Analytics API returns inconsistent metrics after schema upda

After upgrading our analytics schema in Pega 8.6, the analytics API is returning inconsistent metrics compared to what we see in the Pega UI dashboards. Some KPIs are off by 20-30%, and certain dimension fields are returning null values that previously had data.

Example API response showing issues:


GET /api/analytics/v1/metrics/case-volume
{"total_cases": 1247, "avg_resolution_time": null}

The dashboard shows 1586 cases and 4.2 days average resolution time for the same period. We recently added new custom fields to our case types and updated the analytics schema to include them. The dashboard widget config seems to be pulling correct data, but the API field mapping is broken. How do we fix API schema versioning to ensure consistent metrics across both UI and API access?

Also verify your field mapping for the resolution time calculation. The null value might indicate the field name changed from something like pyElapsedTime to pyResolutionDuration. Check the analytics schema definition to see the exact field names, then update your API’s data transform to map correctly. The dashboard might have been updated automatically through UI configuration, but API mappings are manual.

The null values for avg_resolution_time suggest a field mapping issue. When you updated the analytics schema, did you also update the corresponding report definition that the API uses? The dashboard widgets might be using a different report definition than the API endpoint. Verify that both the UI dashboard and API service reference the same underlying data source.

Schema updates often break API mappings because the API service doesn’t automatically pick up new field definitions. Check your analytics API service rule - the data mapping might still reference old field names. If you renamed or restructured fields during the schema update, you need to manually update the API’s response data transform to map to the new schema.

Here’s a comprehensive solution addressing all three aspects of your analytics API issues:

1. API Schema Versioning: Implement proper API versioning to handle schema evolution without breaking existing integrations:

Create Versioned API Endpoints:

  • Maintain existing endpoint: /api/analytics/v1/metrics (uses old schema)
  • Create new endpoint: /api/analytics/v2/metrics (uses updated schema)
  • Set deprecation timeline for v1 (e.g., 6 months)

Service Rule Configuration: Create separate service rules for each version:

  • REST_AnalyticsMetrics_v1: References ReportDefinition_CaseMetrics v1
  • REST_AnalyticsMetrics_v2: References ReportDefinition_CaseMetrics v2

Version Header Support: Allow clients to specify version via header as alternative to URL versioning:


GET /api/analytics/metrics
Header: API-Version: 2.0

Implement version routing logic in your service activity that selects the appropriate report definition based on the requested version.

Schema Migration Guide: Document field mapping changes for API consumers:


v1 → v2 Field Changes:
- pyElapsedTime → pyResolutionDuration
- CaseStatus → pyStatusWork
- AssignedTo → pyAssignmentOperator
- CustomField1 → pyCustomMetric

Provide a migration endpoint that returns both old and new field names during transition period:


GET /api/analytics/v2/metrics?compatibility=v1
Returns: {"total_cases": 1586, "avg_resolution_time": 4.2, "pyResolutionDuration": 4.2}

2. Dashboard Widget Config Alignment: Ensure API and dashboard use consistent data sources:

Unified Report Definition: Create a master report definition that both dashboard and API reference:

  • Name: ReportDefinition_CaseMetrics_Master
  • Include all fields needed by both UI and API
  • Use column aliases to maintain backward compatibility

Data Transform: MapAnalyticsResponse_v2


Set .total_cases = .CaseCount
Set .avg_resolution_time = .pyResolutionDuration
Set .completion_rate = .CompletionPercentage
Set .sla_compliance = .SLAComplianceRate

Apply this transform in both the API service and dashboard widget configuration to ensure identical field mapping.

Dashboard Widget Update: Modify your dashboard widget to explicitly specify field mappings:

  • Widget Type: KPI Metric
  • Data Source: ReportDefinition_CaseMetrics_Master
  • Field Mappings:
    • Display Label: “Total Cases”
    • Data Field: .CaseCount
    • Format: Number

Match these exact field references in your API response structure.

Validation Query: Create a validation report that compares API output to dashboard data:

SELECT
  COUNT(*) as CaseCount,
  AVG(pyResolutionDuration) as AvgResolutionTime,
  SUM(CASE WHEN pyStatusWork='Resolved-Completed' THEN 1 ELSE 0 END) as CompletedCount
FROM pc_work_case
WHERE pxCreateDateTime BETWEEN ? AND ?

Run this query manually to verify which source (API or dashboard) is correct.

3. Field Mapping Correction: Fix the specific field mapping issues causing null values and incorrect counts:

Resolution Time Field: The null avg_resolution_time indicates unmapped field. Update your API data transform:

Before (broken):


Set .avg_resolution_time = .pyElapsedTime

After (fixed):


Set .avg_resolution_time = .pyResolutionDuration

Verify the field exists in your report definition by checking the Columns tab.

Case Count Discrepancy (1247 vs 1586): The 339-case difference suggests a filter mismatch. Check these potential causes:

API Filter Configuration: Your API might be applying additional filters not present in the dashboard:

  • Status filters (excluding certain case statuses)
  • Work pool filters (limiting to specific teams)
  • Date range interpretation (UTC vs local timezone)

Review your report definition’s filter criteria:


API Report Filter: pyStatusWork != 'New' (excludes 339 cases)
Dashboard Report Filter: (no status filter)

Align the filters to match business requirements.

Custom Field Mapping: For newly added custom fields, explicitly map them in the API response:


Set .custom_field_1 = .pyCustomMetric
Set .custom_field_2 = .pyCustomDimension
Set .custom_field_3 = .pyCustomKPI

If custom fields are null, verify they’re included in the report definition’s column list and properly populated in case data.

Implementation Steps:

  1. Immediate Fix (for v1 API):

    • Update REST_AnalyticsMetrics_v1 to use ReportDefinition_CaseMetrics v2
    • Fix field mappings in data transform
    • Test with existing API consumers
  2. Create v2 API (recommended approach):

    • Clone REST_AnalyticsMetrics_v1 to create v2
    • Update v2 to use new schema with proper field names
    • Add API version routing logic
    • Document changes and notify API consumers
  3. Align Dashboard and API:

    • Ensure both use same report definition
    • Validate field mappings match
    • Test metrics consistency
  4. Validation:

    • Run parallel queries: API vs Dashboard vs Direct SQL
    • Compare results for 1-week sample period
    • Verify all metrics match within 1% tolerance

Testing Checklist:

  • [ ] API v2 returns same case count as dashboard (1586)
  • [ ] avg_resolution_time field no longer null
  • [ ] All custom fields populated correctly
  • [ ] Date range filters produce consistent results
  • [ ] Performance acceptable (< 2 second response time)

Implementing proper API versioning with aligned field mappings will restore consistency between your API and dashboard metrics while maintaining backward compatibility for existing integrations.

Don’t just point the API to v2 - that could break existing API consumers. Implement proper API versioning instead. Create a new API endpoint version (v2) that uses the updated schema, while maintaining v1 for backward compatibility. This way, external systems can migrate to the new API version on their own schedule. Configure both versions to coexist: /api/analytics/v1/metrics and /api/analytics/v2/metrics.

I checked and the API service is using ReportDefinition_CaseMetrics v1, while the dashboard uses ReportDefinition_CaseMetrics v2 (created during the schema upgrade). That explains the discrepancy. Should I update the API to use v2, or is there a better way to handle schema versioning for APIs?