We recently modified a record type schema to add new fields for compliance tracking in our approval workflow. The record type updates were published successfully, and the process model was updated to populate these new fields during case creation.
However, our custom process analytics report that pulls data from this record type is not displaying the newly added fields or their values. The report still shows the old schema structure. We’ve tried refreshing the report designer and re-opening the report definition, but the new fields don’t appear in the available data sources.
The process instances are creating records with the new fields populated correctly (verified in the record view), but the analytics report seems stuck with the old record type schema. Has anyone encountered issues with record type schema synchronization in the analytics report designer? Do we need to manually trigger some kind of refresh or republish the report?
Have you checked if the report is using a view or direct record type reference? If it’s pointing to a saved view, that view definition might need to be updated separately to include the new fields. Also, in Appian 22.x versions, there’s sometimes a delay in the analytics data mart synchronization. The process analytics reports pull from a separate data mart that syncs periodically from the operational database. New schema changes need to propagate through that sync cycle before they’re visible in reports.
I ran into this exact issue last month. Here’s what you need to understand: when you modify a record type schema, three separate systems need to synchronize - the record type definition itself, the process model data mappings, and the analytics report designer metadata.
For record type schema synchronization: After publishing record type changes, navigate to the Admin Console > Data Management > Analytics Data Sync and check the last sync timestamp. The analytics engine maintains a separate metadata repository that mirrors your record type schemas. This sync typically runs every hour, but you can trigger it manually if you have the proper permissions.
For process model data mapping: Even though your process instances show the correct data, verify that your Write Records smart service explicitly maps the new fields. Open your process model, locate the Write Records node, and confirm the field mappings include all new fields. The mapping configuration should show the source process variable and the target record field. If mappings are missing, the data won’t flow to the record even if the schema exists.
For analytics report designer refresh: This is the critical step most people miss. Simply refreshing the data source isn’t enough. You need to:
- Open the Report Designer and select your report
- Go to Data Sources tab and click the gear icon next to your record type data source
- Select ‘Reconfigure Data Source’ (not just refresh)
- This opens a dialog - don’t change anything, just click through the wizard and save
- This forces the report to rebuild its internal field mapping from the current record type schema
- Now go to your report fields and you should see the new fields available in the field picker
- Add the new fields to your report layout
- Publish the updated report definition
If the fields still don’t appear after reconfiguring the data source, there’s likely a metadata sync lag. In that case, you have two options: wait for the next scheduled analytics sync (check the Admin Console for timing) or have your admin manually trigger the sync. The sync job is called ‘Process Analytics Metadata Refresh’ in the system jobs list.
One more thing - if you’re using process report tasks or process-backed records in your analytics, make sure the process model version that includes the new field mappings is set as the default version. Reports pull from the default process model version, so if you’re still pointing to an older version, the new fields won’t be available even after all the syncs complete.
After following these steps, your report should display data from the new fields for any process instances created after the schema change. Historical instances won’t have values for the new fields unless you run a data migration to backfill them.
Thanks for the suggestions. I tried refreshing the data source in the Report Designer but the new fields still don’t show up. I don’t have direct access to clear the analytics cache - would need to submit a ticket to our admin team. Is there any workaround that doesn’t require admin intervention? The process model is definitely writing the data correctly because I can see the values when viewing individual records.
This is typically a caching issue with the analytics metadata layer. The process analytics engine maintains its own metadata cache for record types and process models. When you modify a record type schema, the cache doesn’t always invalidate immediately. You might need to wait for the scheduled cache refresh (usually runs every 30-60 minutes depending on your configuration) or manually clear the analytics cache if you have admin access. Also verify that your process model is actually writing to those new fields - check the process variables and the mapping in your Write Records smart service.
I’ve seen similar behavior when the analytics engine hasn’t picked up the schema changes. First thing to check - go to the Report Designer and look at the data source configuration. Sometimes you need to explicitly refresh the data source connection to force it to re-read the record type metadata. There’s usually a refresh icon next to the data source name. Have you tried that yet?