Process analytics dashboard not updating after batch import of historical data

We recently imported 6 months of historical process execution data (approximately 15,000 process instances) through a bulk CSV import into Creatio 8.5. The import completed successfully and the data is visible in the process instance tables, but our process analytics dashboard still shows outdated metrics from before the import.

The dashboard displays process completion rates, average cycle times, and bottleneck analysis - all critical KPIs that should reflect the newly imported historical data. We’ve tried manually refreshing the dashboard and even logging out and back in, but the metrics haven’t updated. The analytics data mart refresh schedule is set to run nightly at 2 AM, and we’ve waited through two refresh cycles with no change.

I suspect the bulk data import impact isn’t triggering the necessary dashboard update triggers that would normally fire when processes complete through normal execution. Is there a manual way to force the analytics engine to recalculate metrics from the imported data? This is blocking our quarterly performance reporting.

Also worth checking if your imported process instances have the correct StatusId values. Analytics dashboards often filter by completion status, and if your import used different status codes than the live system, those records won’t appear in completion rate calculations even after the data mart refreshes.

I can provide a complete solution covering all three aspects of your analytics refresh issue:

Analytics Data Mart Refresh: The core problem is that Creatio’s incremental data mart refresh only processes records modified after the last refresh timestamp. Your bulk import created records with historical CreatedOn dates, but the data mart’s ‘last processed’ marker is still set to before your import. Here’s how to fix it:

  1. Navigate to System Designer > Lookups > ‘SysAnalyticsDataMart’ lookup
  2. Find the record for your process analytics dashboard
  3. Clear or set the ‘LastRefreshDate’ field to a date before your oldest imported record
  4. Trigger a manual refresh through Configuration > SQL Console: Execute the stored procedure ‘tsp_RefreshProcessAnalyticsMart’ with parameter @FullRefresh = 1

Alternatively, use the System Designer > Analytics section and select ‘Force Full Rebuild’ for the Process Analytics data mart specifically, rather than rebuilding all marts.

Bulk Data Import Impact: When importing historical process data, you must ensure several fields are correctly populated for analytics to work:

  • StatusId must match your system’s process status lookup values (typically: 1=Running, 2=Completed, 3=Error, 4=Cancelled)
  • CreatedOn and ModifiedOn timestamps must reflect actual historical dates
  • ProcessSchemaId must match existing process definitions in your system
  • All required process parameter values must be present

After import, run a validation query to check data quality:

SELECT COUNT(*) as InvalidRecords FROM SysProcessLog WHERE StatusId NOT IN (1,2,3,4) OR CreatedOn IS NULL OR ProcessSchemaId NOT IN (SELECT Id FROM SysSchema WHERE ManagerName = ‘ProcessSchemaManager’)

Any records returned need correction before analytics will process them correctly.

Dashboard Update Triggers: Process analytics dashboards don’t automatically detect bulk imported data because they rely on real-time process completion events to trigger metric updates. To force recognition of imported data:

  1. Clear the dashboard cache: System Settings > ‘ClearDashboardCache’ = true, then restart the application
  2. Update dashboard widget date filters to explicitly include your historical date range
  3. Verify widget data sources are querying the correct process log tables (SysProcessLog, SysProcessElementLog)
  4. For each dashboard widget, check the ‘Refresh Interval’ setting - set to ‘On Open’ temporarily to see immediate updates

For future bulk imports, implement this workflow:

  • Import data during maintenance window
  • Immediately trigger full data mart refresh
  • Validate dashboard metrics against known totals from source data
  • Clear application cache before users access updated dashboards

Regarding performance during rebuild: Monitor using the ‘Database Performance’ dashboard in System Designer. If CPU exceeds 80% or query wait times spike, pause the rebuild. The rebuild process is resumable - it will continue from where it stopped when restarted. Schedule completion during your next maintenance window.

Finally, document this process for future historical data imports and consider creating a scheduled job that automatically triggers data mart refresh after detecting bulk import operations (check for >1000 records inserted in SysProcessLog within a 1-hour window).

Yes, data mart rebuilds can impact database performance significantly. They run intensive aggregation queries across large process tables. If possible, schedule this during off-hours or a maintenance window. You can also monitor the database CPU and memory usage to see if it’s affecting user operations. Consider pausing the rebuild and restarting tonight if users are experiencing slowness.

You probably need to trigger a full data mart rebuild rather than waiting for the incremental nightly refresh. Navigate to System Designer > Analytics > Data Mart Settings and look for a ‘Rebuild All’ option. This will reprocess all historical data including your imports. Be aware this can take several hours depending on data volume.