Dashboard filters not updating after data refresh in dashboards module

We’re experiencing a critical issue with our executive dashboards where filter selections aren’t reflecting updated data after scheduled refreshes. Our main sales dashboard has date range filters and regional filters that were working fine until last week.

The data refresh completes successfully (we can see new records in the dataset), but the filter dropdowns still show old values. Users have to manually refresh their browser or clear cache to see current filter options. This is affecting reporting accuracy for our quarterly reviews.

We’ve checked the dashboard filter configuration and the bindings look correct:

{
  "datasets": {"salesData": "latest_sales_v1"},
  "filters": [{"column": "Region", "dataset": "salesData"}]
}

The data refresh triggers are set to run daily at 3 AM, but the filters lag behind by 12-24 hours. Has anyone dealt with cache invalidation issues in Tableau CRM dashboards after data updates?

Check if your dashboard is using static filter values versus dynamic binding. If the filter configuration specifies a hardcoded list of values instead of pulling from the dataset dynamically, that would explain why refreshes don’t update the filters. Look at your dashboard JSON - the filter definition should reference the dataset column directly without any static value arrays. Also verify that the dataset API name in your filter binding matches exactly with the refreshed dataset version.

Did you recently upgrade or change the dataset version? Sometimes when you create a new dataset version, the dashboard continues pointing to the old version for filters even if the main queries use the new version. This creates the exact symptom you’re describing - data updates but filters don’t. Check the dataset references in both your query definitions and filter configurations to ensure they’re all pointing to the same current version. You might need to explicitly update the filter dataset binding to point to the latest version rather than relying on the alias.

The solution involves three key areas that need to be aligned:

Dashboard Filter Configuration: Your filter binding needs to use the dataset alias rather than a specific version. Update your dashboard JSON to reference the current dataset:

{
  "datasets": {"salesData": "latest_sales_current"},
  "filters": [{"column": "Region", "dataset": "salesData", "useCurrentVersion": true}]
}

Data Refresh Triggers: Ensure your dataflow output is set to overwrite the same dataset alias each time rather than creating versioned outputs. In your dataflow configuration, set the output node to “Update existing dataset” mode instead of “Create new version.” This ensures the dashboard always points to the freshest data without needing manual rebinding.

Cache Invalidation: After fixing the dataset versioning, you need to force a metadata cache refresh. You can do this through the API or by editing and re-saving the dashboard (which triggers cache invalidation). For ongoing prevention, add a post-refresh webhook that calls the dashboard metadata refresh API:


POST /services/data/v52.0/wave/dashboards/{dashboardId}/refreshMetadata

The key insight is that filter metadata caches separately from query results. When your dataflow creates a new dataset version, the dashboard queries might pick up the new data through the alias, but filters can still reference the old version’s metadata until explicitly refreshed. Setting useCurrentVersion: true in filter definitions and using dataflow overwrite mode instead of versioning solves this permanently.

Also verify your scheduled refresh job includes the metadata refresh step. In tcrm-2021, this isn’t automatic and needs to be explicitly configured in the dataflow schedule settings under Advanced Options.

Another thing to verify - are you using the dataflow to refresh or direct dataset refresh? Dataflows can have their own caching behavior. Also check if there’s any transformation or recipe step that might be caching filter values. I’ve encountered situations where a computed dimension used in filters had aggressive caching that wasn’t tied to the data refresh schedule.

I’ve seen this exact behavior before. The issue is usually with the dashboard metadata cache not being invalidated when the underlying dataset updates. The filter values get cached separately from the data itself, so even though your data refreshes successfully, the filter dropdown options remain stale until the metadata cache expires naturally (usually 24 hours).