Service case data archival policy not purging closed cases older than retention period in high-volume org

We configured a data archival policy for service cases three months ago to automatically purge closed cases older than 18 months. The policy shows as active in setup, but our storage metrics indicate zero records have been archived. Cases meeting the criteria (Status = Closed AND ClosedDate < 18 months ago) are still present in full.

Checking the archival logs shows successful runs but 0 records processed each time. We have custom triggers on the Case object for notification workflows and a flow that updates related records when cases close. Could these be interfering with the archival process?

Our storage is now at 87% capacity and we’re facing slow query performance on case reports. Need to understand why eligible records aren’t being identified and purged by the policy.

Here’s the comprehensive solution to get your archival working:

1. Archival Policy Criteria Review Your policy criteria needs to be more specific. Navigate to Setup > Data Archival Policies > Your Policy and verify the SOQL matches:

SELECT Id FROM Case
WHERE Status = 'Closed'
AND ClosedDate < LAST_N_MONTHS:18
AND IsArchivalReady__c = true

Add a custom checkbox field IsArchivalReady__c to flag cases that have completed all business processes.

2. Automation Dependency Resolution Your triggers and flows are preventing archival because they create runtime dependencies. Solutions:

  • Modify your case closure flow to check the IsArchivalReady__c field. Set it to true only after all related record updates complete
  • Update your notification trigger to include a condition: if (!Trigger.isDelete && !System.isBatch()) to skip execution during archival batch jobs
  • Consider converting your after-update trigger to an asynchronous queueable job that respects archival flags

3. SOQL for Identifying Eligible Records Run this diagnostic query to find cases that should archive but aren’t:

SELECT Id, CaseNumber, ClosedDate,
  (SELECT Id FROM CaseComments),
  (SELECT Id FROM EmailMessages)
FROM Case
WHERE Status = 'Closed'
AND ClosedDate < LAST_N_MONTHS:18

If cases have more than 100 related CaseComments or EmailMessages, they might exceed archival batch limits. You’ll need to archive related objects first.

4. Trigger Impact Mitigation Create a custom setting called ArchivalMode__c with a checkbox field. In your triggers, add:

if (ArchivalMode__c.getInstance().IsActive__c) {
  return; // Skip trigger logic during archival
}

Enable this setting before archival runs, disable after completion.

5. Storage Impact Resolution Once automation conflicts are resolved, schedule your archival policy to run weekly during off-peak hours. With 87% storage capacity, you should see immediate relief. After the first successful run, monitor the Archival History related list to confirm record counts match your SOQL query results.

6. Flow Optimization For your case closure flow updating related objects - modify it to use a ‘Record-Triggered Flow’ with ‘Fast Field Update’ disabled. Set the flow to run asynchronously, which removes it from the archival dependency chain.

Implement these changes in a sandbox first, then run a test archival on 100 records to validate. Once confirmed working, deploy to production and execute a full archival run. Your storage should drop to 60-65% within 48 hours of the first successful purge.

I’ve seen this exact scenario before. The issue is usually a combination of automation conflicts and policy criteria gaps. Check if your flow has a ‘Fast Field Update’ setting - those can block archival even when they shouldn’t logically interfere.

Don’t forget about field history tracking and related records. If your cases have child records (emails, attachments, comments) that aren’t included in the archival scope, the parent case won’t archive. You need to configure archival policies for all related objects first, or use cascade delete rules. Also verify your profile has the necessary permissions - archival policies require specific system permissions to execute.

Check the archival eligibility using this query:

SELECT Id, CaseNumber, Status, ClosedDate
FROM Case
WHERE Status = 'Closed'
AND ClosedDate < LAST_N_MONTHS:18

Then compare those Ids against cases that have trigger execution logs in the last archival run timeframe. The Debug Logs will show you exactly which trigger fired during the archival attempt.

First check your archival policy criteria configuration. The policy might be looking at a different date field than ClosedDate. Go to Setup > Data Archival Policies and verify the exact SOQL criteria being used. Also confirm the case record type is included in the policy scope - if you have multiple record types, the policy might only target specific ones.

Thanks for the insights. I checked the policy criteria and it does reference ClosedDate correctly. We have two active triggers and one flow on Case. The flow updates a related custom object whenever a case closes - that’s probably blocking archival. Is there a way to see which specific automation is causing the blockage?

Your triggers and flows are likely the culprit here. Salesforce archival policies skip records that have active automation dependencies. When you have before/after triggers or flows referencing the Case object, the archival process marks those records as ineligible for safety reasons. You need to evaluate which automations are truly necessary and consider deactivating non-critical ones during archival windows. We had the same issue last year - our notification trigger was preventing 40,000 cases from archiving. We modified it to check a custom field that flagged records as archival-ready, then the policy worked perfectly.