Scheduled Flow fails on large CSV import due to batch processing limits and unhandled errors

I built a Scheduled Flow to import records from CSV files stored in Files. The flow runs daily at 2 AM to process new account records. Works fine for small files (under 500 records) but fails consistently when processing larger CSVs with 2,000+ records.

Getting these governor limit errors:


Flow error: LIMIT_EXCEEDED
SOQL queries: 101 of 100
CPU time: 11,234ms exceeded limit of 10,000ms
Flow Interview failed at element: Loop_Through_CSV_Records

The flow reads the CSV content, loops through each row, queries existing accounts to check for duplicates (based on Account Number), then either updates or creates records. I understand this hits governor limits with SOQL inside loops, but Flow Builder doesn’t have bulk processing like Apex. The batch processing option in Flow seems limited.

This is causing incomplete imports where only the first 500-600 records process before timeout. Need a solution that either optimizes the Flow to handle larger datasets or an Apex batch alternative that maintains the same automated schedule. Anyone dealt with this?

Check if your CSV parsing logic is efficient. If you’re using formula fields or complex string manipulation inside the loop, that consumes CPU time quickly. Sometimes optimizing the data transformation steps can buy you enough headroom to process more records.

Have you tried using the ‘Get Records’ element with a collection variable instead of querying inside the loop? You can query all existing Account Numbers once at the start, store them in a collection, then use a Decision element to check if the CSV record exists in that collection. This reduces your SOQL from N queries to just 1. Still might hit CPU limits though with 2,000 iterations.

Before jumping to Apex, try splitting your flow into smaller chunks. Create a parent flow that queries the CSV file, counts total records, then calls a subflow multiple times passing 500 records at a time. Each subflow invocation gets its own governor limits. It’s hacky but might work for your use case without rewriting everything in Apex.

The Winter '25 release added some Flow optimization features but they won’t solve your fundamental problem. With 2,000+ records you’re beyond what Flow can handle efficiently, even with bulkification best practices.

I’ve seen this pattern fail many times. Flows are great for simple automation but terrible for bulk data processing. The CPU timeout at 10 seconds is a hard limit you can’t work around in Flow. Even if you optimize SOQL, you’ll still hit CPU time with that many loop iterations. The proper solution is Apex Batch class scheduled via Apex Scheduler. It processes records in chunks of 200 (configurable), each chunk gets fresh governor limits, and you can handle up to 50 million records. Migration from Flow to Apex is straightforward - I’ll describe the pattern below.

Your flow architecture is hitting classic governor limits. SOQL inside loops is the main culprit - you’re running 2,000+ queries when processing large files. Flow Builder isn’t designed for bulk operations at this scale. You need to move to Apex Batch processing which handles chunking automatically and has higher limits (50,000 SOQL queries per batch context). I can share a template if needed.