OData batch import fails for large supply planning payloads with timeout errors

We’re experiencing timeout errors when importing large supply planning datasets via OData batch requests in D365 F&O 10.0.41. Our planning updates include demand forecasts, inventory levels, and procurement recommendations - typically 15,000-20,000 records per batch.

The batch request structure:


POST /data/$batch
Content-Type: multipart/mixed; boundary=batch_123
--batch_123
Content-Type: application/http
POST /data/SupplyPlanningEntities

We’re hitting timeout limits around 12,000 records, and the Data Management framework shows “Request timeout” errors. The OData service seems to have limitations we’re not aware of. Has anyone dealt with batch size optimization for supply planning imports? What are the recommended limits for OData batch operations in D365?

The OData timeout can be adjusted in IIS settings, but that’s not the real issue here. You’re hitting Data Management framework throttling limits designed to protect system performance. For supply planning volumes, consider using the Data Management REST API instead of direct OData - it’s optimized for bulk operations. The DMF API handles batching internally and provides better error recovery. You can submit a data project via API and poll for completion status. This approach gives you better control over batch processing without timeout concerns.

Thanks Maya. We’re using individual operations without changesets. Breaking into 5,000 record batches might work, but that significantly increases our integration time. We need to process these updates every 4 hours for production planning. Is there a way to increase the OData timeout limits, or would that impact other system operations? Our current architecture depends on larger batch sizes to meet the scheduling requirements.

I agree with using DMF API for bulk imports. Another consideration: are you processing these synchronously? For supply planning updates that aren’t time-critical within seconds, asynchronous processing through recurring data jobs might be more reliable. You could schedule the import, let D365 handle the batch execution, and monitor via execution history. This removes the timeout constraint entirely since the job runs in the background with proper resource allocation.