Bulk update of service cases via API fails with 413 Payload Too Large

Our support operations team needs to bulk update service case statuses and assignments when we rotate support agents or close resolved cases in batches. We’re using the bulk update API endpoint, but when we try to update more than 200 cases at once, we get HTTP 413 Payload Too Large errors.

Current implementation:


PUT /api/service-cases/bulk-update
Body: [{case_id: 1, status: 'closed'}, ...]
// 500 case objects in array

We need to process 1,000+ cases weekly, and doing them in small batches is inefficient and error-prone. The API documentation doesn’t specify payload size limits. We’ve tried compressing the request body with gzip, but still hit the limit around 250 cases. Is there a configuration to increase the payload limit, or should we be using a different approach for large-scale case updates?

Instead of sending full case objects, send only the fields you’re actually updating. If you’re just changing status and assignment, your payload should be minimal like {“id”: 123, “status”: “closed”, “assigned_to”: 456}. Remove any read-only fields, timestamps, or nested objects. This can reduce your payload size by 70-80% and let you fit more cases per request. Also make sure you’re setting Content-Type: application/json header correctly.

Don’t forget about idempotency when using the async API. If a job fails partway through or times out, you need to be able to retry without duplicating updates. Use the idempotency_key parameter in your job creation request to ensure duplicate job submissions don’t cause issues. This is especially important if you’re automating these bulk updates on a schedule.

Adobe Experience Cloud has an async bulk operations API that’s perfect for this. Instead of synchronous PUT requests, you POST a bulk operation job that processes in the background. You get a job ID back immediately, then poll for completion. This is designed for operations affecting 500+ records. The job API also handles retries automatically if individual case updates fail, giving you better error handling than manual batching.

The 413 error indicates you’re hitting the API gateway’s request size limit, which is typically 1MB for Adobe Experience Cloud endpoints. The limit isn’t about the number of records but the total payload size. Each case object includes all the fields you’re updating plus metadata, so 200-250 cases probably exceeds 1MB. You need to reduce the payload size per request.

The async approach is definitely the way to go for large batches. One thing to watch out for - make sure your bulk operation job doesn’t exceed the maximum job size, which I believe is 5,000 records or 10MB, whichever comes first. For 1,000 cases with minimal payloads, you should be fine with a single job. Also implement proper error handling to check the job status and retrieve any failed case updates from the results.