We’re running into serious API rate limiting issues when trying to bulk import users into account management. Our company recently acquired another organization, and we need to provision about 3,500 users into Adobe Experience Cloud.
We built a script that reads from our HR system CSV and calls the User Management API to create accounts. It works fine for the first 200-300 users, then we start getting 429 rate limit errors. The script continues but about 40% of users fail to provision.
for (const user of userList) {
await createUser(user);
}
// Error: 429 Too Many Requests after ~300 iterations
We’ve tried adding a 1-second delay between requests, but that would take over an hour for the full import and we still hit limits. The partial provisioning is creating chaos - some departments have access while others don’t. What’s the right approach for bulk user imports with proper batching and retry logic?
You need a proper queue system for this. Push all users into a message queue, then have workers pull batches and process them with rate limiting. We use Redis for tracking which users succeeded and which failed. On 429 errors, put the batch back in the queue with a delay. This way you can resume from failures without re-processing successful users.
Even with the bulk endpoint, you need proper batching. Don’t send all 3,500 at once. Break into batches of 50-100 users per request. Between batches, implement exponential backoff - start with 2 seconds, double it on each 429 error up to a max of 60 seconds. Also check the Retry-After header in 429 responses, it tells you exactly how long to wait.
Check if you have multiple API keys or service accounts. Adobe’s rate limits are per API key, not per organization. If you have access to multiple service accounts, you can parallelize the import across them. But be careful - aggressive parallel requests can trigger additional security throttling.