We’re having major issues uploading large demand forecast CSVs to our Blue Yonder Luminate 2023.2 cloud environment from our on-prem planning tools. Files over 10MB consistently fail with generic error messages, while smaller files upload fine. Our monthly forecast files typically contain 50,000+ SKU-location combinations and run about 15-18MB.
The upload API returns a 413 error (Request Entity Too Large), which suggests we’re hitting file size limits. I’ve looked through the API documentation but can’t find clear guidance on maximum file sizes or whether there’s a chunked upload strategy we should be using. We tried compressing the CSV to reduce size, but that only gets us to about 12MB and still fails.
POST /api/demand-planning/forecast/upload
Content-Type: text/csv
Content-Length: 15728640
RESPONSE: 413 Request Entity Too Large
Is there a bulk upload endpoint designed for large forecast datasets? Our forecast accuracy depends on timely updates.
Another consideration - network stability between your on-prem environment and the cloud. Large file uploads over unreliable connections will fail even with chunking. Implement retry logic with exponential backoff for each chunk. We also found that uploading during off-peak hours (late evening or early morning) gave us much better success rates because of lower network congestion and better API response times.
Also check your API authentication token expiration settings. When uploading large files or multiple chunks, the process can take several minutes. If your token expires mid-upload (default is 30 minutes), you’ll get authentication errors partway through. Request a longer-lived token for bulk upload operations, or implement token refresh logic in your upload script. This isn’t obvious from the error messages but causes a lot of failed uploads.
Before switching to chunked uploads, consider whether you actually need to upload the entire forecast file each time. The bulk upload API also supports delta uploads where you only send changed forecast values. For 50,000 SKU-locations, you’re probably only updating 10-20% of forecasts in a typical monthly refresh. This could reduce your payload significantly and avoid the file size issue altogether.