Here’s what you need to do based on similar implementations I’ve architected. The integration event logs are telling you exactly what’s wrong - you’re exceeding processing capacity, not just hitting a simple timeout. Let me address your three key areas systematically.
First, regarding bulk import file size limits: Workday’s documented 10MB limit for performance management imports is conservative. However, the real constraint is processing time, not file size. Your 12MB file with 2,500 records is attempting to process too many complex validations in a single transaction. Split your imports into batches of 1,000-1,200 records maximum. Use this approach:
// Batch processing logic
Batch 1: Records 1-1200 (EMP-0001 to EMP-1200)
Batch 2: Records 1201-2400 (EMP-1201 to EMP-2400)
Batch 3: Records 2401-2534 (EMP-2401 to EMP-2534)
// Schedule batches 30 minutes apart
Second, for employee ID validation, implement pre-validation before import. Create a separate integration that runs nightly to validate employee eligibility for performance reviews. This caches validation results and dramatically reduces real-time lookup overhead during import. The validation should check: active employment status, eligibility date ranges, organizational assignment, and manager hierarchy. Store results in a staging table that your import process references.
Third, enhance your integration event log monitoring. Configure alerts for memory utilization above 75% and processing time exceeding 10 minutes. In Integration System Settings, increase the heap allocation to 2GB and enable detailed logging:
wt.integration.maxMemory=2048MB
wt.integration.logLevel=DEBUG
wt.integration.timeout=1800000
Additional critical steps: Schedule imports during off-peak hours (2-6 AM your timezone) to avoid resource contention. Implement exponential backoff retry logic for transient failures. Use Workday’s Cloud Connect for Integrations if you’re currently using EIB - it has better handling for large volumes. Enable parallel processing if your integration framework supports it, but limit to 2-3 concurrent threads to avoid overwhelming the system.
For your immediate deadline crisis, manually split your 12MB file into three 4MB files of approximately 850 records each. Process them sequentially with 30-minute gaps. This will get you through the current cycle while you implement the longer-term architectural improvements I’ve outlined. Monitor each batch completion in the integration event logs before starting the next batch.
The memory spikes you’re seeing indicate inefficient data handling. Review your import mapping - remove any unnecessary custom fields or calculated values that can be populated after import via separate business processes. Focus on core goal data during import and enrich records afterward through automated workflows.
One final critical point: verify your Workday release version supports the volume you’re attempting. Release 1 2023 has known performance constraints with bulk operations that were addressed in later releases. If possible, coordinate with your Workday account team to schedule your tenant upgrade before next review cycle.