Bulk import of non-conformance records fails with timeout errors

I’m trying to import a large CSV file with approximately 2,800 non-conformance records into our Arena QMS cloud instance (AQP 2022.2), but the import consistently fails with timeout errors after processing about 800-900 records. The error message I’m seeing is:


Import failed: Gateway timeout (504)
Processed: 847 of 2800 records
Status: Connection lost to import service

I’ve tried splitting the file into smaller batches of 1,000 records, but even that fails. The bulk import limits documentation mentions 5,000 records per batch, so I should be well within limits. Is there a timeout threshold that’s more restrictive than the batch size limit? Also, when these imports fail, I can’t seem to download an error report to see which records were actually committed versus rolled back.

Here’s the complete solution addressing all three focus areas:

Understanding Bulk Import Limits and Timeout Thresholds: Arena QMS cloud has two separate constraints: a 5,000 record batch size limit AND a 15-minute execution timeout. Your issue is hitting the timeout, not the batch size. With attachments, you’re processing at ~1 second per record, which means you can only handle about 900 records before timeout. The solution is to separate your import into two phases:

Phase 1 - Import base records without attachments (reduce CSV to core fields only):


Number,Title,Description,Status,Priority,DetectedDate
NC-2024-001,"Component defect","Issue description",Open,High,2024-01-15

This will process at 5-10 records per second, allowing your full 2,800 records to complete in under 10 minutes.

Phase 2 - Use the REST API to attach files programmatically:


// Pseudocode - Attachment upload process:
1. Read NC record IDs from Phase 1 import results
2. For each NC record, retrieve associated attachments from staging area
3. POST to /api/nonconformance/{id}/attachments with multipart form data
4. Handle API response and log success/failures individually
5. Implement retry logic for failed uploads (3 attempts with exponential backoff)
// API timeout is 5 minutes per attachment, much more forgiving than bulk import

Error Report Download Access: When bulk imports timeout, navigate to Admin > Data Management > Import History (not System Logs). Find your import job by timestamp - even failed jobs are logged. Click the job ID to access the detailed execution report. This report shows:

  • Records processed before timeout
  • Validation errors per record
  • Transaction rollback confirmation
  • Partial success data (which records validated successfully even if not committed)

Download this report as CSV to identify any data quality issues before retrying.

Optimization Tips: Reduce batch size to 500 records for imports with complex data relationships. This provides better error isolation and faster feedback. For your 2,800 records, use 6 batches of 500 instead of trying to import all at once. Each batch completes in 2-3 minutes, well under the timeout threshold, and you can monitor progress more effectively.

Absolutely - that’s the recommended approach for large migrations. Import your NC records without attachments first using a streamlined CSV with just the core fields. Then use the Arena API to attach files in a second pass. This also gives you better error handling since you can retry individual attachment uploads without re-importing the entire record. The API has separate timeout limits that are more forgiving for file operations.

The 5,000 record limit is for total batch size, but there’s also a time-based timeout. Cloud deployments have a 15-minute hard limit on import operations. With 2,800 records failing at around 900, that suggests each record is taking roughly 1 second to process. Are your non-conformance records particularly complex with lots of attachments or related records?

One more thing - verify your CSV encoding and line endings. We’ve seen timeout issues caused by malformed CSV files where the parser gets stuck on special characters in description fields. Use UTF-8 encoding and ensure all text fields with line breaks are properly escaped with quotes. This can dramatically reduce processing time per record.

Yes, many of these NC records have attachments and links to related CAPA items. Each record averages 2-3 attachments (PDFs, images) ranging from 500KB to 5MB. That could definitely be adding processing overhead. Is there a way to import the base records first, then add attachments separately?

For the error report download issue - when an import times out, the transaction is typically rolled back completely, so no partial data is committed. However, Arena should still generate a processing log. Check Admin > System Logs > Import History and look for your failed import job ID. The detailed log should be downloadable from there even if the import didn’t complete successfully.