We’re migrating historical non-conformance records from our legacy system into Qualio using the bulk import API endpoint. The import consistently fails with 422 validation errors, but the error messages aren’t specific enough to identify which fields are causing issues.
We’re sending batches of 50 records at a time using the bulk import endpoint. Our JSON payload includes all required fields according to the documentation: NC number, description, detected date, and status. However, we keep getting generic validation failures.
POST /api/v1/non-conformances/bulk
Response: 422 Unprocessable Entity
{"errors": ["Validation failed for record 12"]}
The schema validation seems inconsistent - sometimes the same payload structure works, other times it fails. We’ve verified required field mapping against the API docs, but there must be something we’re missing about how Qualio validates bulk imports versus single record creation.
Thanks for the suggestion. I’ve verified our date formats are ISO 8601 compliant. The status values we’re using are: “open”, “under_investigation”, “closed”. Are these the correct enum values for qual-2022.2? The API documentation doesn’t list the exact acceptable values for the status field in bulk operations.
One thing to watch out for - if any of your records reference related entities like departments or users, those references need to exist in Qualio first. The bulk endpoint doesn’t create relationships on the fly. I’d recommend enabling verbose error logging in your API client to capture the full validation response. Sometimes there are nested error objects that get truncated in the summary message.
I encountered this exact issue during our data migration project. The problem stems from how the bulk import endpoint handles required field mapping versus the schema validation rules.
Bulk Import Endpoint Usage:
The bulk endpoint in qual-2022.2 requires explicit field mapping in the request headers. You need to include a X-Field-Mapping header that declares which fields you’re populating, even if they’re documented as required. Without this, the validator assumes default mappings that may not match your payload structure.
JSON Schema Validation:
The validation errors occur because the bulk endpoint applies stricter schema validation than single-record creation. Specifically:
- Status values must be uppercase: “OPEN”, “UNDER_INVESTIGATION”, “CLOSED” (not lowercase)
- Date fields require timezone offset even if documentation says it’s optional
- Numeric fields (like severity scores) cannot be sent as strings, even quoted numbers
Required Field Mapping:
Here’s the corrected approach:
POST /api/v1/non-conformances/bulk
Headers:
X-Field-Mapping: nc_number,description,detected_date,status,severity
Content-Type: application/json
Payload:
{
"records": [
{
"nc_number": "NC-2024-001",
"description": "Material defect",
"detected_date": "2024-11-15T10:30:00+00:00",
"status": "OPEN",
"severity": 3
}
]
}
Additional tips:
- Reduce batch size to 20 records maximum for qual-2022.2
- Enable detailed error responses by adding
?verbose=true query parameter
- Validate each record against the schema individually before batching
- Ensure all referenced entity IDs (users, departments) exist and are active
The schema validator in the bulk endpoint also checks for duplicate NC numbers across the entire batch, not just against existing records. If your batch contains duplicate numbers internally, it will fail validation even if those numbers don’t exist in Qualio yet.
After implementing these changes, our migration success rate went from 40% to 98%, with remaining failures being legitimate data quality issues.
The intermittent failures you’re experiencing suggest a data consistency issue rather than endpoint configuration. I’d recommend testing with a minimal payload first - just the absolute required fields with no optional data. Build up from there to identify which field is causing the validation failure.