We’ve configured comprehensive data quality validation rules for our territory management module to prevent invalid territory assignments, but these rules aren’t being enforced during bulk import operations via REST API. When we manually create or update territory records through the UI, validation works perfectly - invalid postal codes are rejected, overlapping territories are flagged, and required fields are enforced.
However, when importing territories through our integration:
POST /crmRestApi/resources/11.13.18.05/territories
Validation: Bypassed
Records Imported: 1,247
Invalid Records: ~340 (estimated)
The bulk import succeeds but creates invalid territory data that violates our quality rules. Our REST API configuration should be triggering the same validation as the UI, and our error handling is set up to reject invalid records. This is causing data integrity issues with territory assignments and impacting sales routing. Why would validation rules work in the UI but not through the API import process?
Thanks for the insights. I checked our API requests and we’re not including the validateData parameter - that’s likely a major part of the issue. However, I’m concerned about performance impact if we enable full validation on bulk imports of 1000+ territories. Is there a way to configure asynchronous validation that doesn’t block the import but still flags invalid records for review? Also, how do I modify the execution context for existing data quality rules without having to recreate them all?
One important aspect of bulk import validation is configuring proper error handling at the API level. Even with validation enabled, if your error handling isn’t configured to stop on validation failures, the import will continue processing invalid records. In your REST API call, set the onError parameter to ABORT rather than CONTINUE. This ensures that when validation fails, the entire batch is rolled back rather than partially committing valid records and leaving invalid ones in limbo.
Let me provide a comprehensive solution that addresses all aspects of bulk import validation for territory management.
Bulk Import Validation Configuration:
The core issue is that REST API bulk imports bypass data quality validation by default for performance reasons. To enable validation, modify your API requests:
- Add validation parameters to your POST request:
POST /crmRestApi/resources/11.13.18.05/territories
Header: validateData=true
Header: validationLevel=FULL
- These headers instruct the API to execute the same validation rules that run in the UI context. Without them, the import service assumes pre-validated data and skips quality checks.
Data Quality Rules Execution Context:
Your existing data quality rules need to be configured to execute in bulk import scenarios:
-
Navigate to Setup > Data Quality > Validation Rules
-
For each territory validation rule, click Edit
-
Go to the “Execution Settings” tab
-
Enable these execution contexts:
- UI_CONTEXT: Already enabled (working correctly)
- API_CONTEXT: Enable for REST API operations
- BULK_IMPORT_CONTEXT: Critical for bulk operations
- INTEGRATION_CONTEXT: Enable for scheduled integration jobs
-
Set “Validation Timing” to “Pre-Commit” rather than “Post-Commit” - this ensures validation happens before records are saved
-
For territory-specific validations (overlap detection, postal code validation), set “Execution Priority” to “High” to ensure they run even under heavy load
REST API Configuration:
Configure your integration to use the proper validation framework:
- Modify your bulk import payload to include validation directives:
{
"territories": [...],
"importOptions": {
"validationMode": "FULL",
"onError": "ABORT",
"validateAsync": false
}
}
-
The ABORT error handling is critical - it ensures that if any record fails validation, the entire batch is rolled back. This prevents partial imports that leave your territory data in an inconsistent state.
-
Set validateAsync to false for synchronous validation during the import. While this impacts performance, it ensures immediate feedback on validation failures.
Error Handling Strategy:
Implement robust error handling to capture and report validation failures:
-
Configure your REST API client to capture validation error responses:
- HTTP 400 responses indicate validation failures
- Parse the response body for detailed validation error messages
- Each failed record will include field-level validation errors
-
Set up error logging in your integration:
if (response.status == 400) {
log validation errors
quarantine failed records
notify data quality team
}
- Create an error handling workflow:
- Failed records go to a quarantine table
- Data quality team reviews and corrects errors
- Corrected records are resubmitted through the same validation process
Performance Optimization:
Full validation on bulk imports can impact performance. Implement these optimizations:
-
Batch Size Optimization:
- Reduce batch size from 1000+ to 250-500 records per API call
- Smaller batches complete faster and provide better error isolation
- If one batch fails validation, you don’t lose the entire import
-
Asynchronous Validation Option:
For very large imports where performance is critical, use the async validation framework:
- Set validateAsync=true in import options
- The import completes immediately, validation runs in background
- Invalid records are flagged in a validation report
- Use this for initial data loads, not for ongoing operational imports
-
Validation Rule Optimization:
- Territory overlap detection is expensive - consider running it as a separate post-import job
- Cache postal code validation data to reduce lookup overhead
- Use indexed fields in validation rule criteria
Territory-Specific Validation Considerations:
Territory management has unique validation requirements:
-
Overlap Detection:
- This validation can timeout on large imports
- Configure it as a separate validation phase
- Run overlap detection as a scheduled job after bulk import completes
-
Postal Code Validation:
- Ensure your postal code reference data is up to date
- Configure validation to use cached postal code lookups
- Set appropriate timeout values (default 30 seconds may be too low)
-
Hierarchy Validation:
- Parent territory must exist before child territories are imported
- Sort your import data by hierarchy level
- Import parent territories first, then children
Verification and Monitoring:
After implementing validation:
- Test with a small batch (50 records) including known invalid data
- Verify that validation errors are properly returned and logged
- Confirm that invalid records are rejected and don’t appear in the territory table
- Monitor API response times to ensure validation doesn’t cause unacceptable delays
- Set up alerts for validation failure rates exceeding threshold (e.g., >10% failure rate)
Recommended Implementation Approach:
- Enable validation parameters in your API calls immediately
- Update execution contexts for all territory validation rules
- Implement error handling and quarantine workflow
- Test thoroughly with sample data before processing production imports
- Monitor performance and adjust batch sizes as needed
The combination of proper API parameters, correct execution context configuration, and robust error handling will ensure your data quality rules are enforced consistently across both UI and bulk import operations.