Here’s a complete solution addressing all aspects of reliable batch processing:
Pre-Import Validation: Implement a validation service that runs before batch submission:
-- Pseudocode - Pre-import validation check:
1. Export journal entries from source system to staging file
2. Query CloudSuite: SELECT account_code FROM chart_of_accounts
3. Compare source account codes against CloudSuite list
4. Generate validation report with missing accounts
5. If validation fails: alert accounting team, halt import
6. If validation passes: proceed with batch import
// Prevents invalid batches from starting
Create a validation query:
SELECT DISTINCT s.account_code
FROM staging_journal_entries s
LEFT JOIN chart_of_accounts c ON s.account_code = c.account_code
WHERE c.account_code IS NULL;
Staging Table Pattern: Implement three-tier staging architecture:
-
journal_entry_staging: Raw import destination
- All fields from source system
- import_batch_id, import_timestamp
- status: ‘PENDING’
-
journal_entry_validation_errors: Failed validations
- Copy of staging record
- error_type: ‘FK_VIOLATION’, ‘INVALID_AMOUNT’, etc.
- error_message: Detailed constraint violation info
- status: ‘ERROR’
-
journal_entry_production: Validated entries
- Same schema as staging
- status: ‘IMPORTED’
Validation stored procedure:
CREATE PROCEDURE validate_journal_entries(batch_id)
BEGIN
INSERT INTO journal_entry_validation_errors
SELECT s.*, 'FK_VIOLATION', 'Invalid account code'
FROM staging s
WHERE NOT EXISTS (
SELECT 1 FROM chart_of_accounts c
WHERE c.account_code = s.account_code
);
INSERT INTO journal_entry_production
SELECT s.* FROM staging s
WHERE NOT EXISTS (
SELECT 1 FROM validation_errors e
WHERE e.entry_id = s.entry_id
);
END;
Constraint Error Handling: Wrap validation in try-catch blocks:
<ErrorHandling>
<OnConstraintViolation>
<LogToErrorTable>true</LogToErrorTable>
<EmailNotification>accounting-team@company.com</EmailNotification>
<ContinueProcessing>true</ContinueProcessing>
</OnConstraintViolation>
</ErrorHandling>
Batch Transaction Management: Implement mini-batch processing:
- Split 5000 entries into 10 batches of 500
- Process each batch in separate transaction
- Track completion status in batch_control table
- On failure: completed batches remain committed
- Resume processing from failed batch after fixing errors
Batch control tracking:
CREATE TABLE batch_control (
batch_id INT,
sub_batch_number INT,
total_entries INT,
processed_entries INT,
error_entries INT,
status VARCHAR(20),
completed_timestamp TIMESTAMP
);
This architecture ensures that account code mismatches are caught during pre-validation, any entries that slip through are logged to the error table without failing the entire batch, and partial batch completion is preserved. Your month-end close process becomes resilient - even if 10 out of 5000 entries have invalid account codes, the other 4990 import successfully and you get a clear error report for manual correction.