Demand planning forecast update workflow fails on data import validation

Our automated forecast update workflow in Infor SCM 2022.2 is failing during the data import phase. We receive forecast data from a third-party analytics platform in JSON format, which the workflow imports and processes nightly.

Since last week, the import validation step fails with “invalid JSON structure” errors, even though the source file format hasn’t changed. Here’s a sample of the failing import code:

{
  "forecastData": [
    {"productId": "P-12345", "period": "2025-02", "quantity": 1500}
  ]
}

The error message indicates the JSON schema validation is failing, but I’ve compared our current files to previous successful imports and they appear identical. This is disrupting our forecast updates and causing planning delays. I suspect the issue might be related to file encoding or changes in how the third-party system exports data, but I’m not sure how to diagnose this properly.

I’ve dealt with this exact scenario multiple times. Here’s a comprehensive solution covering all three validation aspects.

JSON Schema Validation Fix: First, update your workflow’s JSON schema definition to be more flexible while maintaining data integrity. The strict schema in 2022.2 can reject files with minor variations. Modify your schema to allow additional properties:

{
  "type": "object",
  "properties": {
    "forecastData": {"type": "array"}
  },
  "additionalProperties": true
}

This allows the third-party system to add new fields without breaking your import. However, still validate required fields explicitly. In your import workflow configuration, enable “Lenient Schema Validation” mode, which accepts valid data even if extra fields are present.

Next, review your schema to ensure it matches the current data structure exactly. Check field names (case-sensitive), data types, and required vs. optional fields. In your workflow’s Import Configuration, enable detailed validation logging to see exactly which field is failing validation.

File Encoding Check and Preprocessing: The UTF-8 BOM marker is your primary issue. Add a preprocessing step to your workflow that strips BOM markers before JSON validation. Create a custom import preprocessor script:

function preprocessFile(inputFile) {
  var content = readFile(inputFile);
  content = content.replace(/^\uFEFF/, '');
  return content;
}

This removes the BOM marker (\uFEFF) from the beginning of the file. Configure your workflow to call this preprocessor before the JSON validation step. Alternatively, contact your third-party vendor and request they export without BOM markers - many systems have this as a configurable option.

Also check for other encoding issues: verify the file is actually UTF-8 and not UTF-16 or another encoding. Add encoding detection to your preprocessor to handle multiple encoding types automatically.

Third-Party Data Compatibility: Since the third-party system may have changed their export format, implement compatibility checks in your workflow. Add a validation step that verifies:

  • All required fields are present
  • Date formats match expected pattern (YYYY-MM)
  • Numeric values are within acceptable ranges
  • Product IDs exist in your master data

Create a compatibility layer that transforms the third-party data format to your internal format. This insulates your workflow from vendor changes. For example, if they change date format from “2025-02” to “202502”, your compatibility layer converts it automatically.

Implement version detection by having the third-party system include a format version field in their export. Your workflow can then apply the appropriate transformation logic based on the version.

Error Handling and Monitoring: Enhance your workflow’s error handling to provide better diagnostics. When validation fails, log:

  • The exact line/character where parsing failed
  • The expected vs. actual data structure
  • File encoding and size
  • Source system version/export timestamp

Set up alerts for import failures so you’re notified immediately rather than discovering issues the next day. Create a fallback process that retries the import with progressively more lenient validation if strict validation fails.

Testing and Validation: Before deploying these fixes to production, test with multiple file variations:

  • Files with BOM markers
  • Files with extra fields
  • Files with different date/number formats
  • Files from different versions of the third-party system

This comprehensive approach will make your forecast import workflow resilient to third-party data format changes while maintaining data quality.

Have you checked the actual file encoding? Sometimes third-party systems change their export encoding from UTF-8 to UTF-8 BOM or Windows-1252, which can cause JSON parsing to fail even though the structure looks correct. Use a hex editor to check for BOM markers at the start of the file.

I checked the encoding and found that the files do have a UTF-8 BOM marker now, which they didn’t have before. The third-party vendor must have changed their export settings. How do I configure the import workflow to handle BOM markers? Should I strip them before import or can Infor SCM handle them automatically?

Infor SCM’s JSON parser in 2022.2 doesn’t automatically strip BOM markers, so you’ll need to handle that in your import workflow. You can either add a preprocessing step to remove the BOM before validation, or configure your third-party vendor to export without BOM. The preprocessing approach is more robust since it handles any encoding variations.

Another thing to verify is the JSON schema definition in your workflow. Infor SCM 2022.2 has strict schema validation, and if the third-party system added any new fields or changed field order, it might trigger validation failures. Check if your schema allows additional properties or if it’s locked to a specific structure. Also look at the import logs - they usually show exactly which field or structure is causing the validation error, not just a generic “invalid JSON” message.

Also check if your third-party system is including any invisible characters or line breaks that weren’t there before. We had an issue where the vendor’s new export tool added carriage returns in unexpected places, which broke our JSON structure validation even though the file looked fine in a text editor. Use a JSON validator tool to check the raw file before it enters your workflow.