Change control JSON payload mapping fails when syncing with external PLM system

Our change-control module integration with an external PLM system is failing during JSON payload transformation. The workflow triggers correctly, but the data mapping step returns schema validation errors.

The PLM system expects a specific nested structure for change requests, but ETQ’s outbound JSON doesn’t match. We’re getting errors like:


Validation failed: Required field 'changeItems.partRevisions' missing
Expected array at 'affectedDocuments', got object

We’ve configured field transformation mapping in the integration middleware, but nested arrays aren’t being handled correctly. The schema validation fails before the payload even reaches the PLM API. Has anyone successfully mapped ETQ change control data to complex nested JSON schemas?

What middleware are you using for the transformation? Some tools handle nested array mapping better than others. Also, are you using ETQ’s built-in REST integration or a custom middleware layer?

We’re using MuleSoft as the integration middleware. ETQ sends the change request data via REST API, MuleSoft transforms it, then forwards to the PLM system. The transformation logic is defined in DataWeave, but the nested array handling is where we’re stuck. The ‘affectedDocuments’ field in ETQ is a single object, but PLM expects an array of objects.

Schema validation should happen after transformation, not before. If validation is failing before the payload reaches PLM, your middleware might be validating against the schema prematurely. Configure your integration flow to transform first, then validate against the target schema. This ensures the validation sees the properly mapped structure.