The issue you’re experiencing stems from three interconnected problems with bulk import validation, server-side duplicate detection, and BOM synchronization in TC 12.3.
Bulk Import Mapping Validation Issue:
The Bulk Import Utility performs initial validation against your CSV mapping but doesn’t execute the full server-side validation chain until processing begins. This creates a false sense of security. Your mapping validation passes because it only checks format and required fields, not actual business rule constraints.
Server-Side Duplicate Detection Root Cause:
The DuplicatePartNumberException is likely triggered by parts in iteration states or working copies that exist in the version graph but aren’t returned by standard queries. The server validation uses PartHelper.service.checkUniqueness() which scans the entire version tree, including checked-out copies and iterations.
// Add pre-validation before import
QuerySpec qSpec = new QuerySpec(WTPart.class);
qSpec.appendWhere(new SearchCondition(WTPart.class,
WTPart.NUMBER, SearchCondition.EQUAL, partNumber),
new int[]{0});
qSpec.setAdvancedQueryEnabled(true);
QueryResult qr = PersistenceHelper.manager.find(qSpec);
if (qr.hasMoreElements()) {
// Handle duplicate including all iterations
}
BOM Synchronization Dependencies Solution:
To maintain BOM integrity during bulk imports, implement a three-phase approach:
-
Pre-Import Validation Phase: Run comprehensive duplicate checks using the same API the server uses (PartHelper.service.checkUniqueness()). This catches all versions, iterations, and working copies before import starts.
-
Staged Import with Context Isolation: Break your 500+ part batches into context-specific groups of 50-100 parts. Import all parts for a single product context together, then commit before moving to the next context. This ensures BOM parent-child relationships are available when needed.
-
Post-Import BOM Reconciliation: After each batch commits successfully, run a verification script that validates all BOM links are established. If any are missing due to timing issues, create them in a separate pass.
// Batch commit configuration in import properties
wt.bulk.import.batchSize=100
wt.bulk.import.commitInterval=50
wt.bulk.import.validateUniqueness=true
Additional Configuration:
Modify your site.xconf to enable detailed validation logging and adjust transaction timeouts for large imports:
<Property name="wt.bulk.import.validation.detailed"
value="true"/>
<Property name="wt.bulk.import.transaction.timeout"
value="600"/>
Disable any custom event handlers during bulk imports by setting wt.bulk.import.disableCustomValidators=true in your import configuration. Re-enable them after import completes.
This approach addresses all three focus areas: validates mappings using server-side logic, implements proper duplicate detection that matches server behavior, and maintains BOM synchronization by ensuring complete part sets are available before linking. The staged approach prevents partial imports that break BOM dependencies.