Bulk part import in part management fails with duplicate part numbers

We’re experiencing a critical issue with bulk part imports using the Bulk Import Utility in TC 12.3. When importing large batches (500+ parts), the process fails midway with DuplicatePartNumberException errors, even though we’ve validated the import file beforehand.

The mapping validation appears to pass initially, but the server-side duplicate detection kicks in during processing and halts the entire batch. This is blocking our BOM synchronization workflow because incomplete part sets can’t be linked properly.

ERROR: DuplicatePartNumberException at ImportService.java:234
at com.ptc.windchill.part.ImportService.validateUniqueness
at com.ptc.windchill.bulk.BulkImportProcessor.processBatch
Caused by: Part number 'P-45892' already exists in context

The strange part is that these part numbers don’t exist when we query the database directly. Has anyone dealt with server-side validation issues during bulk imports? We need to understand why the duplicate detection is triggering incorrectly and how to ensure BOM dependencies are maintained.

The issue you’re experiencing stems from three interconnected problems with bulk import validation, server-side duplicate detection, and BOM synchronization in TC 12.3.

Bulk Import Mapping Validation Issue: The Bulk Import Utility performs initial validation against your CSV mapping but doesn’t execute the full server-side validation chain until processing begins. This creates a false sense of security. Your mapping validation passes because it only checks format and required fields, not actual business rule constraints.

Server-Side Duplicate Detection Root Cause: The DuplicatePartNumberException is likely triggered by parts in iteration states or working copies that exist in the version graph but aren’t returned by standard queries. The server validation uses PartHelper.service.checkUniqueness() which scans the entire version tree, including checked-out copies and iterations.

// Add pre-validation before import
QuerySpec qSpec = new QuerySpec(WTPart.class);
qSpec.appendWhere(new SearchCondition(WTPart.class,
    WTPart.NUMBER, SearchCondition.EQUAL, partNumber),
    new int[]{0});
qSpec.setAdvancedQueryEnabled(true);
QueryResult qr = PersistenceHelper.manager.find(qSpec);
if (qr.hasMoreElements()) {
    // Handle duplicate including all iterations
}

BOM Synchronization Dependencies Solution: To maintain BOM integrity during bulk imports, implement a three-phase approach:

  1. Pre-Import Validation Phase: Run comprehensive duplicate checks using the same API the server uses (PartHelper.service.checkUniqueness()). This catches all versions, iterations, and working copies before import starts.

  2. Staged Import with Context Isolation: Break your 500+ part batches into context-specific groups of 50-100 parts. Import all parts for a single product context together, then commit before moving to the next context. This ensures BOM parent-child relationships are available when needed.

  3. Post-Import BOM Reconciliation: After each batch commits successfully, run a verification script that validates all BOM links are established. If any are missing due to timing issues, create them in a separate pass.

// Batch commit configuration in import properties
wt.bulk.import.batchSize=100
wt.bulk.import.commitInterval=50
wt.bulk.import.validateUniqueness=true

Additional Configuration: Modify your site.xconf to enable detailed validation logging and adjust transaction timeouts for large imports:

<Property name="wt.bulk.import.validation.detailed"
    value="true"/>
<Property name="wt.bulk.import.transaction.timeout"
    value="600"/>

Disable any custom event handlers during bulk imports by setting wt.bulk.import.disableCustomValidators=true in your import configuration. Re-enable them after import completes.

This approach addresses all three focus areas: validates mappings using server-side logic, implements proper duplicate detection that matches server behavior, and maintains BOM synchronization by ensuring complete part sets are available before linking. The staged approach prevents partial imports that break BOM dependencies.

I had this exact scenario in TC 12.3 and traced it to the revision handling logic. When parts exist in a different lifecycle state or revision, the standard query won’t find them, but the server-side validation does. Check your import configuration for revision scheme handling.

I’ve seen this before. The duplicate detection might be checking against soft-deleted parts that are still in the database but not visible through standard queries. Try running a cleanup on your part master table first, especially if you’ve had recent deletions or failed imports that left orphaned records.

Look at your import file structure carefully. Are you including the full part context path in your mapping? Sometimes the Bulk Import Utility interprets relative contexts differently than expected, causing it to attempt creation in the wrong container where duplicates actually exist.

The BOM synchronization dependency you mentioned is key here. When bulk imports fail midway, you end up with partial part sets that can’t satisfy BOM structure requirements. I’d recommend implementing a pre-import validation step that queries against the same validation logic the server uses, not just database queries. This catches issues before you start the actual import process.