Running into a blocking issue with our inventory optimization parameter migration using FBDI in Fusion Cloud 23D. The import process fails with unique constraint violations on what appears to be a composite key.
Error from the import log:
ORA-00001: unique constraint violated
Table: INV_OPT_PARAMETERS
Constraint: INV_OPT_PARAMS_UK1
Duplicate key: ORG_101|ITEM_A2345|WAREHOUSE_01
We’re migrating approximately 125,000 inventory optimization parameter records from our legacy WMS. The FBDI template upload fails after processing about 8,000 records. When I check our source data, I don’t see obvious duplicates - each record appears unique when I look at item number, organization, and warehouse individually.
Is there a way to identify which specific combination of fields forms the composite key that’s causing this violation? Also, should we be running duplicate checks before the FBDI upload, or does Fusion provide any validation tools?
Good point about the composite nature. I ran a grouping query and found about 340 duplicate combinations out of 125,000 records. Most are cases where we have different safety stock levels or reorder points for the same item-org-warehouse combination, probably entered at different times by different planners. How should we decide which record to keep when there are duplicates?
The composite key for inventory optimization parameters typically includes Organization, Item, Subinventory, and sometimes Planning Group. Your error shows ORG_101|ITEM_A2345|WAREHOUSE_01, which suggests the uniqueness constraint spans these three fields together, not individually.
Check if your source data has multiple records with the same org-item-warehouse combination but different parameter values. That’s usually the culprit.
I’ve dealt with this exact scenario during our 23D upgrade migration. The FBDI import doesn’t provide pre-validation for composite key uniqueness - it just fails when it hits the duplicate during database insert. You need to implement your own pre-upload duplicate detection.
Write a SQL query against your staging data that groups by all the key fields and identifies where count > 1. That will show you exactly which combinations are duplicated. In our case, we found that different planning parameters for the same item-org-subinventory combination had been entered over time in the legacy system, creating logical duplicates.
Don’t forget to check for case sensitivity and whitespace issues too. We had “duplicates” that were actually caused by trailing spaces in item numbers or inconsistent casing in organization codes. Your database might treat ‘WAREHOUSE_01’ and 'WAREHOUSE_01 ’ (with trailing space) as different values, but Oracle Fusion’s composite key might not. Run a TRIM() and UPPER() on all key fields before your duplicate check.
Create a business rule for duplicate resolution. Common approaches:
- Keep the most recent record (based on last_update_date)
- Keep the record with the highest safety stock (most conservative)
- Escalate to business users for manual review
We used approach #1 for 90% of duplicates and #3 for critical A-items. Document your decision logic in a migration runbook so it’s auditable and repeatable if you need to re-run the import.