CI-DS job fails on delta loads in logistics-mgmt (si-2302) due to duplicate keys in source data

Our CI-DS integration job for logistics-mgmt delta loads started failing after we upgraded to si-2302. The job runs successfully for initial loads but consistently fails on delta extractions with duplicate key violations.

Error from the job log:


Error Code: ORA-00001
Duplicate key in index LOGISTICS_SHIPMENT_PK
Key value: SHIP_2024_04_15_001

The CI-DS key mapping uses ShipmentID + CreatedDate as the composite key, which should be unique. However, we’re seeing the same key appearing multiple times in delta extractions when shipments are updated in the source S/4HANA system.

Our delta extraction logic is based on the standard change pointer mechanism, but it seems like updates to existing shipments are being treated as inserts rather than updates. This creates data gaps in our logistics planning as shipments aren’t reflecting current status.

Is there a known issue with CI-DS delta handling in si-2302, or do we need to adjust our key mapping strategy?

Yes, absolutely include the CHNGIND field. Your CI-DS job needs this to determine the operation type. Without it, all delta records are treated as inserts by default, which explains your duplicate key errors.

You’ll need to modify your CI-DS data source to include this field and then configure the target mapping to use it for determining insert vs update operations. In the CI-DS job configuration, map CHNGIND to the operation type parameter so IBP knows how to process each record.

Consider implementing duplicate filtering logic in your CI-DS transformation layer as a safety net. Even with proper change indicators, network issues or retry logic can sometimes cause the same delta record to be processed twice.

Add a staging table in your CI-DS flow that deduplicates records based on ShipmentID + LastChangeTimestamp before loading to the target IBP table. This prevents duplicate key errors even if the source sends redundant change records.

This is a common issue when the CI-DS extraction doesn’t properly distinguish between inserts and updates. The problem is likely in how your delta filter is configured. Check if your extraction query includes both the change timestamp AND the change type indicator from the source system.

In S/4HANA, shipment updates should set a change flag that CI-DS needs to read to determine whether to INSERT or UPDATE in IBP.

I checked the extraction query and we’re pulling the change timestamp (AEDAT field) but I don’t see any change type indicator being extracted. The S/4HANA table has a CHNGIND field that marks records as ‘I’ (insert), ‘U’ (update), or ‘D’ (delete). Should this be included in the CI-DS mapping?

Also verify that your S/4HANA change pointer configuration is correctly set up for the shipment tables. Sometimes change pointers aren’t activated for all relevant fields, so updates to certain shipment attributes don’t trigger proper change records.

Run transaction BD52 in S/4HANA to check which fields have change document objects activated. For logistics shipments, you typically need change pointers on status fields, dates, and quantities at minimum.

Here’s a comprehensive solution addressing CI-DS key mapping, delta extraction logic, and duplicate filtering:

1. CI-DS Key Mapping Configuration:

Your current composite key (ShipmentID + CreatedDate) is problematic for delta loads because CreatedDate never changes on updates. Modify your key mapping strategy:

Primary Key: ShipmentID only (this should be unique) Technical Key: Add LastChangeTimestamp for delta detection

In your CI-DS data source definition, configure:

SELECT
  SHIPMENT_ID,
  CREATED_DATE,
  LAST_CHANGE_TIMESTAMP,
  CHANGE_INDICATOR,
  STATUS,
  PLANNED_DATE
FROM S4_SHIPMENT_TABLE
WHERE LAST_CHANGE_TIMESTAMP > :LAST_EXTRACTION_TIME

2. Delta Extraction Logic Enhancement:

In your CI-DS job configuration (si-2302), add the change operation mapping:

a) Include CHNGIND field from S/4HANA in your extraction

b) Map it to the IBP operation type:

  • ‘I’ → INSERT
  • ‘U’ → UPDATE (this is critical - currently missing)
  • ‘D’ → DELETE

In the CI-DS transformation step, add this Groovy script:

def changeInd = message.properties.get('CHNGIND')
if (changeInd == 'U') {
    message.setHeader('CamelSqlOperationType', 'UPDATE')
} else if (changeInd == 'D') {
    message.setHeader('CamelSqlOperationType', 'DELETE')
}

3. Duplicate Filtering Implementation:

Create a staging approach with deduplication:

Step 1 - Load to staging table (allows duplicates):

INSERT INTO IBP_LOGISTICS_STAGING
(SHIPMENT_ID, LAST_CHANGE_TIMESTAMP, CHANGE_INDICATOR, ...)
VALUES (?, ?, ?, ...)

Step 2 - Deduplicate using window functions:

MERGE INTO IBP_LOGISTICS_SHIPMENT target
USING (
  SELECT * FROM (
    SELECT *,
      ROW_NUMBER() OVER (
        PARTITION BY SHIPMENT_ID
        ORDER BY LAST_CHANGE_TIMESTAMP DESC
      ) as rn
    FROM IBP_LOGISTICS_STAGING
  ) WHERE rn = 1
) source
ON (target.SHIPMENT_ID = source.SHIPMENT_ID)
WHEN MATCHED THEN UPDATE SET ...
WHEN NOT MATCHED THEN INSERT ...

Step 3 - Clear staging after successful merge

4. S/4HANA Change Pointer Verification:

Ensure proper change pointer activation in S/4HANA:

  • Run BD52 to verify change document objects for shipment tables
  • Activate change pointers for: LIKP (delivery header), LIPS (delivery items)
  • Key fields to monitor: WADAT_IST (goods issue date), LFSTA (delivery status), KOSTA (picking status)

5. CI-DS Job Configuration Updates:

Modify your job scheduling:

  • Add error handling for duplicate key scenarios
  • Implement retry logic with exponential backoff
  • Configure job to continue processing remaining records even if some fail
  • Add alerting for consecutive duplicate key failures

6. Monitoring and Validation:

Implement these checks:

  • Daily reconciliation job comparing S/4HANA shipment counts with IBP
  • Monitor for gaps in LastChangeTimestamp sequence
  • Alert on records stuck in staging table for > 1 hour
  • Track duplicate rate metrics (should be < 0.1% after fixes)

Testing Approach:

  1. Run initial load to populate base data
  2. Update test shipments in S/4HANA with different change types
  3. Verify CI-DS delta job correctly identifies INSERT/UPDATE/DELETE operations
  4. Confirm no duplicate key errors in target table
  5. Validate data consistency between S/4HANA and IBP

This solution eliminates duplicate key violations by properly handling update operations and provides a safety net through staging table deduplication. The key is ensuring CHNGIND is extracted and mapped correctly so IBP knows whether to insert or update each record.