Here’s a comprehensive solution addressing CI-DS key mapping, delta extraction logic, and duplicate filtering:
1. CI-DS Key Mapping Configuration:
Your current composite key (ShipmentID + CreatedDate) is problematic for delta loads because CreatedDate never changes on updates. Modify your key mapping strategy:
Primary Key: ShipmentID only (this should be unique)
Technical Key: Add LastChangeTimestamp for delta detection
In your CI-DS data source definition, configure:
SELECT
SHIPMENT_ID,
CREATED_DATE,
LAST_CHANGE_TIMESTAMP,
CHANGE_INDICATOR,
STATUS,
PLANNED_DATE
FROM S4_SHIPMENT_TABLE
WHERE LAST_CHANGE_TIMESTAMP > :LAST_EXTRACTION_TIME
2. Delta Extraction Logic Enhancement:
In your CI-DS job configuration (si-2302), add the change operation mapping:
a) Include CHNGIND field from S/4HANA in your extraction
b) Map it to the IBP operation type:
- ‘I’ → INSERT
- ‘U’ → UPDATE (this is critical - currently missing)
- ‘D’ → DELETE
In the CI-DS transformation step, add this Groovy script:
def changeInd = message.properties.get('CHNGIND')
if (changeInd == 'U') {
message.setHeader('CamelSqlOperationType', 'UPDATE')
} else if (changeInd == 'D') {
message.setHeader('CamelSqlOperationType', 'DELETE')
}
3. Duplicate Filtering Implementation:
Create a staging approach with deduplication:
Step 1 - Load to staging table (allows duplicates):
INSERT INTO IBP_LOGISTICS_STAGING
(SHIPMENT_ID, LAST_CHANGE_TIMESTAMP, CHANGE_INDICATOR, ...)
VALUES (?, ?, ?, ...)
Step 2 - Deduplicate using window functions:
MERGE INTO IBP_LOGISTICS_SHIPMENT target
USING (
SELECT * FROM (
SELECT *,
ROW_NUMBER() OVER (
PARTITION BY SHIPMENT_ID
ORDER BY LAST_CHANGE_TIMESTAMP DESC
) as rn
FROM IBP_LOGISTICS_STAGING
) WHERE rn = 1
) source
ON (target.SHIPMENT_ID = source.SHIPMENT_ID)
WHEN MATCHED THEN UPDATE SET ...
WHEN NOT MATCHED THEN INSERT ...
Step 3 - Clear staging after successful merge
4. S/4HANA Change Pointer Verification:
Ensure proper change pointer activation in S/4HANA:
- Run BD52 to verify change document objects for shipment tables
- Activate change pointers for: LIKP (delivery header), LIPS (delivery items)
- Key fields to monitor: WADAT_IST (goods issue date), LFSTA (delivery status), KOSTA (picking status)
5. CI-DS Job Configuration Updates:
Modify your job scheduling:
- Add error handling for duplicate key scenarios
- Implement retry logic with exponential backoff
- Configure job to continue processing remaining records even if some fail
- Add alerting for consecutive duplicate key failures
6. Monitoring and Validation:
Implement these checks:
- Daily reconciliation job comparing S/4HANA shipment counts with IBP
- Monitor for gaps in LastChangeTimestamp sequence
- Alert on records stuck in staging table for > 1 hour
- Track duplicate rate metrics (should be < 0.1% after fixes)
Testing Approach:
- Run initial load to populate base data
- Update test shipments in S/4HANA with different change types
- Verify CI-DS delta job correctly identifies INSERT/UPDATE/DELETE operations
- Confirm no duplicate key errors in target table
- Validate data consistency between S/4HANA and IBP
This solution eliminates duplicate key violations by properly handling update operations and provides a safety net through staging table deduplication. The key is ensuring CHNGIND is extracted and mapped correctly so IBP knows whether to insert or update each record.