Automated cost rollup integration between SAP PLM and S/4HANA improves margin visibility for new product introductions

We implemented automated cost rollup from SAP PLM to S/4HANA using OData services and scheduled jobs. Previously our finance team manually exported BOM costs weekly which delayed margin analysis by 5-7 days.

Our solution leverages SAP PLM’s OData API to extract part costs and BOM structures, then pushes aggregated data to S/4HANA material master records via scheduled background jobs. The integration runs nightly and includes delta detection to minimize data transfer.

Key implementation aspects:

  • OData service endpoints configured for cost element extraction
  • Custom scheduled job framework for automated synchronization
  • Real-time margin reporting dashboards in S/4HANA

The automation reduced our cost update cycle from 7 days to overnight, enabling daily margin visibility for product managers. Happy to share technical details and lessons learned.

Did you implement any validation rules on the S/4HANA side before updating material master? We’re worried about data quality issues propagating from PLM into our financial records. Also how do you handle currency conversions if PLM costs are in different currencies than S/4HANA material masters?

Excellent questions on validation and currency handling - these were critical to get right.

OData Integration Setup: We exposed custom OData services in SAP PLM using transaction SEGW. The key entities are:

  • CostElementSet: Extracts material costs with effective dates
  • BOMStructureSet: Retrieves BOM hierarchies for rollup calculations
  • ChangeTrackingSet: Custom entity tracking cost modifications

Endpoint example:


/sap/opu/odata/sap/ZPLM_COST_SRV/CostElementSet
?$filter=ChangedDate gt datetime'2025-03-17T00:00:00'
&$expand=BOMComponents

Scheduled Job Automation: We built the integration using SAP Cloud Integration (CPI) rather than custom ABAP jobs for better monitoring and error handling. The iFlow architecture:

  1. Timer-triggered process runs at 2am daily
  2. OData GET from PLM with delta filter
  3. Groovy script aggregates BOM costs (handles multi-level BOMs)
  4. Validation rules check: cost > 0, valid material number, currency code exists
  5. Currency conversion via S/4HANA exchange rate API
  6. OData PATCH to S/4HANA material master (field: ZZ_PLM_COST)
  7. Success/failure logged to custom monitoring table

Margin Reporting: On the S/4HANA side we created custom CDS views joining material master costs with sales order data:

  • View 1: Aggregates PLM costs at product level
  • View 2: Joins with sales pricing for margin calculation
  • Fiori dashboard displays real-time margin by product line

The CDS view automatically refreshes when material costs update, so margin reports are always current.

Validation Strategy: We implemented three-tier validation:

  • Pre-flight validation in CPI before sending to S/4HANA (data type checks, mandatory fields)
  • S/4HANA BAPI validation during material update (standard SAP checks)
  • Post-integration reconciliation report comparing PLM vs S/4HANA costs daily

Any validation failures trigger workflow notifications to data stewards with specific error details.

Currency Handling: PLM costs come in local currencies (USD, EUR, CNY in our case). The CPI integration calls S/4HANA’s currency conversion FM (CONVERT_TO_LOCAL_CURRENCY) using exchange rates from table TCURR. We convert everything to company code currency (USD) before updating material master. The conversion uses month-end rates pulled from S/4HANA to match financial reporting periods.

Performance Optimization: Beyond batching, we implemented parallel processing in CPI - splitting the material list into 5 parallel threads. This reduced total runtime from 45 minutes to 12 minutes for our 28k active materials. Database indexes on ChangedDate and MaterialNumber fields were critical for OData query performance.

Lessons Learned:

  1. Start with read-only OData testing before implementing writes to S/4HANA
  2. Build comprehensive logging from day one - we log every API call with timestamps and response codes
  3. Implement circuit breaker pattern - if error rate exceeds 10% the job pauses and alerts operations
  4. Version your OData services in PLM - we maintain v1 and v2 simultaneously during upgrades
  5. Document currency conversion logic extensively - auditors will ask detailed questions

The ROI was significant - finance estimates we save 120 hours monthly in manual effort, plus the strategic value of daily margin visibility enabled more agile pricing decisions. We’ve since expanded this pattern to sync engineering change data and quality metrics.

Happy to share our CPI iFlow design or OData service metadata if helpful for your implementation planning.

Great use case. How did you handle the delta detection logic? We tried something similar but ended up with full extracts every time because change tracking on cost records was unreliable. Also curious about your error handling strategy when S/4HANA is temporarily unavailable during the scheduled run.

The margin visibility improvement is huge. What kind of performance impact did you see on SAP PLM during the nightly runs? We have about 45000 active parts and I’m concerned about system load during the extraction process.

Performance was a concern initially. We batch the OData calls to 500 records per request and added 2-second delays between batches to avoid overwhelming the system. With 45k parts you’d be looking at roughly 90 batches, taking about 25-30 minutes total during off-peak hours. We schedule it at 2am when user activity is minimal. Monitor your database connection pool settings - we had to increase max connections from default 50 to 80 to handle the concurrent OData requests smoothly.