Real-time SAP invoice integration with Dynamics 365 using Logic Apps triggers

We recently implemented a real-time invoice synchronization between SAP ECC and Dynamics 365 Finance using Azure Logic Apps as the integration middleware. The use case was eliminating the 24-hour reconciliation delay from our previous batch ETL process.

The architecture uses SAP’s OData API to expose invoice creation events, Logic Apps consumption triggers to detect new invoices, and Dynamics 365 Finance Web API for invoice posting. Initial implementation worked but we encountered several challenges during production rollout that required significant refinement.

Key requirements:

  • Sub-5-minute latency from SAP invoice creation to D365 posting
  • Handle 2000+ invoices daily with peak loads of 150/hour
  • Maintain referential integrity for customer and vendor master data
  • Support invoice line item transformations (SAP line items to D365 distribution lines)
  • Error handling with automated retry and manual intervention queue

I’m sharing our implementation approach focusing on Logic Apps trigger setup, data transformation mapping, and error handling automation. The solution reduced our reconciliation window from 24 hours to under 3 minutes average, with 99.7% success rate on first attempt.

Transformation is handled entirely within Logic Apps using Liquid templates for complex mapping and inline expressions for simple field mappings. We created reusable Liquid maps for: invoice header transformation, line item to distribution conversion, tax calculation aggregation, and customer/vendor ID mapping between systems. The Liquid templates are stored in an Azure Storage account and referenced by the Logic App, making them version-controlled and updateable without modifying the flow.

We used a hybrid approach. Logic Apps recurrence trigger polls SAP OData every 2 minutes with a delta filter (invoices created since last poll). Not true event-driven, but 2-minute polling gives us sub-5-minute latency while avoiding SAP system load from constant requests. We also implemented a change tracking table in SAP to optimize the delta query performance.

This is a great use case. What trigger type did you use in Logic Apps - HTTP webhook, recurrence polling, or something else? SAP’s OData implementation doesn’t natively support webhooks in ECC, so I’m curious how you achieved real-time detection.

Here’s the complete implementation covering all three focus areas:

Logic Apps Trigger Setup:

We use a Recurrence trigger with optimized polling configuration:

{
  "type": "Recurrence",
  "recurrence": {
    "frequency": "Minute",
    "interval": 2
  }
}

The trigger executes every 2 minutes and queries SAP OData with delta token:


GET /sap/opu/odata/sap/ZFI_INVOICE_SRV/Invoices?$filter=CreatedDate gt datetime'{LastPollTime}'

Key optimizations:

  1. Change Tracking Table: We created a custom SAP table (ZFI_INV_DELTA) that logs invoice creation timestamps. This avoids expensive queries against main invoice tables.

  2. Stateful Polling: Logic App stores last successful poll time in Azure Table Storage. Each run retrieves this timestamp, queries SAP for newer invoices, then updates the timestamp.

  3. Batch Processing: When polling returns multiple invoices, we use a ForEach loop with concurrency set to 5 to process invoices in parallel while avoiding D365 API throttling.

  4. Trigger Conditions: Added condition to skip execution if SAP maintenance window is active (checked via environment variable).

Alternative considered: SAP Event Mesh for true event-driven architecture, but cost and complexity didn’t justify the marginal latency improvement over 2-minute polling.

Data Transformation Mapping:

Three-layer transformation architecture:

Layer 1 - Field Mapping (Inline Expressions): Simple field transformations use Logic Apps expressions:


@{triggerBody()?['InvoiceNumber']}
@{formatDateTime(triggerBody()?['InvoiceDate'], 'yyyy-MM-dd')}

Layer 2 - Liquid Templates (Complex Transformations): Stored in Azure Storage, referenced via integration account:

Invoice Header Template (invoice_header.liquid):

{
  "InvoiceId": "{{ content.BELNR }}",
  "InvoiceDate": "{{ content.BLDAT }}",
  "VendorAccount": "{{ content.LIFNR | prepend: 'V-' }}",
  "CurrencyCode": "{{ content.WAERS }}",
  "InvoiceAmount": {{ content.WRBTR }}
}

Line Item to Distribution Template (line_distribution.liquid): Converts SAP line items (BSEG table structure) to D365 accounting distributions:

{% for item in content.Items %}
{
  "LineNumber": {{ forloop.index }},
  "MainAccount": "{{ item.HKONT }}",
  "Dimension1": "{{ item.KOSTL }}",
  "Amount": {{ item.DMBTR }},
  "TaxGroup": "{{ item.MWSKZ }}"
}
{% endfor %}

Layer 3 - Master Data Lookup: Before transformation, we validate and map reference data:

  • Customer/Vendor IDs: Query Azure SQL mapping table that syncs nightly from both systems
  • GL Accounts: Use JSON mapping file stored in Storage account (updated by finance team)
  • Tax Codes: Direct mapping via Switch statement in Logic App

Transformation flow:


1. Receive SAP invoice JSON
2. Lookup customer/vendor in mapping table → get D365 ID
3. Apply Liquid template for header transformation
4. ForEach line item:
   - Lookup GL account mapping
   - Apply line distribution template
   - Calculate tax distribution
5. Compose final D365 JSON payload
6. Validate against D365 schema (Parse JSON action)
7. POST to D365 Web API

Error Handling Automation:

Multi-tier error handling with intelligent retry:

Tier 1 - Transient Error Handling: For network timeouts, API throttling, temporary D365 unavailability:

{
  "type": "Retry",
  "count": 3,
  "interval": "PT30S",
  "strategy": "exponential",
  "maximumInterval": "PT5M"
}

Tier 2 - Data Dependency Errors: For missing master data (customer not in D365, GL account invalid):

  1. Catch error in scope block
  2. Check error message for specific patterns:
    • “Customer not found” → Queue to master data sync topic
    • “Invalid GL account” → Send to finance team for mapping
  3. Write invoice to Azure Service Bus “retry queue” with 15-minute visibility timeout
  4. Separate Logic App processes retry queue, re-attempts after delay

Retry Logic App polls Service Bus every 5 minutes:


1. Dequeue message from retry queue
2. Check retry count (stored in message properties)
3. If count < 5: Re-attempt transformation and D365 post
4. If successful: Delete message, log success
5. If failed: Increment retry count
   - If count < 5: Return to queue with extended visibility timeout
   - If count >= 5: Move to "manual intervention" queue

Tier 3 - Manual Intervention Queue: Invoices that fail after 5 retries:

  1. Written to separate Service Bus queue with full context (original SAP data, error details, retry history)
  2. Power Automate cloud flow monitors this queue
  3. Creates task in Dynamics 365 Project Operations assigned to integration support team
  4. Team investigates, corrects data, manually resubmits via custom Power App

Error Categorization: We log all errors to Application Insights with custom dimensions:

{
  "ErrorCategory": "DataTransformation|MasterDataMissing|D365APIError|SAPConnectionError",
  "InvoiceNumber": "...",
  "RetryCount": 0,
  "ErrorDetails": "..."
}

This enables Power BI dashboards showing:

  • Success rate by error category
  • Average retry count before success
  • Most common data mapping failures
  • Peak error times (correlates with SAP batch jobs)

Performance Results:

  • Average end-to-end latency: 2 minutes 47 seconds
  • 99.7% success rate on first attempt
  • 0.2% require single retry (usually transient D365 throttling)
  • 0.1% require manual intervention (invalid data from SAP)
  • Peak throughput tested: 200 invoices/hour without throttling

Key Lessons Learned:

  1. Polling vs Webhooks: 2-minute polling proved more reliable than attempting webhook integration with SAP. The latency difference (2 min vs real-time) was acceptable for business requirements.

  2. Liquid Templates: Initially used inline JSON compose actions, but Liquid templates dramatically improved maintainability. Finance team can now update GL mappings without developer involvement.

  3. Retry Strategy: Exponential backoff alone wasn’t sufficient. Needed intelligent categorization to handle data dependency errors differently from transient failures.

  4. Monitoring: Application Insights custom telemetry was critical for production support. We can now identify systematic issues (e.g., specific GL account always fails) versus one-off errors.

  5. Master Data Synchronization: Real-time invoice sync exposed latency in our master data replication. We added a nightly sync of customer/vendor data to Azure SQL, which reduced master data errors by 90%.

The implementation reduced month-end close time by 8 hours and eliminated manual invoice reconciliation work. Total development effort was 6 weeks with 2 developers.

The Liquid template approach is solid for maintainability. What about error scenarios - how do you handle cases where customer master data exists in SAP but not yet in D365? That’s a common race condition in real-time integrations.

And what’s your retry strategy? Immediate retry might fail again if it’s a data dependency issue, but delayed retry increases your latency. Did you implement exponential backoff or a different pattern?

How did you handle the data transformation between SAP invoice structure and D365? The line item models are quite different - SAP uses item-level tax and GL assignments while D365 uses distribution accounting. Did you build transformation logic in Logic Apps or use an intermediate data service?