Here’s the complete implementation covering all three focus areas:
Logic Apps Trigger Setup:
We use a Recurrence trigger with optimized polling configuration:
{
"type": "Recurrence",
"recurrence": {
"frequency": "Minute",
"interval": 2
}
}
The trigger executes every 2 minutes and queries SAP OData with delta token:
GET /sap/opu/odata/sap/ZFI_INVOICE_SRV/Invoices?$filter=CreatedDate gt datetime'{LastPollTime}'
Key optimizations:
-
Change Tracking Table: We created a custom SAP table (ZFI_INV_DELTA) that logs invoice creation timestamps. This avoids expensive queries against main invoice tables.
-
Stateful Polling: Logic App stores last successful poll time in Azure Table Storage. Each run retrieves this timestamp, queries SAP for newer invoices, then updates the timestamp.
-
Batch Processing: When polling returns multiple invoices, we use a ForEach loop with concurrency set to 5 to process invoices in parallel while avoiding D365 API throttling.
-
Trigger Conditions: Added condition to skip execution if SAP maintenance window is active (checked via environment variable).
Alternative considered: SAP Event Mesh for true event-driven architecture, but cost and complexity didn’t justify the marginal latency improvement over 2-minute polling.
Data Transformation Mapping:
Three-layer transformation architecture:
Layer 1 - Field Mapping (Inline Expressions):
Simple field transformations use Logic Apps expressions:
@{triggerBody()?['InvoiceNumber']}
@{formatDateTime(triggerBody()?['InvoiceDate'], 'yyyy-MM-dd')}
Layer 2 - Liquid Templates (Complex Transformations):
Stored in Azure Storage, referenced via integration account:
Invoice Header Template (invoice_header.liquid):
{
"InvoiceId": "{{ content.BELNR }}",
"InvoiceDate": "{{ content.BLDAT }}",
"VendorAccount": "{{ content.LIFNR | prepend: 'V-' }}",
"CurrencyCode": "{{ content.WAERS }}",
"InvoiceAmount": {{ content.WRBTR }}
}
Line Item to Distribution Template (line_distribution.liquid):
Converts SAP line items (BSEG table structure) to D365 accounting distributions:
{% for item in content.Items %}
{
"LineNumber": {{ forloop.index }},
"MainAccount": "{{ item.HKONT }}",
"Dimension1": "{{ item.KOSTL }}",
"Amount": {{ item.DMBTR }},
"TaxGroup": "{{ item.MWSKZ }}"
}
{% endfor %}
Layer 3 - Master Data Lookup:
Before transformation, we validate and map reference data:
- Customer/Vendor IDs: Query Azure SQL mapping table that syncs nightly from both systems
- GL Accounts: Use JSON mapping file stored in Storage account (updated by finance team)
- Tax Codes: Direct mapping via Switch statement in Logic App
Transformation flow:
1. Receive SAP invoice JSON
2. Lookup customer/vendor in mapping table → get D365 ID
3. Apply Liquid template for header transformation
4. ForEach line item:
- Lookup GL account mapping
- Apply line distribution template
- Calculate tax distribution
5. Compose final D365 JSON payload
6. Validate against D365 schema (Parse JSON action)
7. POST to D365 Web API
Error Handling Automation:
Multi-tier error handling with intelligent retry:
Tier 1 - Transient Error Handling:
For network timeouts, API throttling, temporary D365 unavailability:
{
"type": "Retry",
"count": 3,
"interval": "PT30S",
"strategy": "exponential",
"maximumInterval": "PT5M"
}
Tier 2 - Data Dependency Errors:
For missing master data (customer not in D365, GL account invalid):
- Catch error in scope block
- Check error message for specific patterns:
- “Customer not found” → Queue to master data sync topic
- “Invalid GL account” → Send to finance team for mapping
- Write invoice to Azure Service Bus “retry queue” with 15-minute visibility timeout
- Separate Logic App processes retry queue, re-attempts after delay
Retry Logic App polls Service Bus every 5 minutes:
1. Dequeue message from retry queue
2. Check retry count (stored in message properties)
3. If count < 5: Re-attempt transformation and D365 post
4. If successful: Delete message, log success
5. If failed: Increment retry count
- If count < 5: Return to queue with extended visibility timeout
- If count >= 5: Move to "manual intervention" queue
Tier 3 - Manual Intervention Queue:
Invoices that fail after 5 retries:
- Written to separate Service Bus queue with full context (original SAP data, error details, retry history)
- Power Automate cloud flow monitors this queue
- Creates task in Dynamics 365 Project Operations assigned to integration support team
- Team investigates, corrects data, manually resubmits via custom Power App
Error Categorization:
We log all errors to Application Insights with custom dimensions:
{
"ErrorCategory": "DataTransformation|MasterDataMissing|D365APIError|SAPConnectionError",
"InvoiceNumber": "...",
"RetryCount": 0,
"ErrorDetails": "..."
}
This enables Power BI dashboards showing:
- Success rate by error category
- Average retry count before success
- Most common data mapping failures
- Peak error times (correlates with SAP batch jobs)
Performance Results:
- Average end-to-end latency: 2 minutes 47 seconds
- 99.7% success rate on first attempt
- 0.2% require single retry (usually transient D365 throttling)
- 0.1% require manual intervention (invalid data from SAP)
- Peak throughput tested: 200 invoices/hour without throttling
Key Lessons Learned:
-
Polling vs Webhooks: 2-minute polling proved more reliable than attempting webhook integration with SAP. The latency difference (2 min vs real-time) was acceptable for business requirements.
-
Liquid Templates: Initially used inline JSON compose actions, but Liquid templates dramatically improved maintainability. Finance team can now update GL mappings without developer involvement.
-
Retry Strategy: Exponential backoff alone wasn’t sufficient. Needed intelligent categorization to handle data dependency errors differently from transient failures.
-
Monitoring: Application Insights custom telemetry was critical for production support. We can now identify systematic issues (e.g., specific GL account always fails) versus one-off errors.
-
Master Data Synchronization: Real-time invoice sync exposed latency in our master data replication. We added a nightly sync of customer/vendor data to Azure SQL, which reduced master data errors by 90%.
The implementation reduced month-end close time by 8 hours and eliminated manual invoice reconciliation work. Total development effort was 6 weeks with 2 developers.