Intercompany transaction integration challenges: cloud deployment with API authentication delays

We’re experiencing significant challenges with intercompany transaction integration after moving to D365 cloud (10.0.42). Our setup involves 8 legal entities processing intercompany transactions through custom API integrations that worked smoothly on-premise.

The main issue is transaction delays caused by API authentication overhead. Each intercompany transaction requires API authentication to post across entities, and we’re seeing 3-5 second delays per transaction that compound quickly. On-premise, these were essentially instantaneous using direct database calls.

Here’s a sample of our authentication pattern:


POST /api/auth/token
Authorization: Bearer {client_credentials}
Content-Type: application/json

The authentication token expires every 60 minutes, requiring re-authentication. With hundreds of daily intercompany transactions, this creates noticeable data latency. Our reconciliation automation is also affected - the delayed posting means reconciliation jobs often run before all transactions are committed, requiring manual cleanup.

Has anyone solved API authentication performance issues for high-volume intercompany scenarios in cloud? We’re trying to balance security requirements with transaction throughput.

The authentication overhead is a known challenge in cloud. You should implement token caching and reuse. Don’t authenticate for every transaction - authenticate once, cache the token, and reuse it for the 60-minute validity period. This alone will eliminate 95% of your authentication delays. Also consider using service-to-service authentication with longer-lived tokens rather than user-based authentication.

Beyond token caching, you should batch your intercompany transactions. Instead of posting each transaction individually via API, collect them and post in batches of 50-100. This reduces the API call overhead significantly. Cloud APIs have batch endpoints specifically for this purpose. Your reconciliation timing issues will also improve because transactions post in coordinated batches rather than trickling in over time.

Token caching makes sense - we weren’t doing that. For batching, how do you handle the intercompany posting dependencies? Some transactions depend on others being posted first. Does batching break those dependency chains or is there a way to maintain transaction order?

You can maintain dependencies within batches by using ordered batch processing. The Data Management Framework in D365 cloud supports dependency sequencing in batch imports. Structure your batches with explicit ordering and the framework handles dependencies. This is actually more reliable than on-premise direct database calls because it’s built into the framework rather than custom code.

For reconciliation automation with delayed posting, we implemented a status-checking mechanism. Instead of running reconciliation on a fixed schedule, we query transaction posting status via API and only run reconciliation when all expected transactions are posted. This eliminated the manual cleanup we were doing before. The API provides posting status endpoints that make this straightforward.