Batch evaluate endpoint in decision API causes performance bottleneck

Our approval workflow integration is experiencing severe performance degradation when using the decision API’s batch evaluate endpoint. We’re sending batches of 200-300 decision requests at once, and the API timeout handling is inadequate.

The endpoint times out after processing about 150 requests, causing the entire batch to fail. We’ve tried reducing batch size to 100, but then processing takes too long overall. The decision logic itself is relatively simple - checking approval thresholds based on amount and department.

Request pattern:


POST /api/decisions/batch-evaluate
Payload: 250 decision contexts
Timeout: 60 seconds (configured)
Actual: Fails at ~45 seconds

Are there specific batch processing limits in Creatio 8.3 that we should be aware of? How do others handle high-volume decision evaluation in approval workflows?

We faced this exact issue. The problem isn’t just batch size - it’s how the decision engine loads context data. Each decision in the batch triggers database lookups for rules and conditions. With 250 requests, that’s potentially thousands of DB queries. We solved it by pre-caching frequently used decision rules on the application server side and implementing our own batching logic with optimal sizes of 50 requests per call.

For high-volume scenarios, async is definitely the way to go. The decision API supports webhook callbacks for batch operations. You submit the batch, get an operation ID back immediately, and receive results via webhook when processing completes. This prevents timeout issues and allows the server to optimize processing internally.

I’ve optimized this exact scenario for multiple clients. The issue combines several factors that need addressing systematically.

Batch Processing Limits: Creatio 8.3 has a hard limit of 1000 items per batch API call, but the practical limit for decision evaluation is much lower - around 75-100 decisions depending on rule complexity. This is because each decision context requires:

  • Rule set loading and compilation
  • Condition evaluation with potential database lookups
  • Result serialization

For your 200-300 request volume, here’s the optimal approach:

1. Implement Smart Batching


// Pseudocode - Optimal batch processing:
1. Split requests into batches of 75 items
2. Send batches with 2-second delay between calls
3. Use async/await pattern for parallel processing
4. Implement exponential backoff on rate limit errors

2. API Timeout Handling: The 60-second timeout is too aggressive for batch operations. Configure it differently:

  • Client-side timeout: 120 seconds
  • Server-side processing timeout: 90 seconds (set in web.config)
  • Connection timeout: 30 seconds

Add this to your API configuration:


<system.web>
  <httpRuntime executionTimeout="90"/>
</system.web>

3. Approval Workflow Integration: This is where most optimization happens. Instead of batch pre-evaluation:

  • Use the async callback pattern mentioned earlier
  • Implement decision caching for frequently evaluated rules
  • Consider moving simple threshold checks (amount-based) to client-side validation
  • Use the decision API only for complex multi-criteria evaluations

For your specific workflow, structure it as:


// Submit batch with callback
POST /api/decisions/batch-evaluate-async
Headers: X-Callback-URL: https://yourapp/decisions/callback
Payload: {
  "decisions": [...75 contexts...],
  "operationId": "unique-batch-id"
}

The server processes asynchronously and posts results back to your callback URL. This eliminates timeout issues entirely.

Additional Performance Optimizations:

  • Enable decision rule caching in system settings (DecisionService.EnableRuleCache = true)
  • If your approval logic is threshold-based, consider using the simpler /evaluate endpoint for individual decisions rather than batch
  • Implement result caching on your side - if the same decision context is evaluated multiple times, cache the result for 5-10 minutes
  • Monitor the DecisionEnginePerformance counter in Creatio logs to identify slow rules

For 200-300 decisions, the async approach with 75-item batches should complete in under 20 seconds total. The synchronous batch approach will always struggle at this volume due to how the decision engine processes rules sequentially within each batch.

One final consideration: review your decision rule complexity. If rules are doing complex database joins or external API calls, that’s where the real bottleneck is. Simplify rules where possible or pre-compute expensive lookups before submitting to the decision API.