We’re experiencing severe performance degradation with our lead distribution batch job in SAP CX 2105. The job runs every 15 minutes to assign incoming leads based on territory, product expertise, and current workload. Since we scaled to 5000+ daily leads, CPU usage spikes to 85-90% during execution and assignment completion takes 12-18 minutes instead of the expected 3-5 minutes.
Our lead assignment rules evaluate 8 criteria per lead including geographic territory matching, product line specialization, and rep capacity calculations. I suspect the batch job scheduling might not be optimized for this volume, or our rule complexity is causing performance bottlenecks. We haven’t done any CPU profiling yet to identify the exact bottleneck.
Has anyone dealt with similar lead distribution performance issues at scale? What’s the best approach to profile and optimize batch job execution in SAP CX?
I’ve seen this pattern before with complex assignment rules. First step is enabling performance logging for the batch job. In SAP CX administration, go to Job Scheduler settings and enable detailed execution metrics. This will show you time spent in each rule evaluation phase. My guess is your 8-criteria evaluation is running sequentially rather than being optimized. Also check if you’re loading all 5000 leads into memory at once versus processing in smaller batches.
I’ll provide a comprehensive optimization strategy addressing all three focus areas:
Batch Job Scheduling Optimization:
First, adjust your job interval based on lead volume patterns. Instead of fixed 15-minute intervals, implement dynamic scheduling during peak hours (30-minute intervals) and faster processing during low-volume periods (10-minute intervals). Configure the batch size parameter to 750 leads per execution cycle - this balances memory usage and throughput. In Job Scheduler configuration, set maxConcurrentExecutions=1 to prevent overlapping runs that spike CPU.
CPU Profiling Implementation:
Enable the SAP CX Performance Monitor module and configure method-level profiling for your lead distribution job. Key metrics to track: rule evaluation time per lead, database query execution time, and memory allocation patterns. The profiling data revealed in your case that 78% time in DB queries is the primary bottleneck. Use the Thread Dump Analyzer tool during peak execution to identify CPU-intensive operations. Set up automated profiling alerts when CPU exceeds 70% during job execution.
Lead Assignment Rule Complexity Reduction:
Restructure your 8-criteria evaluation into a two-tier approach:
Tier 1 (Fast Path - 60% of leads): Pre-filter using indexed fields - territory code and primary product line. These leads get assigned in 30-40 seconds using cached lookup tables loaded at job start.
Tier 2 (Detailed Path - 40% of leads): Full 8-criteria evaluation for complex cases. Optimize by:
- Consolidating territory and product queries into single JOIN statement
- Caching rep capacity calculations for job duration (refresh every 30 min)
- Implementing rule short-circuiting - if first 3 criteria fail, skip remaining 5
- Using bulk data loading API to fetch all rep availability data upfront
Create composite indexes on: lead.territory_code, lead.product_line, and rep.specialization fields. This reduced our similar deployment from 18 minutes to 4.5 minutes for 5000 leads.
Additional Optimizations:
Implement a lead priority queue - hot leads (high value, time-sensitive) process first with simplified rules. Configure job thread pool size to match your server cores (typically 4-8 threads). Monitor the Job Execution History dashboard weekly to track improvement trends and identify regression.
Expected results: CPU usage should drop to 45-55% range, execution time to 4-6 minutes for 5000 leads. The key is moving from per-lead database queries to bulk data loading with in-memory rule evaluation.
Batch queries are the way to go. Instead of evaluating rules per lead, restructure to load all relevant territory and rep data once, then perform rule matching in memory. SAP CX supports bulk data loading APIs. Also consider if all 8 criteria need real-time evaluation - some might be cacheable for the duration of the job run. The product specialization and territory data probably don’t change during a 15-minute window.
Don’t forget to review your lead assignment rule complexity score in the system. SAP CX has built-in complexity analysis tools. If your rules are rated high complexity, consider splitting them into fast-track and detailed evaluation paths. Simple leads (clear territory match, obvious product fit) can be assigned quickly, while complex cases go through full 8-criteria evaluation. This reduces average processing time significantly.
Good points. I enabled the performance logging and you’re right - we’re loading all leads at once. The log shows 78% of execution time is spent in database queries for territory matching and rep availability checks. Each of the 8 criteria triggers separate queries. We definitely need to optimize the data loading strategy. Should I look into caching the territory and rep data, or is there a way to make the rule evaluation more efficient with batch queries?