Let me provide a detailed breakdown of our implementation covering the three key components: decision tables, rules engine integration, and audit logging.
Decision Tables Architecture
We designed a four-stage decision table cascade that evaluates loan applications progressively:
Stage 1 - Credit Score Evaluation: This initial table categorizes applicants into risk tiers (Excellent: 750+, Good: 700-749, Fair: 650-699, Poor: <650). Each tier has different approval thresholds for the subsequent stages. This prevents low-credit applicants from proceeding through unnecessary evaluations.
Stage 2 - Income Verification: The table cross-references stated income with employment history and tax return data. It calculates debt-to-income ratio and flags any discrepancies. We set maximum DTI thresholds at 43% for excellent credit, 38% for good credit, and 33% for fair credit.
Stage 3 - Debt Assessment: This table analyzes existing debt obligations, payment history on current accounts, and calculates available disposable income. It assigns a debt burden score from 1-100, where higher scores indicate lower risk.
Stage 4 - Final Approval Decision: The final table combines scores from stages 1-3 with loan-specific factors (amount requested, loan term, collateral value for secured loans). It outputs one of four decisions: Auto-Approve, Manual Review Required, Auto-Decline, or Conditional Approval with modified terms.
Each decision table is maintained in AgilePoint’s Decision Table designer with version control. When we update lending criteria, we create a new version and can roll back if needed. This versioning is crucial for audit trails - we can show exactly which version of the rules was active when a particular loan was evaluated.
Rules Engine Integration
The rules engine connects to multiple data sources through AgilePoint’s Integration Hub. For credit bureau integration, we used REST API connectors configured with OAuth authentication for Experian, TransUnion, and Equifax. The workflow pulls credit reports in parallel to minimize latency, then normalizes the data into a standard format before feeding it to the decision tables.
We implemented a caching mechanism to avoid redundant credit bureau calls. If an applicant submits multiple loan applications within 30 days, we reuse the cached credit report (with applicant consent). This reduces costs and improves processing speed.
The rules engine also integrates with our core banking system to verify existing customer relationships, account balances, and transaction history. Existing customers with good standing get preferential scoring in the decision tables.
Audit Logging and Compliance
This was our most critical requirement given regulatory scrutiny. We configured comprehensive audit logging at multiple levels:
Decision-Level Logging: Every decision table evaluation logs the input parameters, matched rules, intermediate scores, and final output. For example, if an applicant is declined, the log shows exactly which rule caused the decline (e.g., DTI exceeded 43%, or credit score below minimum threshold).
Data-Source Logging: All external data calls (credit bureaus, employment verification, banking data) are logged with timestamps, response times, and data returned. This helps diagnose any integration issues and proves due diligence in data gathering.
User-Action Logging: When manual reviewers override automated decisions, we capture the reviewer identity, timestamp, override reason (from a predefined list), and supporting documentation. This creates an unbroken audit chain from application to final decision.
Compliance Reporting: We built custom reports that aggregate decision data for fair lending analysis. These reports can show approval rates by demographic groups, average processing times, override frequencies, and other metrics regulators request during examinations.
The audit logs are stored in a separate compliance database with read-only access for most users. Only compliance officers and auditors can query the full logs. Retention is set to 7 years to meet regulatory requirements.
Results and Lessons Learned
After six months in production, we’re processing 300-400 loan applications daily with 70% automated approval rate. Processing time dropped from 3-5 days to 8 minutes for auto-approved loans. Manual review cases still take 24-48 hours, but reviewers now have all the data and preliminary analysis ready, making their job more efficient.
Key lessons: Start with conservative approval rules and gradually loosen them as you gain confidence. We initially auto-approved only the most obvious cases (excellent credit, low DTI, existing customers) and expanded from there. Also, invest heavily in testing with historical data before going live. We ran our decision tables against 10,000 historical applications to validate that automated decisions aligned with our previous manual decisions.
The ROI has been substantial - we’ve reduced loan processing staff by 40% while handling 50% more application volume. Customer satisfaction improved significantly due to faster decisions, and we’ve seen no increase in default rates compared to manual underwriting.