Automated test case baseline sync between ELM test-mgmt and MBSE tools reduced manual effort by 65%

We implemented an automated baseline synchronization system between ELM test management and our MBSE toolchain that eliminated most manual reconciliation work. Previously, our test team spent 15-20 hours per sprint manually comparing baselines and resolving conflicts between test cases and system model updates.

The solution uses OSLC REST APIs to monitor baseline delivery events in both systems. When a new baseline is created in either ELM or the MBSE tool, our automation detects changes, performs conflict resolution using predefined rules, and synchronizes test case versions automatically. We maintain a complete audit trail of all sync operations for compliance requirements.

Key results: 65% reduction in manual sync effort (from 18 hours to 6 hours per sprint), zero baseline mismatch incidents in the last 6 months, and improved traceability between system requirements and test coverage. The MBSE integration was particularly challenging but worth the effort.

Our conflict resolution uses a priority-based system. System model changes (from MBSE) take precedence for requirement traceability links. Test case content changes (from ELM) take precedence for test steps and expected results. For true conflicts where both sides modified the same attribute, we flag for manual review but still create a merged baseline with both versions documented. About 8% of syncs require manual review.

We use event subscriptions rather than polling. ELM 7.0.3 supports OSLC change events for baseline operations. Our middleware subscribes to these events and triggers the sync workflow. The MBSE tool side required custom adapter development since not all MBSE tools support OSLC events natively. We poll the MBSE API every 5 minutes as a fallback.

What’s your approach to conflict resolution? When test cases and system models diverge, how does your automation decide which version to keep? We’ve tried rule-based conflict resolution before but found too many edge cases that required manual intervention anyway.

I’m curious about the MBSE integration challenges you mentioned. We’re evaluating similar integration but concerned about vendor lock-in. Did you build a generic OSLC adapter or is it specific to your MBSE tool? Can the solution be extended to other tools without major rework?

Let me provide a comprehensive overview of our automated baseline sync implementation:

OSLC Automation Architecture

We built a middleware service that acts as an OSLC broker between ELM test management and our MBSE platform. The architecture has three main components:

  1. Event Listener Service: Subscribes to ELM baseline delivery notifications using OSLC change events
  2. Conflict Resolution Engine: Applies rules to merge divergent baselines
  3. Audit Service: Logs all operations for compliance reporting

For ELM integration, we use the OSLC Query API to detect baseline changes:

GET /qm/oslc_qm/contexts/{project}/baselines
Accept: application/json
OSLC-Core-Version: 2.0

When a baseline event occurs, we fetch the change set and compare with the MBSE tool’s current baseline state.

Baseline Delivery Workflow

The sync process follows this sequence:

  1. Detect baseline creation/update event in either system
  2. Retrieve full baseline metadata including test case versions and requirement links
  3. Query corresponding baseline in the other system
  4. Identify differences using content hash comparison
  5. Apply conflict resolution rules
  6. Create merged baseline in both systems
  7. Update traceability links
  8. Generate audit record

Conflict Resolution Strategy

Our rules prioritize based on artifact type:

  • Requirement traceability: MBSE system is source of truth
  • Test case content: ELM is authoritative
  • Metadata (status, owner, dates): Most recent timestamp wins
  • Custom attributes: Flagged for manual review

For complex conflicts, we create a “conflict baseline” that preserves both versions and notifies the responsible team. This happens in about 8% of syncs, down from 100% manual review before automation.

Audit Trail Implementation

Every sync operation generates a compliance record:

  • Timestamp and triggering event
  • Source and target baseline IDs
  • List of synchronized artifacts with version numbers
  • Conflict resolution decisions applied
  • Validation status (auto-approved or requires manual review)
  • Digital signature of the sync operation

These records are stored in a separate compliance database and can generate reports for audits. We export monthly compliance summaries showing all baseline changes, sync operations, and any manual interventions.

MBSE Integration Challenges

The MBSE integration was indeed the hardest part. Our MBSE tool doesn’t natively support OSLC, so we built a custom adapter that:

  • Translates MBSE model versions to OSLC baseline concepts
  • Maps system requirements to ELM test case traceability links
  • Exposes an OSLC provider interface for the broker to consume

We designed the adapter using a plugin architecture, so it can theoretically support multiple MBSE tools. However, each tool requires specific mapping logic. The adapter is about 60% generic OSLC handling and 40% tool-specific translation.

Quantified Benefits

Before automation:

  • 18 hours per sprint manually comparing baselines
  • 12-15 baseline mismatch incidents per quarter
  • 3-day average lag between ELM and MBSE baseline updates

After automation:

  • 6 hours per sprint (only for the 8% requiring manual review)
  • Zero mismatch incidents in 6 months
  • Real-time baseline synchronization (avg 3 minutes)
  • 65% reduction in manual effort
  • Improved test coverage traceability from 78% to 96%

Implementation Recommendations

If you’re considering similar automation:

  1. Start with read-only sync to validate your OSLC queries work correctly
  2. Build comprehensive logging before attempting write operations
  3. Test conflict resolution rules with historical baseline data
  4. Implement a “dry-run” mode that shows what would be synced without making changes
  5. Plan for about 3-4 weeks of development plus 2 weeks of validation testing

The investment was significant but paid off within two sprints. The key success factor was getting buy-in from both test management and systems engineering teams on the conflict resolution rules before building the automation.

How do you handle the audit trail requirement? We’re in a regulated industry and need to prove that every baseline sync was validated and approved. Does your system generate compliance reports automatically?