End-to-end requirements traceability in codebeamer cb-24 CI/CD pipeline

We successfully implemented full requirements-to-deployment traceability in our codebeamer cb-24 environment integrated with our CI/CD pipeline. This was driven by audit compliance requirements and the need to demonstrate coverage gaps across the development lifecycle.

Our solution automatically links requirements → user stories → code commits → test cases → test executions → builds → deployments using cb-24’s enhanced traceability matrix and REST API integration. Every production deployment now includes a traceability report showing which requirements are covered and tested.

The implementation took about 6 weeks and involved custom scripts, Jenkins pipeline modifications, and codebeamer workflow automation. Happy to share details about the architecture and lessons learned.

Here’s the complete implementation architecture for end-to-end requirements traceability in cb-24:

System Architecture Overview

Our solution integrates codebeamer cb-24 with Git, Jenkins, automated testing frameworks, and deployment systems to create an unbroken traceability chain from requirements through production deployment.

1. Requirements and User Story Management

All requirements are managed in codebeamer with unique IDs (REQ-XXXX format). User stories link to parent requirements using cb-24’s native association features. Each requirement includes:

  • Acceptance criteria defining verification approach
  • Risk level (determines required test coverage depth)
  • Compliance tags (FDA, ISO, SOC2, etc.)
  • Target release version

2. Code Commit Traceability

Developers reference requirements in commit messages using standardized format:


[REQ-1234] Implement user authentication

Added OAuth2 integration for single sign-on
Updated security configuration

Git hooks (server-side) parse commit messages and automatically create traceability links via cb-24 REST API. The hook script validates that referenced requirements exist and are in appropriate workflow state (approved, in-development).

3. Automated Test Linkage

Test cases in codebeamer are linked to requirements during test design. Our test automation framework includes metadata mapping:

# Pseudocode - Test automation pattern:
1. Test class annotated with @RequirementId("REQ-1234")
2. Test execution framework captures requirement metadata
3. After test run, POST results to cb-24 API
4. Include build number, environment, timestamp in payload
5. cb-24 creates test execution record linked to requirement
// See codebeamer Test Management API Documentation Section 5.3

This creates automatic linkage: Requirement → Test Case → Test Execution → Build.

4. CI/CD Pipeline Integration

Jenkins pipeline stages interact with codebeamer at key points:

Build Stage:

  • Extract requirement IDs from commits included in build
  • Query cb-24 to verify requirements are approved and testable
  • Create build artifact record in codebeamer with requirement associations

Test Stage:

  • Execute automated test suites (unit, integration, E2E)
  • POST test results to cb-24 test execution API
  • Link results to build and deployment target environment

Deployment Stage:

  • Generate pre-deployment traceability report
  • Verify all requirements have passing tests
  • Create deployment record in codebeamer with full traceability chain
  • Block deployment if coverage gaps exist (configurable by environment)

5. Traceability Report Generation

We built custom reporting using cb-24’s REST API and traceability matrix features. The report includes:

  • Requirements Coverage: Percentage of requirements with linked code commits
  • Test Coverage: Percentage of requirements with associated test cases
  • Verification Status: Requirements with passing vs. failing tests
  • Deployment History: When each requirement was deployed to each environment
  • Gap Analysis: Requirements without adequate verification or testing
  • Change Impact: Requirements modified since last release

Reports are generated automatically for each deployment and stored in codebeamer as release documentation.

6. Workflow Automation and Enforcement

Codebeamer workflows enforce traceability rules:

  • Requirements cannot transition to “Ready for Development” without acceptance criteria
  • User stories must link to parent requirements
  • Code reviews require commit messages with valid requirement references
  • Deployments to production require 100% test coverage (configurable threshold)
  • Failed tests automatically transition requirements to “Verification Failed” state

7. Change Impact Analysis

When requirements change, automated workflows:

  • Identify all linked items (code, tests, builds, deployments)
  • Create change impact report showing affected artifacts
  • Notify stakeholders of downstream impacts
  • Flag items requiring regression testing
  • Update traceability matrix to reflect changes

Implementation Lessons Learned

Success Factors:

  • Developer buy-in through simple commit message conventions
  • Automated enforcement reduces manual traceability overhead
  • Real-time traceability updates provide immediate feedback
  • Integration with existing tools (Git, Jenkins) minimized disruption

Challenges Overcome:

  • Initial resistance to commit message discipline - solved with Git hooks that reject non-compliant commits
  • Performance issues with large traceability queries - implemented caching and incremental updates
  • Handling legacy code without requirement links - created “technical debt” requirements and backfilled traceability
  • Test framework integration complexity - built adapter layer to support multiple test tools

Key Metrics Achieved:

  • 98% requirement-to-code traceability (up from 45% manual process)
  • 100% test-to-requirement linkage
  • Audit preparation time reduced from 3 weeks to 2 days
  • Deployment confidence increased - zero compliance findings in last two audits
  • Automated traceability reports generated in under 5 minutes vs. 2 days manual effort

Technical Stack:

  • codebeamer cb-24 with REST API integration
  • Git with server-side hooks for commit parsing
  • Jenkins with custom pipeline libraries
  • Python scripts for API integration and reporting
  • JUnit/TestNG with custom annotations for test metadata

The investment in automated traceability has paid significant dividends in compliance, quality, and development velocity. The visibility into requirements coverage has helped us identify gaps earlier and make data-driven release decisions.

This is exactly what we need for our FDA-regulated medical device development. How did you handle the code commit to requirement linking? Did you use commit message conventions or something more automated? Also curious about how you maintain traceability when requirements change mid-sprint.

We use commit message conventions with requirement IDs. Developers include tags like [REQ-1234] in commits, and our Git hooks parse these and create traceability links via cb-24 REST API. For requirement changes, we implemented a change impact analysis workflow that automatically identifies affected downstream items (code, tests, builds) and flags them for review. The traceability matrix updates in real-time as changes propagate.

Test results are linked through our Jenkins pipeline. After each test run, we POST results to codebeamer’s test execution API with build number and deployment environment metadata:


POST /api/v3/test-runs
buildId: jenkins-build-4523
environment: staging

Failed tests don’t block the report but are clearly flagged as “requirements with failed verification.” This actually helped us identify coverage gaps - requirements that passed unit tests but failed integration tests revealed incomplete acceptance criteria.

How do you handle test execution results in the traceability chain? We’re struggling to automatically link test runs to specific builds and deployments. Also, what happens when tests fail - does that block the traceability report or just flag coverage as incomplete?