Xray test coverage status not rolling back requirement to In Progress

We drive requirement readiness off Xray test coverage. When all linked test cases pass, an automation rule transitions the requirement to “Verified”. When a test fails, it should roll the requirement back to “In Progress”. This worked until last month. Now requirements stay in Verified status even when linked tests fail. Stakeholders see requirements as ready when they’re not.

Our automation rule triggers on test execution status changes and uses this logic:


issue.fields.customfield_10050 == "Failed"

The rule should transition linked requirements back to In Progress, but it only fires sometimes. I think the problem is that the rule evaluates the single test execution that triggered it, not ALL linked tests. If one test fails but others are still passing, the requirement stays Verified. We need to evaluate the aggregate test coverage status across all linked tests and update the requirement accordingly. How do others handle this with Xray?

I see the quality gate field in Xray settings. It’s called “Requirement Test Summary” in our instance. It shows values: Not Tested, Partial, Passed, Failed. I’ll update our automation rule to trigger off this field instead of individual test executions. Do I need to worry about timing issues? Like, if a test execution completes but the summary field hasn’t updated yet, could the rule miss the transition?

That makes sense. So the real-time rule is inherently limited. Could I use a scheduled rule that runs every hour and checks all requirements in Verified status, then queries their linked test executions to see if any have failed? Would that be performant enough for about 500 requirements?

Here’s a complete solution addressing all focus areas:

Xray Test Coverage and Execution Entities: Xray uses three key entities:

  1. Test (issue type): The test case definition
  2. Test Execution (issue type): A container for test runs
  3. Test Run: The actual execution result (Pass/Fail/etc.) linking a Test to a Test Execution

Requirements link to Tests via “tests” link type. Tests are executed within Test Executions. The challenge is that requirement status needs to reflect the aggregate of all Test Runs across potentially multiple Test Executions.

Automation Rules Triggered from Test Executions: Your current approach triggers on individual test execution changes, which can’t evaluate aggregate status. Here’s a better trigger strategy:

  1. Use Xray’s built-in “Requirement Test Summary” custom field (enable in Xray > Settings > Test Coverage)

  2. This field automatically aggregates all test results for a requirement

  3. Trigger your automation rule on field value changes:

    
    WHEN: Field value changed
    FIELD: Requirement Test Summary
    CONDITION: New value = "Failed" OR "Partial"
    ACTION: Transition issue to "In Progress"
    
  4. Create a second rule for successful verification:

    
    WHEN: Field value changed
    FIELD: Requirement Test Summary
    CONDITION: New value = "Passed"
    ACTION: Transition issue to "Verified"
    

Link-Based Evaluation of All Related Tests: If you can’t use the Xray summary field, implement a link-based evaluation using a scheduled rule:


TRIGGER: Scheduled (every 2 hours)
JQL: project = REQ AND status = Verified AND issueFunction in hasLinks("tests")
FOR EACH: Matching issues
BRANCH: Related issues where linkType = "tests"
CONDITION: Advanced compare - Check if ANY linked test has latest execution status = FAIL
ACTION: Transition parent requirement to "In Progress"
COMMENT: "Requirement rolled back due to failed test: {{issue.key}}"

This queries all requirements in Verified status, examines their linked tests, and rolls back any that have failed executions.

Requirement Workflow Status Rollback: Implement a two-tier workflow for requirements:

  1. DraftReady for TestIn TestVerifiedApproved

  2. Add a transition “Roll Back” from Verified to In Test with these conditions:

    • Requirement Test Summary != “Passed”
    • OR any linked test execution status = Failed
  3. Make this transition available to automation rules (not just manual users)

  4. Add a post-function that logs the rollback reason:

    
    Rolled back due to test failure
    Failed tests: [list of test keys]
    Timestamp: {{now}}
    

Quality Gate Custom Field Integration: Integrate the Xray quality gate field into your workflow:

  1. Enable “Requirement Test Summary” in Xray settings

  2. Add this field to your requirement screen layout

  3. Create a condition on the “Verify” transition:

    
    Requirement Test Summary = "Passed"
    

    This prevents manual verification if tests haven’t all passed

  4. Create an automation rule that monitors this field:

    
    TRIGGER: Field value changed (Requirement Test Summary)
    CONDITION:
      IF new value = "Failed" OR "Partial"
      AND current status = "Verified"
    THEN: Transition to "In Test"
    ADD COMMENT: "Automated rollback: Test coverage incomplete or failed"
    
  5. Add a dashboard gadget showing requirements with mismatched status:

    
    JQL: project = REQ
    AND status = Verified
    AND "Requirement Test Summary" != "Passed"
    

    This helps identify requirements that slipped through before the automation was fixed

Complete Automation Rule Configuration: Replace your current rule with these two rules:

Rule 1: Rollback on Test Failure


Name: Requirement - Rollback on Test Failure
Trigger: Field value changed
Field: Requirement Test Summary
IF Conditions:
  - New value = "Failed" OR "Partial"
  - Current status = "Verified" OR "Approved"
THEN Actions:
  - Transition issue: "Roll Back to In Test"
  - Add comment: "Automated rollback due to test failure. Summary status: {{issue.Requirement Test Summary}}"
  - Send notification to: {{issue.Assignee}}, {{issue.Reporter}}

Rule 2: Verify on All Tests Passed


Name: Requirement - Auto-Verify on Test Success
Trigger: Field value changed
Field: Requirement Test Summary
IF Conditions:
  - New value = "Passed"
  - Current status = "In Test"
  - All linked tests have status "Done" (optional additional check)
THEN Actions:
  - Transition issue: "Verify"
  - Add comment: "All tests passed. Requirement automatically verified."
  - Send notification to: Stakeholders

Handling Timing and Edge Cases:

  • Xray updates the summary field within 5-10 seconds of test execution completion
  • If you need immediate rollback, add a third rule triggered directly by test execution status changes that sets a “Needs Review” flag
  • The summary field-based rule then clears the flag and performs the actual transition
  • For requirements with 100+ linked tests, the summary field update may take up to 30 seconds

Verification and Monitoring: Create a saved filter to monitor the system:


project = REQ
AND status = Verified
AND "Requirement Test Summary" in ("Failed", "Partial", "Not Tested")
ORDER BY updated DESC

Run this daily to catch any requirements that weren’t properly rolled back. This gives you confidence that the automation is working correctly and stakeholders are seeing accurate requirement status.

Xray test executions are separate entities from test cases. Your automation rule triggers when an execution changes, but it can only see that one execution. To evaluate all executions for a requirement, you’d need to query all linked test cases, then query all executions for those test cases, then aggregate the results. Jira automation rules can’t do that kind of nested query easily. You might need a scripted solution or a scheduled rule that periodically recalculates requirement status.

We use a quality gate custom field that Xray updates automatically. It has values like “All Passed”, “Some Failed”, “Not Tested”. Our automation rule watches that field on the requirement issue. When it changes to “Some Failed”, we transition the requirement to In Progress. When it changes to “All Passed”, we transition to Verified. This is much more reliable than trying to evaluate individual test executions. You need to enable this field in Xray settings under Test Coverage.

Xray updates the summary field asynchronously, usually within a few seconds of the test execution completing. There’s a small window where the field might be stale, but it’s rare. If you’re concerned, add a condition to your rule that checks whether the summary field value matches the expected state based on recent test executions. Or just accept the slight delay-it’s usually not an issue in practice.