Test cycles and execution status lost when cloning issues with fix versions for release preparation

Our release managers clone stories and bugs with fix versions when preparing for the next release cycle. The cloned issues carry over the test coverage links from Xray, but the test execution history gets corrupted or lost entirely.

We use fix version-based test cycles in Xray (e.g., “v2.5 Regression” contains all tests for stories in Fix Version 2.5). When we clone Story-123 to Story-456 and update the fix version to 2.6, the cloned story still shows test results from the v2.5 cycle:


Original: Story-123 → Fix Version 2.5 → Test Cycle "v2.5 Regression" → Tests executed
Cloned: Story-456 → Fix Version 2.6 → Still shows v2.5 test results (incorrect)

QA cannot reliably plan testing because the release board reports show inflated test coverage. We need cloned stories to preserve test coverage (which tests apply) but reset execution status for the new release. How do other teams handle this pattern?

Yes, use a condition in your automation rule to check the linked issue type. Something like {{issue.issueLinks.filter(link => link.linkedIssue.issueType == "Test Execution")}} in smart values. Then use a “Delete Link” action for those specific links. Keep the links to Test issues intact so QA knows what needs retesting.

Xray’s test execution records are tied to specific test plans and cycles, not just the story links. When you clone a story, the test links copy over but they still reference executions from the original cycle. You need to either create a new test cycle for v2.6 and re-execute tests, or use Xray’s automation to unlink execution history from cloned issues.

That approach makes sense. How do you identify which links are Test Executions vs Tests? They both use the “tests” link type in our setup. Is there a way to filter by the linked issue type in the automation rule?

Another consideration: if you’re using release boards with filters like fixVersion = 2.6 AND "Test Coverage" > 80%, the cloned stories will incorrectly show as tested. Update your board filters to exclude issues where the test execution cycle doesn’t match the fix version. We use a custom field to track the target test cycle and filter on that.

Here’s a complete solution for managing test cycles during release cloning:

Understanding Fix Version-Based Test Cycles: Xray organizes test executions into cycles that are typically aligned with fix versions or sprints. When you clone a story from one release to another, Jira copies all issue links including those to Test and Test Execution issues. However, the Test Execution records still reference the original cycle (e.g., “v2.5 Regression”), creating confusion about which tests have actually been run for the new release.

Cloning Patterns for Release Preparation: Establish a standard cloning workflow:

  1. Clone the story/bug and update the fix version to the target release (e.g., 2.5 → 2.6)
  2. Preserve links to Test issues (these define what needs testing)
  3. Remove links to Test Execution issues (these represent historical results)
  4. Add the cloned story to a new test cycle for the target release

Preserving Test Coverage While Resetting Results: Use automation to handle the link cleanup after cloning:


Trigger: Issue Cloned (or Issue Created with specific label/field)
Condition: Issue Type in (Story, Bug) AND Fix Version changed
Action: For each linked issue where type = "Test Execution"
  → Delete link between current issue and Test Execution
Action: Add comment "Test executions from previous release removed. Please re-execute tests in {{issue.fixVersions.first}} cycle."

This automation preserves the “tests” links to Test issues (defining coverage requirements) while removing “is tested by” links to Test Execution records (historical results). QA can see which tests apply but won’t see stale pass/fail status.

Automation to Relink Tests After Clone: Create a follow-up automation that adds cloned stories to the appropriate test cycle:


Trigger: Issue Updated (Fix Version field changed)
Condition: Fix Version is not EMPTY AND Test Execution links removed
Action: Find or create Test Plan for {{issue.fixVersions.first}}
Action: Add current issue to Test Plan
Action: Trigger test execution creation (via Xray REST API or manual QA workflow)

Alternatively, use a scheduled rule that runs daily to identify newly cloned stories and batch-add them to the current release’s test plan.

Release Board Reporting Expectations: Update your release board filters to accurately reflect test status:


JQL: fixVersion = "2.6" AND
     issueFunction in linkedIssuesOf("project = TEST AND
       issuetype = 'Test Execution' AND
       'Test Cycle' = 'v2.6 Regression'", "is tested by")

This filter only shows stories with test executions in the correct cycle, preventing false positives from cloned issues. Add a separate filter for “Needs Retesting” to identify cloned stories without current-cycle executions:


JQL: fixVersion = "2.6" AND
     issueFunction in linkedIssuesOf("issuetype = Test", "tests") AND
     NOT issueFunction in linkedIssuesOf("issuetype = 'Test Execution' AND
       'Test Cycle' = 'v2.6 Regression'", "is tested by")

Additional Recommendations:

  • Use a custom field “Target Test Cycle” on stories to explicitly track which cycle should test them
  • Configure Xray’s test plan to automatically include issues by fix version
  • Train release managers to verify test status after cloning using the “Needs Retesting” filter
  • Consider using Xray’s test plan cloning feature instead of story cloning for better test cycle alignment

This approach ensures cloned stories maintain test coverage definitions while resetting execution history, giving QA accurate visibility into what needs retesting for each release.

We run into this constantly. Our workaround is to use a post-clone automation rule that removes the Test Execution links (not the Test links themselves) from the cloned story. This preserves which tests need to be run but clears the stale results. Then we bulk-add the cloned stories to a new test cycle for the upcoming release.