Let me provide a comprehensive overview of our implementation covering all four focus areas:
GitLab CI Webhook Integration: We configured GitLab to send pipeline completion webhooks to our integration service endpoint. The webhook configuration in GitLab includes pipeline events with test report artifacts enabled. Our service validates webhook signatures using GitLab’s secret token to prevent unauthorized requests. The critical design decision was making the webhook receiver stateless - it immediately queues the event and returns 200 OK to GitLab, preventing timeout issues. The actual Rally API interaction happens asynchronously in background workers.
REST API Result Mapping: The mapping layer is a three-stage transformation pipeline:
// Pseudocode - Transformation stages:
1. Parse JUnit XML from GitLab artifacts
2. Lookup Rally TestCase references from mapping table
3. Transform to Rally JSON schema with required fields
4. Validate payload against Rally API specification
5. Submit batch of TestCaseResults via POST request
We handle the FormattedID-to-ObjectID mapping by caching Rally’s TestCase collection locally and refreshing it daily. This avoids repeated API queries during result submission. For test cases without Rally mappings, we log them to a review queue where QA can establish the associations.
Test Cycle Automation: Sprint-based test set creation is triggered by Rally’s iteration change events. We subscribe to Rally’s webhook notifications for iteration updates. When a new sprint starts, our automation:
- Queries Rally for user stories in the current iteration
- Extracts test case associations from user story requirements
- Creates a test set named ‘Sprint-{N}-Automated-Tests’
- Populates it with the associated test cases
- Configures the test set’s TestFolder to match the sprint structure
The user story associations come from Rally’s Requirements field on test cases. We maintain these relationships as part of our test case authoring process.
Artifact Synchronization: This was indeed the most complex component. Rally’s REST API supports attachment uploads via multipart/form-data, but we optimized for performance by storing large artifacts in GitLab and linking them:
# For small artifacts (<5MB): Direct upload to Rally
attachment_data = {
"AttachmentContent": {
"Content": base64_encoded_data
},
"Artifact": test_result_ref,
"Name": "test_output.log"
}
# For large artifacts: Store link in Notes field
notes = f"Test logs: {gitlab_artifact_url}"
Screenshots and small logs get uploaded directly to Rally as attachments. Performance test results and video recordings remain in GitLab, with URLs stored in Rally’s Notes field. This hybrid approach keeps Rally responsive while maintaining artifact access.
Results: After implementing this integration, our QA team reduced manual result logging from 2-3 hours daily to zero. Test result accuracy improved because we eliminated human transcription errors. The automation also enabled real-time test coverage dashboards in Rally that update within minutes of pipeline completion. Total development effort was about 3 weeks for a two-person team, and we’ve been running this in production for 6 months with 99.5% uptime. The 85% time savings calculation comes from comparing pre/post automation QA team time allocation - they now spend those hours on exploratory testing instead of data entry.