We’re experiencing workflow delays in our quality change management process when users attach large files (typically test reports and compliance documents over 50MB). The approval workflow gets stuck at the review stage, and sometimes the change order shows as “Processing” indefinitely.
This is creating serious compliance issues because our quality changes have strict approval timelines. Users report that after uploading a large PDF, the workflow doesn’t advance to the next approver for hours, or sometimes requires manual intervention to restart. The File Manager logs show successful uploads, but something is preventing the workflow from progressing. Has anyone dealt with large attachment handling in quality workflows?
From a compliance perspective, you need to ensure file processing doesn’t impact audit trails. Make sure your workflow logs are capturing when attachments are added and when transitions occur. We implemented a policy limiting attachments to 100MB per file and required users to compress larger documents. This reduced our workflow stalls by about 80% while still meeting documentation requirements.
We had similar issues with our engineering change workflows. The problem was twofold: First, our File Manager vault was on slow network storage, causing upload delays. Second, WebLogic was configured with default timeout settings that were too aggressive for large files. Moving the vault to faster storage and increasing WebLogic transaction timeouts from 30 to 300 seconds resolved most of our stalls.
Check your File Manager configuration in agile.properties. There are settings for max file size, concurrent uploads, and virus scanning timeouts that can all impact workflow progression. If you have antivirus scanning enabled, large files can take several minutes to scan, during which the workflow may appear stuck. Consider adjusting scan timeouts or excluding certain file types from scanning if your security policy allows.
Interesting - I’ll check the storage performance and WebLogic timeout settings. Are there any recommended attachment size limits we should enforce? And is there a way to monitor which workflows are stuck due to file processing?
I’ll provide a complete solution addressing all aspects of large attachment handling in quality workflows.
Attachment Size Limits: Establish and enforce practical constraints:
- Set maximum single file size to 100MB in File Manager configuration
- Configure in agile.properties: `filemanager.max.file.size=104857600
- Implement client-side validation to warn users before upload
- For larger documents, require compression or splitting into multiple files
- Document the policy in user guidelines with rationale
- Our 100MB limit eliminated 90% of workflow timeout issues
File Manager Storage Optimization: Ensure adequate infrastructure:
- Verify vault storage is on high-performance SAN or NAS, not remote file shares
- Test I/O throughput - should handle 50MB files in under 30 seconds
- Allocate dedicated storage pool for quality attachments if possible
- Monitor disk space - maintain at least 20% free space to prevent performance degradation
- Enable compression on the storage layer to reduce actual disk usage
- Check network bandwidth between File Manager and storage - minimum 1Gbps recommended
Workflow Timeout Tuning: Adjust timeout settings for large file scenarios:
- Edit workflow transitions to extend timeout from default 30s to 300s
- In Agile Java Client, go to Admin > Workflow > Edit Transition
- Set “Timeout” property to 300 seconds for approval transitions
- Configure WebLogic JTA timeout:
set-jta-timeout 300 in domain config
- Update stuck transaction timeout in config.xml: `600
Log Monitoring Strategy: Implement proactive tracking:
- Enable detailed File Manager logging in log4j.properties
- Set `log4j.logger.com.agile.filemanager=DEBUG
- Create automated script to scan logs for “Processing” status exceeding 5 minutes
- Monitor workflow_status table for changes stuck in transition
- Set up alerts for file upload failures or timeout exceptions
- Review logs weekly to identify patterns in workflow stalls
Additional Best Practices:
- Disable virus scanning timeout or increase to 600 seconds for large files
- Configure asynchronous file processing to prevent workflow blocking
- Implement file upload progress indicators so users know processing is occurring
- Schedule maintenance window to clear orphaned file locks
- Create “fast track” workflow variant for changes without large attachments
- Train users on optimal file formats (compressed PDFs vs. uncompressed)
Compliance Safeguards:
- Ensure audit trail captures file attachment timestamps
- Configure workflow to log when large files cause delays
- Implement notification to quality managers when workflows exceed SLA
- Maintain separate log of all compliance-related change approvals
- Test disaster recovery procedures for File Manager vault
Monitoring Dashboard Queries:
Create SQL queries to identify problematic workflows:
- Changes with attachments over 50MB that are still in-progress
- Workflows stuck in same status for over 2 hours
- File Manager operations with duration exceeding 120 seconds
After implementing these changes, our quality change workflows handle 50-100MB attachments without stalling, maintaining our compliance timelines. The combination of size limits, storage optimization, and timeout tuning provides robust performance while ensuring audit integrity.