Our ERP system generates signed URLs for uploading purchase order attachments to Cloud Storage, but we’re losing files because URLs expire before users complete uploads. The workflow allows users to attach multiple documents (quotes, specifications, contracts) to a PO, but if they take more than 15 minutes, uploads fail with 403 errors.
Current signed URL generation:
url = bucket.blob(filename).generate_signed_url(
expiration=datetime.timedelta(minutes=15),
method='PUT'
)
This is causing missing attachments in our purchase order workflow, especially for large files or when users are interrupted. We’ve had complaints from procurement teams about lost work. What’s the right balance between security and usability for signed URL expiration in file upload scenarios?
Have you looked into resumable uploads? They’re specifically designed for this use case. With resumable uploads, even if the connection drops or the user pauses, they can resume from where they left off. The session URL expires after 7 days by default, which is much more forgiving. This would solve both your reliability and timeout issues without compromising security.
For the purchase order workflow, you might also want to implement client-side upload progress tracking. This way, your ERP UI can detect when an upload is taking too long and proactively request a new signed URL before the current one expires. We do this with a 2-minute warning threshold - if upload isn’t complete with 2 minutes remaining, refresh the URL automatically.
Here’s a comprehensive solution addressing all three focus areas:
Signed URL Expiration Strategy:
Increase expiration to 60-90 minutes for better reliability while maintaining security:
url = bucket.blob(filename).generate_signed_url(
expiration=datetime.timedelta(minutes=60),
method='PUT',
content_type='application/octet-stream'
)
For sensitive environments, implement URL refresh:
# Client requests refresh when 80% of time elapsed
if (elapsed_time / expiration_time) > 0.8:
new_url = refresh_signed_url(file_id)
File Upload Reliability:
Switch to resumable uploads for files >5MB:
from google.cloud import storage
blob = bucket.blob(filename)
with open(local_file, 'rb') as file_obj:
blob.upload_from_file(
file_obj,
size=file_size,
checksum='md5'
)
Implement client-side chunked uploads with retry logic for network interruptions. Add upload progress tracking so users know their files are processing.
Purchase Order Workflow Integration:
- Generate signed URLs with 60-minute expiration when user opens PO attachment dialog
- Store upload session metadata in your ERP database (file_id, expiration, status)
- Implement server-side validation after upload completes
- Add webhook endpoint to receive Cloud Storage notifications on successful uploads
- Display real-time upload status in PO interface
- Prevent PO submission if required attachments are missing or uploads pending
For your specific use case with 2-5 files per PO:
- Generate all signed URLs upfront when attachment dialog opens
- Use 60-minute expiration (plenty of time for multiple uploads)
- Implement parallel uploads for better user experience
- Add automatic retry for failed uploads
- Show clear error messages if URLs expire (with one-click regenerate option)
This approach balances security (URLs still expire, scoped to specific operations) with reliability (enough time for real-world upload scenarios). Monitor actual upload times using Cloud Storage metrics to fine-tune expiration values based on your user patterns.
Before extending expiration times, consider your security requirements. Are these URLs being shared or embedded in emails? If so, longer expiration increases exposure window. A better approach might be implementing a token refresh mechanism where your ERP backend can regenerate URLs on demand when the client detects approaching expiration. This gives you both security and reliability.
The URLs are generated on-demand when users click ‘Attach File’ in the PO form. They’re not shared externally or emailed. Users typically upload 2-5 files per PO, with sizes ranging from 500KB to 50MB. The 15-minute limit seemed reasonable initially, but we didn’t account for users multitasking or network variability.
15 minutes is quite short for file uploads, especially if users are uploading multiple files or dealing with slow connections. I’d recommend extending to at least 60 minutes for better upload reliability. The security risk is minimal since signed URLs are scoped to specific objects and operations. You could also implement resumable uploads for larger files, which would handle interruptions better than simple PUT requests.
Another consideration: are you validating file uploads server-side after completion? If URLs expire during upload, you lose the file, but you also need to ensure the PO workflow knows about the failure. Implement webhook callbacks or polling to confirm successful uploads, so your ERP can retry or alert users about missing attachments before the PO is submitted for approval.