We’re running into a critical issue with our contract data exports in Zoho CRM 2021. When attempting to export contract records that have multiple large PDF attachments (typically 5-10MB each), the export process fails midway through with a timeout error.
Here’s what we’re seeing:
GET /crm/v2/Contracts?fields=all&include_attachments=true
Response: 504 Gateway Timeout
Partial records returned: 47 of 203
The API seems to handle pagination limits inconsistently - sometimes we get 100 records, other times only 40-50 before timing out. We’ve tried adjusting page size parameters, but can’t find documentation on attachment size limits per API call. Our concern is data integrity verification - we need complete backups of all contract records with attachments for compliance audits, but we can’t confirm if all data is being captured. Has anyone dealt with large attachment exports through the API successfully?
For compliance backups, I always use cursor-based pagination when available, but Zoho CRM doesn’t support that for all modules. Use the ‘page’ parameter with a fixed per_page value of 100 (don’t go higher). Track the ‘more_records’ boolean in responses - when it’s false, you’ve reached the end. For data integrity verification, maintain a local checksum database of record IDs and modification timestamps. After each export batch, compare against the checksum to detect any missed records.
I’ve seen similar timeout issues when attachment sizes aren’t considered in pagination. The default page size doesn’t account for attachment payload, so large files cause unexpected failures. Try implementing conditional pagination - fetch metadata first, then attachments separately for records with large files. This approach gives you better control over data integrity verification.
The 504 timeout is typically triggered when the total response payload exceeds the gateway limit (usually around 50MB). Your issue isn’t the page size parameter - it’s the cumulative attachment size in a single request. Zoho’s API has an undocumented soft limit of approximately 30-40MB per response. When you request fields=all with include_attachments=true, you’re loading everything into memory at once. I’d recommend a two-phase approach: first export contract metadata with pagination set to 200 records, then make separate attachment download requests using the attachment IDs. This way you can verify data integrity at each step and avoid incomplete backups.
One thing nobody mentioned - check your connection timeout settings on the client side. Default HTTP client timeouts are often 30-60 seconds, which isn’t enough for attachment-heavy responses.
I’ve implemented this exact solution for several clients dealing with contract exports. Here’s a comprehensive approach that addresses all three focus areas:
Attachment Size Limits Strategy:
Zoho CRM has a per-file attachment limit of 20MB and a practical per-response payload limit of approximately 35-40MB. Never request attachments inline with record data. Instead, use a three-tier approach:
// Phase 1: Fetch contract metadata only
GET /crm/v2/Contracts?fields=id,Contract_Name,Modified_Time&per_page=200&page=1
// Phase 2: For each contract, get attachment metadata
GET /crm/v2/Contracts/{record_id}/Attachments
// Phase 3: Download attachments individually
GET /crm/v2/Contracts/{record_id}/Attachments/{attachment_id}
API Pagination Best Practices:
Always use the ‘page’ parameter with per_page=200 for metadata-only requests. For attachment downloads, implement a queue system with retry logic. Track the ‘info’ object in API responses - it contains ‘page’, ‘per_page’, ‘count’, and ‘more_records’ fields. Never rely on page count calculations; always check ‘more_records’ boolean. Implement a state machine that tracks: records_fetched, attachments_identified, attachments_downloaded. This gives you checkpoint recovery if the process fails.
Data Integrity Verification:
Implement a three-level verification system:
- Record-level: Compare exported record count against API’s total count from the ‘info’ object
- Attachment-level: Verify each downloaded file’s size matches the ‘File_Size’ field from attachment metadata
- Checksum-level: Generate SHA-256 hashes of downloaded files and store them in your backup database
For your specific timeout issue, modify your export script to:
- Set HTTP client timeout to 180 seconds for attachment downloads
- Implement exponential backoff (start with 5s delay, double on each retry, max 60s)
- Add a download queue with priority based on attachment size (download smaller files first)
- Use parallel downloads with a concurrency limit of 3 to respect rate limits
For compliance audits, maintain an export log table with columns: record_id, export_timestamp, attachment_count, total_size_bytes, checksum, verification_status. This gives you an audit trail proving data integrity. I’ve used this pattern to successfully export over 50,000 contract records with attachments totaling 2TB without data loss.
Have you checked the API rate limits? The timeout could also be rate limiting kicking in when processing attachment-heavy requests. Zoho CRM API has a limit of 100 API calls per minute for the Professional edition. Large attachment exports consume more processing time per call, potentially triggering rate limits even if you’re not hitting the call count. Consider implementing exponential backoff retry logic and monitor your API usage in the developer console.
Thanks for the suggestions. I’ve been testing the two-phase approach, but I’m still unclear on the best pagination strategy. Should I be using the ‘page’ parameter or the ‘offset’ parameter for contract records? And how do I ensure I’m not missing any records when the API returns inconsistent page sizes?