Automating enterprise reporting workflows using REST API for scheduled multi-format exports

We’ve successfully automated our enterprise reporting workflows using the Cognos Analytics REST API, eliminating 20+ hours of weekly manual effort. Previously, our reporting team manually ran 50+ reports daily, exported them to various formats (PDF, Excel, CSV), and distributed them to different stakeholder groups via email. The process was error-prone and time-consuming.

Using the API automation approach with scheduled reports, we built a Python-based orchestration system that handles the entire workflow. The system runs on a schedule, generates reports with appropriate parameters, exports to required formats, and distributes via email or file shares. This has reduced report delivery time from hours to minutes and freed up our team for higher-value analysis work.

Key implementation aspects: API automation for report execution, scheduled report generation with error handling, and multi-format export management. Happy to share implementation details and lessons learned.

We use OAuth2 client credentials for API authentication with a dedicated service account. This avoids password expiration issues. For report parameters, we maintain a JSON configuration file that maps each report to its schedule, parameters, output formats, and distribution list. The orchestration system reads this config at runtime. We also store execution history in a PostgreSQL database for auditing and troubleshooting.

Great questions! Let me provide a comprehensive overview of our implementation covering all these aspects.

API Automation Architecture:

We built a Python-based orchestration system using these components:

  1. Scheduler: APScheduler for job scheduling
  2. API Client: Custom wrapper around requests library for Cognos API calls
  3. Queue Manager: RabbitMQ for distributing report generation tasks
  4. Worker Pool: 5 worker processes that execute reports in parallel
  5. Storage: PostgreSQL for configuration and execution history
  6. Notification: Integration with Slack and email (SMTP)

The workflow:


Scheduler → Queue → Workers → API Calls → Export → Distribute → Log

Scheduled Reports Implementation:

Configuration structure (JSON):

{
  "reports": [
    {
      "id": "i8A1B2C3D4E5F",
      "name": "Daily Sales Report",
      "schedule": "0 6 * * *",
      "parameters": {
        "region": "{{DYNAMIC:user.region}}",
        "date": "{{DYNAMIC:yesterday}}"
      },
      "outputs": [
        {"format": "PDF", "recipients": ["exec@company.com"]},
        {"format": "XLSX", "recipients": ["sales@company.com"]}
      ]
    }
  ]
}

The system supports:

  • Cron-style schedules for flexible timing
  • Dynamic parameter resolution (dates, user attributes, etc.)
  • Multiple output formats per report
  • Conditional execution based on data availability

Error Handling Strategy:

Implemented at multiple levels:

  1. Parameter Validation (before execution):
def validate_parameters(report_id, params):
    response = cognos_api.post(
        f'/api/v1/reports/{report_id}/validate',
        json={'parameters': params}
    )
    if not response.ok:
        raise ValidationError(response.json()['errors'])
  1. Execution Retry Logic:
@retry(stop=stop_after_attempt(3),
       wait=wait_exponential(multiplier=1, min=4, max=60))
def execute_report(report_id, params):
    response = cognos_api.post(
        f'/api/v1/reports/{report_id}/execute',
        json={'parameters': params, 'async': True}
    )
    return response.json()['executionId']
  1. Failure Notifications:
  • Immediate Slack alert for critical reports
  • Daily digest email with all failures
  • Dashboard showing execution status by report
  1. Graceful Degradation:
  • If a report fails, other reports continue
  • Partial success (e.g., PDF succeeds, Excel fails) is handled
  • Failed jobs are queued for manual review

Multi-Format Export Management:

Format-specific handling:

  1. Format Configuration (stored in report metadata):
{
  "format": "XLSX",
  "options": {
    "includeFormatting": true,
    "includeLayout": true,
    "columnWidths": "auto",
    "sheetName": "Sales Data"
  }
}
  1. Async Export for Large Reports:
def export_report(execution_id, format):
    # Submit export request
    export_response = cognos_api.post(
        f'/api/v1/executions/{execution_id}/export',
        json={'outputFormat': format, 'async': True}
    )
    export_id = export_response.json()['exportId']

    # Poll for completion
    while True:
        status = cognos_api.get(
            f'/api/v1/exports/{export_id}/status'
        ).json()

        if status['state'] == 'completed':
            return download_export(export_id)
        elif status['state'] == 'failed':
            raise ExportError(status['error'])

        time.sleep(10)
  1. Format-Specific Processing:
  • PDF: Add watermarks, merge multiple reports
  • Excel: Apply cell formatting, add summary sheets
  • CSV: Handle encoding, delimiter options

Infrastructure and Scaling:

Deployment architecture:

  • Environment: On-premises (Docker containers)
  • Orchestration: Kubernetes for container management
  • Scaling: Horizontal scaling of worker pods based on queue depth
  • Resource Limits: Max 10 concurrent API calls to Cognos (to avoid overload)

Load management:

class CognosAPIClient:
    def __init__(self, max_concurrent=10):
        self.semaphore = asyncio.Semaphore(max_concurrent)

    async def execute_report(self, report_id, params):
        async with self.semaphore:
            # Only max_concurrent requests execute simultaneously
            return await self._execute(report_id, params)

Peak time handling:

  • Stagger report execution (high-priority reports first)
  • Pre-generate reports during off-peak hours when possible
  • Cache frequently-requested reports

Distribution Implementation:

  1. Email Distribution:
def distribute_via_email(report_file, recipients, metadata):
    msg = EmailMessage()
    msg['Subject'] = f"{metadata['name']} - {date.today()}"
    msg['From'] = 'reports@company.com'
    msg['To'] = ', '.join(recipients)

    msg.set_content(f"Please find attached {metadata['name']}")

    with open(report_file, 'rb') as f:
        msg.add_attachment(
            f.read(),
            maintype='application',
            subtype='pdf',
            filename=f"{metadata['name']}.pdf"
        )

    smtp.send_message(msg)
  1. File Share Distribution:
  • Mount network shares in worker containers
  • Organize by date and report type
  • Set appropriate file permissions

Monitoring and Observability:

Metrics tracked:

  • Report execution success rate (target: >98%)
  • Average execution time by report
  • Export time by format
  • Distribution success rate
  • API response times

Dashboard built in Cognos showing:

  • Daily execution summary
  • Failed reports with error details
  • Execution time trends
  • Top slowest reports

Results and Benefits:

  • Time Savings: 20 hours/week manual effort eliminated
  • Reliability: 99.2% success rate (vs. ~85% manual)
  • Speed: Reports delivered in 10-30 minutes vs. 2-4 hours
  • Scalability: Can handle 200+ reports/day with current infrastructure
  • Audit Trail: Complete execution history for compliance

Lessons Learned:

  1. Start with a small subset of reports to prove the concept
  2. Invest in robust error handling early - it pays off
  3. Monitor Cognos server load and implement rate limiting
  4. Use async API calls for any report that takes >30 seconds
  5. Keep configuration in version control (Git)
  6. Build a simple UI for non-technical users to view execution status
  7. Document API usage patterns for capacity planning

The entire system runs with minimal maintenance - we spend about 2 hours/week on it vs. the 20+ hours of manual work it replaced. Happy to provide code samples for specific components if helpful!

Error handling was critical for us. We implement retry logic with exponential backoff - if a report fails, we retry up to 3 times with increasing delays. For persistent failures, we send Slack notifications to our ops team with error details. We also validate report parameters before submission using the API’s validation endpoint. Failed reports are logged to our database with full error context. Each morning, we run a summary job that checks for any failures from the previous day and sends a digest email.

Very interested in the multi-format export aspect. How do you handle format-specific requirements? For example, Excel exports might need specific column widths or formatting, while PDFs need page breaks. Do you configure this in the Cognos reports themselves or through API parameters? Also, what’s your approach to large reports that take minutes to generate - do you use async API calls?