Automated CDN log export to OCI Object Storage for security monitoring

Our security team needed to analyze CDN access logs for threat detection, but manually downloading logs daily from OCI Console was time-consuming and error-prone. We implemented an automated solution using OCI Events and Functions that’s been running flawlessly for 6 months.

The challenge was that CDN logs are generated hourly and stored temporarily. We needed them in Object Storage for long-term retention and SIEM integration. Here’s our implementation approach that eliminated manual log downloads and provides real-time security monitoring.

How do you handle authentication and permissions for the Function to access CDN logs and write to Object Storage? Also, what’s the processing time - do you have any lag between log generation and SIEM availability?

Functions use Dynamic Groups and IAM policies for authentication - no hardcoded credentials. The Dynamic Group includes all Functions in our compartment, and we grant it permissions to read from CDN log bucket and write to our security bucket.

Processing time is typically 2-5 minutes from log generation to SIEM availability. The Function itself runs in under 30 seconds, but there’s some delay in event triggering and log availability. For security monitoring, this near-real-time processing is acceptable.

Yes, exactly. OCI Events monitors CDN log creation events and triggers a Function automatically. The Function downloads the log file, parses it, enriches with additional metadata (geolocation, threat intel), and uploads to a dedicated Object Storage bucket.

For SIEM integration, we export logs in CEF (Common Event Format) which our security tools can parse natively. The whole pipeline runs serverless, so costs are minimal - we’re processing about 2GB of logs daily and paying less than $10/month for Functions execution.

Let me share the complete implementation details for anyone looking to build a similar solution. This automated CDN log pipeline has been handling our security monitoring needs reliably, processing 60GB+ of logs monthly with zero manual intervention.

Architecture Overview: CDN Logs Generated → OCI Events Trigger → Function Processes → Object Storage → SIEM Integration

Component 1: CDN Log Delivery to Object Storage

First, configure CDN to deliver access logs to an Object Storage bucket. In OCI Console:

  1. Navigate to CDN Distribution → Logs
  2. Enable Access Logs
  3. Specify destination bucket: `cdn-logs-raw
  4. Logs are delivered hourly in gzip format

Log files follow naming pattern: `cdn-access-logs/distribution-id/YYYY/MM/DD/HH/logfile.gz Component 2: OCI Events and Functions Automation

Create an Event Rule to trigger when new CDN logs arrive:

{
  "eventType": "com.oraclecloud.objectstorage.createobject",
  "data": {
    "compartmentId": "ocid1.compartment...",
    "bucketName": "cdn-logs-raw",
    "resourceName": "cdn-access-logs/*"
  }
}

This triggers our Function whenever a new log file is created in the bucket.

Component 3: Function Implementation

Our Function (Python 3.9) performs these steps:

  1. Download log file from Object Storage
  2. Decompress gzip content
  3. Parse CDN log format (space-delimited)
  4. Enrich each log entry with:
    • GeoIP lookup (country, city, ASN)
    • Threat intelligence check (known malicious IPs)
    • User-agent parsing (device type, browser)
  5. Convert to CEF format for SIEM
  6. Upload processed logs to security bucket

Key Function code snippet:

import gzip, json
from oci import object_storage

def enrich_log_entry(log_line):
    fields = log_line.split()
    return {
        'timestamp': fields[0],
        'client_ip': fields[1],
        'request': fields[6],
        'status': fields[8],
        'geo': lookup_geoip(fields[1])
    }

Component 4: IAM Configuration

Dynamic Group for Functions:


ALL {resource.type = 'fnfunc', resource.compartment.id = 'ocid1.compartment...'}

IAM Policy:


Allow dynamic-group cdn-log-functions to read objects in compartment Security where target.bucket.name='cdn-logs-raw'
Allow dynamic-group cdn-log-functions to manage objects in compartment Security where target.bucket.name='cdn-logs-processed'

Component 5: SIEM Integration for Analytics

Once logs are in the cdn-logs-processed bucket in CEF format, we use two integration methods:

  1. Real-time streaming: OCI Events triggers another Function that forwards logs to our SIEM via syslog
  2. Batch processing: SIEM pulls from Object Storage hourly using pre-authenticated requests

The processed logs include security-relevant fields:

  • Source IP with geolocation and threat score
  • Request patterns (SQL injection attempts, path traversal)
  • Anomalous user-agent strings
  • Rate limiting violations
  • Response codes indicating attacks (403, 404 patterns)

Operational Benefits:

Automation: Zero manual intervention - logs flow automatically from CDN to SIEM Cost: ~$8/month for Functions execution + Object Storage costs Retention: 90-day retention in Object Storage with lifecycle policies for archival Scalability: Handles traffic spikes automatically - Functions scale based on log volume Security: No credentials in code, all authentication via Dynamic Groups

Monitoring and Alerting:

We monitor the pipeline using:

  • OCI Metrics for Function invocations and errors
  • Alarms for Function failures (alert security team)
  • Daily summary of processed log volume (detect anomalies)

Lessons Learned:

  1. Use Function timeout of 120 seconds - some log files are large and take time to process
  2. Implement retry logic for transient Object Storage errors
  3. Add Function logging to CloudWatch for debugging
  4. Use Object Storage lifecycle policies to archive processed logs after 90 days
  5. Batch multiple log entries in single SIEM forwards to reduce overhead

This architecture has proven reliable and cost-effective for our security monitoring needs. The serverless approach means we pay only for actual log processing, and the automation eliminates the manual toil that was consuming security team time. The enrichment capabilities give our security analysts much better context for threat detection compared to raw CDN logs.

Have you considered using OCI Logging Analytics instead of building a custom pipeline? It has built-in CDN log parsing and can integrate with SIEM tools. Just curious about your architectural decision - custom Functions vs managed service.