Azure Blob Storage data connector returns 403 error in cloud

I’m trying to set up a data connector to Azure Blob Storage from our Snowflake cloud deployment and consistently getting 403 Forbidden errors. The same connection configuration worked fine in our test environment, but fails in production cloud.

Error message:


Error: 403 Forbidden - Access Denied
Storage Account: proddata2025
Container: analytics-raw
Blob: customer_data/*.parquet

I’ve verified the SAS token is valid and has read permissions. Is there something specific about Snowflake cloud deployments that requires additional Azure configuration? The token works when I test it directly with Azure Storage Explorer.

403 errors with valid SAS tokens usually mean IP restrictions. Check if your Azure Storage account has network rules that only allow specific IP ranges. Snowflake cloud instances use dynamic IP addresses, so you need to whitelist the entire Snowflake cloud IP range for your region.

Good point about IP whitelisting. I checked and our Azure Storage account has “Selected networks” enabled with only our corporate IP ranges. How do I find the Snowflake cloud IP ranges for whitelisting? Is there a published list?

Also verify the SAS token permissions are comprehensive enough. You need at least Read and List permissions for data ingestion. If you’re doing incremental loads, you might also need Write permission. Check the SAS token expiration date too - tokens generated for testing might have short validity periods.

Here’s the complete solution covering all three authentication aspects:

Azure Role Assignment (Recommended Approach): Instead of SAS tokens, use Azure AD service principal:

  1. Create service principal in Azure AD:
az ad sp create-for-rbac --name snowflake-connector
  1. Assign Storage Blob Data Reader role:
az role assignment create --assignee [service-principal-id] \
  --role "Storage Blob Data Reader" \
  --scope /subscriptions/[sub-id]/resourceGroups/[rg]/providers/Microsoft.Storage/storageAccounts/proddata2025
  1. Configure in Snowflake using service principal credentials instead of SAS token.

SAS Token Permissions (If You Must Use Tokens): If sticking with SAS tokens, ensure comprehensive permissions:

  • Read (r): Required
  • List (l): Required for directory operations
  • Allowed resource types: Container + Object
  • Allowed services: Blob
  • Expiration: Set reasonable duration (90+ days for production)

Generate with:

az storage container generate-sas --account-name proddata2025 \
  --name analytics-raw --permissions rl --expiry 2026-05-13

Cloud IP Whitelisting: If using network restrictions on Azure Storage:

  1. Get Snowflake cloud IP ranges from documentation (varies by region)
  2. Add to Azure Storage firewall:
    • Azure Portal > Storage Account > Networking > Firewall
    • Add IP ranges: e.g., 52.1.0.0/16 for US East
  3. Or better: Enable “Allow Azure services” which includes Snowflake cloud

Production Best Practice: Use Azure Private Link for Snowflake-to-Azure Storage connections. This keeps traffic on Microsoft backbone network, avoids public internet, and eliminates IP whitelisting:

  • Create Private Endpoint for Storage Account
  • Configure Snowflake to use Private Link connection
  • No firewall rules needed, no IP management

Why Your Test Worked but Production Failed: Your test environment likely has “Allow all networks” enabled on Azure Storage, while production has “Selected networks” with IP restrictions. Production also might have shorter SAS token expiration or different RBAC policies.

Immediate Fix: Add Snowflake cloud IP range to Azure Storage firewall or switch to service principal authentication. Long-term, implement Private Link for secure, reliable cloud-to-cloud connectivity without ongoing IP management.

Snowflake publishes IP ranges in their documentation under network policies. But honestly, a better approach is to use Azure role-based access instead of SAS tokens for cloud-to-cloud connections. Set up a managed identity or service principal with proper RBAC roles (Storage Blob Data Reader). This avoids IP whitelisting issues entirely and is more secure since you’re not managing token expiration.