Analytics ETL job for Tableau extracts runs slow with Snowflake data source

Our Tableau Desktop 2023.2 extract creation from Snowflake is taking 4-5 hours for a dataset that’s only 2.5 million rows. This is killing our analytics refresh cycle.

The query hitting Snowflake:

SELECT orders.*, customers.segment, products.category
FROM orders
JOIN customers ON orders.customer_id = customers.id
JOIN products ON orders.product_id = products.id
WHERE order_date >= '2023-01-01'

We’re using an X-Small Snowflake warehouse for cost reasons. The extract optimization settings in Tableau are default - we haven’t touched any performance tuning options. Query pushdown should be happening automatically, right? The slow ETL is causing delayed analytics updates and our business users are complaining about stale dashboards. What’s the best approach to speed this up?

X-Small warehouse is your bottleneck. For 2.5M rows with joins, you need at least a Medium warehouse. X-Small has only 1 server and limited compute resources. The 4-5 hour runtime suggests you’re hitting compute limits, not network transfer limits. Scale up your warehouse during extract creation, then scale back down for normal queries.