
In our last few posts, we’ve looked at how to:
Today, we’re diving into a different, but equally powerful question:
When are my Snowflake warehouses actually running?
This can give you some great insight into usage patterns, process timing, and maintenance windows.
The Query: Mapping Warehouse Usage by Hour & Day
Let’s start with the SQL query that fuels this insight:
WITH size_factors AS (
SELECT 'XSMALL' AS SIZE, 1 AS MULTIPLIER UNION ALL
SELECT 'SMALL', 2 UNION ALL
SELECT 'MEDIUM', 4 UNION ALL
SELECT 'LARGE', 8 UNION ALL
SELECT 'XLARGE', 16 UNION ALL
SELECT 'XXLARGE', 32 UNION ALL
SELECT 'XXXLARGE', 64
),
warehouse_sizes AS (
SELECT
WAREHOUSE_NAME,
UPPER(SIZE) AS SIZE
FROM MONITORING.WAREHOUSE_SNAPSHOT
),
day_hour_grid AS (
SELECT
d.day_name,
d.day_order,
LPAD(h.hour::string, 2, '0') AS hour_of_day
FROM (
SELECT * FROM VALUES
(1, '1 - SUN'), (2, '2 - MON'), (3, '3 - TUE'), (4, '4 - WED'),
(5, '5 - THU'), (6, '6 - FRI'), (7, '7 - SAT')
) AS d(day_order, day_name)
CROSS JOIN (
SELECT SEQ4() AS hour FROM TABLE(GENERATOR(ROWCOUNT => 24))
) AS h
),
warehouse_usage AS (
SELECT
CASE DAYOFWEEK(h.START_TIME)
WHEN 1 THEN '1 - SUN'
WHEN 2 THEN '2 - MON'
WHEN 3 THEN '3 - TUE'
WHEN 4 THEN '4 - WED'
WHEN 5 THEN '5 - THU'
WHEN 6 THEN '6 - FRI'
WHEN 7 THEN '7 - SAT'
END AS day_name,
LPAD(TO_CHAR(h.START_TIME, 'HH24'), 2, '0') AS hour_of_day,
SUM(h.AVG_RUNNING * f.MULTIPLIER * (1.0 / 12)) AS estimated_credits
FROM SNOWFLAKE.ACCOUNT_USAGE.WAREHOUSE_LOAD_HISTORY h
JOIN warehouse_sizes s ON h.WAREHOUSE_NAME = s.WAREHOUSE_NAME
JOIN size_factors f ON s.SIZE = f.SIZE
WHERE h.START_TIME >= DATEADD(DAY, -30, CURRENT_DATE)
GROUP BY 1, 2
)
SELECT
g.day_name,
g.hour_of_day,
COALESCE(w.estimated_credits, 0) AS estimated_credits,
g.day_order
FROM day_hour_grid g
LEFT JOIN warehouse_usage w
ON g.day_name = w.day_name
AND g.hour_of_day = w.hour_of_day
ORDER BY g.day_order, g.hour_of_day;
Let’s Break It Down
Here’s what’s going on:
size_factorsCTE: This maps Snowflake warehouse sizes to their respective credit multipliers (i.e., XSMALL = 1, SMALL = 2, and so on).warehouse_sizesCTE: Uses the warehouse snapshot from our previous post to get each warehouse’s size.day_hour_gridCTE: Creates a full 7-day x 24-hour grid using CROSS JOIN. Each hour is left-padded (LPAD) to keep the ordering clean for visuals. Days are prepended with a number (1 - SUN, etc.) so we can order them properly in visuals and queries.warehouse_usageCTE: This is where the magic happens. It sums credit consumption per hour, per day of the week, based on warehouse activity in the past 30 days. Note that we normalize the credits by hour using(1.0 / 12)to reflect partial-hour usage.- Final SELECT: We LEFT JOIN the full day-hour grid with actual warehouse usage so that even unused time slots appear (with 0 credits). That’s essential for clean heatmaps.
Building the Heatmap (aka HeatGrid)
Now we can set up the HeatGrid in Snowflake using the following columns and settings:
- Cell Value →
ESTIMATED_CREDITS - Rows →
HOUR_OF_DAY - Columns →
DAY_NAME
You’ll get a beautiful, color-coded grid showing your warehouse activity across days and hours.
What Can You Learn From This?
After this is completed, you will have something like the following:

Here’s a sample interpretation:
You might see that your warehouses spike at 4 AM, Sunday through Friday, but do absolutely nothing between Midnight and 3 AM. That’s valuable insight.
This tells you:
- Whether processes are running as scheduled (or too frequently).
- When maintenance can safely happen.
- When resource contention might occur.
Final Thoughts
Visualizations like this aren’t just pretty — they’re operationally strategic. With a single query and a clean chart, you can uncover when Snowflake is doing the heavy lifting — and when it’s just chilling in the cloud.
Add it to your regular monitoring toolkit. Trust me, future-you will thank you.
Until next time, may your queries be fast and your warehouses always spark joy.
Leave a Reply
You must be logged in to post a comment.