The author

Ben Kentzer

Head of Data Engineering

View profile
News & Views / Using Data to Optimise Snowflake Costs
03 May 2024

Using Data to Optimise Snowflake Costs

Whether you’re an organisation hosting Snowflake for your clients, or are an end-user company running Snowflake, cost management is something that should be on your radar.

In times when every penny makes a difference to your bottom line, keeping an eye on all this is imperative.

Breaking down the costs into projects, departments, or clients will help you understand where your high usage areas are so you can understand how things can be made more efficient— remember, it’s not always the case that using a larger warehouse will increase your costs — in fact, it’s often quite the opposite.

Setting up for Success

The cost reporting within Snowflake is excellent. Within Snowsight, you can break down the costs within an account — giving you such insights as which users are running the longest queries, and which departments are storing the most data.

This image shows an example of the compute usage for one of our clients across March:

All this data is available in the Snowflake provided databases. This enables you to break down costs by user, department, organisation, or any other category that fits your business model.

Given Snowflake charges separately for data stored and queries run, to make the most of this reporting, our best practice is:

  1. Be as granular as you need to be with compute resource used for queries:
    - Each cost centre has their own compute warehouse(s) that only they can use.
    - Batch processes have dedicated compute warehouses.
  2. Decide who pays for data storage and other overheads:
    - Corporate Data Store with central costs
    - User defined marts owned and charged to the relevant department.
    - Data transfer (for replication).


Alongside the standard Snowsight reporting, the underlying data can be used to build additional reporting – with the outputs being published within your reporting tool of choice.

This enables a baseline to be produced – this is simply a snapshot of performance at a particular point in time – which is then used to help track changes in performance over time.


Once your account is set up as you need, and you have your reporting ready, you can start on optimisation.

Some starting points to consider:

  • Take the “low hanging fruit” first — these are the biggest queries which have the biggest chance to impact the costs, or are on the critical path for your batch schedule.
  • Change one thing at once — you need to understand the impact of each action.
  • Regularly review against your original baseline, each review becomes a new baseline to compare against.
  • If a change reduces performance, roll back and re-assess.
  • Data and system architecture doesn’t have to be set in stone. It can be an expensive option, but migrating to a different data design can sometimes pay dividends.

Understanding your costs is the first step towards optimisation. The details provided within the Snowflake environment will give you the baseline for any changes that you make, and to understand where the growth areas are.

Jaywing can help you get to a great place with understanding your costs, giving you a springboard to get more value from your Snowflake investment as well as understand where your cost pressures are.

For more information, talk to one of our data experts