Google BigQuery is a serverless data warehouse service, but its costs are still dependent on the amount of data stored and processed. According to Google Cloud’s recommendations, you should apply various methods to optimize both queries and storage in BigQuery, while also utilizing cost-monitoring tools. Below, we summarize the recommended techniques and strategies (based on official documentation and expert insights) to minimize BigQuery costs.
1. Reduce data processed
1.1 Don’t use SELECT *
Avoid using SELECT *, as it significantly increases the number of bytes scanned during query execution. BigQuery reads all columns in a table regardless of whether they are required by the query. Therefore, if your table contains 20, 50, or more columns, you will be billed for scanning every column, even those that are not used. To optimize performance and reduce costs, always specify only the columns that are necessary for your analysis.
SELECT * scans all columns — more data, higher cost:

Selecting specific columns — less data, lower cost:

1.2 Filter by Partition Key
When working with partitioned tables, always filter using the partition key (such as date or the designated partition column). Omitting it causes BigQuery to scan the entire table, which increases both processing time and cost.

1.3 Avoid using LIMIT
Using LIMIT alone does not reduce query costs because BigQuery scans the entire dataset before applying the limit. To minimize data scanned and reduce costs, you should apply WHERE filters and select only the necessary columns. LIMIT is only useful for restricting the number of rows returned in the result set.
Example with LIMIT, query will process 2,11 GB:
Instead, you should use clear WHERE conditions to narrow the data scope, as this helps reduce the amount of data scanned:
1.4 Use SELECT * EXCEPT
Consider using SELECT * EXCEPT when you need most, but not all, columns in a table. Querying a subset of data in this way can significantly reduce the volume of data read during query execution. This not only lowers query costs but also improves performance by minimizing data.

2. Using Cached Results to Reduce Costs
BigQuery caches query results for up to 24 hours. If the same query is rerun with unchanged data, it retrieves results from the cache, avoiding additional data scan charges.
By default, caching is enabled:

3. Check the estimated cost before running a query
Before running an expensive query, estimate the data scanned using the query validator tool. In the Console, enter your query and check the validator message for the estimated bytes.
Dry-run returns the estimated bytes without using any slots, so it’s free. This helps you estimate cost and adjust the query (e.g., by adding filters) before actual execution.
The message “This query will process 557.19 KB when run.” confirms that the query is a dry run.

4. Limit Query Costs
Use the maximum bytes billed setting to limit query costs when using the on-demand pricing model. If the number of estimated bytes is beyond the limit, then the query fails without incurring a charge.

5. Delete Unnecessary Data
Set expiration for tables or datasets containing temporary data or data when it’s no longer needed. Keeping large result sets in BigQuery storage has a cost. If you don’t need permanent access to the results, use the default table expiration to automatically delete the data for you.

6. Reduce Data Before Using a JOIN
Minimize data processing before a JOIN by performing aggregations first. GROUP BY and aggregate functions are resource-intensive due to data shuffling. To improve performance, use GROUP BY only when needed. When combining GROUP BY and JOIN, aggregate early in the query to reduce data volume.

7. Use Materialized Views
In BigQuery, materialized views are precomputed query results that are stored and automatically refreshed when the underlying data changes. They improve performance and reduce costs by enabling queries to read from cached results instead of scanning the base table directly. As a result, queries run significantly faster and consume fewer resources.
You can create a materialized view like this:

Then, you can query the materialized view instead:

8. Use the WHERE clause
Use a WHERE clause to limit the data returned by the query. Preferably, use BOOL, INT64, FLOAT64, or DATE columns in the WHERE clause, as operations on these data types are faster than on STRING or BYTE columns. This helps reduce the amount of data processed by the query.

Conclusion
Optimizing costs in BigQuery isn’t just about saving money — it’s about working smarter. By using partitioning, clustering, selective queries, and storage strategies, you can significantly reduce unnecessary expenses while maintaining performance. Small adjustments, big savings.