For a full list of limitations that apply to BigLake tables
based on Amazon S3 and Blob Storage, see Limitations.
Before you begin
Ensure that you have the following resources:
A connection to access your Blob Storage.
Within the connection, you must create a policy for the Blob Storage
container path that you want to export to. Then, within that policy,
create a role that has the
Microsoft.Storage/storageAccounts/blobServices/containers/write permission.
BigQuery Omni writes to the specified Blob Storage location regardless of any existing
content. The export query can overwrite existing data or mix the query result
with existing data. We recommend that you export the query result to an empty
Blob Storage container.
In the Google Cloud console, go to the BigQuery page.
CONNECTION_REGION: the region where the
connection was created.
CONNECTION_NAME: the connection name that you
created with the necessary permission to write to the container.
AZURE_STORAGE_ACCOUNT_NAME: the name of the
Blob Storage account to which you want to write the query result.
CONTAINER_NAME: the name of the container to
which you want to write the query result.
FILE_PATH: the path where you want to write the
exported file to. It must contain exactly one wildcard * anywhere in
the leaf directory of the path string, for example, ../aa/*,
../aa/b*c, ../aa/*bc, and ../aa/bc*. BigQuery
replaces * with 0000..N depending on the number of files exported.
BigQuery determines the file count and sizes. If
BigQuery decides to export two files, then * in the first
file's filename is replaced by 000000000000, and * in the
second file's filename is replaced by 000000000001.
FORMAT: supported formats are JSON, AVRO,
CSV, and PARQUET.
QUERY: the query to analyze the data that is
stored in a BigLake table.
Troubleshooting
If you get an error related to quota failure, then check if you have reserved
capacity for your queries. For more information about slot reservations, see
Before you begin in this document.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-27 UTC."],[[["\u003cp\u003eThis guide outlines the process of exporting query results from a BigLake table to Azure Blob Storage.\u003c/p\u003e\n"],["\u003cp\u003eBefore exporting, you must establish a connection to your Blob Storage with appropriate write permissions and have a BigLake table in place.\u003c/p\u003e\n"],["\u003cp\u003eThe export process involves using a specific GoogleSQL \u003ccode\u003eEXPORT DATA\u003c/code\u003e query with connection details, target URI, desired format, and the source query.\u003c/p\u003e\n"],["\u003cp\u003eIt is recommended to export query results to an empty Blob Storage container, because the export query can overwrite or mix with existing data.\u003c/p\u003e\n"],["\u003cp\u003eIf you are receiving \u003ccode\u003equota failure\u003c/code\u003e error, check if you have reserved capacity for your queries.\u003c/p\u003e\n"]]],[],null,["# Export query results to Blob Storage\n====================================\n\nThis document describes how to export the result of a query that runs against a\n[BigLake table](/bigquery/docs/biglake-intro) to your\nAzure Blob Storage.\n\nFor information about how data flows between BigQuery and\nAzure Blob Storage,\nsee [Data flow when exporting data](/bigquery/docs/omni-introduction#export-data).\n\nLimitations\n-----------\n\nFor a full list of limitations that apply to BigLake tables\nbased on Amazon S3 and Blob Storage, see [Limitations](/bigquery/docs/omni-introduction#limitations).\n\nBefore you begin\n----------------\n\nEnsure that you have the following resources:\n\n\n- A [connection to access your Blob Storage](/bigquery/docs/omni-azure-create-connection). Within the connection, you must create a policy for the Blob Storage container path that you want to export to. Then, within that policy, create a role that has the `Microsoft.Storage/storageAccounts/blobServices/containers/write` permission.\n- An [Blob Storage BigLake table](/bigquery/docs/omni-azure-create-external-table).\n\n\u003c!-- --\u003e\n\n- If you are on the [capacity-based pricing model](/bigquery/pricing#capacity_compute_analysis_pricing), then ensure that you have enabled the [BigQuery Reservation API](https://console.cloud.google.com/apis/library/bigqueryreservation.googleapis.com) for your project. For information about pricing, see [BigQuery Omni pricing](/bigquery/pricing#bqomni).\n\nExport query results\n--------------------\n\nBigQuery Omni writes to the specified Blob Storage location regardless of any existing\ncontent. The export query can overwrite existing data or mix the query result\nwith existing data. We recommend that you export the query result to an empty\nBlob Storage container.\n\n1. In the Google Cloud console, go to the **BigQuery** page.\n\n [Go to BigQuery](https://console.cloud.google.com/bigquery)\n2. In the **Query editor** field, enter a GoogleSQL export query:\n\n ```bash\n EXPORT DATA WITH CONNECTION \\`CONNECTION_REGION.CONNECTION_NAME\\`\n OPTIONS(\n uri=\"azure://\u003cvar translate=\"no\"\u003eAZURE_STORAGE_ACCOUNT_NAME\u003c/var\u003e.blob.core.windows.net/\u003cvar translate=\"no\"\u003eCONTAINER_NAME\u003c/var\u003e/\u003cvar translate=\"no\"\u003eFILE_PATH\u003c/var\u003e/*\",\n format=\"\u003cvar translate=\"no\"\u003eFORMAT\u003c/var\u003e\"\n )\n AS QUERY\n ```\n\n Replace the following:\n - \u003cvar translate=\"no\"\u003eCONNECTION_REGION\u003c/var\u003e: the region where the connection was created.\n - \u003cvar translate=\"no\"\u003eCONNECTION_NAME\u003c/var\u003e: the connection name that you created with the necessary permission to write to the container.\n - \u003cvar translate=\"no\"\u003eAZURE_STORAGE_ACCOUNT_NAME\u003c/var\u003e: the name of the Blob Storage account to which you want to write the query result.\n - \u003cvar translate=\"no\"\u003eCONTAINER_NAME\u003c/var\u003e: the name of the container to which you want to write the query result.\n - \u003cvar translate=\"no\"\u003eFILE_PATH\u003c/var\u003e: the path where you want to write the exported file to. It must contain exactly one wildcard `*` anywhere in the leaf directory of the path string, for example, `../aa/*`, `../aa/b*c`, `../aa/*bc`, and `../aa/bc*`. BigQuery replaces `*` with `0000..N` depending on the number of files exported. BigQuery determines the file count and sizes. If BigQuery decides to export two files, then `*` in the first file's filename is replaced by `000000000000`, and `*` in the second file's filename is replaced by `000000000001`.\n - \u003cvar translate=\"no\"\u003eFORMAT\u003c/var\u003e: supported formats are `JSON`, `AVRO`, `CSV`, and `PARQUET`.\n - \u003cvar translate=\"no\"\u003eQUERY\u003c/var\u003e: the query to analyze the data that is stored in a BigLake table.\n\n| **Note:** To override the default project, use the `--project_id=`\u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e parameter. Replace \u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e with the ID of your Google Cloud project.\n\nTroubleshooting\n---------------\n\nIf you get an error related to `quota failure`, then check if you have reserved\ncapacity for your queries. For more information about slot reservations, see\n[Before you begin](#before_you_begin) in this document.\n\nWhat's next\n-----------\n\n- Learn about [BigQuery Omni](/bigquery/docs/omni-introduction).\n- Learn how to [export table data](/bigquery/docs/exporting-data).\n- Learn how to [query data stored in Blob Storage](/bigquery/docs/query-azure-data).\n- Learn how to [set up VPC Service Controls for BigQuery Omni](/bigquery/docs/omni-vpc-sc)."]]