Stay organized with collections
Save and categorize content based on your preferences.
Schedule Airflow DAGs
This document describes how to schedule
Airflow directed acyclic graphs (DAGs)
from
Cloud Composer 3 on the
Scheduling page in BigQuery, including how to trigger DAGs
manually, and how to view the history and logs of past DAG runs.
About managing Airflow DAGs in BigQuery
The Scheduling page in BigQuery provides tools to
schedule Airflow DAGs that run in your Cloud Composer 3 environments.
Airflow DAGs that you schedule in BigQuery are executed in
one or more Cloud Composer environments in your project. The
Scheduling page in BigQuery combines information for
all Airflow DAGs in your project.
During a DAG run, Airflow schedules and executes individual tasks that make up
a DAG in a sequence defined by the DAG. On the Scheduling page in
BigQuery, you can view statuses of past DAG runs, explore
detailed logs of all DAG runs and all tasks from these DAG runs, and view
details about DAGs.
To learn more about Airflow's core concepts such as Airflow DAGs, DAG runs,
tasks, or operators, see the
Core Concepts
page in the Airflow documentation.
To learn more about Cloud Composer environments, see the
Cloud Composer 3 overview page
in the Cloud Composer documentation.
Make sure that your Google Cloud project has at least one Cloud Composer 3
environment, with at least one already uploaded DAG file:
To get started with Airflow DAGs, follow the instructions in the
Run an
Apache Airflow DAG in Cloud Composer 3 guide. As a part of this
guide, you create a Cloud Composer 3 environment with the default
configuration, upload a DAG to it, and check that Airflow runs it.
For detailed instructions to upload an Airflow DAG to a
Cloud Composer 3 environment, see
Add and update
DAGs.
These predefined roles contain
the permissions required to schedule Airflow DAGs. To see the exact permissions that are
required, expand the Required permissions section:
Required permissions
The following permissions are required to schedule Airflow DAGs:
To view Airflow DAGs and their details:
composers.dags.list, composer.environments.list
To trigger and pause Airflow DAGs:
composers.dags.list, composer.environments.list, composer.dags.execute
Optional: The Runs tab displays DAG runs from the last 10 days by
default. To filter DAG runs by a different time range, in
the 10 days drop-down menu, select a time range, and then click
OK.
Optional: To display additional columns with DAG run details in the list
of all DAG runs, click view_columnColumn display options, and then select columns and click OK.
To view details and logs for a selected DAG run, select a DAG run.
To view a visualization of the DAG with task dependencies,
select the Diagram tab.
To view task details, select a task on the diagram.
To view the source code of the DAG, select the Code tab.
Optional: To refresh the displayed data, click Refresh.
View all Airflow DAGs
To view Airflow DAGs from all Cloud Composer 3 environments in your
Google Cloud project, follow these steps:
In the Google Cloud console, go to the Scheduling page.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-25 UTC."],[[["\u003cp\u003eThe Scheduling page in BigQuery allows users to manage and schedule Airflow DAGs running in Cloud Composer 3 environments within their Google Cloud project.\u003c/p\u003e\n"],["\u003cp\u003eUsers can manually trigger DAGs, view past run statuses and detailed logs, and examine DAG details, including visualizations and source code, all from the Scheduling page.\u003c/p\u003e\n"],["\u003cp\u003eBefore using the Scheduling features, users must enable the Cloud Composer API and ensure they have at least one Cloud Composer 3 environment with an uploaded DAG.\u003c/p\u003e\n"],["\u003cp\u003eSpecific IAM roles are required to view, trigger, and pause Airflow DAGs, such as Environment and Storage Object Viewer and Environment and Storage Object User, which can be granted by an administrator.\u003c/p\u003e\n"],["\u003cp\u003eThis feature is in "Pre-GA" (Preview) stage, meaning it's available "as is" with limited support, and users can provide feedback or request assistance via email.\u003c/p\u003e\n"]]],[],null,["# Schedule Airflow DAGs\n=====================\n\nThis document describes how to schedule\n[Airflow directed acyclic graphs (DAGs)](/composer/docs/composer-3/composer-overview#about-airflow)\nfrom\n[Cloud Composer 3](/composer/docs/composer-3/composer-overview) on the\n**Scheduling** page in BigQuery, including how to trigger DAGs\nmanually, and how to view the history and logs of past DAG runs.\n\nAbout managing Airflow DAGs in BigQuery\n---------------------------------------\n\nThe **Scheduling** page in BigQuery provides tools to\nschedule Airflow DAGs that run in your Cloud Composer 3 environments.\n\nAirflow DAGs that you schedule in BigQuery are executed in\none or more Cloud Composer environments in your project. The\n**Scheduling** page in BigQuery combines information for\nall Airflow DAGs in your project.\n\nDuring a DAG run, Airflow schedules and executes individual tasks that make up\na DAG in a sequence defined by the DAG. On the **Scheduling** page in\nBigQuery, you can view statuses of past DAG runs, explore\ndetailed logs of all DAG runs and all tasks from these DAG runs, and view\ndetails about DAGs.\n| **Note:** You can't manage Cloud Composer environments in BigQuery. To manage environments, for example, to create an environment, install dependencies for your DAG files, upload, delete, or change individual DAGs, you use Cloud Composer.\n\nTo learn more about Airflow's core concepts such as Airflow DAGs, DAG runs,\ntasks, or operators, see the\n[Core Concepts](https://airflow.apache.org/docs/apache-airflow/stable/core-concepts/index.html)\npage in the Airflow documentation.\n\nTo learn more about Cloud Composer environments, see the\n[Cloud Composer 3 overview](/composer/docs/composer-3/composer-overview) page\nin the Cloud Composer documentation.\n\nBefore you begin\n----------------\n\n1.\n\n\n Enable the Cloud Composer API.\n\n\n [Enable the API](https://console.cloud.google.com/flows/enableapi?apiid=composer.googleapis.com)\n2. Make sure that your Google Cloud project has at least one Cloud Composer 3 environment, with at least one already uploaded DAG file:\n - To get started with Airflow DAGs, follow the instructions in the [Run an\n Apache Airflow DAG in Cloud Composer 3](/composer/docs/composer-3/run-apache-airflow-dag) guide. As a part of this guide, you create a Cloud Composer 3 environment with the default configuration, upload a DAG to it, and check that Airflow runs it.\n - For detailed instructions to upload an Airflow DAG to a Cloud Composer 3 environment, see [Add and update\n DAGs](/composer/docs/composer-3/manage-dags).\n - For detailed instructions to create a Cloud Composer 3 environment, see [Create\n Cloud Composer environments](/composer/docs/composer-3/create-environments).\n\n### Required permissions\n\n\nTo get the permissions that\nyou need to schedule Airflow DAGs,\n\nask your administrator to grant you the\nfollowing IAM roles on the project:\n\n- To view Airflow DAGs and their details: [Environment and Storage Object Viewer](/iam/docs/roles-permissions/composer#composer.environmentAndStorageObjectViewer) (`roles/composer.environmentAndStorageObjectViewer`)\n- To trigger and pause Airflow DAGs: [Environment and Storage Object User](/iam/docs/roles-permissions/composer#composer.environmentAndStorageObjectUser) (`roles/composer.environmentAndStorageObjectUser`)\n\n\nFor more information about granting roles, see [Manage access to projects, folders, and organizations](/iam/docs/granting-changing-revoking-access).\n\n\nThese predefined roles contain\n\nthe permissions required to schedule Airflow DAGs. To see the exact permissions that are\nrequired, expand the **Required permissions** section:\n\n\n#### Required permissions\n\nThe following permissions are required to schedule Airflow DAGs:\n\n- To view Airflow DAGs and their details: ` composers.dags.list, composer.environments.list `\n- To trigger and pause Airflow DAGs: ` composers.dags.list, composer.environments.list, composer.dags.execute`\n\n\nYou might also be able to get\nthese permissions\nwith [custom roles](/iam/docs/creating-custom-roles) or\nother [predefined roles](/iam/docs/roles-overview#predefined).\n\n\u003cbr /\u003e\n\nFor more information about Cloud Composer 3 IAM, see\n[Access control with IAM](/composer/docs/composer-3/access-control)\nin Cloud Composer documentation.\n\nManually trigger an Airflow DAG\n-------------------------------\n\nWhen you manually trigger an Airflow DAG, Airflow runs\nthe DAG once, independently from the schedule specified for the DAG.\n\nTo manually trigger a selected Airflow DAG, follow these steps:\n\n1. In the Google Cloud console, go to the **Scheduling** page.\n\n [Go to the **Scheduling** page](https://console.cloud.google.com/bigquery/orchestration)\n2. Do either of the following:\n\n - Click the name of the selected DAG, and then\n on the **DAG details** page, click **Trigger DAG**.\n\n - In the row that contains the selected DAG,\n click more_vert\n **View actions** in the **Actions** column, and then click\n **Trigger DAG**.\n\nView Airflow DAG run logs and details\n-------------------------------------\n\nTo view details of a selected Airflow DAG, follow these steps:\n\n1. In the Google Cloud console, go to the **Scheduling** page.\n\n [Go to the **Scheduling** page](https://console.cloud.google.com/bigquery/orchestration)\n2. Click the name of the selected DAG.\n\n3. On the **DAG details** page, select the **Details** tab.\n\n4. To view past DAG runs, select the **Runs** tab.\n\n 1. Optional: The **Runs** tab displays DAG runs from the last 10 days by\n default. To filter DAG runs by a different time range, in\n the **10 days** drop-down menu, select a time range, and then click\n **OK**.\n\n 2. Optional: To display additional columns with DAG run details in the list\n of all DAG runs, click view_column\n **Column display options** , and then select columns and click **OK**.\n\n 3. To view details and logs for a selected DAG run, select a DAG run.\n\n5. To view a visualization of the DAG with task dependencies,\n select the **Diagram** tab.\n\n 1. To view task details, select a task on the diagram.\n6. To view the source code of the DAG, select the **Code** tab.\n\n7. Optional: To refresh the displayed data, click **Refresh**.\n\nView all Airflow DAGs\n---------------------\n\nTo view Airflow DAGs from all Cloud Composer 3 environments in your\nGoogle Cloud project, follow these steps:\n\n1. In the Google Cloud console, go to the **Scheduling** page.\n\n [Go to the **Scheduling** page](https://console.cloud.google.com/bigquery/orchestration)\n2. Optional: To display additional columns with DAG details,\n click view_column\n **Column display options** ,\n and then select columns and click **OK**.\n\nPause an Airflow DAG\n--------------------\n\nTo pause a selected Airflow DAG, follow these steps:\n\n1. In the Google Cloud console, go to the **Scheduling** page.\n\n [Go to the **Scheduling** page](https://console.cloud.google.com/bigquery/orchestration)\n2. Do either of the following:\n\n - Click the name of the selected DAG, and then\n on the **DAG details** page, click **Pause DAG**.\n\n - In the row that contains the selected DAG,\n click more_vert\n **View actions** in the **Actions** column, and then click **Pause DAG**.\n\nTroubleshooting\n---------------\n\nFor instructions to troubleshoot Airflow DAGs, see\n[Troubleshooting Airflow DAGs](/composer/docs/composer-3/troubleshooting-dags)\nin Cloud Composer documentation.\n\nWhat's next\n-----------\n\n- Learn more about [writing Airflow DAGs](/composer/docs/composer-3/write-dags).\n- Learn more about [Airflow in Cloud Composer 3](/composer/docs/composer-3/composer-overview#about-airflow)."]]