gcloud logging read \
'resource.type="dataplex.googleapis.com/DataScan" AND
logName=projects/PROJECT_ID/logs/dataplex.googleapis.com%2Fdata_scan AND
resource.labels.location=LOCATION AND
resource.labels.datascan_id=DATA_SCAN_ID'
--limit 10
When you use Dataplex Universal Catalog to
create and run a data quality scan, a
data quality scan rule result log is produced in Logging for the
resulting job.
Console
In the Google Cloud console, go to the Logs explorer page.
To read your data quality scan rule result log entries, use the
gcloud logging read command
with the following query:
gcloud logging read \
'resource.type="dataplex.googleapis.com/DataScan" AND
logName=projects/PROJECT_ID/logs/dataplex.googleapis.com%2Fdata_quality_scan_rule_result AND
resource.labels.location=LOCATION AND
resource.labels.datascan_id=DATA_SCAN_ID'
--limit 10
gcloud logging read \
'resource.type="dataplex.googleapis.com/Zone" AND
logName=projects/PROJECT_ID/logs/dataplex.googleapis.com%2Fdiscovery AND
resource.labels.location=LOCATION AND
resource.labels.lake_id=LAKE_ID AND
resource.labels.zone_id=ZONE_ID AND
jsonPayload.assetId=ASSET_ID'
--limit 10
gcloud logging read \
'resource.type="dataplex.googleapis.com/MetadataJob" AND
logName=projects/PROJECT_ID/logs/dataplex.googleapis.com%2Fmetadata_job AND
resource.labels.location=LOCATION AND
resource.labels.metadata_job_id=METADATA_JOB_ID
--limit 10
gcloud logging read \
'resource.type="dataplex.googleapis.com/Task" AND
logName=projects/PROJECT_ID/logs/dataplex.googleapis.com%2Fprocess AND
resource.labels.location=LOCATION AND
resource.labels.lake_id=LAKE_ID AND
resource.labels.task_id=TASK_ID'
--limit 10
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-25 UTC."],[[["\u003cp\u003eDataplex service logs, including data scan event logs, data quality scan rule result logs, discovery logs, metadata job logs, and process logs, are published to Cloud Logging.\u003c/p\u003e\n"],["\u003cp\u003eYou can access, search, filter, and archive Dataplex job logs in Cloud Logging through the Logs Explorer, \u003ccode\u003egcloud logging\u003c/code\u003e commands, or the Logging API.\u003c/p\u003e\n"],["\u003cp\u003eDataplex data scan, data quality scan, discovery, metadata import, and task processes generate logs that can be queried using specific log names and optional filters for location and IDs.\u003c/p\u003e\n"],["\u003cp\u003eYou can route the logs from Logging to Cloud Storage, BigQuery or Pub/Sub, disable all logs, or exclude some logs from logging.\u003c/p\u003e\n"],["\u003cp\u003eLog entries can be accessed and managed using the Google Cloud console, \u003ccode\u003egcloud\u003c/code\u003e command-line tools, or through REST API calls.\u003c/p\u003e\n"]]],[],null,["# Monitor Dataplex Universal Catalog logs\n\nDataplex Universal Catalog job logs can be viewed, searched, filtered, and archived in\n[Cloud Logging](/logging/docs).\n\nTo understand the costs, see [Google Cloud Observability pricing](/stackdriver/pricing).\n\nFor more information about logging retention, see\n[Logs retention periods](/logging/quotas#logs_retention_periods).\n\nTo disable all logs or exclude logs from Logging, see\n[Exclusion filters](/logging/docs/exclusions).\n\nTo route logs from Logging to Cloud Storage,\nBigQuery, or Pub/Sub, see\n[Routing and storage overview](/logging/docs/routing/overview).\n\nAccess Dataplex Universal Catalog service logs in Logging\n---------------------------------------------------------\n\nDataplex Universal Catalog publishes the following service logs to Cloud Logging.\n\nReplace the following:\n\n- \u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e: the ID of your project\n\nTo access Logging, you can use the\n[Logs Explorer](/logging/docs/view/logs-explorer-interface) in the\nGoogle Cloud console, the\n[`gcloud logging` commands](/sdk/gcloud/reference/logging), or\nthe [Logging API](/logging/docs/apis).\n\n### Query data scan event logs\n\nWhen you use Dataplex Universal Catalog to\n[create and run a data scan](/dataplex/docs/use-auto-data-quality), a data scan\nevent log is produced in Logging for the resulting job. \n\n### Console\n\n1. In the Google Cloud console, go to the **Logs explorer** page.\n\n\n [Go to Logs explorer](https://console.cloud.google.com/logs/query)\n\n \u003cbr /\u003e\n\n2. In the **Logs Explorer** view, find the **Query** tab.\n\n3. Click the **Resource** menu.\n\n4. Select **Cloud Dataplex DataScan** . Click **Apply**.\n\n5. Click the **Log name** menu.\n\n6. In the **Search log names** field, enter\n `dataplex.googleapis.com%2Fdata_scan`. Select **data_scan** and\n click **Apply**.\n\n7. Optional: Filter the logs to a specific data scan ID or location by\n adding the following filters in the log query:\n\n ```\n resource.labels.location=\"LOCATION\"\n resource.labels.datascan_id=\"DATA_SCAN_ID\"\n ```\n8. Click **Run query**.\n\n### gcloud\n\nTo read your data scan event log entries, use the\n[`gcloud logging read` command](https://cloud.google.com/sdk/gcloud/reference/logging/read)\nwith the following query: \n\n```\ngcloud logging read \\\n 'resource.type=\"dataplex.googleapis.com/DataScan\" AND\n logName=projects/PROJECT_ID/logs/dataplex.googleapis.com%2Fdata_scan AND\n resource.labels.location=LOCATION AND\n resource.labels.datascan_id=DATA_SCAN_ID'\n --limit 10\n```\n\n### REST\n\nTo list log entries, use the\n[`entries.list` method](/logging/docs/reference/v2/rest/v2/entries/list).\n\n### Query data quality scan rule result logs\n\nWhen you use Dataplex Universal Catalog to\n[create and run a data quality scan](/dataplex/docs/use-auto-data-quality), a\ndata quality scan rule result log is produced in Logging for the\nresulting job. \n\n### Console\n\n1. In the Google Cloud console, go to the **Logs explorer** page.\n\n\n [Go to Logs explorer](https://console.cloud.google.com/logs/query)\n\n \u003cbr /\u003e\n\n2. In the **Logs Explorer** view, find the **Query** tab.\n\n3. Click the **Resource** menu.\n\n4. Select **Cloud Dataplex DataScan** . Click **Apply**.\n\n5. Click the **Log name** menu.\n\n6. In the **Search log names** field, enter\n `dataplex.googleapis.com%2Fdata_quality_scan_rule_result`. Select\n **data_quality_scan_rule_result** and click **Apply**.\n\n7. Optional: Filter the logs to a specific data scan ID or location by\n adding the following filters in the log query:\n\n ```\n resource.labels.location=\"LOCATION\"\n resource.labels.datascan_id=\"DATA_SCAN_ID\"\n ```\n8. Click **Run query**.\n\n### gcloud\n\nTo read your data quality scan rule result log entries, use the\n[`gcloud logging read` command](https://cloud.google.com/sdk/gcloud/reference/logging/read)\nwith the following query: \n\n```\ngcloud logging read \\\n 'resource.type=\"dataplex.googleapis.com/DataScan\" AND\n logName=projects/PROJECT_ID/logs/dataplex.googleapis.com%2Fdata_quality_scan_rule_result AND\n resource.labels.location=LOCATION AND\n resource.labels.datascan_id=DATA_SCAN_ID'\n --limit 10\n```\n\n### REST\n\nTo list log entries, use the\n[`entries.list` method](/logging/docs/reference/v2/rest/v2/entries/list).\n\n### Query discovery logs\n\nWhen you use Dataplex Universal Catalog to discover data in assets, a discovery log\nis produced in Logging. \n\n### Console\n\n1. In the Google Cloud console, go to the **Logs explorer** page.\n\n\n [Go to Logs explorer](https://console.cloud.google.com/logs/query)\n\n \u003cbr /\u003e\n\n2. In the **Logs Explorer** view, find the **Query** tab.\n\n3. Click the **Resource** menu.\n\n4. Select **Cloud Dataplex Zone** . Click **Apply**.\n\n5. Click the **Log name** menu.\n\n6. In the **Search log names** field, enter\n `dataplex.googleapis.com%2Fdiscovery`. Select **discovery** and\n click **Apply**.\n\n7. Optional: Filter the logs to a specific asset by adding the following filters in the log query:\n\n ```\n resource.labels.location=\"LOCATION\"\n resource.labels.lake_id=\"LAKE_ID\"\n resource.labels.zone_id=\"ZONE_ID\"\n jsonPayload.assetId=\"ASSET_ID\"\n ```\n8. Click **Run query**.\n\n### gcloud\n\nTo read your discovery log entries, use the\n[`gcloud logging read` command](https://cloud.google.com/sdk/gcloud/reference/logging/read)\nwith the following query: \n\n```\ngcloud logging read \\\n 'resource.type=\"dataplex.googleapis.com/Zone\" AND\n logName=projects/PROJECT_ID/logs/dataplex.googleapis.com%2Fdiscovery AND\n resource.labels.location=LOCATION AND\n resource.labels.lake_id=LAKE_ID AND\n resource.labels.zone_id=ZONE_ID AND\n jsonPayload.assetId=ASSET_ID'\n --limit 10\n```\n\n### REST\n\nTo list log entries, use the\n[`entries.list` method](/logging/docs/reference/v2/rest/v2/entries/list).\n\n### Query metadata job logs\n\nWhen you [run a metadata import job](/dataplex/docs/import-metadata), metadata\njob logs are produced in Logging. \n\n### Console\n\n1. In the Google Cloud console, go to the **Logs explorer** page.\n\n\n [Go to Logs explorer](https://console.cloud.google.com/logs/query)\n\n \u003cbr /\u003e\n\n2. In the **Logs Explorer** view, find the **Query** tab.\n\n3. Click the **Resource** menu.\n\n4. Select **Cloud Dataplex Metadata Job**.\n\n5. Optional: To filter the logs to a specific location or metadata job ID,\n select a location or job ID.\n\n6. Click **Apply**.\n\n7. Click the **Log name** menu.\n\n8. Type `dataplex.googleapis.com%2Fmetadata_job` and then select\n **metadata_job**.\n\n9. Click **Apply**.\n\n### gcloud\n\nTo read your metadata job log entries, use the\n[`gcloud logging read` command](https://cloud.google.com/sdk/gcloud/reference/logging/read)\nwith the following query: \n\n```\ngcloud logging read \\\n 'resource.type=\"dataplex.googleapis.com/MetadataJob\" AND\n logName=projects/PROJECT_ID/logs/dataplex.googleapis.com%2Fmetadata_job AND\n resource.labels.location=LOCATION AND\n resource.labels.metadata_job_id=METADATA_JOB_ID\n --limit 10\n```\n\n### REST\n\nTo list log entries, use the\n[`entries.list` method](/logging/docs/reference/v2/rest/v2/entries/list).\n\n### Query process logs\n\nWhen you use Dataplex Universal Catalog to\n[schedule and run tasks](/dataplex/docs/schedule-custom-spark-tasks), a process\nlog is produced in Logging for the resulting job. \n\n### Console\n\n1. In the Google Cloud console, go to the **Logs explorer** page.\n\n\n [Go to Logs explorer](https://console.cloud.google.com/logs/query)\n\n \u003cbr /\u003e\n\n2. In the **Logs Explorer** view, find the **Query** tab.\n\n3. Click the **Resource** menu.\n\n4. Select **Cloud Dataplex Task** . Click **Apply**.\n\n5. Click the **Log name** menu.\n\n6. In the **Search log names** field, enter\n `dataplex.googleapis.com%2Fprocess`. Select **process** and click\n **Apply**.\n\n7. Optional: Filter the logs to a specific task by adding the following\n filters in the log query:\n\n ```\n resource.labels.location=\"LOCATION\"\n resource.labels.lake_id=\"LAKE_ID\"\n resource.labels.task_id=\"TASK_ID\"\n ```\n8. Click **Run query**.\n\n### gcloud\n\nTo read your process log entries, use the\n[`gcloud logging read` command](https://cloud.google.com/sdk/gcloud/reference/logging/read)\nwith the following query: \n\n```\ngcloud logging read \\\n 'resource.type=\"dataplex.googleapis.com/Task\" AND\n logName=projects/PROJECT_ID/logs/dataplex.googleapis.com%2Fprocess AND\n resource.labels.location=LOCATION AND\n resource.labels.lake_id=LAKE_ID AND\n resource.labels.task_id=TASK_ID'\n --limit 10\n```\n\n### REST\n\nTo list log entries, use the\n[`entries.list` method](/logging/docs/reference/v2/rest/v2/entries/list).\n\nWhat's next\n-----------\n\n- Learn more about [Cloud Logging](/logging/docs).\n- Learn about [Dataplex Universal Catalog monitoring](/dataplex/docs/monitoring)."]]