Stay organized with collections
Save and categorize content based on your preferences.
The ML.CONVERT_COLOR_SPACE function
This document describes the ML.CONVERT_COLOR_SPACE scalar function, which lets
you convert images that have an RGB color space to a different color space.
You can use ML.CONVERT_COLOR_SPACE with the ML.PREDICT
function or
chain it with other functions or subqueries.
Syntax
ML.CONVERT_COLOR_SPACE(image, target_color_space)
Arguments
ML.CONVERT_COLOR_SPACE takes the following arguments:
image: a STRUCT value that represents an RGB image in one of the
following forms:
STRUCT<ARRAY<INT64>, ARRAY<FLOAT64>>
STRUCT<ARRAY<INT64>, ARRAY<INT64>>
The first array in the struct must contain the dimensions of the image.
It must contain three INT64 values, which represent the image height (H),
width (W), and number of channels (C).
The second array in the struct must contain the image data. The
length of the array must be equivalent to H x W x C from the preceding
array. If the image data is in FLOAT64, each value in the array must be
between [0, 1). If the image data is in INT64, each value in the array
must be between [0, 255).
The struct value must be <= 60 MB.
target_color_space: a STRING value that specifies the target color space.
Valid values are HSV, GRAYSCALE, YIQ, and YUV.
Output
ML.CONVERT_COLOR_SPACE returns a STRUCT value that represents the
modified image in the form STRUCT<ARRAY<INT64>, ARRAY<FLOAT64>>.
The first array in the struct represents the dimensions of the image, and
the second array in the struct contains the image data, similar
to the image input argument. Each value in the second array is between
[0, 1).
Example
The following example uses the ML.CONVERT_COLOR_SPACE function within the
ML.PREDICT function to change the color space for input images from RGB to
GRAYSCALE:
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-25 UTC."],[[["\u003cp\u003eThe \u003ccode\u003eML.CONVERT_COLOR_SPACE\u003c/code\u003e function converts images from the \u003ccode\u003eRGB\u003c/code\u003e color space to a specified target color space, like \u003ccode\u003eHSV\u003c/code\u003e, \u003ccode\u003eGRAYSCALE\u003c/code\u003e, \u003ccode\u003eYIQ\u003c/code\u003e, or \u003ccode\u003eYUV\u003c/code\u003e.\u003c/p\u003e\n"],["\u003cp\u003eThis function accepts an \u003ccode\u003eimage\u003c/code\u003e argument, which is a \u003ccode\u003eSTRUCT\u003c/code\u003e containing the image dimensions and data, and a \u003ccode\u003etarget_color_space\u003c/code\u003e argument specifying the desired color space.\u003c/p\u003e\n"],["\u003cp\u003eThe function outputs a \u003ccode\u003eSTRUCT\u003c/code\u003e value representing the modified image, with dimensions in the first array and the converted image data in the second array, with each value between \u003ccode\u003e[0, 1)\u003c/code\u003e.\u003c/p\u003e\n"],["\u003cp\u003eThe \u003ccode\u003eML.CONVERT_COLOR_SPACE\u003c/code\u003e function can be utilized within the \u003ccode\u003eML.PREDICT\u003c/code\u003e function or combined with other functions or subqueries, such as shown in the example with \u003ccode\u003eML.DECODE_IMAGE\u003c/code\u003e.\u003c/p\u003e\n"]]],[],null,["# The ML.CONVERT_COLOR_SPACE function\n===================================\n\nThis document describes the `ML.CONVERT_COLOR_SPACE` scalar function, which lets\nyou convert images that have an `RGB` color space to a different color space.\nYou can use `ML.CONVERT_COLOR_SPACE` with the [`ML.PREDICT`\nfunction](/bigquery/docs/reference/standard-sql/bigqueryml-syntax-predict) or\nchain it with other functions or subqueries.\n\nSyntax\n------\n\n```sql\nML.CONVERT_COLOR_SPACE(image, target_color_space)\n```\n\n### Arguments\n\n`ML.CONVERT_COLOR_SPACE` takes the following arguments:\n\n- `image`: a `STRUCT` value that represents an `RGB` image in one of the\n following forms:\n\n - `STRUCT\u003cARRAY\u003cINT64\u003e, ARRAY\u003cFLOAT64\u003e\u003e`\n - `STRUCT\u003cARRAY\u003cINT64\u003e, ARRAY\u003cINT64\u003e\u003e`\n\n The first array in the struct must contain the dimensions of the image.\n It must contain three `INT64` values, which represent the image height (H),\n width (W), and number of channels (C).\n\n The second array in the struct must contain the image data. The\n length of the array must be equivalent to H x W x C from the preceding\n array. If the image data is in `FLOAT64`, each value in the array must be\n between `[0, 1)`. If the image data is in `INT64`, each value in the array\n must be between `[0, 255)`.\n\n The struct value must be \\\u003c= 60 MB.\n- `target_color_space`: a `STRING` value that specifies the target color space.\n Valid values are `HSV`, `GRAYSCALE`, `YIQ`, and `YUV`.\n\nOutput\n------\n\n`ML.CONVERT_COLOR_SPACE` returns a `STRUCT` value that represents the\nmodified image in the form `STRUCT\u003cARRAY\u003cINT64\u003e, ARRAY\u003cFLOAT64\u003e\u003e`.\n\nThe first array in the struct represents the dimensions of the image, and\nthe second array in the struct contains the image data, similar\nto the `image` input argument. Each value in the second array is between\n`[0, 1)`.\n| **Note:** If you reference `ML.CONVERT_COLOR_SPACE` in SQL statements in the BigQuery editor, it is possible for the function output to be too large to display. If this occurs, write the output to a table instead.\n\nExample\n-------\n\nThe following example uses the `ML.CONVERT_COLOR_SPACE` function within the\n`ML.PREDICT` function to change the color space for input images from `RGB` to\n`GRAYSCALE`: \n\n```sql\nCREATE OR REPLACE TABLE mydataset.model_output\nAS (\n SELECT *\n FROM\n ML.PREDICT(\n MODEL `mydataset.mymodel`,\n SELECT\n ML.CONVERT_COLOR_SPACE(ML.DECODE_IMAGE(data), 'GRAYSCALE')\n AS image,\n uri\n FROM `mydataset.images`)\n);\n```\n\nWhat's next\n-----------\n\n- For information about feature preprocessing, see [Feature preprocessing overview](/bigquery/docs/preprocess-overview).\n- For information about the supported SQL statements and functions for each model type, see [End-to-end user journey for each model](/bigquery/docs/e2e-journey)."]]