- 2.28.0 (latest)
 - 2.27.0
 - 2.26.0
 - 2.25.0
 - 2.24.0
 - 2.23.0
 - 2.22.0
 - 2.21.0
 - 2.20.0
 - 2.19.0
 - 2.18.0
 - 2.17.0
 - 2.16.0
 - 2.15.0
 - 2.14.0
 - 2.13.0
 - 2.12.0
 - 2.11.0
 - 2.10.0
 - 2.9.0
 - 2.8.0
 - 2.7.0
 - 2.6.0
 - 2.5.0
 - 2.4.0
 - 2.3.0
 - 2.2.0
 - 1.36.0
 - 1.35.0
 - 1.34.0
 - 1.33.0
 - 1.32.0
 - 1.31.0
 - 1.30.0
 - 1.29.0
 - 1.28.0
 - 1.27.0
 - 1.26.0
 - 1.25.0
 - 1.24.0
 - 1.22.0
 - 1.21.0
 - 1.20.0
 - 1.19.0
 - 1.18.0
 - 1.17.0
 - 1.16.0
 - 1.15.0
 - 1.14.0
 - 1.13.0
 - 1.12.0
 - 1.11.1
 - 1.10.0
 - 1.9.0
 - 1.8.0
 - 1.7.0
 - 1.6.0
 - 1.5.0
 - 1.4.0
 - 1.3.0
 - 1.2.0
 - 1.1.0
 - 1.0.0
 - 0.26.0
 - 0.25.0
 - 0.24.0
 - 0.23.0
 - 0.22.0
 - 0.21.0
 - 0.20.1
 - 0.19.2
 - 0.18.0
 - 0.17.0
 - 0.16.0
 - 0.15.0
 - 0.14.1
 - 0.13.0
 - 0.12.0
 - 0.11.0
 - 0.10.0
 - 0.9.0
 - 0.8.0
 - 0.7.0
 - 0.6.0
 - 0.5.0
 - 0.4.0
 - 0.3.0
 - 0.2.0
 
PaLM2TextEmbeddingGenerator(
    *,
    model_name: typing.Literal[
        "textembedding-gecko", "textembedding-gecko-multilingual"
    ] = "textembedding-gecko",
    version: typing.Optional[str] = None,
    session: typing.Optional[bigframes.session.Session] = None,
    connection_name: typing.Optional[str] = None
)PaLM2 text embedding generator LLM model.
Parameters | 
      |
|---|---|
| Name | Description | 
model_name | 
        
  	str, Default to "textembedding-gecko"
  	The model for text embedding. “textembedding-gecko” returns model embeddings for text inputs. "textembedding-gecko-multilingual" returns model embeddings for text inputs which support over 100 languages Default to "textembedding-gecko".  | 
      
version | 
        
  	str or None
  	Model version. Accepted values are "001", "002", "003", "latest" etc. Will use the default version if unset. See https://cloud.google.com/vertex-ai/docs/generative-ai/learn/model-versioning for details.  | 
      
session | 
        
  	bigframes.Session or None
  	BQ session to create the model. If None, use the global default session.  | 
      
connection_name | 
        
  	str or None
  	connection to connect with remote service. str of the format <PROJECT_NUMBER/PROJECT_ID>. 
  | 
      
Methods
__repr__
__repr__()Print the estimator's constructor with all non-default parameter values.
get_params
get_params(deep: bool = True) -> typing.Dict[str, typing.Any]Get parameters for this estimator.
| Parameter | |
|---|---|
| Name | Description | 
deep | 
        
          bool, default True
          Default   | 
      
| Returns | |
|---|---|
| Type | Description | 
Dictionary | 
        A dictionary of parameter names mapped to their values. | 
predict
predict(
    X: typing.Union[bigframes.dataframe.DataFrame, bigframes.series.Series]
) -> bigframes.dataframe.DataFramePredict the result from input DataFrame.
| Parameter | |
|---|---|
| Name | Description | 
X | 
        
          bigframes.dataframe.DataFrame or bigframes.series.Series
          Input DataFrame, which needs to contain a column with name "content". Only the column will be used as input. Content can include preamble, questions, suggestions, instructions, or examples.  | 
      
| Returns | |
|---|---|
| Type | Description | 
bigframes.dataframe.DataFrame | 
        DataFrame of shape (n_samples, n_input_columns + n_prediction_columns). Returns predicted values. | 
to_gbq
to_gbq(
    model_name: str, replace: bool = False
) -> bigframes.ml.llm.PaLM2TextEmbeddingGeneratorSave the model to BigQuery.
| Parameters | |
|---|---|
| Name | Description | 
model_name | 
        
          str
          the name of the model.  | 
      
replace | 
        
          bool, default False
          Determine whether to replace if the model already exists. Default to False.  | 
      
| Returns | |
|---|---|
| Type | Description | 
PaLM2TextEmbeddingGenerator | 
        saved model. |