Skip to main content

Leveraging GenAI with Google BigQuery in Astrato

Get the most out of BigQuery ML & AI capabilities with Astrato.

Piers Batchelor avatar
Written by Piers Batchelor
Updated this week

Native Integration

Watch this space, coming soon

Using GenAI via SQL

BigQuery now supports direct calls to large language models (LLMs) using the ML.GENERATE_TEXT() function, powered by Vertex AI.

πŸ”§ Prerequisites

Before using LLM completions:

  1. Enable the Vertex AI API in your GCP project.

  2. Ensure your BigQuery project is in a supported region (e.g., us-central1).

  3. Assign the required IAM roles to your service account:

    • Vertex AI User

    • BigQuery Data Editor (if writing results to a table)


πŸ§ͺ Basic Example

SELECT ML.GENERATE_TEXT( MODEL `your-project.us-central1.text-bison@001`, STRUCT('Write a tweet about SQL analytics' AS prompt) ) AS generated_text;

This sends a prompt to the LLM and returns the completion.


βš™οΈ Customizing the Response

Use additional parameters for more control:

SELECT ML.GENERATE_TEXT( MODEL `your-project.us-central1.text-bison@001`, STRUCT( 'Summarise this product review:' AS prompt, 0.7 AS temperature, 256 AS max_output_tokens ) ) AS output;
  • temperature: randomness of response (0.0 = deterministic, 1.0 = creative)

  • max_output_tokens: length limit of the reply


πŸ“¦ Example with Table Data

SELECT review_text, ML.GENERATE_TEXT( MODEL `your-project.us-central1.text-bison@001`, STRUCT(review_text AS prompt, 0.5 AS temperature, 128 AS max_output_tokens) ) AS summary FROM my_dataset.product_reviews;

Each row sends a prompt to the LLM and returns the result.


πŸ“˜ Model Options

You can use:

  • text-bison@001 – best for general prompts

  • chat-bison@001 – optimized for chat-style interactions

Did this answer your question?