Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.risingwave.com/llms.txt

Use this file to discover all available pages before exploring further.

BigQuery is Google’s fully managed data warehouse and data analytics platform, capable of handling and analyzing large volumes of data as it is highly scalable. You can test out this process on your own device by using the big-query-sink demo in the integration_test directory of the RisingWave repository.
PREMIUM FEATUREThis is a premium feature. For a comprehensive overview of all premium features and their usage, please see RisingWave premium features.

Prerequisites

Before sinking data from RisingWave to BigQuery, please ensure the following:
  • The BigQuery table you want to sink to is accessible from RisingWave.
  • Ensure you have an upstream materialized view or table in RisingWave that you can sink data from.

Syntax

CREATE SINK [ IF NOT EXISTS ] sink_name
[FROM sink_from | AS select_query]
WITH (
   connector='bigquery',
   connector_parameter = 'value', ...
);

Parameters

Parameter NamesDescription
sink_nameName of the sink to be created.
sink_fromA clause that specifies the direct source from which data will be output. sink_from can be a materialized view or a table. Either this clause or select_query query must be specified.
AS select_queryA SELECT query that specifies the data to be output to the sink. Either this query or a sink_from clause must be specified. See SELECT for the syntax and examples of the SELECT command.
typeRequired. Data format. Allowed formats:
  • append-only: Output data with insert operations.
  • upsert: For this type, you need to set corresponding permissions and primary keys based on the Document of BigQuery.
force_append_onlyOptional. If true, forces the sink to be append-only, even if it cannot be.
bigquery.local.pathOptional. The file path leading to the JSON key file located on your local machine. Details can be found in Service Accounts under your Google Cloud account. At least one of bigquery.credentials, bigquery.local.path, or bigquery.s3.path must be specified.
bigquery.s3.pathOptional. The file path leading to the JSON key file located in S3. Details can be found in Service Accounts under your Google Cloud account. At least one of bigquery.credentials, bigquery.local.path, or bigquery.s3.path must be specified. If this parameter is used, aws.credentials.access_key_id, aws.credentials.secret_access_key, and region are also required.
bigquery.projectRequired. The BigQuery project ID.
bigquery.datasetRequired. The BigQuery dataset ID.
bigquery.tableRequired. The BigQuery table you want to sink to.
bigquery.credentialsOptional. Enter the JSON key credentials as a string. Both plain JSON and Base64-encoded JSON are accepted; Base64 is recommended for inline use to avoid escaping issues. At least one of bigquery.credentials, bigquery.local.path, or bigquery.s3.path must be specified.
bigquery.retry_timesOptional. The number of times the system should retry a BigQuery insert operation before ultimately returning an error. Defaults to 5.
auto_createOptional. Defaults to false. If true, a new table will be automatically created in BigQuery when the specified table is not found.
aws.credentials.access_key_idOptional. The access key for accessing the S3 file storing the JSON key. Required when bigquery.s3.path is specified.
aws.credentials.secret_access_keyOptional. The secret access key for accessing the S3 file storing the JSON key. Required when bigquery.s3.path is specified.
regionOptional. The AWS region of the S3 file storing the JSON key. Required when bigquery.s3.path is specified.

Examples

RisingWave supports three ways to provide the Google Cloud service account JSON key for authenticating with BigQuery. Option 1: Inline credentials string
Example using inline credentials
CREATE SINK big_query_sink_credentials
FROM mv1
WITH (
    connector = 'bigquery',
    type = 'append-only',
    bigquery.credentials= '${base64_encoded_service_account_json}',
    bigquery.project= '${project_id}',
    bigquery.dataset= '${dataset_id}',
    bigquery.table= '${table_id}',
    force_append_only='true'
);
Option 2: Local JSON key file
Example using a local JSON key file
CREATE SINK big_query_sink_local
FROM mv1
WITH (
    connector = 'bigquery',
    type = 'append-only',
    bigquery.local.path= '${bigquery_service_account_json_path}',
    bigquery.project= '${project_id}',
    bigquery.dataset= '${dataset_id}',
    bigquery.table= '${table_id}',
    force_append_only='true'
);
Option 3: JSON key file stored in S3 When the JSON key file is stored in S3, you must also provide the AWS credentials and region to access it.
Example using a JSON key file stored in S3
CREATE SINK big_query_sink_s3
FROM mv1
WITH (
    connector = 'bigquery',
    type = 'append-only',
    bigquery.s3.path= '${s3_service_account_json_path}',
    bigquery.project= '${project_id}',
    bigquery.dataset= '${dataset_id}',
    bigquery.table= '${table_id}',
    aws.credentials.access_key_id = '${aws_access_key}',
    aws.credentials.secret_access_key = '${aws_secret_access}',
    region = '${aws_region}',
    force_append_only='true'
);

Data type mapping

RisingWave Data TypeBigQuery Data Type
booleanbool
smallintint64
integerint64
bigintint64
realunsupported
double precisionfloat64
numericnumeric
datedate
character varying (varchar)string
time without time zonetime
timestamp without time zonedatetime
timestamp with time zonetimestamp
intervalinterval
structstruct
arrayarray
byteabytes
JSONBJSON
serialint64