Extract data from a BigQuery table to GCS.
type: "io.kestra.plugin.gcp.bigquery.extracttogcs"Extract a BigQuery table to a GCS bucket.
id: gcp_bq_extract_to_gcs
namespace: company.team
tasks:
- id: extract_to_gcs
type: io.kestra.plugin.gcp.bigquery.ExtractToGcs
destinationUris:
- "gs://bucket_name/filename.csv"
sourceTable: "my_project.my_dataset.my_table"
format: CSV
fieldDelimiter: ';'
printHeader: true
YESthe compression value to use for exported files. If not set exported files are not compressed.
YESThe list of fully-qualified Google Cloud Storage URIs (e.g. gs://bucket/path) where the extracted table should be written.
YESThe delimiter to use between fields in the exported data. By default "," is used.
YESThe exported file format. If not set table is exported in CSV format.
YESThe GCP service account to impersonate.
NOOptional Job timeout in milliseconds. If this time limit is exceeded, BigQuery may attempt to terminate the job.
YESThe labels associated with this job.
The labels associated with this job. You can use these to organize and group your jobs. Label keys and values can be no longer than 63 characters, can only contain lowercase letters, numeric characters, underscores and dashes. International characters are allowed. Label values are optional. Label keys must start with a letter and each label in the list must have a different key. Parameters: labels - labels or null for none
YESThe geographic location where the dataset should reside.
This property is experimental and might be subject to change or removed.
See Dataset Location
NOWhether to print out a header row in the results. By default an header is printed.
YESThe GCP project ID.
NOAutomatic retry for retryable BigQuery exceptions.
Some exceptions (especially rate limit) are not retried by default by BigQuery client, we use by default a transparent retry (not the kestra one) to handle this case. The default values are exponential of 5 seconds for a maximum of 15 minutes and ten attempts
YES["due to concurrent update","Retrying the job may solve the problem","Retrying may solve the problem"]The messages which would trigger an automatic retry.
Message is tested as a substring of the full message, and is case insensitive.
YES["rateLimitExceeded","jobBackendError","backendError","internalError","jobInternalError"]The reasons which would trigger an automatic retry.
YES["https://www.googleapis.com/auth/cloud-platform"]The GCP scopes to be used.
YESThe GCP service account.
YESThe table to export.
NOOptional Flag if format is set to "AVRO".
Optional If destinationFormat is set to "AVRO", this flag indicates whether to enable extracting applicable column types (such as TIMESTAMP) to their corresponding AVRO logical types (timestamp-micros), instead of only using their raw types (avro-long).
The destination URI file
Number of extracted files
The job id
source Table
NOdurationNONORETRY_FAILED_TASKRETRY_FAILED_TASKCREATE_NEW_EXECUTIONNO >= 1NOdurationNOfalseNOdurationNOdurationNONORETRY_FAILED_TASKRETRY_FAILED_TASKCREATE_NEW_EXECUTIONNO >= 1NOdurationNOfalseNOdurationNOdurationNONORETRY_FAILED_TASKRETRY_FAILED_TASKCREATE_NEW_EXECUTIONNONO >= 1NOdurationNOfalse