Node Description
The Google BigQuery Writer node enables you to execute DML (Data Manipulation Language) operations on BigQuery tables. Unlike the standard BigQuery node which is designed for reading data with SELECT queries, this node is specifically designed for write operations without row limits.Node Inputs
Required Fields
-
Google Cloud Project
The ID of your Google Cloud project where BigQuery is enabled.
Example:
"my-cloud-project" -
Query Type
The type of DML operation to perform.
Available options:
INSERT: Add new rows to a tableUPDATE: Modify existing rows in a tableMERGE: Combine INSERT and UPDATE operations based on conditionsDELETE: Remove rows from a table
"insert" -
BigQuery SQL Query
A valid DML SQL query to execute on the BigQuery dataset.
INSERT Example:
UPDATE Example:MERGE Example:DELETE Example:
Using Dynamic Parameters
You can use dynamic parameters in your SQL queries using the{{parameter_name}} syntax. This allows you to pass values from previous nodes in your workflow.
Example with dynamic parameters:
Node Output
The Google BigQuery Writer node provides the following output: Execution Result A JSON object containing execution information about the DML operation. Example Output:Example Usage
1. Insert New Records
Google Cloud Project:"my-cloud-project"
Query Type: INSERT
BigQuery SQL Query:
2. Update Existing Records
Google Cloud Project:"ecommerce-project"
Query Type: UPDATE
BigQuery SQL Query:
3. Delete Records
Google Cloud Project:"data-cleanup-project"
Query Type: DELETE
BigQuery SQL Query:
4. Merge Data from Source to Target
Google Cloud Project:"sync-project"
Query Type: MERGE
BigQuery SQL Query:
Node Functionality
The Google BigQuery Writer node is designed for data manipulation tasks in BigQuery. Use it for:- Inserting new data into BigQuery tables from workflow outputs
- Updating existing records based on conditions
- Merging data from staging tables to production tables
- Deleting outdated or unwanted records
- Building ETL pipelines that write processed data back to BigQuery
Tool Activation
You must activate the tool via the Integrations menu. You must authenticate with Google OAuth and your administrator must assign the correct IAM roles to access BigQuery in your GCP account. Required IAM roles for write operations:BigQuery Data Viewer role is not sufficient for write operations. You need at least BigQuery Data Editor permissions to execute INSERT, UPDATE, MERGE, or DELETE queries.
Important Notes
- This node only accepts DML queries (INSERT, UPDATE, MERGE, DELETE). SELECT queries should use the standard Google BigQuery node.
- There are no row limits on write operations, but be mindful of BigQuery quotas and costs.
- Always test your queries on a small dataset or staging environment before running on production data.
- Use parameterized queries with
{{variable}}syntax to prevent SQL injection when working with dynamic data.

