Skip to main content

Node Description

The Google BigQuery Writer node enables you to execute DML (Data Manipulation Language) operations on BigQuery tables. Unlike the standard BigQuery node which is designed for reading data with SELECT queries, this node is specifically designed for write operations without row limits.

Node Inputs

Required Fields

  1. Google Cloud Project The ID of your Google Cloud project where BigQuery is enabled. Example: "my-cloud-project"
  2. Query Type The type of DML operation to perform. Available options:
    • INSERT: Add new rows to a table
    • UPDATE: Modify existing rows in a table
    • MERGE: Combine INSERT and UPDATE operations based on conditions
    • DELETE: Remove rows from a table
    Example: "insert"
  3. BigQuery SQL Query A valid DML SQL query to execute on the BigQuery dataset. INSERT Example:
    INSERT INTO `project.dataset.table` (column1, column2)
    VALUES ("value1", "value2")
    
    UPDATE Example:
    UPDATE `project.dataset.table`
    SET column1 = "new_value"
    WHERE column2 = "condition"
    
    MERGE Example:
    MERGE `project.dataset.target_table` T
    USING `project.dataset.source_table` S
    ON T.id = S.id
    WHEN MATCHED THEN
      UPDATE SET T.value = S.value
    WHEN NOT MATCHED THEN
      INSERT (id, value) VALUES (S.id, S.value)
    
    DELETE Example:
    DELETE FROM `project.dataset.table`
    WHERE column1 = "value_to_delete"
    

Using Dynamic Parameters

You can use dynamic parameters in your SQL queries using the {{parameter_name}} syntax. This allows you to pass values from previous nodes in your workflow. Example with dynamic parameters:
INSERT INTO `project.dataset.users` (name, email, status)
VALUES ("{{user_name}}", "{{user_email}}", "{{user_status}}")

Node Output

The Google BigQuery Writer node provides the following output: Execution Result A JSON object containing execution information about the DML operation. Example Output:
{
  "success": true,
  "job_id": "job_abc123xyz",
  "affected_rows": 5,
  "query_type": "insert",
  "execution_time_ms": 1250
}

Example Usage

1. Insert New Records

Google Cloud Project: "my-cloud-project" Query Type: INSERT BigQuery SQL Query:
INSERT INTO `my-cloud-project.analytics.page_views` (page_url, visitor_id, timestamp)
VALUES ("{{page_url}}", "{{visitor_id}}", CURRENT_TIMESTAMP())

2. Update Existing Records

Google Cloud Project: "ecommerce-project" Query Type: UPDATE BigQuery SQL Query:
UPDATE `ecommerce-project.orders.order_status`
SET status = "shipped", updated_at = CURRENT_TIMESTAMP()
WHERE order_id = "{{order_id}}"

3. Delete Records

Google Cloud Project: "data-cleanup-project" Query Type: DELETE BigQuery SQL Query:
DELETE FROM `data-cleanup-project.logs.old_entries`
WHERE created_at < TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 90 DAY)

4. Merge Data from Source to Target

Google Cloud Project: "sync-project" Query Type: MERGE BigQuery SQL Query:
MERGE `sync-project.main.customers` T
USING `sync-project.staging.new_customers` S
ON T.customer_id = S.customer_id
WHEN MATCHED THEN
  UPDATE SET T.email = S.email, T.name = S.name
WHEN NOT MATCHED THEN
  INSERT (customer_id, email, name) VALUES (S.customer_id, S.email, S.name)

Node Functionality

The Google BigQuery Writer node is designed for data manipulation tasks in BigQuery. Use it for:
  • Inserting new data into BigQuery tables from workflow outputs
  • Updating existing records based on conditions
  • Merging data from staging tables to production tables
  • Deleting outdated or unwanted records
  • Building ETL pipelines that write processed data back to BigQuery
This node complements the standard Google BigQuery node by providing write capabilities, enabling complete data workflows within DNG Builder.

Tool Activation

You must activate the tool via the Integrations menu. You must authenticate with Google OAuth and your administrator must assign the correct IAM roles to access BigQuery in your GCP account. Required IAM roles for write operations:
BigQuery Data Editor
(roles/bigquery.dataEditor)
or
BigQuery Admin
(roles/bigquery.admin)
Note: The BigQuery Data Viewer role is not sufficient for write operations. You need at least BigQuery Data Editor permissions to execute INSERT, UPDATE, MERGE, or DELETE queries.

Important Notes

  • This node only accepts DML queries (INSERT, UPDATE, MERGE, DELETE). SELECT queries should use the standard Google BigQuery node.
  • There are no row limits on write operations, but be mindful of BigQuery quotas and costs.
  • Always test your queries on a small dataset or staging environment before running on production data.
  • Use parameterized queries with {{variable}} syntax to prevent SQL injection when working with dynamic data.