Skip to main content
BigQuery is Google Cloud’s fully managed, serverless data warehouse that enables super-fast SQL queries. Cargo’s native integration with BigQuery allows you to connect your data warehouse, create data models, and execute SQL operations directly in your workflows.

How to set up BigQuery

Prerequisites

Before connecting BigQuery to Cargo, ensure you have:
  • An active Google Cloud Project with billing enabled
  • BigQuery API enabled
  • A service account with appropriate permissions

Creating a Service Account

To connect Cargo to BigQuery, you need a service account with the necessary permissions:
  1. Go to the Google Cloud Console
  2. Navigate to IAM & AdminService Accounts
  3. Click Create service account
  4. Give the service account a name (e.g., cargo-bigquery)
  5. Grant the following roles:
    • BigQuery Data Editor – Read and write data in BigQuery tables
    • BigQuery Job User – Run queries and jobs
  6. Click Done
  7. Click on the newly created service account
  8. Go to the Keys tab
  9. Click Add KeyCreate new keyJSON
  10. Save the downloaded JSON file securely—you’ll need its contents for the connection

Connection details

To set up the connection, provide the following details when creating the connector:
FieldDescription
Service AccountPaste the full JSON content of your service account key file

BigQuery actions

Once connected, you can use BigQuery in your workflows with the following actions:

Insert

Insert new records into a BigQuery table. Configuration
FieldDescription
DatasetThe BigQuery dataset containing the target table
TableThe table to insert data into
MappingsMap columns to values using expressions
Use cases
  • Lead capture – Insert new leads from form submissions or enrichment workflows
  • Event logging – Record workflow events and outcomes
  • Data aggregation – Store computed results for reporting

Update

Update existing records in a BigQuery table based on a matching column. Configuration
FieldDescription
DatasetThe BigQuery dataset containing the target table
TableThe table to update
Matching ColumnThe column to match records against
Matching ValueThe value to match (supports expressions)
MappingsMap columns to new values using expressions
Use cases
  • Data enrichment – Update records with enriched data from external sources
  • Status updates – Mark records as processed or update stages
  • Sync external changes – Keep BigQuery in sync with CRM or other systems

Upsert

Create new records or update existing ones based on a matching column. Configuration
FieldDescription
DatasetThe BigQuery dataset containing the target table
TableThe table to upsert into
Matching ColumnThe column to match records against
Matching ValueThe value to match (supports expressions)
MappingsMap columns to values using expressions
Use cases
  • Data sync – Keep your warehouse updated regardless of whether records exist
  • Idempotent operations – Safely retry operations without creating duplicates
  • Master data management – Maintain a single source of truth

Delete

Delete records from a BigQuery table based on a matching column. Configuration
FieldDescription
DatasetThe BigQuery dataset containing the target table
TableThe table to delete from
Matching ColumnThe column to match records against
Matching ValueThe value to match (supports expressions)
Use cases
  • Data cleanup – Remove outdated or invalid records
  • GDPR compliance – Delete personal data on request
  • Workflow automation – Remove processed records from staging tables

BigQuery data models

Cargo allows you to create data models on top of your BigQuery data that can be used to trigger Plays and power workflows.

Creating BigQuery data models

To create a BigQuery data model:
  1. Navigate to Data Models in Cargo
  2. Click Create data model
  3. Select BigQuery as the source
  4. Configure the following fields:
FieldDescription
NameChoose a descriptive name for your model
SlugSet a unique identifier that cannot be changed once created
DatasetSelect the BigQuery dataset containing your data
TableSelect the table or view to model
ID ColumnThe column containing unique record identifiers
Title ColumnThe column to display as the record title
Cursor Column(Optional) Column for incremental syncing (date or number)

Using BigQuery data models

Once created, your BigQuery data model can be used to:
  • Trigger Plays – Start automated workflows when data changes
  • Power enrichment – Use BigQuery data to enrich records in workflows
  • Create segments – Filter and target specific records from your data

Required permissions

Ensure your service account has the following IAM roles:
  • BigQuery Data Editor – Read and write data in BigQuery tables
  • BigQuery Job User – Run queries and jobs
  • BigQuery Data Viewer – (Optional) Read access to additional datasets outside your main dataset

Required permissions breakdown

PermissionPurpose
bigquery.tables.createCreate new tables
bigquery.tables.getDataRead data from tables
bigquery.tables.updateDataWrite data to tables
bigquery.jobs.createExecute queries and jobs

Network configuration

If you restrict access to your BigQuery instance, add these Cargo IP addresses to your VPC firewall rules:
  • 3.251.34.134
  • 54.220.135.99
  • 79.125.105.52

Security

  • All BigQuery connections use Google’s secure authentication
  • Service account keys are encrypted at rest
  • Data in transit is encrypted using TLS
  • Cargo never overwrites existing tables—it always creates its own