Documentation Index Fetch the complete documentation index at: https://docs.getcargo.ai/llms.txt
Use this file to discover all available pages before exploring further.
BigQuery is Google Cloud’s fully managed, serverless data warehouse that enables super-fast SQL queries. Cargo’s native integration with BigQuery allows you to connect your data warehouse, create data models, and execute SQL operations directly in your workflows.
How to set up BigQuery
Prerequisites
Before connecting BigQuery to Cargo, ensure you have:
An active Google Cloud Project with billing enabled
BigQuery API enabled
A service account with appropriate permissions
Creating a Service Account
To connect Cargo to BigQuery, you need a service account with the necessary permissions:
Go to the Google Cloud Console
Navigate to IAM & Admin → Service Accounts
Click Create service account
Give the service account a name (e.g., cargo-bigquery)
Grant the following roles:
BigQuery Data Editor – Read and write data in BigQuery tables
BigQuery Job User – Run queries and jobs
Click Done
Click on the newly created service account
Go to the Keys tab
Click Add Key → Create new key → JSON
Save the downloaded JSON file securely—you’ll need its contents for the connection
Create the service account: gcloud iam service-accounts create cargo-bigquery \
--display-name= "Cargo BigQuery Service Account"
Grant BigQuery permissions: gcloud projects add-iam-policy-binding your-project-id \
--member= "serviceAccount:cargo-bigquery@your-project-id.iam.gserviceaccount.com" \
--role= "roles/bigquery.dataEditor"
gcloud projects add-iam-policy-binding your-project-id \
--member= "serviceAccount:cargo-bigquery@your-project-id.iam.gserviceaccount.com" \
--role= "roles/bigquery.jobUser"
Create and download a key file: gcloud iam service-accounts keys create cargo-bigquery-key.json \
--iam-account=cargo-bigquery@your-project-id.iam.gserviceaccount.com
Connection details
To set up the connection, provide the following details when creating the connector:
Field Description Service Account Paste the full JSON content of your service account key file
BigQuery actions
Once connected, you can use BigQuery in your workflows with the following actions:
Insert
Insert new records into a BigQuery table.
Configuration
Field Description Dataset The BigQuery dataset containing the target table Table The table to insert data into Mappings Map columns to values using expressions
Use cases
Lead capture – Insert new leads from form submissions or enrichment workflows
Event logging – Record workflow events and outcomes
Data aggregation – Store computed results for reporting
Update
Update existing records in a BigQuery table based on a matching column.
Configuration
Field Description Dataset The BigQuery dataset containing the target table Table The table to update Matching Column The column to match records against Matching Value The value to match (supports expressions) Mappings Map columns to new values using expressions
Use cases
Data enrichment – Update records with enriched data from external sources
Status updates – Mark records as processed or update stages
Sync external changes – Keep BigQuery in sync with CRM or other systems
Upsert
Create new records or update existing ones based on a matching column.
Configuration
Field Description Dataset The BigQuery dataset containing the target table Table The table to upsert into Matching Column The column to match records against Matching Value The value to match (supports expressions) Mappings Map columns to values using expressions
Use cases
Data sync – Keep your warehouse updated regardless of whether records exist
Idempotent operations – Safely retry operations without creating duplicates
Master data management – Maintain a single source of truth
Delete
Delete records from a BigQuery table based on a matching column.
Configuration
Field Description Dataset The BigQuery dataset containing the target table Table The table to delete from Matching Column The column to match records against Matching Value The value to match (supports expressions)
Use cases
Data cleanup – Remove outdated or invalid records
GDPR compliance – Delete personal data on request
Workflow automation – Remove processed records from staging tables
BigQuery data models
Cargo allows you to create data models on top of your BigQuery data that can be used to trigger Plays and power workflows.
Creating BigQuery data models
To create a BigQuery data model:
Navigate to Data Models in Cargo
Click Create data model
Select BigQuery as the source
Configure the following fields:
Field Description Name Choose a descriptive name for your model Slug Set a unique identifier that cannot be changed once created Dataset Select the BigQuery dataset containing your data Table Select the table or view to model ID Column The column containing unique record identifiers Title Column The column to display as the record title Cursor Column (Optional) Column for incremental syncing (date or number)
Using BigQuery data models
Once created, your BigQuery data model can be used to:
Trigger Plays – Start automated workflows when data changes
Power enrichment – Use BigQuery data to enrich records in workflows
Create segments – Filter and target specific records from your data
Required permissions
Ensure your service account has the following IAM roles:
BigQuery Data Editor – Read and write data in BigQuery tables
BigQuery Job User – Run queries and jobs
BigQuery Data Viewer – (Optional) Read access to additional datasets outside your main dataset
Required permissions breakdown
Permission Purpose bigquery.tables.createCreate new tables bigquery.tables.getDataRead data from tables bigquery.tables.updateDataWrite data to tables bigquery.jobs.createExecute queries and jobs
Network configuration
If you restrict access to your BigQuery instance, add these Cargo IP addresses to your VPC firewall rules:
3.251.34.134
54.220.135.99
79.125.105.52
Security
All BigQuery connections use Google’s secure authentication
Service account keys are encrypted at rest
Data in transit is encrypted using TLS
Cargo never overwrites existing tables—it always creates its own