Permissions
What Cargo can do- Read data from datasets and tables, even if they are spread across multiple projects
- Write data into new datasets and tables
- Overwrite existing datasets and tables (Cargo always creates its own datasets and tables when needed)
Before you begin
You need an existing Google Cloud Project with a payment method and billing enabled. Follow the official Google guide to create one. This guide covers enabling and creating the following elements in your GCP project:- BigQuery API & Cloud Resource Manager API
- Cloud Storage bucket (dedicated to Cargo)
- Service Account with appropriate permissions
If you have an existing BigQuery project and technical knowledge, you may skip
any steps you’ve already completed.
Enable the necessary APIs
Cargo uses two Google APIs that must be enabled:- Go to the Google Cloud Console
- Select APIs & Services
- Select Enabled APIs & Services
- Search for and enable the following APIs:
BigQuery APICloud Resource Manager API
Create a storage bucket
To enable Cargo to load and unload data from BigQuery as your system of records, you need a dedicated storage bucket.- Go to the Google Cloud Console
- Search for Cloud Storage in the search bar
- Click Create bucket and follow the setup wizard
- Note down the bucket name and location for later
Step 1: Create a service account
Create a service account with the necessary permissions for Cargo to interact with BigQuery.- Console
- CLI
- Go to the Google Cloud Console
- Navigate to IAM & Admin → Service Accounts
- Click Create service account
- Give the service account a name (e.g.,
cargo-bigquery) - Grant the following roles:
BigQuery Data Editor– Read and write data in BigQuery tablesBigQuery Job User– Run queries and jobsStorage Object User– Read/write to the Cloud Storage bucket
- Click Done
- Click on the newly created service account
- Go to the Keys tab
- Click Add Key → Create new key → JSON
- Save the downloaded JSON file securely—you’ll need its contents for the connection
Step 2: Grant additional permissions (optional)
If you need Cargo to access data outside the default dataset, grant read access to other datasets:Step 3: Verify permissions
Ensure the Cargo service account has the following permissions:| Permission | Purpose |
|---|---|
bigquery.tables.create | Create new tables |
bigquery.tables.getData | Read table data |
bigquery.tables.updateData | Update table data |
bigquery.jobs.create | Run BigQuery jobs |
Step 4: Connect to Cargo
Now that everything is set up, connect BigQuery as your system of records in Cargo:- Navigate to Workspace settings → System of records
- Fill in the settings form:
- Service Account: Paste the contents of your JSON key file
- Location: Select the location you chose when creating the storage bucket
- Bucket: Enter the name of your storage bucket
- Scope: Select
Dataset - Dataset: Enter the name of your dedicated dataset for Cargo (e.g.,
cargo_dataset)
- Click Setup
Your BigQuery integration is now complete!

