Skip to main content
This guide walks you through setting up BigQuery as your system of records in Cargo. By the end, Cargo will have the necessary permissions to read and write data efficiently.

Permissions

What Cargo can do
  • Read data from datasets and tables, even if they are spread across multiple projects
  • Write data into new datasets and tables
What Cargo will never do
  • Overwrite existing datasets and tables (Cargo always creates its own datasets and tables when needed)

Before you begin

You need an existing Google Cloud Project with a payment method and billing enabled. Follow the official Google guide to create one. This guide covers enabling and creating the following elements in your GCP project:
  • BigQuery API & Cloud Resource Manager API
  • Cloud Storage bucket (dedicated to Cargo)
  • Service Account with appropriate permissions
If you have an existing BigQuery project and technical knowledge, you may skip any steps you’ve already completed.

Enable the necessary APIs

Cargo uses two Google APIs that must be enabled:
  1. Go to the Google Cloud Console
  2. Select APIs & Services
  3. Select Enabled APIs & Services
  4. Search for and enable the following APIs:
    • BigQuery API
    • Cloud Resource Manager API

Create a storage bucket

To enable Cargo to load and unload data from BigQuery as your system of records, you need a dedicated storage bucket.
  1. Go to the Google Cloud Console
  2. Search for Cloud Storage in the search bar
  3. Click Create bucket and follow the setup wizard
  4. Note down the bucket name and location for later

Step 1: Create a service account

Create a service account with the necessary permissions for Cargo to interact with BigQuery.
  1. Go to the Google Cloud Console
  2. Navigate to IAM & AdminService Accounts
  3. Click Create service account
  4. Give the service account a name (e.g., cargo-bigquery)
  5. Grant the following roles:
    • BigQuery Data Editor – Read and write data in BigQuery tables
    • BigQuery Job User – Run queries and jobs
    • Storage Object User – Read/write to the Cloud Storage bucket
  6. Click Done
  7. Click on the newly created service account
  8. Go to the Keys tab
  9. Click Add KeyCreate new keyJSON
  10. Save the downloaded JSON file securely—you’ll need its contents for the connection

Step 2: Grant additional permissions (optional)

If you need Cargo to access data outside the default dataset, grant read access to other datasets:
bq add-iam-policy-binding \
    --member="serviceAccount:[email protected]" \
    --role="roles/bigquery.dataViewer" \
    your-project-id:other_dataset

Step 3: Verify permissions

Ensure the Cargo service account has the following permissions:
PermissionPurpose
bigquery.tables.createCreate new tables
bigquery.tables.getDataRead table data
bigquery.tables.updateDataUpdate table data
bigquery.jobs.createRun BigQuery jobs
Check service account permissions:
gcloud projects get-iam-policy your-project-id \
    --flatten="bindings[].members" \
    --format='table(bindings.role)' \
    --filter="bindings.members:[email protected]"

Step 4: Connect to Cargo

Now that everything is set up, connect BigQuery as your system of records in Cargo:
  1. Navigate to Workspace settingsSystem of records
  2. Fill in the settings form:
    • Service Account: Paste the contents of your JSON key file
    • Location: Select the location you chose when creating the storage bucket
    • Bucket: Enter the name of your storage bucket
    • Scope: Select Dataset
    • Dataset: Enter the name of your dedicated dataset for Cargo (e.g., cargo_dataset)
  3. Click Setup
Your BigQuery integration is now complete!

Next steps