Permissions
What Cargo can do- Read data from datasets and tables, even if they are spread across multiple projects
- Write data into new datasets and tables
- Overwrite existing datasets and tables (Cargo always creates its own datasets and tables when needed)
Before you begin
You need an existing Google Cloud Project with a payment method and billing enabled. Follow the official Google guide to create one. This guide covers enabling and creating the following elements in your GCP project:- BigQuery API & Cloud Resource Manager API
- BigQuery dataset (dedicated to Cargo)
- Object Storage Bucket (dedicated to Cargo)
- Service Account
If you have an existing BigQuery project and technical knowledge, you may skip
any steps you’ve already completed.
Enable the necessary APIs
Cargo uses two Google APIs that must be enabled:- Go to the Google Cloud Console
- Select APIs & Services
- Select Enabled APIs & Services
- Search for and enable the following APIs:
BigQuery APICloud Resource Manager API
Create a storage bucket
To enable Cargo to load and unload data from BigQuery, you need a dedicated storage bucket.- Go to the Google Cloud Console
- Search for Cloud Storage in the search bar
- Click Create bucket and follow the setup wizard
- Note down the bucket name and location for later
Step 1: Create a dedicated dataset
Create a dedicated dataset for Cargo where all data managed by Cargo will be stored.- Console
- CLI
- Go to the Google Cloud Console 2.
Search for BigQuery in the search bar 3. In BigQuery Studio, click the
three dots next to your project name 4. Click Create dataset 5. Name it
cargo_dataset
Step 2: Create a service account
Create a service account with the necessary permissions for Cargo to interact with BigQuery.- Console
- CLI
- Go to the Google Cloud Console
- Navigate to IAM & Admin → Service Accounts
- Click Create service account
- Give the service account a name (e.g.,
cargo-service-account) - Grant the following roles:
BigQuery Data EditorBigQuery Job UserStorage Object User
- Click Done
- Click on the newly created service account
- Go to the Keys tab
- Click Add Key → Create new key → JSON
- Save the downloaded JSON file securely
Step 3: Grant additional permissions (optional)
If you need Cargo to access data outside the dedicated dataset, grant read access to other datasets:Step 4: Verify permissions
Ensure the Cargo service account has the following permissions:| Permission | Purpose |
|---|---|
bigquery.datasets.create | Create new datasets |
bigquery.tables.create | Create new tables |
bigquery.tables.getData | Read table data |
bigquery.tables.updateData | Update table data |
bigquery.jobs.create | Run BigQuery jobs |
Step 5: Connect to Cargo
Now that everything is set up, connect BigQuery to Cargo:- Navigate to Workspace settings → System of records
- Fill in the settings form:
- Service Account: Paste the contents of your JSON key file
- Location: Select the location you chose when creating the storage bucket
- Bucket: Enter the name of your storage bucket
- Scope: Select
Dataset - Dataset: Enter
cargo_dataset(or your dataset name from Step 1)
- Click Setup
Your BigQuery integration is now complete!

