Permissions
What Cargo can do- Read data from datasets and tables, even if they are spread across multiple projects
- Write data into new datasets and tables
- Overwrite existing datasets and tables (Cargo always creates its own datasets and tables when needed)
Before you begin
To start, you need an existing Google Cloud Project with a payment method and billing enabled. Follow the official Google guide. Once you have created the project, you can continue with this guide which will cover enabling and creating the necessary elements in your new GCP project:- BigQuery API & Cloud Resource Manager API
- BigQuery dataset (dedicated to Cargo)
- Object Storage Bucket (dedicated to Cargo)
- Service Account
To enable Cargo to load and unload data from BigQuery, we need a dedicated storage bucket for this purpose. To create a new bucket:
- Go to the
Google Cloud Console
. - Search
Object storage
in the search bar. - Create a
new bucket
and follow the steps.
Step 1: Create a dedicated dataset
Create a dedicated dataset for Cargo where all data managed by Cargo will be stored. Create a dataset called “cargo_dataset” You can create this through the BigQuery console or using the following command:- Go to the
Google Cloud Console
- Search
BigQuery
in the search bar - In BigQuery Studio click three dots next to the project name
- Click
Create dataset
Step 2: Create a service account for Cargo
Grant the necessary permissions for Cargo to run commands as an authenticated service account on the dataset you just created. Create a service account- Go to the
Google Cloud Console
- Click on
IAM & Admin
- Click on
IAM
- Click on
Service Accounts
- Click on
Create service account
- Give the service account a name
- Grant the following roles:
BigQuery Data Editor
,BigQuery Job User
,Storage Object User
- Click on
Done
Step 3: Grant additional permissions
If you need Cargo to access data outside the dedicated dataset, grant additional permissions: Grant read access to other datasetsStep 4: Verify permissions
Make sure the Cargo service account has access to the following permissions:bigquery.datasets.create
bigquery.tables.create
bigquery.tables.getData
bigquery.tables.updateData
bigquery.jobs.create
Step 5: Setup system of records
Now that we have all required elements, navigate to workspace settings and select “System of records”. Fill in the settings form with the data we gathered in previous steps:- Copy and paste the content of the service account key file into the field labeled
Service Account
- Select the
location
that was chosen during step 3 - Fill in the name of the
bucket
created in step 2 - Select
Dataset
as scope - Fill in the
name
of the BigQuery dataset created in step 3 - Click
Setup
Next steps
- Create your first data model using BigQuery tables
- Set up data connectors to import data from other sources
- Configure model relationships to connect different datasets
- Set up filters and segments for targeted plays