PostgreSQL is a powerful, open-source relational database system with over 35 years of active development. Cargo’s native integration with PostgreSQL allows you to connect your database, create data models, and execute SQL operations directly in your workflows.
Supabase compatible — This integration works with Supabase, Neon, Railway, and any PostgreSQL-compatible database.
How to set up PostgreSQL
Prerequisites
Before connecting PostgreSQL to Cargo, ensure you have:
- An active PostgreSQL server (version 9.6 or higher)
- Network connectivity between Cargo and your PostgreSQL server
- A database user with appropriate permissions
- SSL enabled for secure connections (recommended)
Connection details
To set up the connection, provide the following details when creating the connector:
| Field | Description |
|---|
| Host | Your PostgreSQL server hostname or IP address |
| Port | Default is 5432 |
| Database | Your database name |
| Username | The Cargo service user (e.g., cargo_user) |
| Password | The user’s password |
| SSL | Enable SSL/TLS encryption (recommended) |
Standard PostgreSQL
Supabase
Neon
Use your PostgreSQL server’s hostname and credentials directly.Host: db.example.com
Port: 5432
Database: myapp
Username: cargo_user
Password: ********
SSL: true
Find your connection details in Project Settings → Database.Host: db.xxxxxxxxxxxx.supabase.co
Port: 5432
Database: postgres
Username: postgres
Password: ********
SSL: true
Use the Direct connection string for best performance with Cargo.
Find your connection details in your Neon project dashboard.Host: ep-xxxxx.region.aws.neon.tech
Port: 5432
Database: neondb
Username: your_username
Password: ********
SSL: true
PostgreSQL actions
Once connected, you can use PostgreSQL in your workflows with the following actions:
Insert
Insert new records into a PostgreSQL table.
Configuration
| Field | Description |
|---|
| Schema | The PostgreSQL schema containing the target table |
| Table | The table to insert data into |
| Mappings | Map columns to values using expressions |
Use cases
- Lead capture – Insert new leads from form submissions or enrichment workflows
- Event logging – Record workflow events and outcomes
- Data aggregation – Store computed results for reporting
Update
Update existing records in a PostgreSQL table based on a matching column.
Configuration
| Field | Description |
|---|
| Schema | The PostgreSQL schema containing the target table |
| Table | The table to update |
| Matching Column | The column to match records against |
| Matching Value | The value to match (supports expressions) |
| Mappings | Map columns to new values using expressions |
Use cases
- Data enrichment – Update records with enriched data from external sources
- Status updates – Mark records as processed or update stages
- Sync external changes – Keep PostgreSQL in sync with CRM or other systems
Upsert
Create new records or update existing ones based on a matching column. Uses PostgreSQL’s native ON CONFLICT clause for optimal performance.
Configuration
| Field | Description |
|---|
| Schema | The PostgreSQL schema containing the target table |
| Table | The table to upsert into |
| Matching Column | The column to match records against |
| Matching Value | The value to match (supports expressions) |
| Mappings | Map columns to values using expressions |
Use cases
- Data sync – Keep your database updated regardless of whether records exist
- Idempotent operations – Safely retry operations without creating duplicates
- Master data management – Maintain a single source of truth
The upsert action requires a unique constraint or primary key on the matching column.
Delete
Delete records from a PostgreSQL table based on a matching column.
Configuration
| Field | Description |
|---|
| Schema | The PostgreSQL schema containing the target table |
| Table | The table to delete from |
| Matching Column | The column to match records against |
| Matching Value | The value to match (supports expressions) |
Use cases
- Data cleanup – Remove outdated or invalid records
- GDPR compliance – Delete personal data on request
- Workflow automation – Remove processed records from staging tables
PostgreSQL data models
Cargo allows you to create data models on top of your PostgreSQL data that can be used to trigger Plays and power workflows.
Creating PostgreSQL data models
To create a PostgreSQL data model:
- Navigate to Data Models in Cargo
- Click Create data model
- Select PostgreSQL as the source
- Configure the following fields:
| Field | Description |
|---|
| Name | Choose a descriptive name for your model |
| Slug | Set a unique identifier that cannot be changed once created |
| Schema | Select the PostgreSQL schema containing your data |
| Table | Select the table or view to model |
| ID Column | The column containing unique record identifiers |
| Title Column | The column to display as the record title |
| Cursor Column | (Optional) Column for incremental syncing (date or number) |
Using PostgreSQL data models
Once created, your PostgreSQL data model can be used to:
- Trigger Plays – Start automated workflows when data changes
- Power enrichment – Use PostgreSQL data to enrich records in workflows
- Create segments – Filter and target specific records from your data
Database user setup
For best security practices, create a dedicated user with only the necessary permissions:
-- Create a dedicated user
CREATE USER cargo_user WITH PASSWORD 'your_secure_password';
-- Grant connect privilege
GRANT CONNECT ON DATABASE your_database TO cargo_user;
-- Grant usage on schema
GRANT USAGE ON SCHEMA public TO cargo_user;
-- Grant permissions on specific tables
GRANT SELECT, INSERT, UPDATE, DELETE ON TABLE your_table TO cargo_user;
-- Or grant on all tables in a schema
GRANT SELECT, INSERT, UPDATE, DELETE ON ALL TABLES IN SCHEMA public TO cargo_user;
-- For future tables created in the schema
ALTER DEFAULT PRIVILEGES IN SCHEMA public
GRANT SELECT, INSERT, UPDATE, DELETE ON TABLES TO cargo_user;
Network configuration
If you restrict access to your PostgreSQL server, add these Cargo IP addresses to your firewall or security group:
3.251.34.134
54.220.135.99
79.125.105.52
Supabase
AWS RDS
Standard PostgreSQL
In Supabase, network restrictions are configured in Project Settings → Database → Network restrictions.
Update your RDS security group to allow inbound traffic on port 5432 from Cargo’s IPs.
Update your pg_hba.conf to allow connections from Cargo’s IP addresses.
Security
- All PostgreSQL connections can be encrypted using SSL/TLS
- Credentials are securely stored and encrypted at rest
- Cargo uses dedicated user credentials with minimal required permissions
- We recommend enabling SSL and using strong passwords