Skip to main content
Firecrawl is a web scraping API that turns any website into clean, structured data. Cargo’s native integration with Firecrawl allows you to scrape individual pages or crawl entire websites directly from your workflows.

How to set up Firecrawl

You can use Firecrawl in two ways:
  1. Cargo credits – Use Firecrawl through Cargo’s managed integration
  2. Your own API key – Connect your Firecrawl account

Connection details

FieldDescription
API KeyYour Firecrawl API key
Get your API key from the Firecrawl dashboard.

Firecrawl actions

Scrape

Extract content from a single URL. Use cases
  • Page content extraction – Get clean text from any webpage
  • Product data – Scrape product information from e-commerce sites
  • News monitoring – Extract articles and updates
  • Research – Gather information from multiple sources
Configuration
FieldDescription
URLURL of the page to scrape
Credit cost: 0.05 credits per page

Crawl

Recursively crawl a website and extract content from multiple pages. Use cases
  • Site-wide extraction – Crawl documentation or knowledge bases
  • Competitive analysis – Extract content from competitor sites
  • Content aggregation – Gather content across multiple pages
  • Data collection – Build datasets from websites
Configuration
FieldDescription
URLStarting URL for the crawl
Maximum depthHow deep to crawl (0 = single page only)
IncludesPath patterns to include
ExcludesPath patterns to exclude
LimitMaximum pages to crawl
Options
OptionDescription
Ignore sitemapDon’t use the site’s sitemap for discovery
Allow backward linksAllow crawling back to previously linked pages
Allow external linksFollow links to external websites
Use Include and Exclude patterns to focus your crawl on specific sections of a website.
Credit cost: 0.05 credits per page crawled

Firecrawl data models

Create data models from crawled websites to power your workflows.

Crawl extractor

Crawl a website and create records from each page. Configuration
FieldDescription
URLStarting URL for the crawl
Maximum depthHow deep to crawl
IncludesPath patterns to include
ExcludesPath patterns to exclude
LimitMaximum pages (default: 10,000)