Overview
Get a list of all jobs submitted by your client account with filtering, pagination, and sorting options.Query Parameters
Page number (starting from 1)Example:
?page=2Number of jobs per page (max: 100)Example:
?page_size=25Filter by job statusOptions:
queued, processing, completed, failed, cancelledExample: ?status_filter=completedFilter by job typeOptions:
spiderSite, spiderMapsExample: ?type_filter=spiderSiteField to sort byOptions:
created_at, updated_at, statusExample: ?sort_by=updated_atSort orderOptions:
asc, descExample: ?sort_order=ascResponse
Total number of jobs matching the filter
Current page number
Number of items per page
Total number of pages available
Array of job objects
Job Object
Unique job identifier (UUID)
Job type (
spiderSite or spiderMaps)Current job status
The URL that was scraped
ISO 8601 timestamp when job was created
ISO 8601 timestamp of last update
ID of the worker that processed the job (if assigned)
Example Request
Example Response
Pagination Example
Use Cases
Get Recent Completed Jobs
Get Failed Jobs for Debugging
Get All SpiderMaps Jobs
Get Jobs Currently Processing
Notes
Default sorting: Jobs are sorted by
created_at in descending order (newest first) by default.Performance: Use the maximum
page_size=100 for fewer API calls when fetching large datasets.Rate limits apply: Each request counts toward your 100 requests/minute limit. When iterating through many pages, implement rate limiting in your code.
