Overview
List endpoints return paginated results to improve performance and reduce payload sizes. The API uses offset-based pagination with page numbers.
| Parameter | Type | Default | Max | Description |
|---|
page | integer | 1 | - | Page number to retrieve |
limit | integer | 50 | 100 | Items per page |
Paginated responses include metadata about the current page and total results:
{
"data": [...],
"meta": {
"version": "v1",
"pagination": {
"currentPage": 1,
"totalPages": 5,
"totalCount": 237,
"hasNextPage": true,
"hasPreviousPage": false
}
}
}
Total number of pages available
Total number of items across all pages
Whether there are more pages after this one
Whether there are pages before this one
Making Paginated Requests
# First page
curl "https://api.inspecto.com/api/third-party/v1/vehicles?page=1&limit=50" \
-H "Authorization: Bearer $API_KEY"
# Second page
curl "https://api.inspecto.com/api/third-party/v1/vehicles?page=2&limit=50" \
-H "Authorization: Bearer $API_KEY"
Iterating Through All Pages
async function fetchAllVehicles() {
const allVehicles = [];
let page = 1;
let hasMore = true;
while (hasMore) {
const response = await client.get('/vehicles', {
params: { page, limit: 100 }
});
allVehicles.push(...response.data.data);
hasMore = response.data.meta.pagination.hasNextPage;
page++;
}
return allVehicles;
}
Best Practices
Request the maximum number of items (100) per page to minimize API calls.// Good: Fewer requests
const response = await client.get('/vehicles', {
params: { limit: 100 }
});
// Bad: More requests needed
const response = await client.get('/vehicles', {
params: { limit: 10 }
});
Use filters to reduce the total number of items before paginating.const response = await client.get('/vehicles', {
params: {
status: 'OK',
type: 'TRACTOR_UNIT',
limit: 100
}
});
Handle Pagination Changes
Data may change between page requests. Handle potential inconsistencies.async function fetchPageSafely(page) {
try {
return await client.get('/vehicles', {
params: { page, limit: 100 }
});
} catch (error) {
if (error.response?.status === 404) {
// Page no longer exists, data changed
return null;
}
throw error;
}
}
For very large datasets, process pages as they arrive instead of loading all into memory.async function* streamVehicles() {
let page = 1;
let hasMore = true;
while (hasMore) {
const response = await client.get('/vehicles', {
params: { page, limit: 100 }
});
yield* response.data.data;
hasMore = response.data.meta.pagination.hasNextPage;
page++;
}
}
// Usage
for await (const vehicle of streamVehicles()) {
await processVehicle(vehicle);
}
Filters are preserved across pages:
// All pages will include the same filters
const params = {
status: 'OK',
type: 'TRACTOR_UNIT',
limit: 100
};
// Page 1
const page1 = await client.get('/vehicles', { params: { ...params, page: 1 } });
// Page 2 with same filters
const page2 = await client.get('/vehicles', { params: { ...params, page: 2 } });
Parallel Requests
Fetch multiple pages in parallel if order doesn’t matter
Cache Results
Cache paginated results to avoid redundant requests
Monitor Total Count
Track totalCount to detect when data changes
Optimize Page Size
Use appropriate page sizes for your use case
Example: Parallel Page Fetching
async function fetchPagesInParallel(totalPages) {
const pageNumbers = Array.from({ length: totalPages }, (_, i) => i + 1);
const results = await Promise.all(
pageNumbers.map(page =>
client.get('/vehicles', {
params: { page, limit: 100 }
})
)
);
return results.flatMap(r => r.data.data);
}
// Usage
const firstPage = await client.get('/vehicles', { params: { limit: 100 } });
const totalPages = firstPage.data.meta.pagination.totalPages;
if (totalPages > 1) {
const allVehicles = await fetchPagesInParallel(totalPages);
}