Skip to content
StatusSupportDashboard
Using the Data API

Adding Data in Batches

Use the import flow for high-volume JSONL ingestion.

Use the import flow when ingesting large datasets. The flow has two steps:

  1. Request an upload URL
  2. Upload a JSONL file to that URL

This pattern is ideal for migrations, historical backfills, or scheduled bulk loads that are too large for request/response ingest patterns.

Endpoint: POST /v1/data/{namespace}/import with an empty object as the request data.

Terminal window
curl -X POST "https://api.safetykit.com/v1/data/products/import" \
-H "Authorization: Bearer sk_your_api_key" \
-H "Content-Type: application/json" \
-d '{}'

Example response:

{
"status": "pending_upload",
"requestId": "req_01h2m7qdmdjckc30e1mnq6xqfd",
"objectKey": "TEAM_ID/data-api-request-input/products/req_01...jsonl",
"upload_url": "https://...",
"expires_at": "2026-02-19T12:34:56.000Z"
}

Upload your data to upload_url with PUT:

Terminal window
curl -X PUT "{upload_url}" \
-H "Content-Type: application/json" \
--data-binary @data.jsonl

The PUT {upload_url} call is made to a pre-signed object storage URL returned by the import endpoint above, not to a standalone SafetyKit API route.

  • one valid JSON object per line
  • every line includes an id

Example data.jsonl:

{"id":"product_1","title":"Item 1","price":24.0}
{"id":"product_2","title":"Item 2","price":18.5}
{"id":"product_3","title":"Item 3","price":9.99}

After upload, SafetyKit processes objects asynchronously and emits results per object. Large requests can take longer to complete, so you should always track the returned requestId and monitor completion using webhooks or polling.

See Request Status and Polling and Receiving Results for retrieval patterns.