Add large batch of data
/v1/data/{namespace}/import
Request a pre-signed upload URL for importing large JSONL batches into a namespace. After receiving upload_url, upload your JSONL file using PUT {upload_url} with header Content-Type: application/json; put each JSON object on a new line.
Path Parameters
namespace: string
The namespace to ingest data into
Returns
expires_at: string
ISO timestamp when upload_url expires (12 hours after issuance)
objectKey: string
S3 object key where uploaded JSONL will be processed from
requestId: string
Unique identifier for this import request
status: "pending_upload"
Import request is waiting for file upload
upload_url: string
Pre-signed upload URL for PUT-ing JSONL content (Content-Type: application/json)
Add large batch of data
curl https://api.safetykit.com/v1/data/$NAMESPACE/import \
-X POST \
-H "Authorization: Bearer $SAFETYKIT_API_KEY"
{
"expires_at": "2026-01-15T12:30:00.000Z",
"objectKey": "TEAM_ID/data-api-request-input/products/request_01K....jsonl",
"requestId": "req_01h2m7qdmdjckc30e1mnq6xqfd",
"status": "pending_upload",
"upload_url": "https://s3.amazonaws.com/..."
}
Returns Examples
{
"expires_at": "2026-01-15T12:30:00.000Z",
"objectKey": "TEAM_ID/data-api-request-input/products/request_01K....jsonl",
"requestId": "req_01h2m7qdmdjckc30e1mnq6xqfd",
"status": "pending_upload",
"upload_url": "https://s3.amazonaws.com/..."
}