Add large batch of data
client.data.import(stringnamespace, DataImportParamsbody?, RequestOptionsoptions?): DataImportResponse { expires_at, objectKey, requestId, 2 more }
/v1/data/{namespace}/import
Request a pre-signed upload URL for importing large JSONL batches into a namespace. After receiving upload_url, upload your JSONL file using PUT {upload_url} with header Content-Type: application/json; put each JSON object on a new line.
Parameters
namespace: string
The namespace to ingest data into
body: DataImportParams
Returns
Add large batch of data
import Safetykit from 'safetykit';
const client = new Safetykit({
apiKey: process.env['SAFETYKIT_API_KEY'], // This is the default and can be omitted
});
const response = await client.data.import('namespace');
console.log(response.expires_at);
{
"expires_at": "2026-01-15T12:30:00.000Z",
"objectKey": "TEAM_ID/data-api-request-input/products/request_01K....jsonl",
"requestId": "req_01h2m7qdmdjckc30e1mnq6xqfd",
"status": "pending_upload",
"upload_url": "https://s3.amazonaws.com/..."
}
Returns Examples
{
"expires_at": "2026-01-15T12:30:00.000Z",
"objectKey": "TEAM_ID/data-api-request-input/products/request_01K....jsonl",
"requestId": "req_01h2m7qdmdjckc30e1mnq6xqfd",
"status": "pending_upload",
"upload_url": "https://s3.amazonaws.com/..."
}