Skip to content
StatusSupportDashboard

Add large batch of data

client.data.import(stringnamespace, DataImportParamsbody?, RequestOptionsoptions?): DataImportResponse { expires_at, objectKey, requestId, 2 more }
post/v1/data/{namespace}/import

Request a pre-signed upload URL for importing large JSONL batches into a namespace. After receiving upload_url, upload your JSONL file using PUT {upload_url} with header Content-Type: application/json; put each JSON object on a new line.

ParametersExpand Collapse
namespace: string

The namespace to ingest data into

body: DataImportParams
ReturnsExpand Collapse
DataImportResponse { expires_at, objectKey, requestId, 2 more }

Response containing an upload URL and metadata for large-batch import processing. Use PUT {upload_url} to upload JSONL (one JSON object per line). upload_url expires in 12 hours.

expires_at: string

ISO timestamp when upload_url expires (12 hours after issuance)

objectKey: string

S3 object key where uploaded JSONL will be processed from

requestId: string

Unique identifier for this import request

status: "pending_upload"

Import request is waiting for file upload

upload_url: string

Pre-signed upload URL for PUT-ing JSONL content (Content-Type: application/json)

Add large batch of data
import Safetykit from 'safetykit';

const client = new Safetykit({
  apiKey: process.env['SAFETYKIT_API_KEY'], // This is the default and can be omitted
});

const response = await client.data.import('namespace');

console.log(response.expires_at);
{
  "expires_at": "2026-01-15T12:30:00.000Z",
  "objectKey": "TEAM_ID/data-api-request-input/products/request_01K....jsonl",
  "requestId": "req_01h2m7qdmdjckc30e1mnq6xqfd",
  "status": "pending_upload",
  "upload_url": "https://s3.amazonaws.com/..."
}
Returns Examples
{
  "expires_at": "2026-01-15T12:30:00.000Z",
  "objectKey": "TEAM_ID/data-api-request-input/products/request_01K....jsonl",
  "requestId": "req_01h2m7qdmdjckc30e1mnq6xqfd",
  "status": "pending_upload",
  "upload_url": "https://s3.amazonaws.com/..."
}