Skip to content
StatusSupportDashboard

Add large batch of data

data.import_(strnamespace) -> DataImportResponse
POST/v1/data/{namespace}/import

Request a pre-signed upload URL for importing large JSONL batches into a namespace. After receiving upload_url, upload your JSONL file using PUT {upload_url} with header Content-Type: application/json; put each JSON object on a new line.

ParametersExpand Collapse
namespace: str

The namespace to ingest data into

ReturnsExpand Collapse
class DataImportResponse:

Response containing an upload URL and metadata for large-batch import processing. Use PUT {upload_url} to upload JSONL (one JSON object per line). upload_url expires in 12 hours.

expires_at: str

ISO timestamp when upload_url expires (12 hours after issuance)

object_key: str

S3 object key where uploaded JSONL will be processed from

request_id: str

Unique identifier for this import request

status: Literal["pending_upload"]

Import request is waiting for file upload

upload_url: str

Pre-signed upload URL for PUT-ing JSONL content (Content-Type: application/json)

Add large batch of data

import os
from safetykit import Safetykit

client = Safetykit(
    api_key=os.environ.get("SAFETYKIT_API_KEY"),  # This is the default and can be omitted
)
response = client.data.import_(
    "namespace",
)
print(response.expires_at)
{
  "expires_at": "2026-01-15T12:30:00.000Z",
  "objectKey": "TEAM_ID/data-api-request-input/products/request_01K....jsonl",
  "requestId": "req_01h2m7qdmdjckc30e1mnq6xqfd",
  "status": "pending_upload",
  "upload_url": "https://s3.amazonaws.com/..."
}
Returns Examples
{
  "expires_at": "2026-01-15T12:30:00.000Z",
  "objectKey": "TEAM_ID/data-api-request-input/products/request_01K....jsonl",
  "requestId": "req_01h2m7qdmdjckc30e1mnq6xqfd",
  "status": "pending_upload",
  "upload_url": "https://s3.amazonaws.com/..."
}