## Import `data.import_(strnamespace) -> DataImportResponse` **post** `/v1/data/{namespace}/import` Request a pre-signed upload URL for importing large JSONL batches into a namespace. After receiving `upload_url`, upload your JSONL file using `PUT {upload_url}` with header `Content-Type: application/json`; put each JSON object on a new line. ### Parameters - `namespace: str` The namespace to ingest data into ### Returns - `class DataImportResponse: …` Response containing an upload URL and metadata for large-batch import processing. Use `PUT {upload_url}` to upload JSONL (one JSON object per line). `upload_url` expires in 12 hours. - `expires_at: str` ISO timestamp when upload_url expires (12 hours after issuance) - `object_key: str` S3 object key where uploaded JSONL will be processed from - `request_id: str` Unique identifier for this import request - `status: Literal["pending_upload"]` Import request is waiting for file upload - `"pending_upload"` - `upload_url: str` Pre-signed upload URL for PUT-ing JSONL content (`Content-Type: application/json`) ### Example ```python import os from safetykit import Safetykit client = Safetykit( api_key=os.environ.get("SAFETYKIT_API_KEY"), # This is the default and can be omitted ) response = client.data.import_( "namespace", ) print(response.expires_at) ```