Skip to content
StatusSupportDashboard
Using the Data API

Limits and Best Practices

Operational guidance for reliability, throughput, and data quality.

  • Use stable IDs: reuse the same id for updates to the same entity.
  • Prefer webhooks for completion: use polling for reconciliation and monitoring.
  • Keep payloads consistent: maintain field names and types per namespace.
  • Send complete context: richer objects improve review quality.
  • Use import flow for very large datasets: avoid overloading real-time ingest calls.

As your volume grows, these conventions reduce incident risk and make decisions easier to trace from ingest to final action.

  • keep media URLs publicly retrievable at processing time
  • avoid placeholder text (“N/A”, "", etc) in key fields (title, description, etc.)
  • include enough context fields for policy decisions
  • retry transient failures (429, 500, 503) with backoff
  • treat ingest as asynchronous fire-and-forget requests
  • store requestId for auditability and reconciliation
  • implement idempotent webhook handlers so duplicate delivery retries do not produce duplicate side effects
  • keep API keys server-side only
  • verify webhook signatures on every event

For authentication details, see Authentication. For webhook security, see Verifying Signatures. For endpoint-level behavior and response schemas, see the API Reference.