NSFW

NSFW Content Detection API

Detect NSFW content across every media type with one API.

Tunable thresholds, clear actions, and review queues.

What it detects

  • Sexual content & nudity
  • Suggestive imagery
  • Explicit text
  • Sexual self-harm signals
  • CSAM signals
  • Custom rules

Why developers choose Vettly

  • Same NSFW policy across text, image, and video
  • Tunable thresholds per category
  • Soft-queue with the review action
  • CSAM signal detection included
Example request
bash
curl -X POST https://api.vettly.dev/v1/check \
  -H "Authorization: Bearer YOUR_KEY" \
  -H "Content-Type: application/json" \
  -d '{"content": "https://example.com/image.jpg", "contentType": "image"}'
Example response
json
{
  "flagged": true,
  "action": "review",
  "categories": {
    "sexual": 0.88,
    "violence": 0.04
  },
  "policy": "marketplace-safe",
  "latency_ms": 318
}

Compared to image-only detectors

Vettly classifies NSFW across every media type and returns actions, not just probability scores.

Get an API key

Start making decisions in minutes with a Developer plan and clear upgrade paths.

Get an API key