Image Uploads

Image Upload Moderation API

Block unsafe images before they reach your storage bucket.

Synchronous checks at upload time with category-aware decisions.

What it detects

  • Sexual content & nudity
  • Violence & gore
  • Self-harm
  • CSAM signals
  • Hate symbols
  • Custom rules

Why developers choose Vettly

  • Sub-500ms image decisions
  • Block at upload to keep storage clean
  • Category-aware policies tuned per surface
  • Evidence + audit trail for every decision
Example request
bash
curl -X POST https://api.vettly.dev/v1/check \
  -H "Authorization: Bearer YOUR_KEY" \
  -H "Content-Type: application/json" \
  -d '{"content": "https://example.com/image.jpg", "contentType": "image"}'
Example response
json
{
  "flagged": true,
  "action": "review",
  "categories": {
    "sexual": 0.88,
    "violence": 0.04
  },
  "policy": "marketplace-safe",
  "latency_ms": 318
}

Compared to AWS Rekognition

Vettly returns decisions and actions, not just labels - and pairs image checks with text and video under shared policies.

Get an API key

Start making decisions in minutes with a Developer plan and clear upgrade paths.

Get an API key