Compliance
Content Moderation for COPPA Compliance
Build under-13 experiences that hold up to COPPA review.
Stricter policies, parental review flows, and exportable audit trails.
What it detects
- • Adult content for child accounts
- • PII solicitation patterns
- • Stranger contact attempts
- • Self-harm content
- • Predatory grooming signals
- • Custom rules
Why developers choose Vettly
- • Pre-built child-safe policy template
- • Per-account policy routing
- • Exportable audit logs
- • Parental appeal workflow
Example request
bashcurl -X POST https://api.vettly.dev/v1/check \
-H "Authorization: Bearer YOUR_KEY" \
-H "Content-Type: application/json" \
-d '{"content": "You are terrible.", "contentType": "text"}'Example response
json{
"flagged": true,
"action": "block",
"categories": {
"harassment": 0.93,
"hate": 0.02
},
"policy": "default",
"latency_ms": 142
}Compared to one-size-fits-all moderation
Distinct policies per audience let you protect kids without dumbing down adult experiences.
Keep exploring
Content Moderation API
One endpoint for text, image, and video moderation.
Image Moderation API
Policy-driven image checks with clear allow, review, and block actions.
Video Moderation API
Async video moderation without stitching together multiple vendors.
Content Moderation in Next.js
Add content moderation to a Next.js App Router project in minutes. Server-side API routes, React Server Components, and edge runtime examples.
Get an API key
Start making decisions in minutes with a Developer plan and clear upgrade paths.
Get an API key