EdTech
Content Moderation for EdTech
Safer classrooms, tutoring sessions, and student forums.
COPPA-aware policies plus auditable decisions for administrators.
What it detects
- • Bullying & harassment
- • Adult content for student accounts
- • PII solicitation
- • Self-harm signals
- • Off-task spam
- • Custom rules
Why developers choose Vettly
- • COPPA-aware policy templates
- • Per-classroom or per-grade thresholds
- • Exportable audit trails
- • Role-based dashboard access
Example request
bashcurl -X POST https://api.vettly.dev/v1/check \
-H "Authorization: Bearer YOUR_KEY" \
-H "Content-Type: application/json" \
-d '{"content": "You are terrible.", "contentType": "text"}'Example response
json{
"flagged": true,
"action": "block",
"categories": {
"harassment": 0.93,
"hate": 0.02
},
"policy": "default",
"latency_ms": 142
}Compared to wordlist filters
AI category scores catch novel bullying patterns and adversarial spelling that wordlists miss.
Read the COPPA compliance guideKeep exploring
Content Moderation API
One endpoint for text, image, and video moderation.
Image Moderation API
Policy-driven image checks with clear allow, review, and block actions.
Video Moderation API
Async video moderation without stitching together multiple vendors.
Content Moderation for Healthcare
Moderate patient communities, telehealth chat, and provider reviews with HIPAA-aware audit trails and clinically tuned policies.
Get an API key
Start making decisions in minutes with a Developer plan and clear upgrade paths.
Get an API key