Guide

Image Moderation for Social Apps: A Developer's Guide

·10 min read

Social apps that accept user-uploaded photos face a moderation problem that text filters can't solve. Users post profile pictures, share photos in feeds, and upload images in messages. Without image moderation, explicit content, violence, and other policy violations slip through unchecked.

This guide covers how to implement image moderation for a social app using the Vettly API — from basic NSFW detection to policy-driven decisions that match your community standards. For background on how Vettly handles images alongside text and video, see the Multi-Modal documentation.

Why Image Moderation Is Different

Text moderation analyzes words and context. Image moderation analyzes visual content: nudity, violence, drugs, weapons, hate symbols, and more. The challenges are different:

  • No keyword matching: you can't grep an image for banned words
  • Context matters: a medical diagram and explicit content can look similar to simple classifiers
  • Volume: a single user can upload dozens of photos in minutes
  • Storage: you need to handle the image lifecycle — check before display, delete if blocked

Basic Image Check

The simplest integration checks an image URL against your policy:

check-image.tsNode.js
import { Vettly } from '@vettly/sdk';
const vettly = new Vettly(process.env.VETTLY_API_KEY);
const result = await vettly.check({
imageUrl: 'https://cdn.example.com/uploads/user-photo.jpg',
policy: 'social-app',
});
// result.action: 'allow' | 'flag' | 'block'
// result.categories: ['nudity', 'violence', ...]
// result.decisionId: 'dec_abc123'

The response includes the decision, the categories that were detected, and a decisionId for your audit trail.

Integration Patterns

Check Before Display

The most common pattern: check the image after upload but before it's visible to other users.

routes/upload.tsNode.js
app.post('/api/photos', async (req, res) => {
// 1. Upload image to storage (S3, Cloudflare R2, etc.)
const { url, key } = await storage.upload(req.file);
// 2. Check the image
const result = await vettly.check({
imageUrl: url,
policy: 'social-app',
});
// 3. Handle the decision
if (result.action === 'block') {
await storage.delete(key); // Clean up
return res.status(422).json({ error: 'Image violates guidelines' });
}
// 4. Save with moderation metadata
await db.photos.create({
url,
key,
userId: req.user.id,
moderationId: result.decisionId,
status: result.action === 'flag' ? 'pending_review' : 'published',
});
return res.status(201).json({ url, status: result.action });
});

Profile Photo Moderation

Profile photos are visible everywhere — in feeds, comments, search results, and messages. Apply stricter policies:

routes/profile.tsNode.js
app.put('/api/profile/photo', async (req, res) => {
const { url } = await storage.upload(req.file);
const result = await vettly.check({
imageUrl: url,
policy: 'profile-photo', // Stricter policy
});
if (result.action !== 'allow') {
await storage.delete(url);
return res.status(422).json({
error: 'Profile photos must meet our community standards',
});
}
await db.users.update(req.user.id, { avatarUrl: url });
return res.json({ avatarUrl: url });
});

Batch Checking for Albums

When a user uploads multiple photos at once, check them in parallel:

batch-check.tsNode.js
const imageUrls = uploadedFiles.map(f => f.url);
const results = await Promise.all(
imageUrls.map(url =>
vettly.check({ imageUrl: url, policy: 'social-app' })
)
);
const blocked = results.filter(r => r.action === 'block');
if (blocked.length > 0) {
// Remove blocked images from the album
for (const result of blocked) {
const idx = results.indexOf(result);
await storage.delete(uploadedFiles[idx].key);
}
}

Setting Up Policies

Vettly policies let you control which categories trigger which actions. For a social app, a typical policy might look like:

  • Block: nudity, child safety, extreme violence, hate symbols
  • Flag for review: suggestive content, mild violence, drugs
  • Allow: everything else

You configure this in the Vettly dashboard or via the API. Policies are versioned, so you can tighten or relax rules without losing history.

Handling Edge Cases

User Appeals

When a user's photo is blocked, give them a way to appeal. Store the decisionId and let users submit an appeal that routes to your moderation team:

routes/appeals.tsNode.js
app.post('/api/appeals', async (req, res) => {
const { decisionId, reason } = req.body;
await db.appeals.create({
decisionId,
userId: req.user.id,
reason,
status: 'pending',
});
return res.status(201).json({ status: 'appeal_submitted' });
});

Temporary URLs

If your images use signed URLs that expire, make sure the URL is still valid when Vettly checks it. Either use long-lived URLs for the moderation check or upload the image to a temporary public URL first.

Performance Considerations

  • Image size: Vettly accepts URLs, so the image doesn't pass through your server twice. Keep upload sizes reasonable (< 10MB) for fast checks.
  • Latency: image checks typically take 200-500ms. For real-time feeds, check before display. For background uploads, check async.
  • Caching: if the same image is re-uploaded (e.g., shared across posts), cache the decision by image hash to avoid redundant checks.

Add image moderation to your social app

Vettly detects NSFW content, violence, hate symbols, and more — driven by your policy, not a black box. Free tier includes 1,000 image checks per month.