Guide
How to Moderate User-Generated Content in a React Native App
React Native apps that accept user-generated content — posts, comments, photos, profile bios — need server-side moderation before that content reaches other users. Client-side word lists are trivially bypassed and won't satisfy App Store or Play Store review teams.
This guide walks through adding content moderation to a React Native app using the Vettly API. By the end, you'll have pre-publish filtering, user reporting, and user blocking — the three pillars required by both Apple and Google. For the full SDK reference and Expo-specific setup, see the React Native integration docs.
Architecture Overview
Moderation should happen on your backend, not in the app binary. The flow is:
- User submits content in the React Native app
- Your backend receives the content and calls the Vettly API
- Vettly returns
allow,flag, orblock - Your backend saves or rejects the content based on the decision
- The app displays the result to the user
This keeps your moderation logic server-side, where it can't be bypassed by a modified client.
Setting Up the SDK
Install the Vettly SDK in your backend project:
npm install @vettly/sdk
Initialize the client with your API key:
import { Vettly } from '@vettly/sdk';const vettly = new Vettly(process.env.VETTLY_API_KEY);
Pre-Publish Content Filtering
Before saving any user-generated content to your database, run it through the check endpoint:
app.post('/api/posts', async (req, res) => {const { text, imageUrl } = req.body;const result = await vettly.check({content: text,imageUrl,policy: 'community-safe',});if (result.action === 'block') {return res.status(422).json({error: 'Content violates community guidelines',categories: result.categories,});}// Save the post with the decision ID for audit trailsawait db.posts.create({text,imageUrl,moderationId: result.decisionId,status: result.action === 'flag' ? 'pending_review' : 'published',});return res.status(201).json({ status: 'ok' });});
On the React Native side, handle the rejection gracefully:
const submitPost = async () => {try {const res = await fetch(`${API_URL}/api/posts`, {method: 'POST',headers: { 'Content-Type': 'application/json', Authorization: token },body: JSON.stringify({ text, imageUrl }),});if (res.status === 422) {Alert.alert('Content blocked', 'Your post violates community guidelines.');return;}navigation.navigate('Feed');} catch (err) {Alert.alert('Error', 'Something went wrong. Please try again.');}};
Adding User Reporting
Every piece of UGC needs a report button. In React Native, this is typically a menu item in a long-press action sheet or a three-dot menu:
const reportContent = async (contentId: string, reason: string) => {await fetch(`${API_URL}/api/reports`, {method: 'POST',headers: { 'Content-Type': 'application/json', Authorization: token },body: JSON.stringify({ contentId, reason }),});Alert.alert('Reported', 'Thanks for keeping the community safe.');};
On your backend, forward the report to Vettly:
app.post('/api/reports', async (req, res) => {const { contentId, reason } = req.body;await vettly.reports.create({contentId,reason,reportedBy: req.user.id,});return res.status(201).json({ status: 'reported' });});
Adding User Blocking
Blocking must be functional, not cosmetic. When a user blocks another user, the blocked user's content should disappear from feeds, search results, and direct messages.
app.post('/api/blocks', async (req, res) => {const { userId } = req.body;await vettly.blocks.create({userId,blockedBy: req.user.id,});return res.status(201).json({ status: 'blocked' });});// When fetching feeds, filter out blocked usersapp.get('/api/feed', async (req, res) => {const blocks = await vettly.blocks.list({ blockedBy: req.user.id });const blockedIds = blocks.map(b => b.userId);const posts = await db.posts.find({authorId: { $nin: blockedIds },status: 'published',});return res.json(posts);});
Image Moderation
If your app accepts photos, you need image moderation too. Vettly handles text and images in the same API call:
const result = await vettly.check({imageUrl: uploadedImage.url,policy: 'community-safe',});if (result.action === 'block') {// Delete the uploaded image and reject the postawait storage.delete(uploadedImage.key);return res.status(422).json({ error: 'Image violates guidelines' });}
App Store & Play Store Compliance
With pre-publish filtering, user reporting, and user blocking in place, your React Native app meets the core moderation requirements for both stores:
- Apple Guideline 1.2: Content filtering (1.2.1), reporting (1.2.2), blocking (1.2.3)
- Google Play Developer Policy: User-generated content policy requires mechanisms to moderate, report, and block
Include a note in your App Store review submission explaining your moderation setup. For a ready-to-use template, see our App Store Guideline 1.2 guide.
Ship moderation in your React Native app today
Vettly covers text, image, and video moderation with a single API. Start free — no credit card required.