Guide
App Store Rejection for User-Generated Content: How to Fix It
You submitted your app. A few days later, App Review sent it back with a rejection citing Guideline 1.2 — User Generated Content. You're not alone — this is one of the most common rejection reasons for apps that let users post text, images, or video.
The good news: the fix is specific and predictable. Apple tells you exactly what they want. This guide decodes the rejection message, explains what Apple is checking for, and walks through the fix step by step.
The Rejection Message
Apple's rejection emails for Guideline 1.2 typically look like this:
Guideline 1.2 - Safety - User Generated Content
Your app enables the display of user-generated content but does not have the proper precautions in place. Apps with user-generated content must include: a method for filtering objectionable material, a mechanism for users to flag objectionable content, the ability to block abusive users, and published contact information.
Apple is looking for four things. If any one is missing or hard to find, you get rejected.
What Apple Is Actually Checking
When a reviewer opens your app, they test these scenarios:
Content Filtering
The reviewer posts something obviously objectionable and checks if it appears in the feed. Client-side word lists don't count — they want server-side moderation.
User Reporting
The reviewer looks for a "Report" button on every type of UGC — posts, comments, profiles, messages. If any content type is missing a report option, you fail.
User Blocking
The reviewer blocks a user and verifies their content disappears from feeds, search, and messages. A cosmetic block that only hides the button will be caught.
Published Contact Info
A way for users to contact you about moderation issues. This can be an email address, a support page, or an in-app contact form.
Common Mistakes That Cause Rejection
- ×Client-side only filtering. Word lists in your app binary are trivially bypassed. Apple wants server-side moderation that works even if the client is modified.
- ×Report button missing on some content types. If your app has posts, comments, and profiles, every one needs a report option — not just posts.
- ×Block doesn't actually block. Blocking a user must prevent their content from appearing in feeds, messages, and search. A button that does nothing visible will be caught.
- ×No explanation in App Review Notes. If the reviewer can't find your moderation system, they reject first and ask questions later. Always explain your setup in the notes.
- ×Images and video unchecked. Text filtering alone isn't enough if your app accepts photos or video.
- ×Moderation only works with test accounts. Reviewers create their own accounts. If your moderation depends on specific user setup, it won't be visible during review.
Step-by-Step Fix
Step 1: Add Server-Side Content Filtering
Every piece of UGC needs to pass through a moderation check before other users see it. Install the Vettly SDK and add a check before saving content:
import { Vettly } from '@vettly/sdk';const vettly = new Vettly(process.env.VETTLY_API_KEY);app.post('/api/posts', async (req, res) => {const { text, imageUrl } = req.body;const result = await vettly.check({content: text,imageUrl,policy: 'community-safe',});if (result.action === 'block') {return res.status(422).json({ error: 'Content violates guidelines' });}await db.posts.create({text, imageUrl,moderationId: result.decisionId,status: result.action === 'flag' ? 'pending_review' : 'published',});return res.status(201).json({ status: 'ok' });});
Step 2: Add Report UI on Every Piece of UGC
Add a report option to every content type — posts, comments, profiles, messages. On your backend, forward reports to Vettly:
app.post('/api/reports', async (req, res) => {const { contentId, reason } = req.body;await vettly.reports.create({contentId,reason,reportedBy: req.user.id,});return res.status(201).json({ status: 'reported' });});
Step 3: Add User Blocking That Actually Works
When a user blocks someone, filter that user's content from all surfaces:
app.post('/api/blocks', async (req, res) => {await vettly.blocks.create({userId: req.body.userId,blockedBy: req.user.id,});return res.status(201).json({ status: 'blocked' });});// Filter blocked users from feedsapp.get('/api/feed', async (req, res) => {const blocks = await vettly.blocks.list({ blockedBy: req.user.id });const blockedIds = blocks.map(b => b.userId);const posts = await db.posts.find({authorId: { $nin: blockedIds },status: 'published',});return res.json(posts);});
Step 4: Write Clear App Review Notes
In App Store Connect, open the "App Review Information" section and add notes that map directly to the requirements:
Content Moderation Implementation: - Filtering (1.2.1): All user-generated content is screened server-side via the Vettly API before display. Text and images are checked against our community-safe policy. - Reporting (1.2.2): Users can report any content (posts, comments, profiles) via the flag icon / three-dot menu on each piece of content. - Blocking (1.2.3): Users can block other users from their profile. Blocked users are filtered from feeds, search, and messages. - Contact: [email protected]
How to Resubmit
- Update your app binary with the moderation changes
- In App Store Connect, go to the rejected submission
- Update the "App Review Information" notes with the template above
- Reply in the Resolution Center with a brief explanation of what you fixed
- Submit for review
Apple typically re-reviews within 1–2 days.
Go Deeper
For a preventative walkthrough of all four requirements, see How to Pass Apple's App Store Guideline 1.2. For the full compliance checklist with implementation details, see the App Store compliance documentation.
Fix your App Store rejection today
Vettly covers all four Guideline 1.2 requirements with a single integration. Start with the free tier and resubmit before your next review.