Compliance
Meeting the EU Digital Services Act (DSA) Requirements for Content Moderation
The EU Digital Services Act (DSA) is the most comprehensive content moderation regulation to date. It applies to any online platform that serves EU users — regardless of where the company is headquartered. If you run a platform with user-generated content and have EU users, the DSA affects you.
This guide covers the DSA's key moderation requirements and how to implement them technically.
Who Does the DSA Apply To?
The DSA creates a tiered system based on platform size:
- All intermediary services: basic notice-and-action obligations
- Hosting services (including UGC platforms): additional transparency and moderation requirements
- Online platforms (with public-facing UGC): notice-and-action, complaint handling, transparency reporting
- Very large online platforms (VLOPs): the most extensive obligations, including systemic risk assessments
Most UGC apps fall into the "online platforms" category. The requirements below focus on that tier.
Core Requirements
1. Notice-and-Action Mechanism
The DSA requires platforms to have a mechanism for users to report illegal content. This is similar to reporting mechanisms you may already have, but with specific requirements:
- Reports must be easy to submit — no account required
- Reports must include the reason the content is illegal (not just "offensive")
- Platforms must acknowledge receipt of the report
- Platforms must make a timely, diligent, and objective decision
- The reporter must be notified of the outcome
app.post('/api/reports', async (req, res) => {const { contentId, reason, legalBasis, reporterEmail } = req.body;const report = await vettly.reports.create({contentId,reason,metadata: {legalBasis, // DSA requires specifying the legal basisreporterEmail, // For outcome notificationjurisdiction: 'EU',},reportedBy: req.user?.id || 'anonymous', // No account required});// DSA: acknowledge receiptif (reporterEmail) {await sendEmail(reporterEmail, {subject: 'Report received',body: `Your report (ID: ${report.id}) has been received and will be reviewed.`,});}return res.status(201).json({reportId: report.id,status: 'received',});});
2. Statement of Reasons
When you remove content or restrict an account, you must provide a "statement of reasons" that includes:
- What action was taken (removal, demotion, restriction)
- The legal or policy basis for the action
- The facts and circumstances relied upon
- How the decision was made (automated, human, or both)
- Information about redress — how to appeal
app.post('/api/moderation/action', async (req, res) => {const { contentId, action, reason, legalBasis, isAutomated } = req.body;// Record the action with DSA-required metadataconst decision = await db.moderationActions.create({contentId,action, // 'remove', 'restrict', 'demote'reason,legalBasis, // e.g., 'illegal_content', 'terms_of_service'decisionMethod: isAutomated ? 'automated' : 'human',statementOfReasons: {action,basis: legalBasis,facts: reason,method: isAutomated ? 'Automated content moderation system' : 'Human reviewer',redress: 'You may appeal this decision within 6 months via /api/appeals',},});// Notify the content creatorawait notifyUser(decision);return res.json({ decisionId: decision.id });});
3. Internal Complaint-Handling System
Users whose content is moderated must be able to appeal. The DSA requires:
- Appeals must be available for at least 6 months after the decision
- Appeals must be handled by qualified staff (not fully automated)
- The outcome must be communicated in a timely manner
app.post('/api/appeals', async (req, res) => {const { decisionId, reason } = req.body;const decision = await db.moderationActions.findById(decisionId);// DSA: check 6-month windowconst sixMonthsAgo = new Date();sixMonthsAgo.setMonth(sixMonthsAgo.getMonth() - 6);if (decision.createdAt < sixMonthsAgo) {return res.status(410).json({ error: 'Appeal window has expired' });}const appeal = await db.appeals.create({decisionId,userId: req.user.id,reason,status: 'pending_human_review', // DSA: must involve human review});return res.status(201).json({ appealId: appeal.id });});
4. Transparency Reporting
Online platforms must publish annual transparency reports that include:
- Number of content moderation decisions (broken down by type)
- Number of reports received
- Median time to act on reports
- Number of appeals and their outcomes
- Use of automated tools and their error rates
Vettly's dashboard tracks all moderation decisions, categories, and decision IDs. You can export this data for your transparency report.
5. Trusted Flaggers
The DSA introduces "trusted flaggers" — organizations designated by EU member states whose reports must be prioritized. If you receive a report from a trusted flagger, process it with priority.
Automated Moderation Under the DSA
If you use automated content moderation (which Vettly provides), the DSA has specific requirements:
- Inform users that automated tools are used for content moderation
- Report accuracy — include error rates in transparency reports
- Human oversight — automated decisions should be reviewed by humans, especially for legal content assessments
- No general monitoring obligation — the DSA does not require you to proactively scan all content, but if you do, you must follow the above rules
This means automated moderation is allowed and even expected, but you can't rely on it exclusively. Flagged content should route to human reviewers, and users must know that automated systems are involved.
Implementation Checklist
- Notice-and-action mechanism (no account required for reports)
- Statement of reasons for all moderation actions
- Internal complaint/appeal system (6-month window, human review)
- Transparency reporting infrastructure
- Trusted flagger prioritization
- User notification on moderation actions
- Audit trail for all decisions (Vettly
decisionId) - Terms of service updated to reflect DSA obligations
- Designated point of contact for EU authorities
Common Pitfalls
- Fully automated appeals. The DSA requires human involvement in appeals. An automated re-review doesn't count.
- No notification to users. Silently removing content without explanation violates the statement-of-reasons requirement.
- Missing legal basis. "Violates our terms" is not enough. You need to specify which term and why the content violates it.
- Ignoring small platform status. Even if you're not a VLOP, the basic DSA obligations still apply to all platforms with EU users.
Build DSA-compliant moderation
Vettly provides the moderation infrastructure — policy-driven decisions, audit trails, and reporting — that makes DSA compliance achievable without building everything from scratch.