Skip to main content
Home/Enforcement/Snapchat
Snapchat logo

Snapchat Enforcement Timeline

Every moderation action Snapchat reports under the EU Digital Services Act — grouped by violation type and linked to the Snapchat Community Guidelines that govern each enforcement bucket. Crucial for advertisers targeting Gen Z audiences.

Updated daily
90,777actions in last 7 days
1platform

90,777 Snapchat actions in total

Category
Sat
Apr 25
Fri
Apr 24
Thu
Apr 23
Wed
Apr 22
Mon
Apr 20
Sun
Apr 19
Community guideline violations
51,466 total
8,101
8,327
7,855
8,495
9,362
9,326
Unsafe, non-compliant or prohibited products
19,203 total
2,836
3,080
3,440
3,849
3,217
2,781
Cyber violence
9,371 total
1,441
1,444
1,395
1,634
1,683
1,774
Scams and/or fraud
4,194 total
562
600
808
784
634
806
Protection of minors
2,669 total
437
354
521
515
448
394
Illegal or harmful speech
1,888 total
305
323
259
347
344
310
Violence
1,849 total
362
293
343
305
349
197
Risk for public security
65 total
6
6
4
3
28
18
Self-harm
60 total
10
9
5
11
8
17
Intellectual property infringements
12 total
1
11
Animal welfare
0 total
Consumer information infringements
0 total
Cyber violence against women
0 total
Data protection and privacy violations
0 total
Negative effects on civic discourse or elections
0 total
Unspecified notices
0 total
Daily total14,06014,43714,64115,94316,07315,623
Heat scale:lowmidhigh· per row, relative to that category's max, source: EU DSA Transparency Database (CC BY 4.0)
Read this

Most Snapchat accounts find out too late.

Three patterns the operators who keep their Snapchat accounts watch for — and how to read this page like one of them.

01

Why Snapchat's enforcement skews toward youth safety

Snapchat's EU user base is younger than almost any other major platform, and a non-trivial percentage falls under 18. This drives a structurally different enforcement profile: 'Protection of minors' tends to dominate the matrix above, often outpacing categories that lead on other platforms. Snap's automated systems aggressively remove suspected underage accounts, sextortion attempts, and grooming behaviour, all of which fall under DSA's protection-of-minors bucket. The sidebar links each category to the corresponding Snapchat Community Guidelines section.

02

What this means for advertisers targeting Gen Z

Brands running Snap Ads in regulated verticals — alcohol, gambling, finance, supplements — should treat any spike in 'Unsafe products' or 'Scams and fraud' as a direct signal that Snap is tightening its ad-eligibility model. Because Snapchat enforces both content and ad policies aggressively against demographic-misalignment risk, an enforcement bump often precedes ad-account warnings by 5–10 days. Creators and brand ambassadors should also watch the 'Cyber violence' row, which on Snap captures bullying-adjacent content tightening that affects creator-led campaigns.

03

How to read the Snapchat matrix

Snap's overall daily volumes are smaller than Meta's or TikTok's, but the per-category trends are sharper because the categories are more targeted. The heat scale is normalized per row, so a darkening 'Protection of minors' cell signals a real trend within that category, not just a reflection of high baseline volume. Policy banners above the matrix mark days where our scanner detected a Snap Community Guidelines or Ad Policy update — these are the leading indicators that the matrix is about to shift.

The rules they got banned for

Every action above stems from one of these Snapchat rules.

Not knowing what changed in these rules is what got the accounts in the table suspended, demonetized, or removed. Read the rule, or get alerted the moment Snapchat updates it — your call.

Community guideline violations
57% · 51,466
Unsafe, non-compliant or prohibited products
21% · 19,203
Cyber violence
10% · 9,371
Protection of minors
3% · 2,669
Illegal or harmful speech
2% · 1,888
Risk for public security
0% · 65
Self-harm
0% · 60

Frequently asked questions

Where does this Snapchat enforcement data come from?
Every action is sourced from the European Commission's DSA Transparency Database. Snap submits each moderation decision — Snap removals, account suspensions, restrictions — under Article 24(5) of the Digital Services Act. We aggregate their daily submissions under CC BY 4.0.
How do enforcement categories map to Snapchat's Community Guidelines?
DSA defines 16 enforcement categories. We surface the most relevant Snapchat Community Guidelines section for each category. Snapchat's guidelines emphasize minor safety, sexual content involving minors, and deceptive practices — areas where their enforcement is particularly aggressive.
Why does Snapchat have such a focus on minor safety enforcement?
A large share of Snapchat's user base in the EU is under 25, and a non-trivial percentage is under 18. This drives both heavy enforcement around protection of minors and a tighter ad policy framework around alcohol, gambling, and adult content targeting.
How often is this timeline updated?
New entries are added every morning after our ingestion cron pulls yesterday's data from the DSA Transparency API. Expect each day's snapshot to appear roughly 12–18 hours after the calendar day ends.
How do Snapchat policy changes relate to enforcement spikes?
When Snapchat updates Community Guidelines or Ad Policies (tracked by our Policy Change Scanner), enforcement in that category typically rises 3–10 days later. We surface related policy changes inline with the timeline.
Can I get alerted when Snapchat enforcement spikes in my category?
Yes — our Pro plan includes anomaly alerts. Particularly useful for advertisers in regulated categories (alcohol, gambling, finance) targeting younger audiences.

Track Snapchat's policies — before enforcement hits your ads.

Snapchat's policies emphasize youth safety and minor protection more than most platforms. Get alerted the moment Snap updates a Community Guideline or Ad Policy, so you can adjust before enforcement waves.

Create free account