Every content moderation action TikTok reports under the EU Digital Services Act — grouped by violation type and linked to the specific Community Guideline section that governs each enforcement bucket.
Three patterns the operators who keep their TikTok accounts watch for — and how to read this page like one of them.
01
How TikTok enforces content in the EU
TikTok runs one of the most aggressive automated moderation pipelines in the EU. A disproportionate share of its daily DSA-reported actions come from removing accounts of users who appear to be under 13 (the 'protection of minors' bucket) and from coordinated inauthentic behaviour (which lands under 'integrity and authenticity'). Account-level actions — suspensions and terminations — often outweigh content-level removals, which is the opposite of how Meta operates. The categories visible above map directly to TikTok's own Community Guidelines sections; each link in the sidebar takes you to the exact rule TikTok cites when justifying a takedown.
02
What this means for creators and brands on TikTok
Creators in lifestyle, parenting, finance, and dietary-supplement niches should watch the 'sensitive and mature themes' and 'unsafe products' rows closely — these are the categories where TikTok's enforcement tightens fastest after a Community Guideline update. Branded content, including Spark Ads and TikTok Shop, is held to a stricter bar than organic posts. A spike in 'Integrity and authenticity' enforcement typically signals tightening on undisclosed paid partnerships and engagement manipulation, both of which can result in account-level penalties that hit revenue immediately.
03
Reading the heatmap and policy banners
Each row's heat is scaled to that category's own 7-day max, so even low-volume categories like 'Animal welfare' or 'Self-harm' show meaningful day-to-day variance. The blue banners above the table are policy updates we detected on TikTok's own published guidelines. When you see a banner followed by a darkening row in the same enforcement category over the next 3–10 days, that's the signature of a policy change actually translating into action — the moment to revisit your content strategy or compliance review.
The rules they got banned for
Every action above stems from one of these TikTok rules.
Not knowing what changed in these rules is what got the accounts in the table suspended, demonetized, or removed. Read the rule, or get alerted the moment TikTok updates it — your call.
Where does this TikTok enforcement data come from?
Every action is sourced from the European Commission's DSA Transparency Database. TikTok submits each content moderation decision — video removals, account bans, restrictions — under Article 24(5) of the Digital Services Act. We aggregate their daily submissions under CC BY 4.0.
How do enforcement categories map to TikTok's actual Community Guidelines?
DSA defines 16 enforcement categories. We surface the most relevant TikTok Community Guideline section (Safety and Civility, Youth Safety, Integrity and Authenticity, etc.) for each category in the sidebar. If you see a spike in 'Protection of minors' enforcement, you can jump straight to TikTok's Youth Safety policy to see what changed.
Why does TikTok have such high creator/account action volume?
TikTok operates one of the most active short-video platforms in the EU and applies aggressive automated moderation. A large share of daily enforcement is automated removal of underage accounts (under-13 policy) and integrity violations like coordinated inauthentic behavior. Account-level actions outweigh content actions on many days.
How often is this timeline updated?
New entries are added every morning after our ingestion cron pulls yesterday's data from the DSA Transparency API. Expect each day's snapshot to appear roughly 12–18 hours after the calendar day ends.
How do TikTok policy changes relate to enforcement spikes?
When TikTok updates a Community Guideline or Branded Content policy (tracked by our Policy Change Scanner), enforcement in that category typically rises 3–10 days later. We surface related policy changes inline with the timeline where they align with enforcement dates.
Can I get alerted when TikTok enforcement spikes in my sector?
Yes — our Pro plan includes anomaly alerts that notify you by email when enforcement in a specific category spikes significantly above the baseline. Free plan users can view the data but don't receive real-time alerts.
Track TikTok's policies — before enforcement hits your account.
TikTok's enforcement on creators and advertisers is fast — bans first, appeals later. Get alerted the moment a Community Guideline shifts in your category, so you can adjust before the wave lands.