EU Digital Services Act Fines X (Twitter) €120 Million: Ad Transparency Enforcement Enters New Phase
The European Commission issued a €120M DSA fine to X — the first non-compliance decision under the Act — for deceptive blue checkmark design, opaque ad repositories, and blocking researcher access.
Inside This Compliance Report
- 1Enforcement Overview: The First DSA Non-Compliance Decision
- 2Blue Checkmark Deception: How X's Verification System Failed DSA Standards
- 3Ad Repository Failures: What Information X Was Missing
- 4Researcher Data Access: Excessive Delays and Systemic Barriers
- 5Remediation Deadlines: What X Must Fix and When
- 6DSA Advertising Prohibitions: Children, Special Categories, and Profiling
- 7TikTok and AliExpress: Voluntary Commitments vs Formal Enforcement
- 8What This Means for Advertisers Buying Inventory on EU Platforms
- 9Frequently Asked Questions
Enforcement Overview: The First DSA Non-Compliance Decision
On March 31, 2026, the European Commission issued a €120 million fine to X Corp (formerly Twitter) for breaching the Digital Services Act — marking the first formal non-compliance decision under the DSA since its VLOP provisions entered force in August 2023. The fine is the culmination of an investigation that began in December 2023 when the Commission opened formal proceedings against X, followed by a preliminary findings notice in July 2024.
The DSA imposes heightened obligations on Very Large Online Platforms (VLOPs) — platforms with more than 45 million monthly active users in the EU. X was designated a VLOP in April 2023. The obligations include maintaining transparent, searchable ad repositories; providing researchers with meaningful data access; and ensuring that identity and trust signals presented to users are accurate and not misleading.
"This decision sends an unambiguous message: the Digital Services Act is not aspirational guidance. It is binding law with real financial consequences. Platforms that treat compliance as optional will face enforcement that scales with their size and reach." — European Commission DSA Enforcement Statement, March 2026
The €120M fine, while significant, sits below the DSA's theoretical maximum of 6% of X's global annual revenue, which the Commission noted in its decision summary. The fine level reflects the Commission's assessment of the severity, duration, and intentionality of the violations, as well as X's cooperation — or lack thereof — during the investigation. The Commission retained the right to impose additional periodic penalty payments if X fails to meet its remediation obligations within the specified deadlines.
This enforcement action directly affects advertisers operating on X in the EU, compliance teams at competing platforms watching regulatory precedent develop, and any business using X's ad tools to reach European audiences. Track ongoing DSA enforcement actions on our Policy Change Tracker.
| Violation Area | DSA Article | Severity | Remediation Deadline |
|---|---|---|---|
| Blue checkmark deceptive design | Art. 25 (Dark patterns) | High | 60 working days |
| Ad repository incomplete information | Art. 39 (Ad transparency) | Critical | 90 working days (action plan) |
| Researcher data access barriers | Art. 40 (Data access) | High | 90 working days (action plan) |
Blue Checkmark Deception: How X's Verification System Failed DSA Standards
The Commission's most publicly visible finding concerns X's blue checkmark — a symbol that, from Twitter's founding through approximately 2022, exclusively denoted that an account had been verified by the platform as authentic: confirming the account genuinely represented the public figure, journalist, organization, or entity it claimed to be. This semantic meaning was widely understood by users and had become a foundational trust signal in the platform's information ecosystem.
Following Elon Musk's acquisition of Twitter in October 2022, the company repurposed the blue checkmark as a paid subscription benefit under the "X Premium" (formerly "Twitter Blue") product. Any user who paid the monthly subscription fee could obtain a blue checkmark, regardless of whether their identity had been verified. Simultaneously, many previously verified accounts that did not subscribe lost their checkmarks — further eroding the consistency of the signal.
The DSA addresses this type of interface manipulation under Article 25, which prohibits VLOPs from using "dark patterns" — interface designs that deceive or manipulate users into believing things that are not true. The Commission found that X's continued use of the blue checkmark as both a legacy verification indicator and a paid subscription feature created a systematically deceptive user experience that violated this provision.
"Users encountering a blue checkmark on X reasonably believe it signals verified identity. When that signal is available for purchase without verification, the platform has weaponized a trust indicator against the very users it was meant to protect."
The practical consequences for advertisers and users are significant. Advertising adjacent to accounts bearing checkmarks — implicitly trusted as verified entities — carries a different risk profile than advertising adjacent to unverified accounts. If users cannot accurately interpret which accounts are genuinely verified, brand safety assessments based on account verification status become meaningless. See our Compliance Rules Tool for DSA-specific requirements affecting ad placement decisions.
X has 60 working days to overhaul the verification system to eliminate the deceptive interpretation. Options the Commission considers compliant include using visually distinct symbols for paid subscribers vs. identity-verified accounts, or discontinuing the checkmark for paid subscribers entirely and reserving it only for verified identities.
Ad Repository Failures: What Information X Was Missing
Article 39 of the DSA requires VLOPs to maintain a publicly accessible, searchable repository of all advertisements served on the platform. The repository must include, at minimum: the content of the advertisement, the period during which it was displayed, the total number of users reached, the Member States where it was displayed, the targeting parameters used (in aggregate), and — critically — the identity of the natural or legal person who paid for the advertisement.
The Commission's investigation found X's ad repository systematically deficient across two critical dimensions:
- Missing ad content and topic data: The repository did not consistently include the actual content or subject matter of advertisements. Without this, users, journalists, and researchers cannot search the repository for ads relating to specific topics — defeating the transparency purpose entirely.
- Missing advertiser identity: The repository failed to identify the legal entity paying for advertisements. This is perhaps the most serious omission: advertiser identity is the information that allows scrutiny of who is funding what messaging on the platform, which is essential for detecting political manipulation, undisclosed lobbying, and coordinated inauthentic behavior.
Beyond missing data fields, the Commission found that X had implemented design features and access barriers that made the repository difficult to use even where data was nominally present. These included interface limitations that prevented bulk searches, rate limits that made systematic auditing impractical, and technical structures that obscured the relationship between ads and their funding sources.
| Required DSA Repository Field | X's Compliance Status | Impact on Auditability |
|---|---|---|
| Ad content and creative | Incomplete / inconsistent | Topic-based searches unreliable |
| Legal entity paying for ad | Missing | Advertiser accountability impossible |
| Targeting parameters (aggregate) | Partial | Profiling audit severely limited |
| Display period | Present | Functional |
| Reach (total users) | Present | Functional |
| Member States displayed | Present | Functional |
For advertisers, the repository gaps create a secondary compliance risk: if your competitors are running undisclosed political ads or misleading campaigns on X and the repository cannot expose them, your brand may be associated with a platform that regulators increasingly view as an enforcement problem. Visit the Knowledge Base for guidance on evaluating platform ad repositories as part of your due diligence process.
Researcher Data Access: Excessive Delays and Systemic Barriers
Article 40 of the DSA grants vetted researchers — those affiliated with institutions and approved through a formal process — the right to access platform data necessary to understand systemic risks. This provision exists because independent academic and civil society research is essential to identifying harms that platforms may not self-report: disinformation networks, algorithmic amplification of harmful content, coordinated inauthentic behavior, and ad targeting that disproportionately harms protected groups.
The Commission found that X imposed excessive delays in processing researcher access requests, effectively rendering the right meaningless for time-sensitive research. Academic investigations of election-related content, for example, have a narrow window of relevance — data access granted months after an election has occurred cannot support the real-time analysis that public interest research requires.
Beyond delays, the investigation identified structural barriers including:
- Overly restrictive eligibility criteria that excluded legitimate researchers from qualifying for data access
- Non-transparent approval processes with no clear timeline or appeal mechanism
- Technical API limitations that constrained the scope of approved research even for researchers who gained access
- Changes to X's developer API pricing and access tiers following the 2022 acquisition that made previously accessible data commercially unaffordable for non-profit research institutions
"When researchers cannot access data, harms go undetected. The DSA's researcher access provision is not a courtesy — it is a structural accountability mechanism. Blocking it has the same effect as blocking an audit."
The API pricing changes deserve particular attention: after X significantly increased API access costs in 2023 — effectively eliminating free and low-cost research access — the academic research community lost much of its capacity to independently study the platform. The Commission's finding suggests that pricing structures designed to commercially exclude researchers, even if facially neutral, can constitute a DSA violation when they systematically undermine the researcher access right.
Remediation Deadlines: What X Must Fix and When
The Commission's non-compliance decision includes binding remediation requirements with specific deadlines. These are not voluntary commitments — failure to meet them triggers additional enforcement, including periodic penalty payments that accrue daily until compliance is achieved.
| Requirement | Deadline | What Compliance Looks Like | Consequence of Non-Compliance |
|---|---|---|---|
| Fix identity verification system (blue checkmark) | 60 working days (~12 weeks) | Visually distinct or separate signals for paid vs. verified; no deceptive overlap | Periodic penalties up to 5% of average daily global turnover |
| Submit action plan: ad repository and researcher access | 90 working days (~18 weeks) | Binding plan with timelines for full Art. 39 and Art. 40 compliance | Periodic penalties; possible escalation to 6% revenue fine |
The two-track timeline reflects the Commission's assessment of complexity. The blue checkmark issue, while high-profile, is a product design decision that can be implemented relatively quickly — it requires a UI change and a policy update. The ad repository and researcher access problems, by contrast, require infrastructure development, legal framework changes, and potentially new data systems. The action plan requirement (rather than direct compliance) for these gives X a structured path forward while maintaining accountability.
Notably, the Commission did not specify exactly what the action plan must contain — leaving X some flexibility in how it proposes to achieve compliance. However, the Commission retains the right to reject an action plan it considers insufficient and to impose additional measures or penalties accordingly. This creates an incentive for X to negotiate the plan's content with Commission officials before formal submission.
Advertisers with significant EU-facing campaigns on X should monitor the Commission's DSA enforcement database and X's public compliance reports for updates on remediation progress. Our Policy Change Tracker will update when formal milestones are reached or new enforcement actions are initiated.
DSA Advertising Prohibitions: Children, Special Categories, and Profiling
Separate from the enforcement action against X, the DSA establishes two absolute prohibitions on targeted advertising that apply to all VLOPs and VLOSEs (Very Large Online Search Engines) operating in the EU. These are not transparency requirements — they are outright bans that create direct compliance obligations for advertisers as well as platforms.
Prohibition 1: Targeted Advertising to Minors
Article 28 of the DSA prohibits VLOPs from presenting targeted advertising to users they know or should reasonably know to be minors. This applies regardless of the advertising category — it is not limited to age-sensitive products like alcohol or gambling. Any advertising that uses personal data or behavioral signals to select which users see it cannot be directed at users under 18 in the EU.
The practical implications for advertisers are significant. If a platform cannot guarantee that a targeted audience segment excludes minors, running targeted campaigns on that platform may expose advertisers to secondary liability under national laws implementing the DSA. Advertisers should:
- Request explicit confirmation from platforms that age exclusion is enforced at the targeting layer, not just at the account registration layer
- Avoid audience segments that skew young without explicit age floor controls
- Document their targeting parameters as evidence of due diligence
Prohibition 2: Profiling-Based Advertising Using Special Category Data
Article 26(3) of the DSA prohibits targeted advertising based on profiling when the profiling uses special categories of personal data. The DSA incorporates the GDPR's definition of special categories, which includes:
- Racial or ethnic origin
- Political opinions
- Religious or philosophical beliefs
- Trade union membership
- Genetic data
- Biometric data (where used for unique identification)
- Health data
- Data concerning sex life or sexual orientation
This prohibition is broader than many advertisers realize. It does not only cover explicitly collected special category data — it covers inferred data as well. An audience segment built by inferring health conditions from browsing behavior, or religious affiliation from content consumption patterns, falls within the prohibition. Third-party data providers selling "health interest" or "religious affinity" segments for EU targeting are potentially non-compliant with this provision, and advertisers purchasing such segments carry exposure.
"The DSA's advertising prohibitions do not require intent or knowledge of the underlying data. If a segment was built using special category data — even if the advertiser didn't know — the resulting targeting may still violate the ban. Due diligence on data provenance is now a legal obligation."
Use our Compliance Rules Tool to check specific targeting scenarios against DSA advertising prohibitions before activating EU campaigns.
TikTok and AliExpress: Voluntary Commitments vs Formal Enforcement
While X received a formal non-compliance decision, the Commission confirmed in the same enforcement cycle that TikTok and AliExpress have made binding commitments to provide fully compliant ad repositories. This two-track outcome illustrates the Commission's enforcement strategy: platforms that cooperate with supervisory dialogue and make credible compliance commitments can avoid formal penalties, while platforms that resist or delay face the full enforcement apparatus.
TikTok's commitment covers its EU ad repository under Article 39, committing to include all required data fields — including advertiser identity and ad topic — within a specified implementation timeline. AliExpress, primarily relevant as a marketplace platform, committed to similar standards for its advertising transparency infrastructure.
| Platform | DSA Status | Ad Repository | Enforcement Track |
|---|---|---|---|
| X (Twitter) | Non-compliance decision issued | Deficient — missing advertiser identity and ad topic | Formal enforcement + financial penalty |
| TikTok | Under ongoing supervision | Committed to full compliance | Supervisory dialogue |
| AliExpress | Under ongoing supervision | Committed to full compliance | Supervisory dialogue |
| Meta (Facebook/Instagram) | Under separate proceedings | Ad Library partially compliant | Ongoing investigation |
The TikTok and AliExpress commitments are legally binding — failure to implement them on schedule would constitute a DSA violation eligible for the same enforcement path as X's case. However, the absence of a fine reflects the Commission's preference for cooperative compliance over adversarial enforcement where platforms demonstrate genuine intent to comply.
For advertisers, this distinction matters less in practice than the underlying compliance status. Until TikTok and AliExpress implement their committed improvements, their repositories remain limited tools for competitive intelligence and compliance verification. Advertisers should treat all platform ad repositories as works-in-progress until the Commission confirms full compliance.
What This Means for Advertisers Buying Inventory on EU Platforms
The X enforcement action, combined with the DSA's advertising prohibitions and the broader regulatory trajectory, creates a set of concrete compliance obligations and risk considerations for every advertiser running campaigns targeting EU users on social media platforms.
Immediate Actions for EU Advertisers
- Audit targeting parameters for special category data: Review every active EU campaign and every audience segment used in EU targeting. Identify any segment built using health, religious, ethnic, political, or other special category dimensions — even if acquired through a third-party data provider. Pause or restructure non-compliant segments before the next campaign cycle.
- Implement robust minor exclusion: For all EU campaigns using behavioral or interest-based targeting, verify that each platform's minor exclusion mechanisms are active and sufficient. Request platform confirmation in writing. Document this verification for compliance records.
- Monitor X remediation status: If X is part of your EU media mix, track the Commission's published updates on X's compliance with its remediation deadlines. A platform under active enforcement with unresolved violations carries elevated brand safety risk. Consider adjusting EU budget allocation on X until remediation milestones are confirmed.
- Use DSA ad repositories for competitive intelligence: Once platforms achieve full repository compliance, the repositories will become valuable tools for monitoring competitor advertising, verifying that your own ads are accurately represented, and auditing targeting compliance. Build repository access into your standard campaign monitoring workflow.
- Review data provider contracts: Any third-party data provider supplying EU audience segments should be contractually required to warrant that their data does not include special categories and complies with GDPR and DSA requirements. Indemnification provisions for DSA violations should be standard in new contracts.
Strategic Considerations for Platform Selection
The DSA enforcement landscape is creating differentiated risk profiles across platforms. Platforms with formal non-compliance decisions carry regulatory risk that can affect advertisers through association, brand safety exposure, and potential secondary liability in some jurisdictions. Platforms that are actively cooperating with Commission supervision represent a lower-risk environment, even if their repositories are not yet fully compliant.
"Platform selection for EU campaigns is increasingly a compliance decision, not just a reach decision. An enforcement action against a platform creates downstream risk for every advertiser whose brand appears in that environment during the enforcement period."
The DSA enforcement trajectory suggests that fines will become larger and more frequent as the Commission builds enforcement capacity and jurisprudence. The €120M X fine, while the first, will not be the last. Platforms across the VLOP list are under active supervision, and the Commission has signaled that enforcement will become "more adversarial" — a direct reference to moving from cooperative compliance dialogue to punitive action where platforms fail to engage meaningfully.
Documentation and Due Diligence
In an environment where regulatory scrutiny of the entire advertising ecosystem is intensifying, documented due diligence is your primary defense. For EU campaigns, maintain records of:
- Targeting parameters used, with dates and campaign IDs
- Platform confirmations of minor exclusion and special category data exclusion
- Data provider compliance attestations
- Any compliance-related communications with platform ad teams
- Internal review and approval records for EU campaign launches
This documentation matters not only for potential regulatory inquiries but for demonstrating that your organization exercises appropriate oversight over its advertising practices — an increasingly important signal to both regulators and brand safety-conscious business partners.
Stay ahead of DSA enforcement developments and every platform policy change affecting EU advertising on our Policy Change Tracker. For a comprehensive analysis of DSA requirements applicable to your specific advertising use cases, visit the Knowledge Base and use our Compliance Rules Tool.
Frequently Asked Questions
The following questions represent the most common concerns raised by advertisers and compliance professionals following the European Commission's DSA enforcement action against X.
Why did the European Commission fine X €120 million under the DSA?
The European Commission found X in breach of three core DSA obligations for Very Large Online Platforms. X's blue checkmark system was deemed deceptive because it repurposed a verified-identity signal as a paid subscription feature without adequately informing users. X's ad repository failed to include advertiser identity and ad topic data required under Article 39. And X placed excessive delays and structural barriers on researchers seeking data access under Article 40. Together, these failures produced the first formal non-compliance decision and financial penalty under the Digital Services Act.
What specific information is X's ad repository missing?
The DSA's Article 39 requires repositories to include the content and topic of each ad, and the identity of the legal entity that paid for it. X's repository was missing both. Without advertiser identity, it is impossible to trace who funds particular campaigns — essential for detecting political manipulation and undisclosed lobbying. The repository also had design and access barriers that undermined usability even where data was nominally present, preventing meaningful auditing by journalists, researchers, and civil society organizations.
What are the remediation deadlines X must meet?
X has 60 working days to fix its blue checkmark system to eliminate the deceptive user experience. For ad repository and researcher access issues, X has 90 working days to submit a binding action plan — not necessarily implement full compliance, but commit to a detailed timeline for achieving it. The Commission can reject insufficient plans and impose additional periodic penalty payments of up to 5% of average daily global turnover for ongoing non-compliance.
How does the DSA ban on targeted advertising affect my EU campaigns on X?
The DSA prohibits targeting minors with any behavioral or interest-based advertising, and prohibits profiling-based ads using special categories of personal data (religion, ethnicity, health, political opinions, sexual orientation). These prohibitions apply to advertisers, not just platforms. Review all EU audience segments for special category data provenance, verify minor exclusion is enforced at the targeting layer, and document your compliance process as a defense against potential regulatory scrutiny.
Should I reduce EU ad spend on X given this enforcement action?
The enforcement action does not prohibit advertising on X in the EU. However, a platform under active non-compliance proceedings carries elevated brand safety risk — your ads may appear adjacent to content or accounts that are themselves under regulatory scrutiny. A risk-based approach would involve monitoring X's remediation progress, diversifying EU social media spend across platforms with cleaner DSA compliance profiles, and reassessing X's weighting in your EU media mix as the remediation deadlines approach. The Commission's periodic penalty payments, if triggered, could accelerate X's compliance timeline — or, if paid without compliance, signal the platform's willingness to absorb costs rather than change practices.
What do TikTok and AliExpress's DSA commitments mean for advertisers on those platforms?
TikTok and AliExpress made binding commitments to provide fully compliant ad repositories under DSA Article 39, avoiding formal non-compliance decisions. For advertisers, this means both platforms are on a clear compliance trajectory — though their repositories are not yet fully compliant. Treat their repositories as limited tools until the Commission confirms implementation. The commitments do not affect the DSA's absolute advertising prohibitions (targeting minors, special category profiling) which apply regardless of repository status.
Don't miss the next policy change.
Subscribe to the Policy Change Tracker — get weekly digests or instant Pro alerts across all 8 platforms. Or try our free Keyword Risk Checker first.
Report Keywords — Run AI Compliance Audit
Related Posts
TikTok New Ownership Ad Rules 2026: Custom Identity Phase-Out, 48-Hour Reviews & AI Content Ban
TikTok's new ownership has introduced sweeping ad policy changes in 2026 — phasing out Custom Identity, extending ad review windows to 48 hours, and completely banning AI-generated endorsements. Advertisers must now link verified TikTok profiles, submit first-party data through the Events API, and comply with mandatory commercial content disclosure on all promotional posts.
Meta & TikTok Ad Compliance for US DTC Brands (2026 Updated)
Why 73% of US DTC brands get flagged on Meta & TikTok — and how to fix it. Covers FTC substantiation rules, multimodal enforcement, forbidden keywords, and compliance-first scaling playbook.
Social Media Content Audit 2026 — Free AI-Powered Checklist & Framework
Most brands audit content manually — and miss 40% of violations. This 7-step AI-powered framework automates compliance checks across Meta, TikTok, and Snapchat. Free downloadable checklist included.