EU’s X Fine Isn’t A Threat To Free Speech. It’s A Defense Of It.
Imran Ahmed / Dec 18, 2025Imran Ahmed is the CEO & founder of the Center for Countering Digital Hate (CCDH).
When the European Commission announced its €120 million fine against X, a chorus of attacks began almost instantly. Detractors insisted that Brussels was punishing “free speech.” Commentators were quick to repeat the rebuttal: this isn’t about free speech at all, it’s about transparency.
But this sets up a false dichotomy. The Center for Countering Digital Hate (CCDH) experience with researcher access to platform data shows that without strong transparency rules, it’s free speech that is under threat.
X has spent years fighting scrutiny, even suing researchers who exposed the harms on its platform. When X tried to sue CCDH in the United States, a federal judge threw the case out under California’s anti-SLAPP law, calling it a clear attempt to silence our public interest research, to silence free speech. Then and now, our research shows a proliferation of harms on X, amplified by opaque algorithmic systems that determine what each user sees. Without independent access to the underlying data the public can't debate these systems, let alone their effect on civil discourse, minors, or vulnerable communities. X’s lawsuit against us was a deliberate effort to shut down that debate.
For researchers, X’s lawsuit against us was the clearest possible demonstration of why access to data can’t depend on the goodwill of platforms. If the only route to data is through the courts, powerful companies can use litigation to exhaust or intimidate critics. That is precisely why civil society has long argued that access must be guaranteed by law.
When the EU created that guarantee through the Digital Services Act, CCDH and others tested whether platforms would comply. We deliberately chose the lowest-burden public data request under Article 40(12): access to the top 1,000 most-viewed posts on each platform. It was a straightforward, easily generated dataset – a minimal lift for any major social media company. Yet every platform refused, including X, despite the DSA making cooperation a legal obligation. What could have been a simple demonstration of good-faith compliance instead revealed something troubling: even with laws in place, platforms will still attempt to do almost nothing, even at the easiest threshold, to allow researchers to speak freely about how their systems shape our online world.
X’s fine was avoidable; Musk’s platform stands out not only for its hostility to scrutiny but for its apparent strategy of non-compliance. While non-compliance with researcher access requirements is still common, most platforms at least seem to be engaging more constructively with the Commission. So far, TikTok and AliExpress have both made commitments to work towards compliance, which the Commission has accepted. The choice is clear: enter into constructive dialogue with the regulator or risk a hefty fine. This carrot-and-stick approach is beginning to show signs of working, and with it, a real chance of meaningful transparency.
Transparency rules are not abstract regulatory hoops to jump through; they are essential democratic infrastructure allowing researchers, journalists and civil society to understand and speak freely about the corporations governing our information environment. The DSA recognizes this. But as our experience proves, these rules only matter when regulators are prepared to enforce them – which is exactly what the EU is now doing.
Free speech is at the core of this case, just not in the way Musk’s supporters want you to think. In fighting for the public’s right to know how platforms are shaping their online world, the EU’s digital regulator has set an important precedent. Now, others should follow.
Editor’s note: X has filed a notice of appeal to the Ninth Circuit, and the lawsuit remains pending.
Authors

