Home

Donate
News

EU Disinformation Code Takes Effect Amid Censorship Claims and Trade Tensions

Ramsha Jahangir / Jul 1, 2025

Flags of the European Union aside a European Commission building.

As of July 1, 2025, Europe’s Code of Conduct on Disinformation is officially in effect. What was once a voluntary self-regulatory framework is now locked into the Digital Services Act (DSA), requiring the Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) to meet tougher transparency and auditing obligations aimed at stamping out disinformation. Full compliance with the Code now counts as a key risk-mitigation measure and marker of DSA compliance. And come audit time, tech platforms will have to prove they’re sticking to their commitments – or face scrutiny from Brussels.

The Code comes into effect just ahead of high-stakes trade talks between the EU and the US, with a July 9 deadline looming. But the EU has so far held firm. “The DSA and the DMA are not on the table in trade negotiations,” a Commission spokesperson said Monday, adding “we are not going to adjust the implementation of our legislation based on actions of third countries. If we started down that road, we’d have to do it with many countries.”

This pushback isn’t limited to the EU. Canada is facing similar heat from the US after introducing its own digital services tax on American Big Tech firms, which President Donald Trump condemned as “obviously copying the European Union.” Joel Kaplan, Meta’s chief global affairs officer, was quick to praise Trump for “standing up for American tech companies in the face of unprecedented attacks from other governments.” Shortly thereafter, trade talks with Canada were abruptly suspended – until Ottawa scrapped its digital tax.

While the EU steps up enforcement to hold platforms accountable and protect public discourse, Brussels finds itself navigating a rising tide of censorship accusations, especially from Washington, where MAGA-aligned officials are watching closely – and tech platforms are rallying behind them.

Censorship or systemic risk?

One of the central anxieties around the Code’s elevation into a DSA instrument is whether it undermines freedom of expression. While the EU Commission has consistently framed the Code as a voluntary mechanism, its transformation into a compliance tool under Article 35 of the DSA means that failing to adhere to its commitments may now trigger investigations or fines.

In May, Rep. Jim Jordan (R-OH), Chairman of the House Judiciary Committee, and four other congressmen sent a letter to EU Commissioner Michael McGrath. They argued that since the DSA requires platforms to systematically censor “disinformation” and most companies won’t create separate moderation systems for Europe and the rest of the world, the DSA could set de facto global censorship standards, restricting Americans’ online speech.

The European Commission maintains this is a misreading of the law. “The [Disinformation] Code is not about censorship,” Thomas Regnier, a Commission spokesperson, told Tech Policy Press in an email. “On the contrary, it is a framework aiming to create a transparent, fair and safe online environment, while fully upholding the fundamental rights of users, including freedom of expression.” Freedom of expression, the spokesperson added, “lies at the heart of the DSA.”

The Commission emphasized that the distinction lies in the DSA’s structural focus. Rather than going after individual pieces of content, the law zeroes in on transparency, accountability, and systemic risk, targeting opaque recommender algorithms and ad networks that shape what users see.

“The Code of Practice on Disinformation is not geared toward content removal,” Regnier said. “Its commitments aim to protect against manipulation of online platforms, giving users more context and tools to navigate safely – not suppressing content.”

“Content moderation does not mean censorship,” the spokesperson added. “The DSA requires platforms to be transparent about moderation practices, including ‘shadow banning,’ and empowers users to challenge decisions.”

For Clare Melford, CEO of the Global Disinformation Index (GDI), the "censorship" framing is not only flawed – it’s deliberate. “Trying to say governments are censoring is a fundamental misunderstanding of how technology works,” she said. “The speech that is actually being suppressed is moderate speech, because it’s less profitable.”

She pointed to recommender algorithms, designed to maximize engagement, which end up amplifying polarizing or sensational content while civil society actors are left fending off bad-faith claims of censorship for calling this dynamic out. “This rhetoric is designed to chill the whole sector — from civil society groups to advertisers to funders,” Melford said. “Why would you work in this space if everything gets framed as censorship?”

Auditing disinformation

The Code of Practice on Disinformation began as a voluntary initiative in 2018 and was strengthened in 2022. It now serves as a template for risk mitigation obligations under Article 34 of the DSA. While signing on remains optional, platforms are expected to meet similar standards as those outlined in the Code, and failure to do so can negatively impact their compliance audits under the DSA.

“Compliance with the Code is voluntary. Compliance with the DSA is not,” noted the Commission spokesperson.

Under the DSA, platforms designated as Very Large Online Platforms (VLOPs) are required to undergo annual independent audits. Those audits will assess, in part, whether the disinformation risks have been adequately addressed — and the Code commitments, where relevant, will serve as benchmarks.

According to civil society signatories of the code, the strength of the DSA depends on the robustness of the audits. “Without a clear audit framework and access to meaningful data, these audits won’t be credible,” said Claire Melford of the Global Disinformation Index (GDI).

“The real risk isn’t censorship – it’s that auditors won’t know what to look for. Civil society has the expertise to guide that, but we’re not being systematically integrated.” She expressed hope that the Commission would soon finalize clear guidelines for audit implementation.

Paula Gori from the European Digital Media Observatory (EDMO) echoes that concern, noting that some VLOPSE signatories of the Code had unsubscribed from their commitments. A recent EDMO report flagged “consistent gaps in transparency, independent oversight and measurable outcomes across all commitments,” warning that the code “risks remaining performative” if the companies do not step up.

“I can imagine that additional risk mitigation measures will need to be implemented and assessed. On top, there will be the assessment of VLOPSEs (currently X), which are not signatories of the Code,” said Gori.

“Enforcers and auditors will need to do intense work to keep all these different layers together,” she noted, calling for harmonized risk identification and assessment methodologies developed by the VLOPSEs as well as comparable reporting structures. “Assessment methodologies by auditors should also be aligned.”

High stakes ahead

The EU’s evolving disinformation strategy is as much about framing as it is about enforcement. As Brussels doubles down on digital rules amid high-stakes transatlantic trade talks, it insists these regulations are non-negotiable – even as Washington raises concerns over censorship and regulatory overreach.

What remains unclear is whether platforms will genuinely reform, and if the Commission’s audits will have the clarity and teeth needed to hold them accountable.

As Melford of GDI puts it: “The Code has the potential to work. But only if it’s backed by transparent data, credible audits, and a Commission willing to follow through.”

Authors

Ramsha Jahangir
Ramsha Jahangir is an Associate Editor at Tech Policy Press. Previously, she led Policy and Communications at the Global Network Initiative (GNI), which she now occasionally represents as a Senior Fellow on a range of issues related to human rights and tech policy. As an award-winning journalist and...

Related

The EU’s Code of Practice on Disinformation is Now Part of the Digital Services Act. What Does It Mean?February 25, 2025
Perspective
Risk Assessment a Good Practice for Curbing Disinformation? EU Candidate Advocates Still Say YesMay 26, 2025

Topics