Home

Donate

Meta’s Policy Changes Test European Leaders’ Resolve on Digital Regulation

Maria Koomen / Jan 10, 2025

President of the European Commission Ursula von der Leyen addresses Joint Sitting of the Houses of the Oireachtas (Dec. 1, 2022, Flickr)

Maria Koomen is a founding member of Women Against Fascism. In response to Meta’s Policy Changes, Women Against Fascism organized an open letter urging EU President Von der Leyen to use the full strength of EU legislation to prevent harms to Europeans.

The recent policy shifts by Meta — abandoning its third-party fact-checking program in favor of "Community Notes" and relaxing its moderation policies on issues such as immigration and gender — are rightly raising alarm across Europe.

This move appears to prioritize corporate expediency over the public good and starkly conflicts with the protections mandated by the European Union's regulatory framework.

Meta’s decision, under the pretext of fostering free expression, risks undermining the EU’s efforts to combat misinformation, a critical challenge in an era where false narratives with unprecedented reach can incite violence, distort public discourse, and endanger democracy.

The riots in France and Ireland in recent years are stark reminders of how quickly misleading content can spiral into real-world harm. Freedom of speech cannot serve as a shield for the abdication of responsibility in curbing misinformation.

The EU now faces a key test of its resolve in safeguarding citizens from the harms of unchecked misinformation in the digital age. But what explains the sudden enormity of concern in this case?

A key reason is geopolitics.

The incoming Vice President of the United States, JD Vance has already threatened to drop support for NATO, which Europe needs for defense against Russian aggression, unless the EU goes easy on Elon Musk’s X platform, where a decision in an investigation opened last year still is pending.

The writing is on the wall: the incoming Trump administration is prepared to use the military and economic might of the US to hobble EU regulation where it wants.

Vance’s threat is also another sign of what Marietje Schaake of Stanford describes in her recently published book as The Tech Coup — the usurpation of government functions by the titans of the tech industry, which now has put some of those same corporate actors in the driving seat of US foreign policy.

A second reason is the well-grounded fear that more extreme online content will push Europe itself further toward the political extremes.

The absence of content moderation signaled by Meta promises to expose online users in Europe to more radical, false, and violent content — content likely to worsen societal polarisation and make it easier for extreme political forces to capture eyeballs and to win votes.

Extreme political forces are already on the march, with the recently elected European Parliament the most right-wing since direct elections began in 1979. Without the rigorous checks provided by independent third-party fact-checking, the risk of radicalization and exposure to violence-inducing falsehoods escalates.

The third reason for concern is the threat to the most vulnerable in our societies.

The withdrawal of funding and programs for fact-checking on Meta all but guarantees that women and minors, people affected by mental illness, and marginalized groups will face ever more powerful and sustained waves of online abuse.

The EU has a duty to ensure that the highest levels of protection are extended to this demographic. Whatever the shortcomings of third-party checkers, the new policy of “Community Notes” can be reasonably expected to return worse outcomes.

To be sure, the termination of a third-party checking program by Meta amounts to a decision by a private corporation to deal with its business in a different manner. But the public has already grown wary of the way Big Tech players like Meta use and abuse the collection and commodification of their personal data.

And where regulators have sought to rein in surveillance capitalism — as the scholar Shoshana Zuboff has called these business practices — they also have rules in place when it comes to disinformation and harmful content on platforms like Meta in this brave new digital era.

The EU’s Digital Services Act (DSA) promises a robust framework to protect European citizens online, requiring platforms to conduct thorough risk assessments and implement risk mitigation measures.

In 2022, Meta committed to the EU’s Code of Practice on Disinformation, a landmark initiative to address false narratives. This code is set to become a formal Code of Conduct under the DSA, making deviations like Meta’s recent policy change deeply troubling.

Meta’s patchy track record, with ongoing cases of alleged non-compliance, further highlights the need for vigilance, and Meta’s new policies must undergo rigorous scrutiny and the results made publicly available.

The EU is at a critical juncture.

The DSA was introduced with promises of robust action to protect civic discourse and electoral integrity in the digital age. Allowing platforms to circumvent these obligations undermines not just the law but the very values it seeks to uphold.

True, the task facing the EU is not an easy one. The likes of X and Meta can field teams of lawyers with almost unlimited funds and increasingly count on the backing of a reliably belligerent American president in Donald Trump.

In the immediate term, X and Meta will seek to slow down the DSA investigations with procedural maneuvers and threats of litigation.

Therein lies the challenge for the EU. We believe regulators in Brussels, tasked with protecting hundreds of millions of citizens, must focus on the enormity of the stakes. Regulators must prioritize speed, decisiveness, and clarity and resist obeying in advance by complying with Meta’s demands before exercising the rules as they are written.

Citizens, civil society, and scholars alike are urgently calling for transparency, accountability, and protection. Ensuring Meta’s adherence to the DSA will set a clear precedent for all platforms operating in Europe.

The alternative — appeasement of Elon Musk and Mark Zuckerberg — can only leave Europe even more exposed to the cascading harms of digital misinformation.

Authors

Maria Koomen
Maria Koomen is the Governance Lead at ICFG. Maria specializes in emerging technologies and democracy. Her team works to address the overarching pillars and bigger-picture considerations of technology governance.

Related

Meta is Not Returning to its Free Speech Origins – It’s Preparing for an Autocratic Future

Topics