Home

Donate

Online Election Manipulation Is a Challenge for Democracy. It’s About to Get a Whole Lot Worse.

Milan Wiertz / Feb 17, 2025

A mural by an unknown artist in Brooklyn, New York, photographed in 2021.

In January, a Canadian government commission into foreign interference concluded that disinformation is one of the greatest challenges to the country’s democracy. That warning comes at a critical juncture, given that social media platforms appear to be backsliding on what efforts they had made to address disinformation campaigns in elections.

With the 2024 year of elections behind us, two recent developments are emblematic of the state of social media and elections. One is the shocking extent and impact of social media manipulation in the first round of the 2024 Romanian Presidential election, which prompted the unprecedented annulment of the outcome. The other is the dismantling of content moderation by major platforms, led by X (formerly Twitter) under Elon Musk and followed by Meta, which recently announced it would severely reduce efforts to police its social media. These changes are part of a broader process of platform backsliding and set a dangerous precedent that is likely to further lower the bar for other companies. The revelations of foul play in the Romanian election suggest platforms were already failing to adequately address threats to election integrity, and these recent policy changes indicate these issues will get worse.

Romania’s experience with election manipulation serves as a warning, but it also offers a glimmer of hope. While emphasizing that platforms need to be compelled to defend election integrity more than ever, it also shows that effective regulation can hold platforms accountable and avert the worst of what is to come, provided that governments heed the warning.

Caught red-handed: social media’s failure to address electoral manipulation

Platforms have claimed for years that they care about election integrity and that they are proactively taking measures to defend democracy against influence operations. Recent events in Romania indicate that these purported efforts are not nearly sufficient.

The first round of Romania’s November 2024 Presidential election saw the shock victory of Călin Georgescu, a far-right independent candidate running on a pro-Russian platform. Georgescu won despite having barely registered in opinion polls and not participating in major televised debates.

Georgescu’s campaign relied heavily on active efforts to manipulate social media engagement. This included a coordinated effort to flood TikTok with friendly messaging by abusing the platform’s tendency to over-promote content based on similarity. An investigation by the Atlantic Council’s DFRLab suggests that such efforts were likely at least partly automated, dodging the platform's filters for suspicious activity.

Criticism has also mounted over TikTok’s failure to enforce its own ban on political advertising and its mislabelling pro-Georgescu content as entertainment, which significantly boosted the candidate’s visibility. According to an investigation by Global Witness, differential treatment by the platform’s algorithm caused Georgescu to be promoted five times as frequently as his competitor in the leadup to the second round of the election.

While TikTok faced the majority of the backlash, it was not alone. An investigation by CheckFirst revealed that Meta failed to take action against a plethora of adverts on its platforms that breached its content and transparency standards. Meta said it did not pick up on any significant disinformation effort during the election.

The country’s Constitutional Court swiftly considered the allegations and annulled the result. Less clear, however, is whether redoing the election will revert the support Georgescu unfairly garnered since the campaign relied on real frustrations felt by many Romanians.

The episode warns of the vulnerability of social media platforms to coordinated interference efforts, in spite of platforms’ insistence that they have it under control, with very real consequences for elections.

Waning defenses: The U-turn on content moderation

Platform manipulation and foreign interference strategies are becoming ever more sophisticated, yet social media companies appear to be throwing in the towel.

Elon Musk’s takeover of X in 2022 ushered in a significant change of direction for the platform. Among these are the disbanding of the Trust & Safety team and the effective removal of verification badges, undermining the most effective means to check account authenticity. Taken as a whole, these actions served to dismantle the apparatus responsible for holding back the flood of misinformation, hate speech, and automated behavior, which now dominate the platform.

Other platforms appear intent on following suit. On January 7th, Meta CEO Mark Zuckerberg announced that the company would be making several drastic changes to the way harmful content is addressed on Facebook and Instagram. Some are US-specific (for now), such as the abandonment of independent fact-checking. Others will have an immediate global impact. This includes narrowing the scope of content the platform actively seeks to address, now limited to strictly illegal content and other high severity violations. Similarly, Meta will no longer demote political content, a strategy it previously promoted as boosting electoral integrity. The threshold for content removal will also significantly increase.

By Zuckerberg’s own admission, this will lead to more “bad stuff” on the platform. Much like his prior mantra of “move fast and break things,” there appears to be little concern for the harm of these changes. Yet the harms are no less real.

If platforms were already failing to address these concerns when they still appeared at least publicly committed toward taking on malicious content and electoral manipulation, it appears the general trend is toward less, not more, intervention, making the Romania scenario all the more likely in years to come.

Tests for democratic resilience: upcoming elections in Germany and Canada

The combination of ever more sophisticated manipulation strategies with the apparent abandonment of safeguards by platforms raises the alarming prospect that upcoming elections might face highly effective yet uncontained interference efforts. What can be done about it?

In the case of Romania, TikTok’s apparent inability to prevent manipulation of its platform prompted the European Commission to open formal proceedings against the platform to determine whether it violated its obligations under the Digital Services Act (DSA). These include the requirement to proactively identify and address systemic risks and to abide by a variety of transparency measures with regard to content moderation, algorithms, and political advertising.

The drastic nature of the Constitutional Court’s intervention left the country deeply divided. More evidence is needed to assess the extent to which Romanian law or DSA regulations were breached. The episode highlights that not only are clear standards needed, but also sufficient platform transparency and access to data so that problematic behavior can be identified and addressed earlier in an election process.

The February elections in Germany will take place under the same DSA framework. Germany recently participated in a joint stress test with the European Commission and several large social media companies, assessing the platforms’ ability to handle attempts at interference during the election. The country also succeeded at forcing X to hand over election-related data under the DSA. Nevertheless, reports of Russian-backed interference campaigns in Germany have already begun pouring in, both online and in the physical world.

While imperfect, EU norms have provided governments with levers to pull to obtain information and pressure platforms to take action. Still, their effectiveness remains somewhat untested, and the Romanian election is set to serve as a crucial test for whether the DSA has teeth.

More concerning is the prospect of countries such as Canada, whose ability to hold platforms accountable is significantly more limited. While Canada does have some provisions in place regarding political advertising and is able to take action against perpetrators of interference, it has thus far relied on agreements with platforms to fill the gaps in its online electoral integrity framework. What is lacking is a framework creating a responsibility for platforms to address disinformation and foreign interference, as the DSA mandates.

Choices and consequences

The combined development of more effective social media manipulation strategies and major platforms’ intent to turn back on their content moderation practices means that online electoral interference is not going away. On the contrary, it is likely to become much worse.

The recent election in Romania provides a stark reminder of why this matters and what the consequences can be of an effective interference campaign on a platform that neglects its responsibility to act. It also reinforces the message that democratic governments need to take proactive steps to ensure that platforms feel compelled to safeguard democratic society against all sorts of attacks. Platforms’ claims that they could self-regulate were never credible. Now, they lay in tatters.

Authors

Milan Wiertz
Milan Wiertz is a researcher at the Centre for the Study of Democratic Institutions (CSDI) and a Political Science student at Sciences Po and The University of British Columbia. His research focuses on tech and democracy, with a particular interest in EU Digital Policy.

Related

Will the DSA Save Democracy? The Test of the Recent Presidential Election in Romania

Topics