Home

Donate

Will the DSA Save Democracy? The Test of the Recent Presidential Election in Romania

Joan Barata, Elena Lazăr / Jan 27, 2025

December 8, 2024 - Mogoșoaia, Romania: Far-right runoff candidate for the presidency of Romania, Călin Georgescu, speaks to the press at a closed polling station after the elections were legally annulled. Shutterstock

Voting manipulation through social media platforms. Illegal campaign financing on TikTok. Cyber-attacks. Suspected Russian interference. The circumstances around Romania’s troubled December presidential election bear all the hallmarks of a modern political thriller. But the circumstances around the election and its annulment also present an opportunity to consider the application of Europe’s Digital Services Act (DSA) and its limitations.

Background

The recent decision by the Constitutional Court of Romania (herein after CCR) to annul that country’s recent Presidential election sent shockwaves across Europe. The CCR Judgment 32/6 December 2024 annulled the election, ordered the electoral process to resume, and extended the mandate of the incumbent president until new elections are held and a new president's mandate is validated. Even though this is not the first time an election process was annulled or postponed in Europe, the main reasons presented by the Court had a serious impact beyond Romania’s borders and the Eastern European region.

The CCR found, in particular, that "the electoral process was marred (…) by multiple irregularities and violations of electoral law that distorted the free and fair nature of the vote cast of citizens and the equality of chances of the electoral candidates, affected the transparent and fair nature of the electoral campaign and disregarded the legal regulations on its financing" (paragraph 5). In the Court’s view, such irregularities in the electoral campaign created a blatant inequality between a candidate who manipulated digital technologies and the other candidates participating in the electoral process. Thus, the significant exposure of one candidate led to a directly proportional reduction in the online media exposure of the other candidates during the electoral process.

It is worth noting that the “affected” candidate, Călin Georgescu, requested interim measures from the European Court of Human Rights, particularly that the Constitutional Court's decision be suspended and the electoral process be resumed. The Strasbourg Court dismissed the request for interim measures as being outside the scope of Article 39 of the Rules of the Court since, per its well-established practice, Mr. Georgescu's claim did not concern irreparable harm.

The Romanian case clearly resonated with the European Union institutions regarding the misuse of social media platforms. On 17 December 2024, the European Commission decided to open formal proceedings against TikTok under the Digital Services Act (DSA). The Commission requested information from TikTok to determine what actions the platform took to reduce potential algorithmic bias in the election. The proceedings will focus “on management of risks to elections or civic discourse” linked to TikTok's recommender systems, notably the risks “linked to the coordinated inauthentic manipulation or automated exploitation of the service,” as well as TikTok's policies on political advertisements and paid-for political content.

Risk assessments as limited tools

At the end of 2024, the largest online platforms and search engines made public the first batch of risk assessment and audit reports in relation to systemic risks, according to the obligations contained in articles 34, 35, and 37 of the DSA. The main systemic risks enumerated by the DSA (dissemination of illegal content, negative effects on the exercise of fundamental rights, negative effects on civic discourse and electoral processes, and public security, as well as regarding gender-based violence, the protection of public health and minors and physical and mental well-being) and the corresponding mitigation responsibilities for platforms are obviously relevant with regards to the issues triggered by the Romanian election and its annulment.

The main issue is, firstly, whether these legal instruments must have worked better to identify and effectively mitigate the specific risks associated with the election process. Secondly, we must also wonder whether the reports and audits elaborated according to the DSA are fit to provide useful parameters to actors such as courts or the Commission when they face the responsibility to assess online harms and impose remedies. To use the words of Svea Windwehr in a recent piece for the Electronic Frontier Foundation on this issue, the question can also be formulated as to whether documents produced by platforms according to the DSA may provide a “smoking gun” to establish a decisive link between what happened online and the results of the election, or at least to conclude that the risks posed by “specific regional and linguistic aspects of national elections” (in the words of the Commission) had not been diligently mitigated. The most probable response to each of these questions is no. This can be explained based on a series of considerations.

First, the risk assessment and mitigation measures contemplated by the DSA have very particular characteristics in terms of enforcement. At this stage, there are still no guidelines or best practices provided by the Commission regarding the “reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks” that need to be put in place according to the DSA (article 35.3). So, this is an area where the most relevant decisions are delegated to online platforms based on the specific characteristics of their services and their own internal resources and processes. In other words, the legislator not only privatized the adoption of individual compliance measures but also implicitly acknowledged the limitations of statutory regulation in terms of the pre-determination of specific responses to a wide range of undesired outcomes.

Second, it is important to highlight that the notion of systemic risks encompasses a series of broad categories. In particular, platforms must not only consider as a risk the possible dissemination of content that is illegal or imperils the exercise of fundamental rights but also the existence of a significant number of legal but harmful messages and behaviors with possible negative effects on election processes, among other areas. In this context, it is perhaps too optimistic and unrealistic to assume that platforms would be able to consistently and thoroughly incorporate into their policies and systems all the possible legal restrictions and nuances applicable to the election process according to the legislation of the 27 Member States or to anticipate all the hypothetical forms of misuse of their services within the particular context of specific national elections. Considering the current very early stage in the implementation of the provisions of the DSA, it is also evident that some of the possible measures envisaged by platforms in this area are still to be tested or benchmarked. To refer to a very illustrative case, and due to the complexity of properly moderating paid-for political content, some platforms, such as TikTok, refused to host this kind of content. However, the particular experience of the Romanian case has shown that certain forms of (allegedly paid) political messaging became almost impossible to tackle when included as short mentions in long influencer videos mainly focusing on make-up trends, for example.

Lastly, it is important not to forget that the flexibility granted to platforms is exercised within the necessarily limited powers of regulators such as the Commission. Regulators may only play a limited role in evaluating whether risk assessments, mitigation measures, and audit reports are satisfactory in accordance with very broad and vague DSA provisions, which provide little indication of what constitutes adequate compliance. Moreover, different analyses of the reports published by the 19 VLOPs and VLOSEs have noted the absence of a meaningful assessment of the role of platform design in relation to risk, the insufficient focus on data and metrics for evaluating risk and mitigations, as well as the inconsistencies triggered by the significant differences in terms of product logics, policies, user base, and design choices among the different platforms.

In the already mentioned absence of standards, templates, or indicators, the question of how far regulators can go (legally and technically) in their “assessments of the assessments” remains unanswered. It is important to underscore that a significant portion of systemic risks, particularly in election processes, does not refer to illegal content or activities but legal-but-harmful practices, which is more problematic in terms of impact on freedom of expression. This means that extensive use of regulatory powers to impose specific criteria and processes on platforms would create the non-neglectable risk of unacceptable political interference in the development of election processes and campaigns, especially from a body with an obvious political profile, such as the European Commission.

Turning back to the specific context of Romanian elections, it is also necessary to note that the National Authority for Communications Administration and Regulation (ANCOM) has been uncovering alleged irregularities in the handling of political content and election-related notifications and informed the European Commission as recently as January 21 that TikTok had "not acted swiftly" to requests from Romanian authorities to secure the election. ANCOM oversees digital platforms under the DSA and has been designated as Romania’s national digital services coordinator, thus acting as the “czar” among Romanian institutions involved in managing the online space. Even though TikTok made some public statements explaining its decisions and exchanges with competent bodies, the platform is seen by part of the public and the political establishment as a tool for malicious interference, including an initial and ulimately recalled threat to ban the app from a member of ANCOM’s board.

Despite uncovering the irregularities and informing rapidly the EU Commission, it is apparent that national competent bodies failed to proactively and transparently engage with platforms during several critical electoral moments. We also need to consider the fact that, besides the central role of the European Commission regarding DSA compliance by VLOPs, it is the Coimisiún na Meán, the Digital Services Coordinator for Ireland, the entity that is associated with the Commission's investigation, since Ireland is TikTok's country of establishment in the EU. This overlap of relevant authorities adds an additional level of complexity to the matter and shows how the slow, methodical, yet unchartered pace of analyzing systemic risks and setting standards may not suffice to deliver the accountability and democratic promises of the DSA.

Conclusion

The DSA is an important piece of legislation. It establishes new rights for users and introduces new transparency and accountability obligations for platforms.

However, the DSA cannot be seen as the definitive set of rules establishing the mechanisms that will guarantee, in any case or circumstances, the completely fair and transparent development of election processes, particularly when it comes to the free formation of the opinion of online platform users. Such a belief may only lead to a degree of State intervention in political discussions and debates that would be clearly unacceptable from human rights and democracy perspectives. Furthermore, the design and enforcement of systemic risk mitigation strategies may not contain, in most cases, sufficient indicators to assess whether certain forms of illegitimate interference (including, in some cases, techniques that are not necessarily illegal) have had an actual and significant impact on the outcome of an election.

Finally, it is important not to forget that, as some interesting reports on the Romanian case have shown, foreign meddling in elections is often the result of the complex combination of a significant number of factors, many of them not necessarily connected to the use or misuse of online services: technical interference at the level of IT infrastructure, exploitation of public-facing applications, data manipulation via cyberattacks, as well as, very importantly, the exploitation of pre-existing societal rifts and vulnerabilities such us mistrust and disillusionment with the democratic process, media illiteracy, manipulation of diaspora communities by extremist groups and organizations, and poor electoral management and legal enforcement.

Authors

Joan Barata
Joan Barata works on freedom of expression, media regulation, and intermediary liability issues. He is a Senior Fellow at The Future of Free Speech. He is also a Fellow of the Program on Platform Regulation at the Stanford Cyber Policy Center. He has published a large number of articles and books on...
Elena Lazăr
Elena Lazăr is currently a lawyer at the law firm Lazar Elena, specializing in European human rights law and new technologies law. She graduated from the Law Faculty of the University of Bucharest in 2010 and from the Franco-Romanian College, Paris I Panthéon-Sorbonne. She continued with a Master's ...

Related

Reading the Systemic Risk Assessments for Major Speech Platforms: Notes and Observations

Topics