Home

Digital Services Act Roundup: February - March 2024

Jordi Calvet-Bademunt / Apr 10, 2024

Overview: The Digital Services Act (DSA), Europe’s online safety rulebook, finally became applicable to all companies on February 17, 2024. Until then, it only applied to the so-called very large online platforms (VLOPs) and very large online search engines (VLOSEs). February 17th was also the date EU Member States were required to designate the national DSA regulators, called Digital Services Coordinators (DSCs). The DSCs are in charge of enforcing the DSA along with the European Commission. Around ten Member States had yet to designate a DSC by the end of March.

The European Commission also ramped up enforcement during February and March, opening two formal proceedings against TikTok and AliExpress and issuing requests for information, including on generative AI, to several companies. In addition, Ireland, one of the countries with a DSC in place, contacted several companies to ask for information regarding DSA compliance. Finally, in March, the European Commission issued its “Guidelines for providers of VLOPs and VLOSEs on mitigating systemic risks for electoral processes” ahead of the European elections in June 2024.

Enforcement Actions

Formal proceedings. The European Commission announced two formal proceedings in February and March concerning two VLOPS. It first opened proceedings against TikTok to assess whether this platform promotes behavioral addictions or creates so-called ‘rabbit hole effects.’ The Commission is also analyzing whether TikTok takes appropriate and proportionate measures to ensure a “high level of privacy, safety, and security for minors,” provides a searchable and reliable repository for its advertisements, and complies with obligations regarding researchers’ access to TikTok’s publicly accessible data.

A few weeks later, the Commission announced proceedings against AliExpress. These proceedings focus on whether this platform adequately limits the dissemination of illegal content and protects consumers and minors, for example, by enforcing its terms prohibiting products posing health risks and restricting access to pornographic material for minors. The Commission is also investigating whether AliExpress complies with the obligations to allow users to notify illegal content and assess the reliability and completeness of the information from its traders to ensure traceability.

Regarding AliExpress’ recommender systems, the Commission is interested in whether the company provides sufficient transparency on the main parameters used and offers at least one option that is not based on profiling. Finally, like in TikTok’s case, the Commission is analyzing whether AliExpress provides a searchable and reliable repository for its advertisements and gives researchers access to its publicly accessible data.

As mentioned in a previous roundup, the European Commission announced its first formal proceedings against X in December. Those proceedings focus on compliance with the DSA obligations related to countering the dissemination of illegal content, the effectiveness of measures taken to combat information manipulation, X's measures to increase the transparency of its platform, and a suspected deceptive design of the user interface, especially in relation to Blue checks.

Requests for Information. In addition to initiating two formal proceedings, the Commission sent requests for information to Meta, X, TikTok, Snap, Microsoft, and Google regarding the dissemination and generation of AI content. The Commission inquired about “mitigation measures for risks linked to generative AI, such as so-called ‘hallucinations' where AI provides false information, the viral dissemination of deepfakes, as well as the automated manipulation of services that can mislead voters.” The Commission justified its requests based on several goals, including protecting electoral processes, avoiding the dissemination of illegal content, safeguarding fundamental rights, fighting gender-based violence, protecting minors, ensuring mental well-being, and protecting personal data, consumers, and intellectual property.

The Commission also sent LinkedIn a request for information asking how it complies with the prohibition on presenting advertisements based on profiling using special categories of personal data. This request was based on a complaint submitted to the Commission by civil society organizations.

Finally, the Commission sent Meta a request for information asking for details regarding the subscription for no ads options for both Facebook and Instagram, in addition to topics discussed in previous requests, like the handling of terrorist content or the protection of minors. It is worth noting that a few weeks later, the Commission opened proceedings against Meta under the Digital Markets Act, or DMA, to investigate whether their “pay or consent” model in the EU complies with this Act, which “requires gatekeepers to obtain consent from users when they intend to combine or cross-use their personal data across different core platform services.” The DMA is Europe’s rulebook to make markets fairer and more contestable.

At a national level, there were reports that the Irish enforcer, Coimisiún na Meán, contacted companies in Ireland, including the online marketplace Temu, under the scope of the DSA. The regulator requested “certain information, including user numbers.” The number of users is relevant to determining whether a service is considered a VLOP or VLOSE and, hence, subject to additional obligations and under the supervision of the European Commission.

For more information on the formal proceedings and requests for information issued by the European Commission, please see the DSA Enforcement Tracker published by The Future of Free Speech.

Court Decisions. The Court of Justice of the EU (CJEU) rejected Amazon’s request to suspend its obligation to make an advertisement repository publicly available as required by the DSA for VLOPs. The lower court, the General Court, had initially suspended this obligation, but the CJEU overturned the decision. Among other considerations, the CJEU pointed out that the suspension would have delayed, potentially for several years, the achievement of the objectives of the DSA and potentially “allow an online environment threatening fundamental rights to persist or develop.” As a result, Amazon will have to make the repository available.

Institutional Framework

Digital Services Coordinators. The DSA finally became fully applicable on February 17, 2024. The date was also when Member States were required to designate their national DSA enforcers, the so-called Digital Services Coordinators or DSCs. DSCs are in charge of enforcing all DSA rules for non-VLOPs and non-VLOSEs and, in cooperation with the European Commission, some of the rules applicable to VLOPs and VLOSEs.

Still, as of mid-February, two-thirds of the EU Member States, including France and Germany, had yet to officially establish their DSCs. Germany finally adopted a DSC on March 21. The European Commission curates a list of the DSCs. Most countries have assigned national media and telecoms regulators to the role, but in a handful of cases, like Denmark and Luxembourg, the antitrust or consumer protection authorities are in charge.

Task Force on Age Verification. The European Commission hosted the second meeting of this task force, during which Member States stressed the need for a harmonized EU approach to age verification and the important role of the EU Digital Identity Wallet in this regard. When adopted, this wallet will allow EU citizens, residents, and businesses to identify themselves or provide confirmation of certain personal information.

European Board for Digital Services. The European Commission hosted the first twomeetings of the European Board for Digital Services. The Board is an independent advisory group composed of the Member States’ DSCs and chaired by the European Commission. In the first two meetings, the national DSCs and the Commission discussed issues like recent DSA enforcement actions, a draft regulation regarding templates for the transparency reports companies must submit, and a draft regulation ensuring researchers’ access to data from VLOPs and VLOSEs. The Board also discussed the results of the public consultation on the DSA Election Guidelines and the next steps of converting voluntary Codes of Practice - like the Code of Practice on Disinformation - into binding Codes of Conduct.

Roundtable with Civil Society. In February, the European Commission hosted an online roundtable with civil society organizations, including The Future of Free Speech, to discuss how these organizations can contribute to implementing the DSA. The Commission provided details on the DSA enforcement activities and asked the participants about their organizations' ongoing and planned activities. The discussion addressed possible ways civil society can support the collection of information and data about DSA compliance, particularly concerning systemic risks. This roundtable followed demands for a closer and more formal involvement of civil society in the enforcement of the DSA. A representative of the Center for Democracy and Technology also highlighted the request during a symposium on AI and digital regulation co-hosted by The Future of Free Speech in March in Brussels.

Guidance on Elections

Electoral integrity. In late March, the European Commission adopted its “Guidelines for providers of VLOPs and VLOSEs on the mitigation of systemic risks for electoral processes.” This document provides guidance on how VLOPs and VLOSEs can comply with the DSA’s obligation to conduct risk assessments and implement mitigation measures for adverse effects on civic discourse and electoral processes.

While the guidelines are not binding, the Commission has stated that companies that do not follow them “must prove to the Commission that the measures undertaken are equally effective in mitigating the risks.” If the Commission considers the alternative measures unsuitable, it can request further information or start formal proceedings under the Digital Services Act. The Commission also plans to conduct a stress test at the end of April. According to the Commission, the measures included in the guidelines aim to:

  • Reinforce VLOPs’ and VLOSEs’ internal processes, for example, by setting up internal teams with adequate resources, using available information on local context-specific risks, and considering how users search information before, during, and after elections.
  • Implement elections-specific risk mitigation measures tailored to each electoral period and local context. These measures include promoting official information on electoral processes, labeling political advertising, implementing media literacy initiatives, and adapting the recommender systems to empower users and reduce the monetization and virality of content that threatens the integrity of electoral processes.
  • Adopt specific mitigation measures linked to the creation and dissemination of generative AI, such as labeling content deepfakes and other AI-generated content.
  • Cooperate with EU-level and national authorities, independent experts, and civil society organizations to foster an efficient exchange of information before, during, and after elections.
  • Adopt specific measures, including an incident response mechanism, during an electoral period to reduce the impact of incidents that could significantly affect the election outcome or turnout.
  • Assess the effectiveness of the measures through post-election reviews, including by publishing a non-confidential version of the post-election review documents.

The guidelines were adopted after a one-month consultation process. Industry and civil society generally welcome further clarity on how the Commission will implement the DSA. Still, as a representative from the Computer & Communications Industry Association noted during a symposium on AI and digital regulation, these guidelines were put forward quite late in the process. The European elections are scheduled for early June, and the platforms have already been preparing moderation and other policy responses for many months.

Regulations

Data-sharing Platform. In February, the European Commission adopted a regulation establishing the arrangements for the functioning of a data-sharing platform between Member States, the Commission, and the European Board for Digital Services. The AGORA platform is expected to be “the backbone for the supervision, investigation, enforcement, and monitoring of services within the scope of the DSA.” It started operating on February 17.

Regulation on Independent Audits. Also in February, the Regulation on Independent Audits under the DSA entered into force. The DSA requires VLOPs and VLOSEs to retain an independent auditor to conduct annual audits. This auditor must assess the company's compliance with DSA obligations and any commitments undertaken under codes of conduct and crisis protocols adopted. The regulation provides a framework to guide VLOPs, VLOSEs, and auditors in preparing and issuing audits. It sets out mandatory templates for the audit reports - to be produced by auditors - and the audit implementation report - to be issued by VLOPs and VLOSEs - to ensure comparability. The reports must present “a clear opinion concerning the compliance of the audited service with the DSA, and an opinion on the compliance with each code of conduct and crisis protocol under which an audited VLOP or VLOSE made voluntary commitments.”

*****


Authors

Jordi Calvet-Bademunt
Jordi Calvet-Bademunt is a Research Fellow at the Future of Free Speech Project and a Visiting Scholar at Vanderbilt University. His research focuses on freedom of expression in the digital space. Jordi has almost a decade of experience as a policy analyst at the Organization for Economic Co-operati...

Topics