Home

Digital Services Act Roundup: July 2023 – January 2024

Jordi Calvet-Bademunt, Joan Barata / Feb 8, 2024

Office of European Commission in Brussels. Shutterstock

Background. The European Commission has been busy at work since the entry of the Digital Services Act (DSA) into force in November 2022. The DSA aims to create “a safe, predictable and trusted online environment.” To do so, it imposes a number of due process, transparency, and due diligence obligations on online companies, referred to by the law as providers of “intermediary services.”

The DSA will be generally enforced by national authorities (Digital Services Coordinators). The European Union Member States must designate these national authorities by February 17, 2024. In the case of the so-called Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs), the European Commission will have oversight.

Overview. The below roundup focuses on DSA enforcement by the European Commission and is based on official press releases issued by the Commission, the European Court of Justice, and interested parties. Since July 2023, there have been a number of important actions and developments related to DSA. Most of this activity has concerned the preparatory steps necessary to establish a comprehensive framework for enforcement of the DSA. The Commission has already begun to enforce the law, sending over twenty requests for information and initiating one formal proceeding against X. The upcoming months are expected to be equally, if not more, eventful. We anticipate more developments at the national level as the DSA’s applicability extends to all companies, not just VLOPs and VLOSEs, starting February 17, 2024.

Enforcement Actions

VLOP and VLOSE Designations. The DSA already applies to 17 companies, operating 22 VLOPs and VLOSEs, namely: Alibaba (AliExpress), Amazon (Amazon Store), Apple (App Store), Aylo Freesites (Pornhub), Booking.com (Booking.com), Google (Google Search, Google Play, Google Maps, Google Shopping, YouTube), LinkedIn (LinkedIn), Meta (Facebook, Instagram), Microsoft (Bing), Pinterest (Pinterest), Snap (Snapchat), Technius (Stripchat), TikTok (TikTok), Twitter (X), WebGroup Czech Republic (XVideos), Wikimedia Foundation (Wikipedia), and Zalando (Zalando). It will start applying to the rest of the providers of intermediary services on February 17, 2024.

Most VLOPs and VLOSEs (19 out of 22) were designated as such in April 2023. In December 2023, the European Commission announced the designation of the remaining three, Pornhub, Stripchat, and XVideos. Zalando appealed its designation as VLOP in June 2023. Amazon also appealed it and, in addition, requested interim measures. The European Court of Justice partially granted the measures, exempting Amazon from making an advertisement repository publicly available as required by the DSA.

Requests for Information. In addition, the Commission has sent 24 requests for information to Alibaba (two), Amazon (two), Apple (two), Booking.com, Google (three), LinkedIn, Meta (four), Microsoft, Pinterest, Snap (two), TikTok (three), X, and Zalando. The range of concerns in the Commission’s actions is diverse. Seventeen deal with data access for eligible researchers. The protection of minors is explicitly mentioned in eight other actions. The term ‘illegal content’ is used in six. Lawful but awful content is also referred to in different ways, in several actions. For more information, you can access the DSA Enforcement Tracker published by The Future of Free Speech.

Formal Proceedings. In December, the European Commission announced a formal proceeding, against X. The proceeding focuses on compliance with the DSA obligations related to countering the dissemination of illegal content, the effectiveness of measures taken to combat information manipulation, the measures taken by X to increase the transparency of its platform, and a suspected deceptive design of the user interface, especially in relation to Blue checks.

Reports & Transparency

Under the DSA, VLOPs and VLOSEs must regularly prepare two main report types. The first type deals with how companies assess and mitigate systemic risks. The second, called transparency reports, addresses more general aspects of content moderation.

Systemic Risks Reports. VLOPs and VLOSEs submitted the first batch of reports on systemic risks to the European Commission in the summer of 2023, but they have yet to be made public. We will likely have to wait until the second half of 2024 before we have access to the companies’ reports on systemic risks and accompanying audits. In August, the Commission published a study proposing an approach to assess the effectiveness of online platforms’ measures against Russian disinformation based on the DSA’s systemic risk obligations. We analyzed the study and critiqued its approach in a Tech Policy Press article.

Transparency Reports. The first transparency reports were published by the designated VLOPs and VLOSEs in October 2023 (Pornhub, Stripchat, and XVideos had not been designated yet). Tech Policy Press has a tracker for the transparency reports and has analyzed the content of the ones submitted so far. In essence, all companies issued their reports on time, but their quality and granularity varied dramatically. In addition, their content was difficult to compare because of “seemingly different interpretations of DSA Articles that VLOPs and VLOSEs are obligated to comply with.” As explained below, the Commission aims to tackle this issue by adopting a regulation to standardize the transparency reports.

Public Databases. The European Commission also launched the DSA Transparency Database with a revamped interface. Platforms must explain the reasons behind their content moderation decisions in this database. According to the Commission, the database will enable the public to “track the content moderation decisions taken by providers of online platforms in almost real-time.” Also, the Commission launched the Digital Services Terms and Conditions Database in December. As its name suggests, this database features the terms and conditions of digital services. It uses an automated system that scrutinizes the terms and conditions in the database multiple times daily and highlights new changes.

Regulations and Public Consultations

Auditing Platforms Compliance with the DSA. In October 2023, the European Commission adopted the Delegated Regulation on independent audits. The binding regulation provides a framework to guide the auditors who will assess the compliance of VLOPs and VLOSEs with the DSA at least once a year. Unless the European Parliament or the Council of the European Union oppose these rules, they will apply shortly.

Limiting the Spread of Illegal Content. In the context of the Israeli-Hamas conflict, the European Commission also issued a non-binding recommendation directed to Member States. It aimed to coordinate countries’ responses to the spread and amplification of illegal content, focusing on “terrorist content or unlawful hate speech.” The recommendation encouraged countries to designate their Digital Services Coordinator ahead of the legal deadline of February 17, so they could become part of an informal network of prospective Digital Services Coordinators.

The recommendation also proposed an “incident response mechanism” to organize the cooperation between the Commission and this network and enhance their response to the dissemination of illegal content. The mechanism can involve meetings and the exchange of information collected at the national level. According to public information, the incident response mechanism has been used at least once, thus far. The recommendation also encouraged VLOPs and VLOSEs to draw up incident protocols in the case of “extraordinary circumstances” like “an international armed conflict or terror attacks.”

Data-sharing Platforms and Transparency Reporting. The Commission has also issued public consultations. On 8 December, it published one regarding a data-sharing platform between Member States and the Commission named AGORA. The platform will be an information-sharing system to support communications between the national Digital Services Coordinators, the Commission, and the European Board for Digital Services (composed of the Digital Services Coordinators).

On the same date, the Commission published a consultation on the Implementing Regulation on transparency reporting. The regulation will aim to increase the quality and level of harmonization of the transparency reports to guarantee the same level of transparency and accountability across platforms. It is set to be adopted in the first months of 2024 and will mandate the form and content of the reports by laying down mandatory templates and standardize the reporting periods for all service providers.

Coordination and Cooperation with Member States

Since the DSA was adopted, there have been several instances of bilateral and multilateral cooperation between the European Commission and national authorities. As discussed above, the European Commission established an informal network of prospective national Digital Service Coordinators following the Commission’s recommendation on limiting the spread of illegal content in October. The network will hold meetings until the DSA comes fully into effect on February 17. Thus far, there have been four meetings, even though the Commission has just provided details on three.

The network was first proposed to enhance the cooperation between the Commission and national authorities in response to the “dissemination of illegal online content, in particular where it pose[d] a clear risk of intimidating groups of population or destabilising political and social structures in the Union.” However, the network members subsequently agreed to discuss other issues related to the preparedness for enforcement of the DSA as well.

The Commission also announced it had signed administrative arrangements with the Dutch, French, Irish, and Italian authorities to support the Commission’s supervisory and enforcement powers under the DSA. This support includes organizing the “exchange of information, data, good practices, methodologies, technical systems, and tools” to investigate compliance.

Moreover, the Commission set up a task force on age verification with Member States. The task force aims to foster cooperation “to identify best practices and standards in age verification” and held its first meeting in January 2024. The DSA imposes a number of obligations on companies to protect minors, such as explaining the terms of use of services primarily directed at them in a way children can understand, putting in place measures to ensure minors’ privacy and safety, and restrictions on advertisements targeted at minors.

Authors

Jordi Calvet-Bademunt
Jordi Calvet-Bademunt is a Research Fellow at the Future of Free Speech Project and a Visiting Scholar at Vanderbilt University. His research focuses on freedom of expression in the digital space. Jordi has almost a decade of experience as a policy analyst at the Organization for Economic Co-operati...
Joan Barata
Joan Barata works on freedom of expression, media regulation, and intermediary liability issues. He is a Senior Fellow at Justitia's Future Free Speech project. He is also a Fellow of the Program on Platform Regulation at the Stanford Cyber Policy Center. He has published a large number of articles ...

Topics