Home

Digital Services Act Roundup: April - May 2024

Jordi Calvet-Bademunt / May 30, 2024

This piece is part of a series of published to mark the first 100 days since the full implementation of Europe's Digital Services Act on February 17, 2024. You can read more items in the series here.

Overview: In April and May, the European Commission opened three formal proceedings involving very large online platforms (VLOPs), two against Meta and one against TikTok. Also, starting in April, Pornhub, Stripchat, and XVideos began complying with the Digital Services Act’s (DSA) VLOP rules after being designated in December 2023. Moreover, Shein, an online fashion retailer, was designated as a VLOP and will be subject to the stricter VLOP rules and oversight by the European Commission by the end of August 2024.

Several countries missed the February deadline for Member States to appoint their national Digital Services Coordinators (DSCs), who serve as enforcers for the DSA at the national level. In April, the Commission urged Cyprus, the Czech Republic, Estonia, Poland, Portugal, and Slovakia to designate and fully empower their DSCs.

Finally, the Commission signed a cooperation agreement with Ofcom, the United Kingdom’s regulator in charge of enforcing the British Online Safety Act, held talks with the US regarding the protection of minors and transparency, and explored an information-sharing arrangement with Japan on online platform regulation.

Enforcement Actions

Formal Proceedings. The European Commission announced three formal proceedings in April and May, with two of them directed at Meta, specifically its platforms, Facebook and Instagram. The first proceeding concerns suspected infringements in relation to (1) deceptive advertising and disinformation, which may “present a risk to civic discourse, electoral processes and fundamental rights, as well as consumer protection;” (2) the demotion of political content in the recommender systems; (3) the “non-availability of an effective third-party real-time civic discourse and election-monitoring tool” ahead of upcoming European elections; and (4) its mechanism to flag illegal content, which should allow users to notify of the presence of illegal content in a user-friendly way and be of easy access. In an attempt to allay EU concerns that triggered the investigation Meta added safety features to its misinformation tracking tool CrowdTangle for use during European Parliament elections in June.

The second proceeding against Meta relates to the protection of minors. The Commission is investigating (1) whether Facebook’s and Instagram’s interfaces “may exploit the weaknesses and inexperience of minors and cause addictive behaviour;” (2) the appropriateness of Meta’s age-verification tools; and (3) Meta’s “measures to ensure a high level of privacy, safety and security for minors.”

Finally, the Commission also opened a proceeding against TikTok. The Commission had already opened proceedings against the company in February to assess whether it promoted behavioral addictions. This second proceeding concerns the launch of TikTok Lite in France and Spain. The Commission is interested in TikTok Lite’s reward system, which allows users to earn points while performing certain “tasks,” such as watching videos or liking content. The DSA requires VLOPS to submit a risk assessment report, including measures to mitigate potential systemic risks, before launching any new functionalities likely to critically impact systemic risks.

The Commission is interested in whether TikTok complied with this obligation and announced “its intention to impose interim measures consisting in the suspension of the TikTok Lite rewards programme in the EU.” The Commission’s main concern is the impact of the reward system on mental health, including minors’ mental health, “especially as a result of the new feature stimulating addictive behavior.” TikTok announced soon after the Commission’s announcement that it was suspending the reward system in TikTok Lite while it addressed the Commission’s concerns.

Requests for Information. In April and May, the Commission issued a handful of requests for information. Two of them were addressed to TikTok concerning TikTok Lite. The Commission also sent a request X concerning its decisions to decrease the resources it devotes to content moderation and its risk assessment and mitigation measures regarding generative AI. Microsoft also received a request regarding generative AI risks; the company had already received a similar request in March.

Designation. The Commission designated Shein as a VLOP. The Commission has so far designated 23 VLOPs and very large online search engines (VLOSEs). Shein, an online fashion retailer, communicated that it had an average of more than 45 million monthly users in the EU, the threshold to become a VLOP. Shein will now have to comply with the stricter DSA rules applicable to VLOPs by the end of August 2024. In its press release, the Commission highlighted Shein’s obligations for diligent surveillance of illegal products, enhanced consumer protection measures, and greater transparency and accountability. Before its designation as a VLOP, Shein already had to comply with the general obligations under the DSA.

The stricter rules applicable to VLOPs also kicked in for Pornhub, Stripchat, and XVideos in April, which were designated in December 2023. The Commission highlighted their obligations “concerning the measures to protect minors from harmful content and to address the dissemination of illegal content.” In April, Temu, an online marketplace, announced it has around 75 million monthly users in the EU, above the DSA threshold for VLOPs. Its designation will likely follow soon.

Supervisory Fee. Zalando challenged the supervisory fee the European Commission charges to providers of VLOPs and VLOSEs for the costs it incurs in enforcing the DSA. This fee is capped at 0.05% of the companies’ worldwide profit. Zalando disagrees with the “Commission’s fee calculation methodology and wants more transparency on this.” Meta and TikTok have also challenged the supervisory fee.

Transparency Reports. The VLOPS and VLOSEs designated in April 2023 were due to publish their second round of transparency reports by May 6. This group of companies published their first set of reports in October 2023. The transparency reports contain “detailed information on the platforms’ content moderation practices, including the count of user notifications and submissions from trusted flaggers, the volume of content automatically removed by the platforms, the number of orders from national judicial or administrative bodies, and metrics on the accuracy of automatic systems processing user notices and moderating content.” Transparency reports are part of the DSA measures to enhance transparency in content moderation. The Commission is working on establishing uniform reporting templates and harmonized reporting cycles.

Guidance on Elections. The “Guidelines for providers of VLOPs and VLOSEs on the mitigation of systemic risks for electoral processes” became applicable in late April. This document provides guidance on how VLOPs and VLOSEs can comply with the DSA’s obligation to conduct risk assessments and implement mitigation measures for adverse effects on civic discourse and electoral processes. More details are available in the previous roundup.

Stress Test. The Commission conducted a simulation exercise or “stress test” with platforms, DSCs, and civil society organizations to test “readiness against election manipulation and interference in relation to the European election.” The stress test featured a series of scenarios with attempts of election manipulation and interference, as well as cyber-enabled information manipulation and hybrid threats. The aim was to test platforms’ readiness to address manipulative behavior that could occur in the run-up to the elections, in particular, the different manipulative tactics, techniques, and procedures.

Institutional Framework

Digital Services Coordinators. Member States were required to designate their national DSA enforcers, the so-called DSCs, by February 17, 2024. DSCs are in charge of enforcing all DSA rules for non-VLOPs and non-VLOSEs and, in cooperation with the European Commission, some of the rules applicable to VLOPs and VLOSEs. In late April, the Commission called on six countries - Cyprus, the Czech Republic, Estonia, Poland, Portugal, and Slovakia - to designate and fully empower their DSCs and opened infringement proceedings against these countries. Cyprus, the Czech Republic, and Portugal had designated their DSCs but still had “to empower them with the necessary powers and competencies to carry out their tasks, including the imposition of sanctions in cases of non-compliance.” The countries have two months to respond and address the shortcomings raised by the Commission. In early April, the Commission had reportedly sent reminders to 21 Member States urging them to designate their DSCs. Estonia, Poland, and Slovakia had yet to designate their DSCs at the time of publication.

Whistleblower Tool. The European Commission launched a whistleblower tool for the DSA. The tool will enable individuals to provide information and unmask harmful practices by VLOPs and VLOSEs. The information can be provided anonymously and in any format, such as reports or email exchanges. The Commission created a similar tool for the Digital Markets Act (DMA).

International Cooperation. The Commission signed a cooperation agreement with Ofcom, the United Kingdom’s regulator enforcing the British Online Safety Act. Both regulators expressed interest in the “protection of minors online, age-appropriate design technologies, the transparency of online platforms, risk assessments, and the impact of algorithms on systemic risks for society.” The cooperation can be conducted “through technical expert dialogues, joint training of technical staff, sharing of best practices, joint studies and coordinated research projects.”

The US and the EU also discussed cooperation regarding the obligations of online platforms at April’s Trade and Technology Council (Council). The US and the EU stated, “that online platforms should exercise greater responsibility in ensuring that their services contribute to an online environment that protects, empowers, and respects their users.” They highlighted platforms’ responsibility regarding “mental health,” the development of children and youth,” and “technology-facilitated gender-based violence.” The US and the EU published “Joint Principles on Combatting Gender-based Violence in the Digital Environment.” These principles complement “Joint High-Level Principles on the Protection and Empowerment of Children and Youth,” published in May 2023, aiming to enhance data access from online platforms for independent research.

Enhancing data access for independent researchers, one of the DSA’s objectives, also played a preeminent role in the Council. The US and the EU held a workshop focused “on access to platform data and using this data to combat technology-facilitated gender-based violence.” They also published a status report “on mechanisms for researcher access to online platform data, which builds upon efforts undertaken by the academic and research community.” The objective is to disseminate information about the new and improved possibilities for accessing data. The two global powers further stressed that online platforms “should exercise greater responsibility in [...] protecting human rights defenders online.” Last March, they published the shared policy document, “Recommended Actions for Online Platforms on Protecting Human Rights Defenders Online.”

Finally, Japan and the EU agreed to explore establishing a regular information-sharing channel on online platform regulations, including the DSA and DMA, in the context of their second Digital Partnership Council.

European Board for Digital Services. The European Board for Digital Services held its third and fourth meetings in April; another is set for May 28. The third meeting discussed issues such as the DSA election guidelines (an overview of this document is available in the previous roundup), the results of the stress test of platforms election readiness, updates regarding disinformation and hate speech codes, and developments on the implementation and enforcement of the DSA.

Authors

Jordi Calvet-Bademunt
Jordi Calvet-Bademunt is a Research Fellow at The Future of Free Speech and a Visiting Scholar at Vanderbilt University. His research focuses on freedom of expression in the digital space. Jordi has almost a decade of experience as a policy analyst at the Organization for Economic Co-operation and D...

Topics