Home

Digital Services Act Roundup: June - July 2024

Jordi Calvet-Bademunt / Aug 6, 2024

Overview. The European Commission issued the first-of-its-kind preliminary findings under the Digital Services Act (DSA). X was accused of employing a deceptive user interface, failing to adhere to advertising transparency requirements, and failing to provide researchers with adequate access to data. X now has the opportunity to respond to these preliminary findings. In a separate case, following a request for information, LinkedIn turned off the feature that enabled advertisers to target users based on their membership to LinkedIn Groups. XNXX, an adult content platform, was designated a Very Large Online Platform (VLOP).

The Commission also urged Belgium, Spain, Croatia, Luxembourg, Netherlands, and Sweden to ensure they have fully empowered Digital Services Coordinators (DSCs). Finally, the Commission signed cooperation agreements with Australia’s eSafety Commissioner and the European Regulators Group for Audiovisual Media Services.

Enforcement Actions

Preliminary Findings. The European Commission shared with X its preliminary view that it is breaching the DSA. The Commission had already opened formal proceedings against AliExpress, Meta, TikTok, and X, but this is the first time it has issued preliminary findings. The Commission finds that X’s design and the operation of its interface “verified accounts” with the “Blue checkmark” are deceptive. It also argues that X fails to comply with the required transparency in advertising and falls short in terms of providing access to its public data for researchers.

Notably, the preliminary findings say nothing about assessing and mitigating systemic risk. I recently published an article with a comprehensive review of what we know about the findings and how they compare to the opening of the proceedings against X. The company can now reply to the Commission’s preliminary findings.

Requests for Information. The Commission sent a request for information to Amazon, asking the company to provide more details on the measures taken to comply with the obligations related to the transparency of recommender systems and their parameters, as well as the provisions on maintaining an ad repository and its risk assessment report.

Temu and Shein also received requests for information. The Commission sought information on the measures adopted to comply with the obligations regarding the ‘Notice and Action mechanism’ (which enables users to report illegal products), online interfaces (which must be designed to avoid misleading or manipulating users through ‘dark patterns’), minor protection, the transparency of recommender systems, trader traceability, and compliance by design. The requests followed a complaint submitted by consumer organizations. Temu and Shein were designated as VLOPs last May and April, respectively, and are supervised by the European Commission and the Irish Digital Services Coordinator.

The Commission also sent requests for information to Pornhub, Stripchat, and XVideos. The companies must provide more details on how they are assessing and mitigating risks related to the protection of minors online, including the use of age assurance mechanisms, and to prevent the amplification of illegal content and gender-based violence. In addition, the Commission requested information on the companies’ internal organization to ensure DSA compliance, including whether they have independent and well-resourced internal teams with sufficient authority. Pornhub, Stripchat, and XVideos were designated as VLOPs last April.

In the EU, LinkedIn turned off the functionality allowing advertisers to target LinkedIn users with ads based on their membership in LinkedIn Groups. This decision follows a request for information regarding the compliance of its ad targeting system with the DSA. There were concerns that LinkedIn was providing advertisers with the possibility to target users based on special categories of personal data, such as racial or ethnic origin, political opinions, religious or philosophical beliefs, or others.

Designation. The Commission designated XNXX, an adult content platform, as a VLOP. XNXX must now comply with the stricter DSA rules applicable to VLOPs by mid-November 2024. The Commission highlighted XNXX obligations to empower and protect users online, “prevent minors from accessing pornographic content online, including with age-verification tools,” provide access to publicly available data to researchers, and publish an ad repository.

Court Decisions. The EU General Court denied Pornhub’s request to postpone the disclosure of advertiser details mandated by the DSA. The court emphasized the DSA’s crucial role in creating a secure online environment. Pornhub’s operator, Aylo, “faces financial risks but not existential threats due to this requirement.” The court dismissed concerns that revealing advertiser information could disrupt competition or jeopardize Pornhub’s operations, highlighting the importance of adhering to EU digital regulations. Pornhub has appealed to the EU’s top court, the Court of Justice, “to avoid having to disclose the natural names of users in its ad repository.”

Post-Election Report. In July 2024, the European Board for Digital Services published a post-election report on the European elections of June 2024. The report provides an overview of the actions taken by the European Commission and DSCs to monitor compliance and enforce the DSA in the context of the European elections. The report highlights actions like the Guidelines on Election Integrity and the enforcement actions undertaken under the DSA. The report also refers to the actions undertaken under the Code of Practice on Disinformation, a self-regulatory framework signed by industry and civil society organizations promoted by the Commission. According to the report, “the actions to ensure preparedness and coordination among all relevant stakeholders were successful, as no major or systemic disinformation incidents were identified that disrupted the elections.” Importantly, the report urges signatories to convert the Code of Practice on Disinformation into a Code of Conduct, so it becomes linked to the DSA enforcement mechanism.

Call for Evidence. The Commission launched a call for evidence to gather feedback for its upcoming guidelines on protecting minors online. The call aims to collect information on the scope and approach of the guidelines, as well as good practices. In line with the DSA, the guidelines will advise platforms on how to implement high levels of privacy, safety, and security for minors online. The guidelines “will apply to all online platforms that are accessible to minors, including those that are not directed to minors, but still have underaged users, for instance, due to inadequate age-assurance mechanisms.”

Transparency Reports. Pornhub, XVideos, and Stripchat, designated as VLOPs in December 2023, published their first transparency reports in June. The transparency reports must contain information “concerning content moderation on the platforms” services, detailing the number of notices they receive from users, the number of pieces of content taken down on the platform's own initiative, the number of orders they receive from national judicial or administrative authorities, and the accuracy and rate of error of their automated content moderation systems. The reports must also include information on content moderation teams, including their qualifications and linguistic expertise.” Transparency reports are part of the DSA measures to enhance transparency in content moderation. The Commission is working on establishing uniform reporting templates and harmonized reporting cycles.

Institutional Framework

Digital Services Coordinators. In July, the European Commission opened infringement proceedings against Belgium, Spain, Croatia, Luxembourg, Netherlands, and Sweden. Public information is still limited, but according to the Commission, these countries either did not designate the authorities in charge of enforcing nationally the DSA - the DSCs - or “did not empower these authorities to perform the tasks” required by the DSA. Countries had until 17 February 2024 to designate their Digital Services Coordinator. In April, the Commission opened infringement proceedings against Cyprus, Czechia, Estonia, Poland, Portugal, and Slovakia for the same reasons.

European Board for Digital Services. The fifth and the sixth meetings of the European Board for Digital Services took place. During the sixth meeting, the Board discussed, among other issues, ongoing cases, complaints, priorities, a pilot project on data access for researchers, and the conversion of the code of conduct on hate speech into a DSA code. The Board also considered the post-election report on the European Elections mentioned above. In addition, the Board discussed the complementarity of DSA cases and enforcement actions under other instruments, such as consumer protection.

In the fifth meeting, DSCs discussed the lessons learned from the European elections, developments on the empowerment of DSCs, ongoing cases, and priorities. They also addressed the code of conduct on hate speech and guidelines on the online protection of minors. The board also agreed to create eight working groups, including on the “integrity of the information space,” the “protection of minors,” and other topics.

Institutional Cooperation. The Commission signed an administrative arrangement with the eSafety Commissioner, Australia’s online safety regulator, to support the enforcement of social media regulations. The arrangement covers areas “such as transparency and accountability of online platforms, risk assessment, and mitigation particularly as regards illegal content, algorithms, and artificial intelligence, as well as measures like age-appropriate design and age verification to protect minors online.” The arrangement will involve information exchanges, including expert dialogues, joint training of technical staff, and sharing of best practices.

The Commission and the European Regulators Group for Audiovisual Media Services (ERGA), which gathers national media regulators, agreed to structure their ongoing cooperation to support the Commission's supervision and enforcement activities under the DSA. The cooperation will focus on the supervision of VLOPs and Very Large Online Search Engines (VLOSEs). ERGA will facilitate the gathering of information at the national level and produce reports on issues related to media pluralism, disinformation, and the protection of minors, among others.

The Commission, DSCs’ representatives, and independent experts discussed the DSA with all EU candidate countries and potential candidates. Representatives from Albania, Bosnia and Herzegovina, Georgia, Kosovo, Moldova, Montenegro, North Macedonia, Serbia, Türkiye, and Ukraine attended the workshop. The event focused on the assessment and mitigation of “societal risks, algorithmic auditing, independence requirements for national regulators, [and operationilizing] processes such as the certification of trusted flaggers and independent dispute settlement bodies.” This workshop kicked off further country-specific engagement on the implementation of the DSA.

Roundtables. The Commission hosted the second online roundtable with Civil Society Organisations (CSOs) to discuss their implementation of the DSA. The roundtable covered various topics, including the protection of minors, tackling online hate speech and disinformation, and crisis response mechanisms in the context of the DSA. The Commission also presented the proposed scope and approach of upcoming Commission guidelines on the protection of minors for providers of online platforms. CSOs provided feedback on transparency reports from providers of intermediary services and gave input on the DSA transparency database that contains reasons for restrictions to online content or accounts. The Commission expressed interest in receiving “well-founded complaints and concrete evidence” concerning systemic non-compliance with the DSA.

In addition, the Commission hosted an online roundtable with VLOPs, VLOSEs, DSCs, and other public authorities. Participants discussed perceived threats to electoral processes, cooperation between authorities, experts, civil society organizations, and platform data access mechanisms. The roundtable participants concluded that “there has been no widespread distribution of deepfakes of Generative AI content measured so far.”

Calls for Tenders. The Commission launched a call for tenders to seek technical assistance for market intelligence (e.g., monitoring technological developments, or digital threats and social media data related to systemic risk), evidence gathering regarding prominent and recurring risks (e.g., gender-based violence), and compliance monitoring particularly regarding the transparency and traceability of online sales and the promotion of illegal goods. The services will support the European Commission in conducting DSA supervisory tasks regarding VLOPs and VLOSEs. The call for tenders closes on September 12, 2024.

The Commission launched another call for tenders for a study into online advertising. The study will provide the Commission with “an updated comprehensive overview and understanding of the most relevant developments (e.g., technological, regulatory, market) in the online advertising sector, map the main issues and identify possible regulatory gaps.” The study will assess the impact of the DSA and the Digital Markets Act, an EU regulation dealing with market fairness and contestability, on the online advertising sector.

Authors

Jordi Calvet-Bademunt
Jordi Calvet-Bademunt is a Senior Research Fellow at The Future of Free Speech and a Visiting Scholar at Vanderbilt University. He is also the Chair of the Programming Committee of the Harvard Alumni for Free Speech and has been a fellow at the Internet Society. At The Future of Free Speech, Jordi f...

Topics