The Digital Services Act: How is Europe Planning to Regulate Tech?
Mathias Vermeulen / Apr 23, 2022Mathias Vermeulen is a director at AWO in Brussels and an affiliated researcher at the Centre for Law, Science, Technology and Society at the Vrije Universiteit Brussel.
The European institutions just announced a political deal or a so-called ‘provisional agreement’ on the Digital Services Act (DSA), a new set of rules for online platforms, search engines, online marketplaces and every other significant provider of digital services. The DSA will fully enter into force in the first quarter of 2024, but sooner (four months after the publication of the final text in the official journal of the EU) for very large online platforms.
The rules have the potential to create a fundamental paradigm shift to hold technology platforms to account, and have been labeled as a potential gold standard for other regulators in the world. Press releases from the European Council, the European Parliament and the European Commission offer soundbites on the contours of the agreement. But what exactly has been decided?
The short answer is: we don’t know yet. In yesterday’s marathon meeting, the European Parliament and the European Council made a deal on roughly 15 key sticking points that weren’t solved in the past three months (more on those below), but since there won’t be a final text available for at least a couple of weeks it is not possible to know the details yet. And in regulating tech, the devil is always in the details.
But first, let’s take one step back.
How did we get here?
The current DSA draft was developed after countless hours of discussions, roundtables, position papers and lobbying activities. Some numbers: in the summer of 2020 the European Commission organized a consultation in which it received 2863 contributions, and it commissioned a number of legal and economic studies, which fed into an extensive impact assessment that accompanied its proposal for a Digital Services Act in December 2020. Both the European Parliament and the Council scrutinized this proposal for a year in 2021 in order to arrive at their respective negotiation positions.
After considering 2300 amendments by members of every political party, the European Parliament adopted its suggestions to amend the Commission proposal on the 20th of January 2022. The European Council had previously done the same and adopted its suggestions for amendments l in November 2021.
Between January and April of this year, negotiators in the European Parliament and the Council, assisted by the Commission, organized 17 ‘technical’ expert meetings and 4 political meetings in order to arrive at a compromise. Yesterday, the fifth and final meeting took place, at which legislators poured over roughly 700 pages to arrive at a consensus over the most contentious points.
This deal now needs to be finalized at ‘technical level’, scrutinized by the legal service to identify any anomalies in the text, and it needs to be translated into all the different European languages. Ultimately, the final text will be approved by vote in the plenary of the European Parliament, and in the European Council. These votes are often formalities.
The DSA’s biggest innovation is ironically its least controversial aspect
The most innovative aspect of the DSA is twofold. Firstly, the DSA creates a set of obligations for tech companies which will force them to properly assess and mitigate the harms their products can cause. Crucially, both these assessments and mitigation measures can be assessed by independent auditors and external researchers.
Secondly, the DSA is a data-gathering machine – in the sense that companies will be subject to a number of new transparency obligations that are adapted to the type and nature of the service concerned. For instance, YouTube will face way more transparency obligations than a simple hosting service might.
The DSA also updates and streamlines the existing notice and action system for illegal content– as defined by national laws of the Member States of the EU– that have been in force for more than 20 years, hence which was relatively uncontroversial.
What was still up for discussion?
Enough was still at stake to keep the negotiators talking for more than 16 hours. Despite some of the early announcements on the deal, it’s probably best to reserve judgments on these topics until one has seen the final text of the agreement.
Online advertising
Probably the most discussed feature of the DSA, and a topic of an earlier podcast I participated in here at Tech Policy Press. The European Parliament insisted on including provisions that would ban targeting based on data acquired from minors and so-called ‘sensitive’ data (ie data that reveals your political or sexual orientation, ethnic origin, etc.), whereas the Council generally was not keen to go beyond what is already provided for in the General Data Protection Regulation (GDPR). Here, the devil is really in the details, and there is a lot of room to spin the final outcome without outsiders being able to have a look at the exact wording.
“Dark patterns”
The Parliament wanted to include a ban on so-called dark patterns in the Digital Services Act, and proposed a non-exhaustive list of such deceptive design practices. The Council was keen to limit some of these provisions initially to marketplaces and recommender systems only. Ultimately it seems there has been a deal that providers shall not “design, organize or operate their online interfaces in a way that deceives, manipulates or otherwise materially distorsts or impairs the ability of recipients of their service to make free and informed choices”.
Crisis response mechanism
Extraordinary times require extraordinary measures. This has been a principle of human rights law for decades. As long as there exists robust time-limits and sufficient safeguards, it makes sense to apply this principle to online platforms as well. When a war breaks out you don’t want to rely on the goodwill of platforms to take necessary measures, for instance to cut off advertising revenue of sites whose business model is to spread Russian propaganda. The discussion between Parliament and Council focused on the governance mechanism and the appropriate safeguards that should be put in place.
Liability of online search engines
One of the most controversial, and last-minute proposal, to the DSA, which would made search engines subject to a specific notice and action regime, creating a duty to delist search results that lead users to illegal content. One (isolated) voice in the European Parliament even wanted to go further and suggested that if illegal content is flagged, not just the relevant web pages but the entire website should be removed from the search results. Discussions also focused on new responsibilities for smaller search engines, but it’s unclear which compromises have made it into the final agreement. Euractiv reports that the final text “consists of a case-by-case assessment of the responsibilities of Google and the likes for illegal content, which is left to be clarified by a legal review."
Dissemination of “revenge porn”
The Parliament was keen to include additional obligations on platforms that are primarily used for the dissemination of user-generated porn, including a requirement to ensure human moderation, a qualified notification procedure through rapid processing of notices and take downs without delay. It appears that no separate article on revenge porn was retained in the final text.
Other issues
Other topics of discussion included topics such as:
- Whether to set up a specific ‘waiver’ system or other exceptions for medium-sized enterprises to not be subject to;
- The amounts very large platforms should pay as ‘supervisory fees’ to the European Commission (under the principle of ‘the polluter pays’);
- To what extent accessibility requirements for persons with disabilities should be imposed on various types of platforms;
- How online marketplaces could identify how illegal content and goods would appear on their platform and subsequently remove these;
- A right of compensation for the recipients of the service when a damage or loss occurs due to the infringement of DSA obligation.
This sheer variety of topics illustrates the breadth and depth of the DSA’s provisions, which will be a source of discussion for many months (and even years) to come. One thing is clear: this political deal marks the beginning of a new era of platform regulation – not only in Europe, but probably beyond as well.