Home

Donate

Fixing the Digital Services Act to Address "Dark Patterns"

Julian Jaursch / Mar 30, 2022

Julian Jaursch is a project director at not-for-profit think tank Stiftung Neue Verantwortung in Berlin, Germany. He analyzes and develops policy proposals in the areas of platform regulation and dealing with disinformation. Recently, he published a paper on platform design issues in the DSA that emerged out of a collaboration with the Max Planck Institute for Human Development.

Cristian Storto / Shutterstock

The social media feed that never ends. The pop-up on the travel booking page that virtually screams “74 people are watching this right now! Only 3 offers left!”. The page to change privacy settings or delete an account that can only be found after going through a maze of clicks: These are all corporate design decisions that shape millions of people’s everyday online experiences. At best, such design practices are just annoying. At worst, they are intrusions on people’s freedom of choice in the name of maximizing attention and profit. That is why the draft European Digital Services Act (“DSA”) for tech platforms needs to explicitly focus on platform design issues.

One of the stated goals of the DSA is to ensure a “transparent and safe online environment” across the European Union (EU). New rules for online platforms, which also include tech giants such as Facebook, Google and TikTok, are intended to help with this. For the first time, it is not just about setting guidelines for moderating and deleting individual pieces of content. Rather, the DSA is a kind of mandatory handbook with compliance rules for tech companies, including, for example, rules on how they must report on their business practices. However, what was missing from the European Commission’s original draft was a consideration of platform design.

If the DSA is to be the progressive set of rules that EU lawmakers have been talking about for years and that many people are hoping for, it must contain a stand-alone article on platform design with clear definitions, transparency requirements and also prohibitions. Proposals for such a design article now exist, but the Commission, member states and the European Parliament (EP) have not yet been able to agree on a compromise. The EU member states had suggested a ban on deceptive design, but only for online marketplaces. The EP goes further and wants to ban such practices on all online platforms. This is the right approach because deceptive design practices do not only occur on online shopping sites but also on social networks or video apps, where people inform themselves and form their opinions.

However, the EP proposals also need improvement. A sensible regulation of platform design should not rely solely on prohibitions. It should also allow insights into platform design processes, for example, through mandatory design reports. After all, design cannot only be used to mislead people. There are many researchers and practitioners in the fields of user interface (UI) and user experience (UX) design who are working on ethical orprosocial” platform design. For example, researchers have found that pop-ups with verified facts can help people deal with disinformation online. The DSA should encourage platforms to evaluate such design measures and disclose new approaches and their results. But neither the EP nor the Council, made up of EU member states, have made proposals of this kind. This means that, at the moment, it does not look like such requirements will be included in the DSA.

It is, therefore, all the more important that the rules that lawmakers can ultimately agree on for the DSA are consistently enforced. Well thought-out oversight structures are crucial. The responsible authorities must not only have sufficient expertise and resources of their own but must also exchange information with external experts, including UX/UI experts. Improvements in the draft are also needed in this regard. So far, the involvement of external expertise is an option, but not a must. Lawmakers urgently need to change this.

The EU countries’ plan would make thedevelopment of expertise an obligation for the Commission,which, according to their suggestion, should play a key role in platform supervision. This makes sense but should also explicitly include the involvement of external experts. The Council’s proposal, which wants to make the Commission responsible for supervising very large online platforms, would be most likely to ensure strong platform oversight and enforcement. In the long term, however, a separate, independent and specialized EU agency for platform oversight should be established.

The EU could use the DSA to emphasize the importance of dealing with annoying pop-ups and misleading buttons. Such design practices have often been referred to as “dark patterns”, including in the DSA. It may seem minor, but it would be an important signal from the EU to stop using this term. It has served its purpose in drawing attention to the issue of platform design. What is needed now is a more precise term such as “deceptive design practices”. Moreover, “dark patterns”, as design expert Kat Zhou says, perpetuates the problematic dualism between light/good and dark/evil.

The fact that not even the terminology regarding platform design is completely settled shows how much of a moving target this policy field is. The EU’s ambition to find rules here is well-placed and admirable but requires even more careful consideration of potential consequences and enforcement than usual. Since the DSA negotiations have not completely drawn to a close yet, lawmakers still have the chance to improve the amendments on the table.

A German version of this article appeared in the newsletter Europe.Table.

Authors

Julian Jaursch
Julian Jaursch is a policy director at not-for-profit think tank interface in Berlin, Germany. He works on issues related to disinformation and platform regulation.

Topics