Home

Do the DSA and DMA Have What It Takes to Take on Dark Patterns?

Eli MacKinnon, Jennifer King / Jun 23, 2022

Dr. Jennifer King is the Privacy and Data Policy Fellow at the Stanford University Institute for Human-Centered Artificial Intelligence; Eli MacKinnon, a recent graduate of Stanford’s Master’s in International Policy program, works for Skiff, a company building private collaboration tools designed to make the internet less creepy.

European Parliament, Wikimedia Commons

Global policymakers are increasingly fed up with manipulative and coercive user interface designs. In the last few years, legislation aimed to curb the use of “dark patterns” — those irritating and manipulative design tactics that digital interface designers use to get people to do what they want, like a consent choice that shouts “Accept” and whispers “Reject,” or a subscription that’s near impossible to cancel — has emerged from California to Brussels. There’s a growing consensus among lawmakers and the public that the designers of software interfaces have an inordinate amount of influence over the choices made by their customers, and that companies use that influence to exert decisional interference by coercing, manipulating, and even deceiving customers into making choices that benefit the company at their customers’ expense.

In the U.S., for example, there have been multiple efforts to rein in this influence. California’s new and incipient privacy laws aim to prevent decisional interference when exercising one’s right to opt-out of data collection, as well as when consenting to the collection of personal data; a bill in Washington state does the same. And recently, the Federal Trade Commission announced it was seeking public comment in its revision of its advertising disclosures guidelines, specifically to address the growing use of manipulative designs.

However, the latest regulatory attempts have come in the form of the E.U.’s landmark pair of internet legislation packages, the Digital Services Act (DSA) and Digital Markets Act (DMA). The DMA’s final text, published in May, includes at least two articles with implications for dark patterns. And the DSA, while not yet finalized, addresses dark patterns in its final drafts in more detail.

The impact of both packages, like the prior impact of the Europe’s comprehensive data privacy regulation, the General Data Protection Regulation (GDPR), also has the potential to spill far beyond the E.U.’s borders, as it can be costly and complicated for a company to balance different legal requirements in different regions. But the world shouldn’t hang its hopes on the DSA and DMA to free it from the effects of manipulative design. The strength and scope of the DSA’s limits on dark patterns was an area of late-stage contention during the negotiation process, with some negotiators reportedly wrangling to water them down.

Efforts to moderate the laws appear to have been somewhat successful — neither legislative package amounts to an unbridled attack on dark patterns. Still, if the European Commission succeeds in enforcing them (a big “if”), they will amount to some of the most far-reaching regulations on manipulative digital interface design to date. It’s worth assessing, then, what both laws likely will — and won’t — achieve with respect to reining in dark patterns.

What Does the DMA Say About Dark Patterns?

Broadly speaking, the DMA is aimed at ensuring fair competition between online businesses, while the Digital Services Act takes on broader social concerns and consumer protections related to online commerce. And while the DSA addresses a comparatively wide range of online services, the DMA concerns itself only with a class of online businesses termed “gatekeepers.” Such companies either must have have 45 million or more monthly active users and an annual turnover of 7.5 billion euros — or else a company must meet the following three qualitative criteria:

(a) it has a significant impact on the internal market; (b) it provides a core platform service which is an important gateway for business users to reach end users; and (c) it enjoys an entrenched and durable position, in its operations, or it is foreseeable that it will enjoy such a position in the near future.

Article 70 of the DMA, which contains its main provision on dark patterns, takes care to narrow the scope of prohibitions, forbidding dark patterns’ use only by gatekeepers and only within the context of attempts to circumvent other obligations put in place by the law.

Article 70 reads, in part:

Gatekeepers should not engage in behaviour that would undermine the effectiveness of the prohibitions and obligations laid down in this Regulation. Such behaviour includes the design used by the gatekeeper, the presentation of end-user choices in a non-neutral manner, or using the structure, function or manner of operation of a user interface or a part thereof to subvert or impair user autonomy, decision-making, or choice.

This definitional language echoes that of the proposed 2018 DETOUR Act in the U.S., which despite dying in session, has lived on as the definition of dark patterns in the California Privacy Rights Act (CPRA) as well as the new Colorado Privacy Act. It defines dark patterns as a form of decisional interference, which presupposes that companies can cure the interference by presenting choices to their customers in a way that is neutral, or at least not inherently self-preferencing. Depending on how regulators choose to interpret the DSA, this framing could have profound implications on how gatekeepers’ future choice architectures are presented to consumers. A design that asymmetrically emphasizes one choice over another, for example by highlighting a decisional button through size, color, or both and leaving the other deemphasized or grayed out will likely be deemed “non-neutral,” especially if it emphasizes a choice with economic or data disclosure impacts for the user.

The DMA’s Recital 63 also puts a particular focus on one well-known dark pattern that researchers have aptly termed “hard to unsubscribe,” barring gatekeepers from making it “unnecessarily difficult or complicated for business users or end users to unsubscribe from a core platform service.” This recital is likely directed primarily at Amazon, given the Norwegian Consumer Council’s findings and complaint against Amazon into the challenges of unsubscribing from Amazon Prime.

The verdict, then, is that the DMA may root out some dark patterns, but only for “gatekeeper” companies, and only in contexts where those dark patterns relate directly to the law’s other provisions (for example, a gatekeeper wouldn’t be able to use a dark pattern to solicit user consent to receive targeted advertising). When major actors like Amazon utilize dark patterns, because of their size and reach, the effects are felt at scale by many millions of customers. However, there is a long tail of small companies that are also guilty of exploiting harmful dark patterns, often relying upon interface-based deception as a core aspect of their business models. While the effects of these deceptions may not be as far-reaching, they can sometimes pose more direct harm, as in the case of outright fraud. Smaller companies whose entire business models are based upon aggressive or highly deceptive dark patterns can often operate under the radar until they finally aggregate enough bilked and angry consumers that regulators take action. If they do so, it won’t be because of the DMA.

What Does The DSA Say About Dark Patterns?

Given its broader applicability, the DSA has drawn more attention from researchers of dark patterns. It concerns itself not only with large online platforms such as Facebook and YouTube, but with all online “intermediaries” — a class of services that includes any online business that stores or transmits information on behalf of a third party. Everything from internet service providers and hosting services, to messaging apps, email providers, and Amazon-style marketplaces are included under this umbrella. Taken together, these intermediary services account for a large swath of the global digital economy.

In theory, the DSA might have had a decisive impact on the prevalence of dark patterns online. However, those hoping for a broad, aggressive approach to dark patterns may be underwhelmed by the final version.

Who Is and Isn’t Affected?

The DSA includes the following language about dark patterns in Article 23a:

Providers of online platforms shall not design, organise or operate their online interfaces in a way that deceives, manipulates or otherwise materially distorts or impairs the ability of recipients of their service to make free and informed decisions.

On first read, this language may seem ambitious and far-reaching, but it is notable for who it leaves out. Embedded in the above paragraph is the conclusion to a key debate that played out among DSA lawmakers and centered on this question: Should the dark pattern ban apply to all intermediary services or only to online platforms?

By specifying that the prohibition on such deceptive design tactics applies only to online platforms, the drafters of the DSA opted to scope down the law’s potential impact on the use of dark patterns. Rather than facilitating an E.U. internet where all intermediary services are subject to this Article, it proposes one where only a limited —albeit important — subset of them is.

This means that a wide range of intermediary services — including businesses foundational to online commerce, such as ISP’s, web-hosting services and domain name registrars — are not subject to the ban. These services often have consumer-facing businesses and are certainly not immune to the allure of manipulative design, so their exemption from the DSA’s dark pattern regulations is significant.

What Practices Are Affected?

Concerns around the limited scope of the DSA’s dark patterns ban center not only on who is affected, but also on what practices are affected.

The apparently broad prohibition on efforts to interfere with user autonomy and choice laid out in Article 23a above is followed in the DSA by language that lays the groundwork for the European Commission to issue guidance on three specific types of dark patterns:

The Commission may issue guidance on the application of paragraph 1 to specific practices, notably:



(a) giving more prominence to certain choices when asking the recipient of the service for a decision;



(b) repeatedly requesting a recipient of the service to make a choice where such a choice has already been made , especially by presenting a pop-up that interferes with user experience;



(c) making the procedure of terminating a service more difficult than subscribing to it.

Among researchers, these three patterns are generally called “asymmetric choice”, “nagging” and “hard to unsubscribe” respectively. We provide an example of asymmetric choice in our DMA discussion above.

While these three patterns are widely prevalent and can cause direct consumer harm, they represent nothing close to an exhaustive list of the dark patterns currently employed by online platforms. It might be that these three were called out because there were concerns that existing EU consumer protection statutes did not clearly include them. But the DSA’s explicit focus on these three patterns suggests that the initial focus of the dark pattern prohibition will be on a relatively narrow range of manipulative practices.

Moreover, the DSA’s language does not indicate that these three tactics are in fact examples of deceptive design that are prohibited. Rather, it merely tasks the Commission with issuing guidance on if and how they may be subject to regulation under the DSA — far from an explicit ban.If the Commission ends up focusing its efforts on these three patterns to the exclusion of others, it would be a missed opportunity to make a strong statement about rebalancing online design in favor of consumers.

Exemption for Overlap with Existing Regulation

The DSA also specifies that its prohibition on dark patterns applies only to practices that are not already covered by two pieces of past E.U. legislation, including the GDPR (Regulation 2019/679) and the Unfair Commercial Practices Directive (Directive 2005/29/EC). It reads:

The prohibition referred to in paragraph 1 shall not apply to practices covered by Directive 2005/29/EC or Regulation 2019/679.

The potential issue with this stipulation is that neither the GDPR nor the Unfair Commercial Practices Directive — the latter of which could be argued to implicate dark patterns via its broad prohibitions of unfair business practices — have succeeded in broadly reining in dark patterns. Insofar as these two laws do make room for regulating dark patterns, enforcement has been relatively weak and constrained to narrow contexts, such as cookie consents.

Some DSA drafters had hoped that the DSA could succeed where these directives had failed. But the DSA’s explicit deference to these past laws dims that prospect. In essence, it says the DSA cedes enforcement in any of the areas touched by the two theoretically relevant laws already on the books, neither of which has significantly curbed the proliferation of dark patterns.

The DSA Swipes Left on Manipulative Algorithms

Finally, many of today’s most prolific dark patterns don’t announce themselves in the form of an annoying pop-up or even in a static user interface — rather, they may manifest in dynamic, personalized interfaces that are driven by machine-learning algorithms honed through the ongoing collection of data. In terms of fighting manipulative algorithms, the DSA goes where so far U.S. regulators have yet to tread. Article 26 requires very large online platforms to “diligently identify, analyse, and assess systemic risks stemming from the design, including algorithmic systems, functioning, and use made of their services,” and to conduct risk assessments. The assessments must include “any actual for foreseeable negative effects for the exercise of fundamental rights,” which include not only the collection and use of personal data, but also consumer protection generally, as well the rights of children. While much of the discussion about Article 26 has focused on the impact of the distribution of content on large platforms, we think this Article can be read more broadly to impact algorithmically powered dark patterns and interface designs as well as the design of algorithmic systems that result in behavioral harms, such as addiction.

We are presently experiencing a shift as manipulative designs move beyond a narrow focus on static user interfaces and has expanded to include design tactics that rely on dynamic, data-driven algorithms that interfere with individuals’ decisions in subtler ways, such as content feeds that drive seemingly endless engagement. The DMA opens the door to requiring very large companies to at least acknowledge the risks these design choices pose to their customers. This could be quite significant for online services such as Instagram, which produced its own research demonstrating that its service poses substantial risks, specifically negative mental health consequences, to teens in particular. In the future, standing by despite knowing your design negatively impacts your customers may not be an option under the DMA.

Conclusion

Despite over a decade’s worth of aggregated evidence and research about companies’ use of manipulative design, we are finally seeing regulators take aim at these practices. However, how effective these new regulatory tools will be remains an open question. Watch this space.

Authors

Eli MacKinnon
Eli MacKinnon is a recent graduate of Stanford’s Master’s in International Policy program, where his studies focused on cyber security and digital privacy. His writing has appeared in Foreign Policy, LiveScience and the South China Morning Post. He now works for Skiff, a company building private col...
Jennifer King
Dr. Jennifer King is the Privacy and Data Policy Fellow at the Stanford University Institute for Human-Centered Artificial Intelligence. An information scientist and a recognized expert and scholar in information privacy, her research examines the public’s understanding and expectations of online pr...

Topics