Taking Action on Dark Patterns
Justin Hendrix, Caroline Sinders / Apr 26, 2021Introduction
On Thursday, April 29th, the Federal Trade Commission (FTC) will host a workshop on the problem of “dark patterns.” The concept, little more than a decade old, was coined by user experience designer Harry Brignull in 2010 to describe “deceptive user interfaces.” At its workshop, in which Brignull will participate along with a roster of experts, the FTC seeks to “explore the ways in which user interfaces can have the effect, intentionally or unintentionally, of obscuring, subverting, or impairing consumer autonomy, decision-making, or choice.”
The FTC’s focus on dark patterns is not new- Commissioner Rohit Chopra issued a statement regarding dark patterns in September 2020. Lawmakers including Sens. Mark R. Warner (D-VA) and Deb Fischer (R-NE) introduced a bill in the last Congress called the Deceptive Experiences to Online Users Reduction (DETOUR) Act. And various bodies in the U.S. and abroad have moved to protect citizens from dark patterns. In California, for instance, Attorney General Xavier Becerra recently announced additional regulations under the California Consumer Privacy Act (CCPA) intended to ban “confusing language or unnecessary steps such as forcing them to click through multiple screens or listen to reasons why they shouldn’t opt out.” And in Europe, there is an active dialogue about how GDPR applies to dark patterns and the concept of consent.
Indeed, we regard this as a time not merely to ask questions- it is a time to act. The harms to people- as consumers and as citizens- are well enough understood at this point that it is incumbent on regulators and lawmakers to act. Ryan Calo, a University of Washington law school professor and an invited speaker at the upcoming workshop, noted in a paper in 2014 that “regulators and courts should only intervene where it is clear that the incentives of firms and consumers are not aligned.” In his view, the systematization of extremely personal marketing practices, coupled with these divergent interests, should prompt action. We assess there is more than enough evidence that this threshold has been met, and that regulators should urgently pilot new rules to address the harms caused by dark patterns. Where there is overreach, that can be addressed; but there is more danger now from a failure to act.
Nevertheless, the workshop and the public comment period on this issue represent an opportunity to shape potential new rules, regulations and legislation related to dark patterns, and to shed light on harms caused by them that may be addressed through existing mechanisms in the FTC or other entities. And, it is a unique opportunity to consider the definition of dark patterns and how it has evolved as design practices, technologies and user behaviors have evolved in the decade since the term was first employed.
Panel discussions at the FTC workshop will focus on themes including:
- What are dark patterns, and why are they employed?
- How do dark patterns affect consumers?
- How do dark patterns specifically affect communities of color?
- How do dark patterns target kids and teens?
- How can we best continue to address dark patterns?
Tech Policy Press convened an ad hoc working group to discuss these issues ahead of the FTC workshop, and arrived at a handful of key considerations for the lawmakers, regulators, and experts participating in it. The working group included:
- Wafa Ben-Hassine, Principal, Responsible Technology, Omidyar Network
- Elinor Carmi, Postdoctoral Research Associate, Liverpool University
- David Carroll, Associate Professor of Media Design, Parsons School of Design at The New School
- Joël Carter, Research Associate at the Center for Media Engagement, University of Texas at Austin
- Sarah Drinkwater, Director, Responsible Technology, Omidyar Network
- Chris Gilliard, Harvard Kennedy School Shorenstein Center Visiting Research Fellow
- Justin Hendrix, Editor of Tech Policy Press
- Jesse Lehrich, co-founder, Accountable Tech
- Caroline Sinders, designer researcher and artist working in human rights and technology
Our purpose is to help inform the discussion at the upcoming workshop, and open a dialogue with Tech Policy Press readers about dark patterns and what should be done to address them.
Key Considerations
In general, our working group arrived at four key considerations we believe the FTC should focus on: defining dark patterns; centering the impact of dark patterns; looking beyond the individual designer and analyzing decision ecosystems; and dark patterns in relation to AI and machine learning.
1. Defining dark patterns
The FTC rightly seeks a clear and robust definition of a dark pattern. In our ad-hoc working group, arriving at an encompassing definition proved challenging. What is or isn’t a dark pattern? When does a dark pattern stop, and malicious design or nudging start? What is simply sales?
- Dark patterns may have started as a design issue, as the name is a nod to design patterns, and the first definitions focused on UX/UI and layout design on the surface, but the concept now rightly contains a much larger set of concerns and harms and represents deeper technical, infrastructure, business models and decisions, and policy issues.
- In our ad-hoc working group, participants referenced nudging, persuasive design, addictive design, malicious design, confusion design, manipulative and general ‘bad design.’ Dark patterns can be all of those things- these phenomena all manifest in opaque communication to the consumer, and increased risk they will be tricked into doing something that wasn't their original intent. But these exact descriptors: nudge, persuasion, addiction, maliciousness, confusion, bad design- can also manifest in ways that might not be regarded as dark patterns.
- We suggest the FTC looking at disinformation as a corollary in terms of defining and scoping this area. Similar to disinformation, defining dark patterns requires describing a range of diverse phenomena and actors with significant risk of misapplication. An example of one attempt at a definition- or set of definitions- of disinformation is Claire Wardle’s Seven Categories of Information Disorder.
2. Centering the impact and harms of dark patterns
Our working group found it imperative to emphasize and center the harms and outcomes of dark patterns, as opposed to their intent. Harm, harm reduction and harm analysis needs to be included in the definitions as well as adjudication and regulation of dark patterns. A good place to start is with Ryan Calo’s definition of harms, which recognizes three main typologies: “economic harm, privacy harm and vulnerability as autonomy harm.” Our recommendation is to build impact and harm analysis into the process when analyzing dark patterns:
- Focus on context. The dark patterns definition needs to be more robust and focused on harms, vulnerabilities and specific domains where dark patterns are implemented. Not all industries are the same, and this kind of nuance is necessary- dark patterns may be differently employed in healthcare, for instance, than in consumer finance. For example: the same graphical UI/UX design of a dark pattern may be less harmful when placed on a website in an entertainment context than in a personal finance context. It is important to name the explicit types of design choices that are dark patterns, like a countdown clock on shopping websites, or a ‘bait and switch’ but we suggest the context in which the dark pattern operates and who it harms needs to be included in the definition. The prevalence of a dark pattern is important to note, but it cannot be divorced or separated from the context of where it is appearing.
- Understanding the asymmetry of harm against different kinds of users. There needs to be a deep understanding in how some dark patterns directly target vulnerable populations, such as groups marginalized by race or gender, elder populations, younger populations, populations who are lower income and financially impacted, etc. FTC should continue to emphasize the importance of having voices who are representative and have specialist knowledge of more vulnerable populations.
- Intention vs. impact: some definitions of dark patterns highlight that dark patterns occur ‘intentionally and unintentionally’ via companies. It is likely more important to measure the impact of dark patterns and their harms rather than trying to investigate or measure intention.
- Understanding this is an ecosystem/systemic problem: There needs to be consideration of how dark patterns exist and are made in a context, including an organizational context and an industrial context. Dark patterns often do not emerge simply from design decisions made by one individual but are representational of an entire company’s motivations, culture and or management structures. So often product choices across industries emerge from dominant companies with careless cultures.
3. Thinking through where AI, machine learning and deep learning impact dark patterns
There is much to be considered as to the extent to which various computational methods are changing design and impacting communities. However, this is where the definition of dark patterns is key. Are dark patterns just design, or the manifestation of design? Or are dark patterns the manifestation of systems, structures, code, and policies wrapped in design? When thinking of including AI into the definition of dark patterns, are we also including questions of consent and understanding of these AI systems on the part of the user?
- We suggest that the FTC and others, again, refer back to the definition and scope of dark patterns. Some concerns around the relationship between computation and dark patterns may be better addressed under different rules.
- To illustrate the nuance of this issue, consider a few brief examples, all using algorithmic recommendation and search. What is the difference between algorithmic pricing and search results on Amazon, versus Target A/B testing a product in different physical stores? In this particular case, a consumer can build a much more clear mental model and understanding of a physical store. When a consumer picks up a physical object, like a bar of soap, in a Target, all of the other soap does not magically disappear or change in price based on the consumer’s actions. But in digital spaces with algorithmic pricing and search, this kind of manipulation does occur.
- However, let’s compare algorithmic pricing and search on Amazon versus an algorithmic timeline on Facebook, or algorithmic ads triggered in Google. Are these the same kinds of dark patterns, or a manifestation of different harms using the same kind of tech? We might argue these are more opaque, and while very targeted, less direct than the action of someone searching for a specific product in a marketplace.
Recommendations for further research
An interesting key point that came up in our ad-hoc working groups is the lack of codified best practices around ‘good design’ within the design industry. We recommend research on design practices that would prevent or mitigate the emergence of dark patterns. This kind of research and definitions could also help define when dark patterns arise, and analyze their harms.
We also recommend that the FTC work with design researchers along with policy experts and technologists to better understand the ecosystems that create dark patterns. Regulators must understand design culture and design pipelines in how dark patterns manifest in products to create impactful and useful regulation.
Conclusion
The upcoming FTC workshop represents a unique moment to advance the consensus on these issues and to identify areas where more consideration and research is required. Our ad hoc working group intends to convene again to further consider these issues, advance the work on the definition of harms, and ultimately provide future input to regulators and lawmakers in the U.S., EU and elsewhere. If you have a contribution to make, contact us.
The time to interrupt dark patterns is now.