Home

Design's time to shine: UX and digital product experts must demand new rules that prohibit exploitative Dark Patterns

David Carroll / Jun 8, 2021

Designers and technologists have a window of opportunity to demand new rules to end exploitative tech practices, argues David Carroll.

Across the world, a new movement of people who develop technologies and products are asserting themselves to demand better, more humane practices from the tech industry. In the United States, the evolving atmosphere around tech regulation offers user experience and product design professionals an opportunity to elevate their practice and shape new rules to the benefit of citizens, ending exploitative practices that dishonor their profession.

Times have changed. Once upon a time, tech could do no wrong. Those were the days. But the current efforts in the United States to reign in tech platforms represent a stark shift from the previously held consensus that regulation harms new technology innovation. Now, we are beginning to recognize that social harms imposed by unaccountable tech platforms remain unpriced by market forces or tax collectors.

As such, the prevailing preference for government regulation has revved up from “light-touch” to “how hard”. Polls suggest Americans want more regulation of technology companies. Remarkably, a robust American tech regulatory regime is suddenly in the realm of possibility. An unserious scenario in 2011, the prospect of a national data privacy law in 2021 is indeed graspable, while antitrust probes roll along and various ideas for Section 230 reform are bandied about. But, even if winds blow in the direction of protection, is our regulatory apparatus up to the task, or as they say in the UK, “fit-for-purpose?”

The UK has been self-critical of its regulatory shortcomings, admitting that its own Information Commissioner’s Office (ICO) and Competition and Markets Authority (CMA) struggle to achieve their mandates single-handedly in the context of overseeing tech titans who dominate multi-sided markets by exploiting personal data. These two previously disconnected and disparate regulators have joined forces. Announced in the summer of 2020, the cooperation has now been codified in a joint statement. Similarly, the Australian authority’s innovative thinking around publisher bargaining rights represents another transformative regulatory vision. Expect the Australian Competition and Consumer Commission (ACCC) model known as the News Media Bargaining Code to be replicated elsewhere (despite some valid concerns about Rupert Murdoch that are peculiar to the Australia context).

The US also needs to recognize this shift toward cooperation among regulatory bodies fostering interdisciplinary approaches to problem solving. The tech regulation discourse state-side is typically cast as a conflict between a patchwork of state data privacy laws against preemptive national privacy bills, with ongoing antitrust investigations humming along in the background, while reform of the poorly understood Section 230 of the Communications Decency Act upstages the more bipartisan and less rancorous data privacy debate. Our legal system and its specializations, case law, and regulatory armatures prefer that the Federal Trade Commission enforce privacy protections, the Department of Justice bust trusts, and that Congress should reform Section 230 by popular demand.

Fundamentally, the system favors deference to business interests, not regulatory overreach. Software giants now dominate multiple industries and enjoy monopoly power in most regular life situations which cut across concerns regarding consumer protections and corporate liability. The situation begs the question: is the independent and largely uncoordinated regulatory status quo in the United States up to the task of effectively limiting the power of tech titans?

In a weird flex, Facebook has been spending lavishly on a national mass media advertising campaign suggesting support for new regulation because the old stuff is obsolete. These ads are reminiscent of the motivational posters that adorn HQ in Menlo Park, California. It’s almost as if one of these posters touts the slogan: Law is like shipping code. Meanwhile, most tech policy wonks are heads-down in their respective silos, churning over privacy reform, lobbied loopholes, antitrust actions, and liability reforms, hoping to boil it down to an oversimplification. Opt-in or opt-out? Few have been thinking across disciplines as a through line across abuse of privacy as software-eats-the-world business models build networked monopolies.

Once we let go of the limited view of disciplinary specializations, it is possible to recognize we are contending with a single issue: the asymmetric power of the dominant tech firms.

David Carroll

An exception to that characterization would be the FTC’s recent workshop on so-called Dark Patterns, which are user interface conventions specifically designed to manipulate people into making choices in software that they would otherwise not make. The workshop exemplifies the trans-disciplinary approach by assembling researchers employing complementary methodologies to evaluate industry practices. Further widening perspectives, academics at the workshop shared panels with professional UX and digital product designers active in the tech industry who were willing and able to freely describe the incentives and internal cultures that yield dark patterns. Kat Zhou, a product designer from Spotify openly assailed her own industry for structurally incentivizing the construction of dark patterns, at our collective expense.

Significantly, the workshop afforded an opportunity for the presentation of a new paper by Jamie Luguri and Lior Strahilevitz that empirically demonstrates how effective various dark patterns are at achieving abusive business objectives despite, or perhaps because of, their negative impacts on consumers. Additionally, the study shows how different dark pattern elements interact and amplify the others’ effects. But, most critically, the researchers surmise that industry has used comparable social science methodologies to empirically measure the efficacy of manipulative interfaces for quite some time now. In other words, it has been the tech industry’s dirty secret that its mean and nasty dark patterns work ridiculously well. Growth hackers have always known that manipulative and unfair UX is a 10x force-multiplier. Finally, in 2021 cross-disciplinary academics are showing federal regulators and other stakeholders how the general public has been in the dark about dark patterns.

Related from Tech Policy Press: Taking Action on Dark Patterns

Opponents of overly restrictive data privacy and protection rules will point to the consent clutter that acronyms like GDPR and CCPA seem to proliferate on websites, piling deceptive control panels and legalese on top of the usual clutter of ads and sign-up sheets. However, the discussions at the FTC’s Dark Pattern workshop recognized how user interfaces for cookie consent are indeed specimens of dark patterns. One some level, the regulatory debris could be better understood as a mal-adaptation to the fairness principles embedded into data regulations.

The promise of breaking down policy wonk siloes should inspire specialists with their heads down inside of these legal and regulatory boxes to look up and broaden the scope of their imaginations, and quickly. Platforms must be regulated with a transformed and newly collaborative legal regime that intersects the levers of power across data exploitation, market consolidation, and consumer protection. The Biden administration’s leaders in the tech regulatory space appear to get it, including big thinkers such as Lina Khan, Tim Wu, and Jonathan Kanter.

Once we let go of the limited view of disciplinary specializations, it is possible to recognize we are contending with a single issue: the asymmetric power of the dominant tech firms. The field of UX, a sub-discipline of design, adjacent to graphic design, is overdue for its regulatory reckoning because the interface designer operates from a positionality to directly invoke business logic into digital products. The manipulation is right there on the surface.

Interface designers need to elevate the field into the regulated domain, much more akin to architects, who need to achieve certifications and licensure, and a deep knowledge of how to design with building codes in mind. We have no equivalent of building codes for writing software code. Nor, as Ian Bogost has pointed out, is there an oath for UX design like that of civil engineers who build bridges that must not fall down. This is despite the fact that these UX employment positions are typically touted as conferring the privilege to make digital products that will be used by billions of people and “change the world.”

A window is open for change. And while the political winds may blow in a different direction in 2022 or 2024, the gathering momentum of industry professionals who refuse to participate in exploitative practices- combined with interdisciplinary, collaborative thinking in government and the regulatory bodies- means push will eventually come to shove. We’ll all be better for it.

Authors

David Carroll
David Carroll is an associate professor of media design at Parsons School of Design at The New School. He is known as an advocate for data rights by legally challenging Cambridge Analytica in the UK in connection with the US presidential election of 2016, resulting in the only criminal conviction of...

Topics