Home

Donate

Toward Prosocial Tech Design Governance

Lisa Schirch, Lena Slachmuijlder, Ravi Iyer / Dec 14, 2023

Lisa Schirch, Lena Slachmuijlder, and Ravi Iyer are the three co-chairs of the Council on Tech and Social Cohesion.

Jamillah Knowles & We and AI / Better Images of AI / People and Ivory Tower AI 2 / CC-BY 4.0

Technology governance often focuses on decision-making related to what to do with harmful digital content, not the design of technology. Journalist and Nobel Peace Prize winner Maria Ressa aptly describes this approach as like cleaning a dirty stream one glass of water at a time. A focus on removing harmful digital content distracts from the opportunity to look upstream at what is incentivizing the harmful content creation and promotion.

In February 2023, we held a conference in San Francisco on “Designing Technology for Social Cohesion” which brought together over 200 people, half from the tech community and half experts in bridgebuilding and peacebuilding. A running theme throughout the event was the need to focus more on tech design and less on content moderation.

With regulators and civil society groups increasingly interested in regulating design, new models of governance need to be developed. As co-chairs of the newly formed Council on Technology and Social Cohesion, we invite government tech regulators, policymakers, and legislators as well as private investors, technology companies, and civil society to formally consider how we might collectively advance prosocial tech design governance.

Moving beyond content governance

A field that began with the development of “community guidelines” to deter users from spam, fraud, and malware has expanded to a multi-billion dollar global infrastructure. Tens of thousands of trust and safety practitioners wrestle with digital content every hour of every day. They focus on transparency, data ethics, and freedom from harm while removing trillions of pieces of harmful content and identifying and de-platforming spammers. Despite major investments from tech companies, tech insiders describe a largely manic game of whack-a-mole against a rising tsunami of harmful content.

A key assumption underlying content governance is that the main problem facing social media is harmful user-generated content. In this view, malevolent actors misuse tech products to cause harm to others while the technology product itself is regarded as a neutral mirror, simply reflecting the worst elements of human behavior.

Tech regulations often begin from the perspective of content governance. The Digital Services Act (DSA), UK Online Safety Act, and Australia’s E-Safety regulations have specific and developed protocols for addressing illegal content. Another example, the German Network Enforcement Act, requires social media platforms with over 2 million users to remove "clearly illegal" content within 24 hours and all illegal content within 7 days of it being posted. Tech violations of the Act face a maximum fine of 50 million Euros.

But the experience of tech company trust and safety efforts suggests that content-based government regulations are not enough. By some estimates, Facebook only removes 3 to 5% of hate speech in the US, while many other countries receive far less attention. Political actors, cyber armies, and a thriving disinformation industry exist because the design of these systems favors the spread of divisive messages. And content moderation is inherently polarizing due to the inevitability that some political voices will be moderated (censored) more than others, because they are more likely to produce content that violates platform policies.

The importance of focusing on design

Tech companies have known for a long time that the design of their products shapes human behavior. Tech products are not neutral mirrors simply reflecting users. In 1999, Harvard Law School professor Larry Lessig wrote how tech designs steer human behavior, outlining how computer code functions as law. Tech designs determine what users see, their privacy, and how they can interact with others.

In 2007, Stanford psychology professor BJ Fogg’s book and popular course on "persuasive technology" described how tech platforms could persuade and subtly steer individuals toward specific actions. Fogg hoped companies would use his psychological insights to steer humans toward prosocial behavior. Instead, Silicon Valley entrepreneurs and big tech executives used Fogg’s insights on design to maximize user engagement, a proxy for profit.

Civil society actors and journalists document how the design of technology products can incentivize disinformation and polarizing narratives of “us” and “them.” To maximize user engagement, engineers designed tech platforms as gladiator arenas that amplify the voices that will get the most attention, which often are the most divisive. Public fights on social media lead to more distribution. Tech companies employ endless “tweaks” to their design to bolster user engagement. These include tech affordances like the share button, the infinity scroll and algorithms recommending content aimed at increasing users’ time on the platform. The impact of such designs is to contribute to division and polarization in societies, undermining democratic institutions and public trust.

What does it mean to design tech for social cohesion?

Social cohesion is crucial for a society to deal with diversity peacefully, and address grievances without violence. The trust and collaboration that underpins social cohesion leads to more peaceful and just societies, enabling all other indicators of human development to progress. Numerous academic disciplines have studied what drives toxic polarization, and those forces apply both on and offline.

This is why the Council for Technology and Social Cohesion is raising awareness of how prosocial tech design can strengthen social cohesion. Tech product affordances enable users to do some things such as sell, buy, like, share, watch, post, or comment. But affordances also limit what users can do. If there is no affordance to identify common ground in a tense online conversation, users will not be able to highlight areas of agreement. If there is no affordance to humanize and contextualize a stranger’s post, other users may not be inclined to give someone the benefit of the doubt or ask questions to learn more.

Tech product algorithms often aim to maximize user engagement. User interface choices make it difficult for users to signal what content they would aspire to consume, separate from what they engage with. For example, divisive content is often engaging such that publishers and politicians report an incentive to create more division than they would like, and users report being worn out by political discussions as a result. Many people across society report a desire to move beyond the divisions they see online. Prosocial or bridging algorithms could better enable users to understand where they have a shared identity or values, or where there may be agreement or common ground across divides.

Many civil society organizations have developed resources for implementing prosocial tech design. For example, the Prosocial Design Network hosts a library of evidence-based design interventions. New_Public has created a variety of resources to support Community by Design, a space for communities to share, learn, and design for better digital public spaces. The Neely Center Design Code for Social Media is a “building code” for social media platforms that identifies nine specific, content-agnostic minimum standards, based on the most effective practices from within tech platforms. The University of Notre Dame and the Toda Peace Institute have published Tech Design Principles for Peacebuilding and Conflict Transformation.

Deliberative technologies such as Pol.is and Remesh both have affordances that help people to “listen at scale” to people with different opinions. Both have algorithms that surface common ground between groups, showing where there is agreement. Similar bridging algorithms have been tested within tech platforms with measurable positive effects. Both incentivize people to find creative policy solutions; a task nearly impossible in polarized policy settings offline. These platforms already have a successful track record of supporting social cohesion. For example, government agencies in Taiwan, Austria, Finland, the UK, Libya, and Yemen are using platforms like Pol.is and Remesh, sometimes dubbed as “AI mediators” in citizen assemblies, constitutional reform, referendums, inclusive peace processes, and to generate policy proposals backed by diverse groups and interests.

Tech products designed to strengthen social cohesion offer a hopeful path to support robust and participatory democratic engagement. They also bring important critical questions to bear about representation, meaningful participation, and inclusion.

What is Prosocial Tech Design Governance?

Tech design governance refers to a system of decision-making including the products that tech companies design, government regulations on those companies, and civil society-led research, innovations, and initiatives to support more inclusive public decision-making. Tech design governance builds on an assertion that tech designs steer human behaviors by encouraging, affording, and amplifying some behaviors and making others’ behaviors impossible or invisible.

Design governance is not unique to the tech sector. Design governance includes a range of tools, measures, standards, incentives, and penalties that policymakers deploy to steer a sector toward particular policy outcomes. Other sectors use design governance to improve public safety and social cohesion.

Elinor Ostrom won the Nobel Prize in economics in 2009 for her groundbreaking design principles for managing the public “commons;” how communities governed themselves from the ground up to manage shared resources. In England, the Commission for Architecture and the Built Environment (CABE) developed a range of design governance tools including first gathering evidence and knowledge about the impacts of different urban and housing designs. The Commission then promoted prosocial designs that balanced the interests and needs of different groups in society. In urban design governance, the goal was not simply to develop top-down policy regulations, but rather an entire ecosystem of evidence, guidance, and incentives that support government design regulations. We seek to replicate this approach.

Prosocial tech design governance includes a range of government incentives and disincentives for how technology companies design social media technologies and community-led processes that support prosocial tech designs.

Policymakers can move toward Tech Design Governance

Governments can do more to incentivize and regulate tech product designs that amplify positive, prosocial human behavior and put up more guardrails to disincentivize negative human behaviors.

Legislators have begun to govern design, and we can learn from these efforts’ successes and failures. The UK’s Age Appropriate Design code led to significant platform design changes (e.g. limits on contacting strangers) that not only affect the prevalence of violating content but meaningfully change the experience for all users. California’s Age Appropriate Design Code and Australia’s Safety by Design provisions also offer an explicit focus on design. The Quad Principles on Technology Design, Development, Governance, and Use affirm that technologies should, by design, support shared democratic values and respect for universal human rights. However, more remains to be done and requirements to assess one’s own design risks have not proven especially effective. There have been some successes in prosocial tech design governance, but there remains much more room for society to meaningfully participate in technology design decisions.

Over the next year, The Council on Technology and Social Cohesion will host a series of workshops and conferences to surface lessons learned in advocating for and implementing prosocial tech design governance. We aim to develop a Blueprint for Prosocial Tech Design Governance which will offer policy frameworks to aid international and state policymakers as well as private funders for how to incentivize prosocial tech design. The Blueprint will explore options for regulatory tech building codes and standards such as the Hague Peace Data Standard, proposals that would tax tech companies that contribute to online toxic polarization, and other ways to reform online incentives for regulators, investors, as well as the upcoming generation of technologists, and users.

Technology is shaping our norms, our relationships, and our potential to collaborate and progress as a society. This influence is too consequential to leave to tech companies alone to determine. It is an urgent imperative that governments, civil society, academics, and technologists act to ensure tech delivers prosocial benefits, and design governance is an effective path forward.

Authors

Lisa Schirch
Lisa Schirch is Professor of the Practice of Technology and Peacebuilding at the University of Notre Dame and Senior Fellow at the Toda Peace Institute.
Lena Slachmuijlder
Lena Slachmuijlder is Executive Director, Digital Peacebuilding, at Search for Common Ground.
Ravi Iyer
Ravi Iyer is Managing Director of the Neely Center for Ethical Leadership and Decision Making at the University of Southern California Marshall School. He worked at Meta for 4+ years across data science, research, and product management roles on improving its algorithmic impact on society.

Topics