Home

Donate

Information Integrity by Design: The Missing Piece of Values-Aligned Tech

Camille Stewart Gloster / Oct 15, 2024

Privacy, security, and inclusion are not natural elements of our technologies. They are largely a result of market demands — for safety and equity and belonging — precipitating movements (privacy by design, security by design, and product inclusion) that have pushed for these concepts to be viewed as defaults in the technologies on which we depend every day. New standards were created, best practices normalized, and awareness brought that transformed how we build technology and what we’ve come to expect of it. We must now think about the creation of an analogous framework for information integrity that seeks to incorporate these values from the earliest stages of development.

These movements were in some ways a response to technology’s “move fast and break things” mentality, which has proven untenable. It is clear that bolting on mitigations for unintended harms is insufficient. Democratization of the information ecosystem and the more recent democratization of AI tools is changing the information space and we have an opportunity to make sure it's for the better. The information environment has benefited greatly from technological innovations—supporting communities in times of crisis, elevating marginalized voices, and mobilizing global movements for racial justice and gender equality. However, the current attention economy means inaccurate and hateful content designed to polarize users and generate strong emotions is often that which generates the most engagement. The fear, and too often the impact, is algorithms inadvertently encouraging and amplifying mis- and disinformation and hate speech.

We cannot create an environment where our tools work harder for the malicious actors than for the people they seek to serve. This demands a new way of building and leveraging tools to combat information integrity issues.

Companies who want to future-proof themselves are realizing that prioritizing the impacts of their creations from the outset is simply good business. One area of impact, however, remains largely unaddressed, and it is becoming the most urgent of all, with the most significant liabilities: the impact of designed media systems on our information environment.

Information as a design process

We no longer arrive at a media source and simply consume it, harkening to the days of print or even television. Our relationship to information is now experienced through a middleman: the majority of internet users get their information through social media platforms and media apps. These platforms are complex, designed environments where the information itself is secondary to the systems of engagement around it.

Although the explosive success of platforms demonstrates a responsiveness to a market need, it is now evident the current model is not sustainable if we want to have a healthy information environment. The problem it introduces has become obvious: quantity of engagement is too often inversely related to quality of information. When what you have to say or share is less important than the ability to get a reaction, the noise almost always drowns out the signal. The results have been nothing short of destructive for society, and have become major liabilities for the platforms themselves, as well as affiliated businesses, leading to regulatory pressure, brand pressure, and mass user disillusionment.

What we often forget, however, is that these unintended, often divisive outcomes are largely the result of upstream design and/or process decisions. Allowing anonymous users to go viral is a design decision; providing zero in-line context of a user’s previous activity — whether they are a brand new user, a high volume spammer, frequently break community rules, or have zero balance in the sources they share — is a design decision; providing zero real-time feedback on the divisiveness of a post the user is about to publish — especially in the age of generative AI — is a design decision.

In just the same way we would consider a poorly secured password flow a negligent cybersecurity decision, we should begin to look at effortlessly toxic information environments through a similar lens. Just as it has with other “by design” movements, we can rely on definitions to understand the standards of this new movement

In the context of democratic discourse, which is the foundation of any healthy society, information integrity takes on and goes beyond the traditional definition that originates in information security. In this new context, information integrity refers to:

  • Accuracy: correct or precise information, including fact-checking efforts and disinformation monitoring.
  • Consistency: steady access, lack of censorship.
  • Reliability: enabling sources of information that are reliable, independent, and transparent.
  • Fidelity: exactness with which information is copied, and understood by others as originally intended.
  • Safety: unlikely to be at risk of danger, risk, or injury; including digital safety and cybersecurity.
  • Transparency: the quality of work being done in an open way without secrets.

When these factors are considered in the design as opposed to a reaction to bad behavior or unintended confusion, and an organization's position relative to each factor is made public through proactive reporting, there is a better opportunity for positive, sustainable results. Prioritizing “information integrity by design” doesn’t mean systems won’t be attacked or exploited, but it will mean that the skill level required to perpetuate a successful attack or confuse the information space will increase.

Information integrity reality check, and in practice

The same social media platforms and news outlets at the center of the chaotic information ecosystem in which we find ourselves, are in fact taking steps to combat misinformation, disinformation, and synthetic content. Often, this is done via fact-checking initiatives and through Trust & Safety teams that are responsible for content management, in an effort to stop and prevent the weaponization of their platforms and outlets. However, these mitigating actions do little to reorient incentives toward mindful, quality, and informed interactions that could result from proactive work, rather than reinforce quantity and mindless, impulsive interaction as they currently do. Tech companies have a unique opportunity to use their insights and tools to augment user experiences in ways that reduce the overwhelming, seemingly impossible moderation burden their current interfaces produce. It would also head off much of the criticism and justifiable concern around censorship by better equipping users to think for themselves, rather than making information decisions for them.

That said, solutions must be integrative: they have to consider not just the user experience, but the realities of business demands. While the status quo is not sustainable, it would be naive to promote changes that reduce engagement volume without making a clear case for the economic benefits. Voluntary industry-led standards like the Coalition for Content Provenance and Authenticity (C2PA) and Data & Trust Alliance’s Data Provenance Standards are a step in the right direction – but these standards have limited scope, are mostly focused on provenance rather than the full range of information integrity components, and are not yet widely adopted.

To have any chance of fixing our dysfunctional relationship with information, we need solutions that can take on the powerful incentives, integration scale, and economic pull of the attention economy as we know it, and realign the market. One good example is the emerging platform Readocracy, designed from the outset with features that allow users to have much more control and context over their information experience. This includes offering users control over the algorithm, providing nudges to direct attention more mindfully, and providing information on how informed commenters are on subjects on which they are commenting. By shifting the value of attention from the estimated $800 billion dollar digital advertising market to the over $2.6 trillion dollar (approximate) knowledge economy markets, it still makes a compelling economic and behavior use case at scale. It also illustrates the kind of people-centered, technically sound, and values-aligned information ecosystem that can be born from a shift to information integrity by design, creating an environment that is more beneficial and lucrative for all involved.

Government’s role in shifting market forces

Government efforts can encourage and support the desired market shift by implementing policies to enhance the transparency, accountability, and quality of the plurality of information sources. This work aligns with existing mandates like the commitment to ”ensure that market forces and public programs alike reward security and resilience, and promote the collaborative stewardship of our digital ecosystem”, as stated in the 2023 US National Cybersecurity Strategy. One way the US federal government has lived up to that commitment is through the fierce promotion and resourcing of the “Secure by Design” initiative led by US Cybersecurity and Infrastructure Security Agency (CISA), which has led to leading companies voluntarily committing to make progress towards seven security goals.

An information integrity by design initiative can focus on promoting the six components of information integrity outlined above so readers and researchers can make informed decisions on the integrity of the information provided. Government promotion and support can drive and support corporate adoption of the concept much like it's done for security by design, privacy by design, and, most recently, safety by design.

Information integrity by design can redesign our desired reality

Fundamentally, we all want to keep ourselves and our families safe, we want to maintain some level of privacy, we want our technology to recognize our unique identities, we want access to trustworthy information, and we want it to support our economic opportunities. As “by design” movements have taught us, technology can reach its greatest innovative potential while recognizing the people it serves and guarding against potential harms. Prioritizing and incentivizing “information integrity by design” is our opportunity to better align our technology and our values.

Information integrity deserves fierce advocacy from governments, the intellectual ingenuity of civil society, and the creative muscle of industry. Collectively, we can move from chasing our aspirations for information integrity to realigning incentives to support our desired reality by structuring and coordinating actions and policies around these components from the beginning. This reorientation will help to bring democracies closer to the goal of a less chaotic information environment.

Authors

Camille Stewart Gloster
Camille Stewart Gloster, Esq. is the CEO of CAS Strategies, LLC and the former Deputy National Cyber Director for Technology & Ecosystem Security for The White House. In her role, Camille led technology, supply chain, data security, and cyber workforce and education efforts for the Office of the Nat...

Topics