Home

Donate

Contending for Democracy on Social Media and Beyond

Richard Reisman, Chris Riley / Sep 22, 2022

Richard Reisman is a nonresident senior fellow at Lincoln Network; Chris Riley is senior fellow for internet governance at the R Street Institute.

Churchill walks through the ruins of Coventry Cathedral, 1942. Wikimedia Commons

“No one pretends that democracy is perfect or all-wise. Indeed it has been said that democracy is the worst form of Government except for all those other forms that have been tried from time to time…”

-- Winston Churchill, 11 November 1947

Today, the futures of both democracy and the internet lie at crossroads. These futures are intertwined in complex and critical ways, with each threatened from within, and by one another. While debates over the extent of the internet’s contributions to democracy’s degradation will continue for many years to come, this piece seeks to look beyond this concern to examine where we might be going, and to argue for a new direction.

This is the fourth in a series of pieces that consider (1) the democratic nature of the internet and (2) the internet’s deep impact on democracy itself. The prior articles in the series include:

  1. Delegation, Or, The Twenty Nine Words That The Internet Forgot.

This article explores why this emphasis on user control is far more important than generally recognized, and how an architecture designed to make high levels of user control manageable can enhance the nuance, context, balance, and value in human discourse that current social media are tragically degrading. It contends that the more power to control the receipt of information that is delegated to the user level, the healthier the overall discourse will be.

  1. Understanding Social Media: An Increasingly Reflexive Extension of Humanity

Modern media tools increasingly do more than merely reflect the world they present – they shape it, such that content and context are inextricably interwoven in a reflexive chain of collaborative transformation. We need the power to shape the information tools we each use to be delegated by us to intermediaries we choose, because that devolves the reflexive power to shape what we see of the world for ourselves.

  1. Community and Content Moderation in the Digital Public Hypersquare

How and by whom should this new power to shape public discourse be managed? The best path forward is to recognize and reinforce the critical role of interconnection within the hypersquare of digital public spaces, and build into that information ecosystem an open and evolving infrastructure of tools and services for community empowerment and deliberation. That can bring new power to the generative processes of social intelligence and adaptive cooperation that historically fueled the progress and survival of healthy societies in an ever-changing world. That means the architecture has to be structured so that we each have more agency to select the communities we choose to participate in and follow.

The overall agenda behind this series is to theorize and advocate for the creation of a new architecture: a market-based ecosystem of delegated intermediaries that operate in between end users and the services and platforms of social media, as well as the broader information ecosystem. The motivation is that in governance and society – and in the shaping of tools to support both – a broadly democratic collective intelligence is essential to dealing with diverse needs and values in an environment of increasingly dynamic stressors and existential threats.

Technology: Part of the Problem, Part of the Solution

This goal comes at a tumultuous time. The internet, and social media in particular, is often cast as detrimental for social cohesion and democracy (even if opinions vary widely as to its degree of responsibility). Authoritarian intervention for online speech control is on the rise, and not just in countries historically associated with internet repression; the annual Freedom on the Net report from Freedom House concluded that global internet freedom has declined for the 11th consecutive year. Public confidence in the superiority of democracy over authoritarianism seems dangerously weakened, at least in some corners. Nations with competing forms of governance – including Russia and China – seek to capitalize on these trends, using the internet as their vehicle.

Certainly, as Eric Schnurer articulates in a fulsome analysis, democracy is being disrupted in important ways by the effects of the internet, and its future shape may look quite different. Divya Siddarth suggests the frame that existing forms of democracy are an unfinished form of human collective intelligence, and surveys ways it might be beneficially transformed by new mechanisms, technologies, and systems for augmentation.

Among the evolving conversations around internet governance, there is increasing advocacy for decentralization as a part, but not all, of a corrective to that future. But what kind and degree of decentralization would really be beneficial? Charley Johnson observes that decentralized technology does not translate directly to decentralized power. Further, Jonathan Zittrain describes the potential for greater change through greater community governance, unlocked by more distributed architectures. Technical solutions, including experiments with Web3 and DAOs (Decentralized Autonomous Organizations, based on blockchains) seem, so far, to disappoint at presenting a better option to what their proponents see as brittle and antiquated technologies that underpin models of democracy. In all of these debates, the key question is how to distribute and decentralize control in an effective and pro-democratic way.

One of the mantras of this series is the idea that “we shape our tools and thereafter our tools shape us” (a quote indirectly tied to Winston Churchill). The very personal nature of today’s tools leads inevitably to the conclusion that democracy requires strong individual participation in determining how the process of obtaining information (digital and social media technology in the broadest sense) works for each person. That, in turn, helps the tools themselves to be honed in democratic ways, and thus, in a virtuous circle, to strengthen democracy writ large.

One element that will help preserve democracy and the internet is to delegate the navigation of the overwhelming modern information ecosystem to intermediaries that are empowered both technically and legally to act on behalf of an individual, and that are held in check through both market and regulatory forces. This proposed structure is both a metaphor for representative democracy, in that individuals can choose among options for agent representatives to carry their preferences forward collectively, and at the same time a tool to improve real-world representative democracy, by improving the long-term outlook for applying social wisdom to manage misinformation and mitigate what the RAND Corporation calls “truth decay.” It also serves as a restoration of the critical role, traditionally held by a diversity of communities and institutions, in informally mediating how citizens inform themselves, which is a function that has been disintermediated by today’s social media platforms with no suitable replacement. Delegated intermediaries could step in to absorb and share that responsibility.

In the longer term, these tools may lead toward new forms of “digital democracy” as a “digital transformation” of representative democracy (recognizing the need to overcome hurdles for any chance of feasibility, such as today’s active work on election security). Sebastian Berg and Jeanette Hofmann provide a rich survey of thinking that pulls in this direction, suggesting “[n]ew forms of digital engagement that go hand in hand with organizational reforms … re-intermediating established democratic settings in open-ended ways that defy linear narratives of demise or renewal.” They point to the complex and often paradoxical relationship between governments, platforms, and citizens, with blurring boundaries and struggles over foundational principles.

As corporate and regulatory actors contend with digital accountability and responsibility, user agency – as realized through delegation to intermediaries – can be a powerful lever for citizen choice guided by social wisdom, and can help obviate any call for excessive paternalism. This is true both in the near term and farther into the future, as such a structure could lay the groundwork for a far deeper digital transformation of democracy that effectively represents a diversity of networked publics joined together in humanity’s collective struggle with increasingly dynamic and severe challenges.

Modern democracy is in disarray, but there may be light at the end of democracy’s dark tunnel.

Since at least January 6, 2021, American news headlines have been fraught with a new level of concern for the future of the nation. The United States is not alone among democracies with such these concerns. Authoritarians are gaining votes in elections throughout Europe, while the world’s largest democracy, India, is on a worrisome trajectory according to several measures.

Massive divisions within various societies have fractured the ideal of democracy as a system that functions on orderly, objective, fact-based discourse and the balancing of values and interests, which results in broadly clear and accepted outcomes. In a world of extreme polarization and “alternative facts,” the workings of democracy become more challenging. Agonistic democracy theory embraces the modern dynamic of contestation, exercised through politics and power. This does not, of course, extend to unhealthy conflict driven by intentional falsehoods, incitement to violence, and those seeking to exclude people from democratic processes entirely. But it does embrace that the outcome of democratic processes need not be consensus or the clear emergence of a single objective shared decision in all circumstances, thus embracing an “opposition” that is “loyal” to the common welfare.

Conflict is part of democracy, and will continue to be, and democracy is not inherently worse off as a result. Rather, just as democracy is weakened by the prevalence of unhealthy conflict, so too it is weakened by attempts to suppress healthy conflict that is agonistic, rather than antagonistic, whether sourced in doctrinaire extremes of liberalism—a form of conflict that regards opposing views as irrational and conflict as temporary, pending a demonstration of the one true way—or authoritarianism, the forced suppression of alternative views. Conflict over truth and value in human society is inevitable, especially in an age of rapid change that only promises to accelerate. Some argue that more paternal—some might say more principled, others authoritarian—governance is needed to deal with these stressors, but robust and healthy democratic processes are arguably the most adaptable, and therefore ensuring they work effectively is more important than ever.

Through this lens, disinformation is harmful to democracy in part because it is a technique to subvert or suppress honest conflicts of beliefs or values held in good faith, and the unknowing spread of misinformation is little better. Additionally, the internet is a powerful tool to conduct smear campaigns intended to suppress and demonize the speech and perspectives of the out group. The rise of direct, citizen engagement in politics shows the power of individual and collective action but creates as a corollary a fair amount of both misinformation and active attempts at speech suppression online – all now compounded by the reflexivity of social media .

At the same time, more engagement with politics demonstrably increases “political efficacy,” the measurement of perceived impact of individual action on political processes. Importantly, this increase applies equally to online engagement as to offline, reinforcing that, despite the seeming shallowness of “clicktivism,” online political action is a powerful force. This reflects the behavioral economic principle that “hedonic utility” is important to human welfare and motivation: “not only what, but also how matters.” The “how” of efficacy is that government must be perceived as not only representative of, but also responsive to, its publics. As these publics become networked with powerful social media tools, intra- and extra-governmental forms of pluralistic representation can complement one another to create an emergent, contingent order. That order derives strength from healthy agonisms that perceive worthy opponents or adversaries in contestation, while discouraging unhealthy antagonisms that perceive evil enemies at war.

This is a quandary: public confidence in government is down, democratic institutions appear weakened, and while public engagement can improve confidence, it simultaneously spreads conflict and anti-democratic suppression of conflict. Certainly, individuals can help by accepting a moral responsibility to improve the information ecosystem through self-education and critical thought. But that isn’t a systemic response, and it is therefore unlikely to match the effects of technology without systemic support.

Within democratic governments, accepting the role of politics and power as a part of agonistic democracy, rather than a fundamental tension to it, can provide a roadmap for reshaping the administrative state to better foster democratic legitimacy. And there’s no time like the present, considering the breakdown in legal liberalism as a theory of vindicating rights and truth through the court system. For decades, advocates for justice pursued compelling interpretations of the law in the belief that a smart and strategic argument would be sufficiently powerful to rule the day; but such assumptions are no longer reliable.

Delegation can help restore trust in the internet and, maybe, in democracy.

The internet can have a positive role to play in the health of future democracy, given the increasingly central role of digital platforms in the information ecosystem. In particular, recommender systems, long developed and studied by computer scientists, are a new area of focus for social science and public policy intervention (where they are more often labeled bluntly as “algorithms,” creating a mystique that distracts from understanding the fundamentals of how and why they make recommendations). Most scholarship and public policy attention focuses on harm and potential mitigation. While mitigating harm is important, it doesn’t constitute a pathway to use the internet and recommender services that draw on and refine human judgment to improve democracy. Some examples of a more positive focus include work by Aviv Ovadya, Jonathan Stray and co-author Reisman.

Although aimed at issues of competition and antitrust, Frank Pasquale’s comparison of “Jeffersonians” and “Hamiltonians” illustrates two popular visions of change for the status quo of the internet. The Jeffersonian view calls for a greater dispersal of power and influence, illustrated best by Senator Elizabeth Warren’s (D-MA) call to “break up big tech.” In contrast, the Hamiltonian view (to which Pasquale attributes thinkers as distinct from one another as Peter Thiel, Rob Atkinson, and Evgeny Morozov) recognizes the value of centralized data and looks to regulatory intervention rather than structural change.

The theory of delegation can be seen as a third option in this landscape; a perhaps Madisonian middle between Jefferson and Hamilton that builds and draws on public dialog, deliberation, and enlightenment. While its overall value has been highlighted in prior posts in this series, delegation has a particularly strong relationship to democracy and strong potential for improving democratic outcomes both for and through the internet. This tie goes even further than the ability of delegation to mitigate the harms arising from reflexivity online; delegation creates new structures complementing and augmenting existing ones (and replacing those mediators that are being lost), and with these new structures come new levers of power and influence that can harness collective will to create continuous improvements in responsibility.

Delegation provides a lever for cutting through the centralization versus decentralization debates noted above, which are more usefully thought of not as binary poles, but as patterns of distribution. Distribution of control is an issue for both systems of democracy and of code (and the distinction is blurring). Pasquale ties this to Hayek’s “knowledge problem,” which argues for bias toward local control because that is where ties between knowledge and stake are most direct. This is fundamental to a clearer understanding of how power is best distributed. Divya Siddarth, Danielle Allen, and E. Glen Weyl point to the need to “focus on the degree rather than type, of decentralization,” in terms of the principle of “subsidiarity” and “composable local control,” which they say focuses on:

1. Keeping data as close as possible to the social context of creation.



2. A plurality of solutions linked and integrated through coordinated mechanisms of federation and interoperability.



3. Leveraging and extending relationships of online and offline trust and institutions.

They argue the way to achieve the best distribution of control (in contrast to the DAO model) is “subsidiarity, not redundancy–a network of networks, not a ledger.” Johnson argues in the same vein (quoting Sarah Jamie Lewis) that what matters is “how trust and power are given, distributed and interact.”

Separating out the common core connectivity layers of platforms (including allowing them to reach the natural network-effect benefits of centralization) from the more user-specific layer of recommender systems is step one. That then creates room for a new range of intermediaries, to whom users would delegate the function of information selection and recommendation that serves each of them by helping to manage the otherwise overwhelming flood that is the modern information ecosystem. The new choices and values that this delegation then unlocks reinforce and expand the inherent nature of the internet as a digital public hypersquare composed of a diversity of communities and institutions. (Similar issues apply to delegated intermediaries for managing user data for user and communal benefit, as those, too, are important, but less urgently central to preserving democracy.)

Creating this diversity of spaces allows for greater diversity of approaches to recommendation. Some experiments are occurring, such as one showing that group moderation can help with fighting disinformation by working at a more granular and local level (an example of subsidiarity). Delegation greatly expands the landscape of opportunity for experimentation and innovation in recommendation. Because the delegated intermediaries that rank the items that go into a user’s feed would be managed separately from underlying platforms, users would choose a suite of intermediaries freely, without removal of potentially valuable information located at the platform layer. The result is a harnessing of powerful market incentives to identify and reward the most successful intermediaries, even while recognizing that “success” in this context is a complex concept – and that regulatory oversight of this ecosystem will still be needed.

Market forces and other levers of influence will shape delegated intermediaries to help improve the information ecosystem.

Operating as delegated intermediaries, recommender services would face a range of market forces encouraging steady and continuous improvements in responsibility. To begin, users freely choosing intermediaries is a natural market force, albeit one that can by itself reward irresponsibility in some ways. But the landscape of intermediary activity and thus potential space for competition for user interest is broad. A wide range of network-level, cross-community moderation remedies are available that are less “overbroad and underinclusive” than removal (whether based on government or “platform law”), and intermediaries experimenting with these tools at greater scale than what can be done at entire platforms creates room for interesting growth and healthy user development. Eric Goldman provides a detailed taxonomy of “expanded non-removal” content moderation remedies that avoid such movements, Ellen Goodman expands on underlying principles of fidelity and friction for “empowering autonomous individual choice,” and co-author Reisman earlier suggested sophisticated strategies for making filter bubbles more permeable – open to cross-fertilization for both serendipity and, more pointedly, what Cass Sunstein called “surprising validators” that might open minds to alternative views. Of course more extreme remedies might be used to isolate dangerous communities when broad revulsion leads key providers of lower-level network infrastructure to unilaterally cut essential services, but such bulk removal actions risk being overbroad and may become a slippery slope toward abuse.

Regulatory intervention will bring pressure for greater responsibility as well, notably through the Digital Services Act, which will take effect in the European Union in the coming months. Here, shifting to a more diffuse landscape of intervention may help by reducing the costs and complexities of investigating the practices of a smaller service, one focused solely on the recommendation function and not the broader platform responsibilities. Greater complexity in the market risks making identification and attribution of violation more difficult, but this can perhaps be offset through transparency measures. And the lower cost of prosecuting a violating company, once identified, can increase the efficacy of law enforcement bodies with fewer resources, such as U.S. state attorneys general (in a future, still uncertain, where the United States adopts comparable law) or the individual digital services regulators of EU member states. While malicious coordinated influence campaigns will seek to jump across community mediation boundaries, such crossings can be tracked, analyzed, and responded to in real time with friction and other cross-community countermeasures that are narrowly targeted and independently overseen.

U.S. law and regulatory action also favors strong reliance on user choice, as noted in the first article of this series. It seems generally forgotten that Section 230 states (in a non-controversial preamble) “it is the policy of the United States … to encourage the development of technologies which maximize user control over what information is received by individuals … who use the Internet …” In the same spirit are the “delegatability” provisions of the Senate ACCESS Act, reintroduced this May, which would operationalize user rights of choice.

Additional levers encouraging responsibility come from the critical community that surrounds the tech industry. This field of advocacy and research can organize substantial pressure, even absent a hard law, to bring about fine-grained changes to policy and practice. This works at its best where the desired outcome has broad public support, and where the obstacle to change is more the inherent myopia of large institutions, rather than a willful resistance. The Check My Ads Institute, for example, has had great success in pushing companies to stop supporting media personalities and outlets that push election falsehoods that contributed to the January 6 insurrection. Over time, any rogue intermediary recommendation engine with sufficient adoption to cause harm will be identified and called out by the critical community.

This lens of harnessing market forces and public engagement to encourage better outcomes is in contrast to calls for de-privatization or the establishment of parallel, public-centered internet services and stacks. While non-commercial services have the potential to add value alongside commercial offerings, just as public broadcasting has its place in the broadcast channel lineup, they will exist as part of future markets and a complement to private sector offerings, not as a replacement for them. Thus they, too, will contribute to market forces by representing alternatives that are in many ways better by virtue of being more public, and being less susceptible to claims of government overreach.

Paternalism does not make for healthy democracy in the way structured agency can.

A frequent response to the ideas presented thus far in this series is, to put it bluntly, a deep cynicism regarding the wisdom of empowering individuals with greater agency over their experience of the information environment. It’s certainly true that human rationality is limited, and that the average internet user gravitates toward content that reinforces their worldview (and with a preference for content with a strong emotional charge), so that, with no check on these tendencies, the future of human society could devolve towards Idiocracy.

But mandating from the top-down a singular and non-inclusive vision of goodness is not sustainable, as it will reinforce feelings of discontent and distrust in information gatekeepers and democratic institutions, and further undermine the internet’s “political efficacy,” the belief that user engagement and participation will result in a better outcome for users. In a future with delegated intermediaries facilitating the expression of user agency, there would be limited need for top-down control. The additional levers of influence described above are powerful tools to balance individual will and choice with overarching societal responsibilities.

This objection also discounts the diversity of delegation services that can emerge (like the traditional diversity of publishers, websites, and institutions), and the ability of each user to compose many such services into a rich tapestry of recommended feed inputs. Some delegated services may be harmful, but others will expose users to broader perspectives, even if only by random variation, reducing the need for a heavy hand.

There will always be outliers, troubled individuals who seek out content and community in the darkest and most destructive corners of the internet. But, providing options for meaningful choice, coupled with a steady increase in public education and awareness of digital citizenship, will, on balance, result in a healthier future that augments our collective human wisdom and channels it to nurture productive agonistic visions and counter destructive antagonistic ones. The reason democratic societies place high value on freedom of expression and assembly is that limiting those freedoms limits the ability of societies to learn and adapt to changing environments. Slavery abolitionists were once viewed by the powers that be as destructive.

Bringing this back to markets and subsidiarity, the fundamental issue, again, is one of balance and who gets to decide. Top-down optimization has theoretical appeal to efficiency, but that is constrained by limitations in knowledge and diverse contexts and values that require local, bottom-up inputs based on the richness of direct human judgments. Markets do not stay fair without oversight, but they excel at solving the knowledge problem and the diversity of values problem in a way that top-down control does not. The reinvigoration of user choice from an open market in intermediaries would make our media tools—and thus our democracy—moreopen and responsive to a diverse and evolving blend of cosmopolitan, local, and cultural perspectives, which may at times converge, while retaining the adaptive potential of agonism.

Conclusion

Democracy and the internet today face challenges that are not only intersecting and compounding, but also fundamentally similar. Many see the power structures and processes of each as rigged by misinformation and manipulation, with too little room for individual engagement or agency to do much good. In the face of black box online recommendation engines and widespread allegations (no matter how poorly founded) of fraud in the democratic system, public confidence is challenged.

Technology and the business models that currently drive it are currently on a path that accelerates this trend. Artificial intelligence, everything associated with Web3, and the metaverse all offer narratives of complex technology beyond the individual’s ability to create and control. The future of elections may follow as well; both the U.S. Presidency and the balance of representation in the European Parliament will be at stake in 2024, and these democratic cycles are certain to be the target of substantial disinformation campaigns and, at least in the United States, preemptive allegations of fraud and misbehavior.

Yet, the decay of truth and democracy can be mitigated in part through a more effective information ecosystem. While it can’t be healed by 2024, there is time to put both the internet and democracy on a path towards greater health and trust. Unlocking more agency in managing the filtering and recommendation layer that sits between individuals and the information ecosystem, and then facilitating the delegation of that agency to a diversity of empowered and effective intermediaries that help distribute control of the information ecosystem in ways that are both democratic and flexible, will help reverse that decay.

Delegation can provide a powerful foundational mechanism to help support the healthy pluralistic dialog needed to feed a more inclusive and effective representative democracy. That can enable democracy to remain the least “worst” system of government – at a time when that is more important than ever.

– – –

This is the fourth in a series of related essays by Reisman and Riley in Tech Policy Press:

  1. Delegation, Or, The Twenty Nine Words That The Internet Forgot
  2. Understanding Social Media: An Increasingly Reflexive Extension of Humanity
  3. Community and Content Moderation in the Digital Public Hypersquare
  4. Contending for Democracy on Social Media and Beyond

Authors

Richard Reisman
Richard Reisman (@rreisman) is a non-resident senior fellow at the Foundation for American Innovation, contributing author to the Centre for International Governance Innovation’s Freedom of Thought Project, and a frequent contributor to Tech Policy Press. He is on the team that convened the symposiu...
Chris Riley
Chris Riley is Executive Director of the Data Transfer Initiative and a Distinguished Research Fellow at the University of Pennsylvania’s Annenberg Public Policy Center. Previously, he was a senior fellow for internet governance at the R Street Institute. He has worked on tech policy in D.C. and San...

Topics