Home

How Social Media Intensifies U.S. Polarization- and What Can Be Done About It

Justin Hendrix / Sep 13, 2021

As a House select committee investigates the causes of the January 6 Capitol insurrection, requesting a variety of data from technology firms that propagated false claims about the election and whose products were used to facilitate the attack, a new report released today from the NYU Stern Center for Business and Human Rights authored by Paul Barrett, the Center’s deputy director; NYU research fellow Grant Sims; and myself finds that social media platforms have played an important role in exacerbating polarization and its consequences, including democratic erosion and political violence.

The report, Fueling the Fire: How Social Media Intensifies U.S. Political Polarization—And What Can Be Done About It, also makes recommendations about what the tech industry and government can do to address the problem.

Based on a review of social science research and more than 40 interviews with scholars, industry insiders, experts and activists, the report finds that, while rising levels of polarization in the U.S. predate and are not mainly caused by social media, the platforms have played an instrumental role in exacerbating the trend in recent years. Absent significant reforms by the federal government and the social media companies themselves, the platforms will continue to contribute to some of the worst consequences of polarization. These include declining trust in institutions; scorn for facts; legislative dysfunction; erosion of democratic norms; and, ultimately, real-world violence, such as the January 6 insurrection.

Source

The report acknowledges that not all polarization is necessarily bad: certainly, campaigns for social and racial justice are often polarizing. But while polarization isn't necessarily unavoidable or ultimately unproductive, that doesn’t mean we should be indifferent to social media’s effect on division more generally.

And, the report acknowledges and explores evidence that key problems in U.S. politics- the decline in trust of fellow citizens and important institutions; the rejection of shared facts and promotion of falsehoods; legislative dysfunction; erosion of democratic norms; and ultimately, radicalization and violent extremism- are asymmetrically expressed on the political right. It notes racial animus is a key driver of the political right, citing research including a recent paper from researchers Lilliana Mason, Julie Wronski and John Kane.

In order to neutralize social media's role in exacerbating polarization and its consequences, the report offers a set of recommendations to government and to the platforms. It calls on the White House to raise the issue to the fore by means of a bipartisan blue-ribbon commission, or via some other high-visibility vehicle.

For the government, recommendations include:

Mandating more disclosure about the inner workings of the platforms.

“We do not know even what we do not know concerning a host of pathologies attributed to social media and digital communication technologies,” Nathaniel Persily, a law professor at Stanford, wrote recently. Congress and the Biden administration should require that Facebook, Twitter, YouTube, and other platforms share certain data on their ranking, recommendation, and removal algorithms for researchers and regulators to use.

Empowering the Federal Trade Commission to create and enforce new industry standards.

The FTC’s oversight of social media needs to go much further than data disclosure. We urge Congress to pass legislation authorizing the agency to collaborate with social media companies and other stakeholders to create standards for industry conduct that would be enforceable by the government. The FTC should draft a new set of rules that define the level of reasonable care expected of the social media companies in addressing hateful, extremist, or threatening content online.

Investing in alternative social media platforms.

Congress should provide funding through federal agencies to develop new, pro-democratic social media platforms. Given the market dominance of current incumbents, public support is necessary to nurture alternatives. One worthy idea is the development of “public service digital media,” as scholars such as Ethan Zuckerman at the University of Massachusetts, Amherst have proposed. Another comes from members of a Stanford working group that advocates a dramatic overhaul of existing social media platforms. They propose separating the basic social networks that billions of people have joined from the algorithmic functions of ranking and moderating content.

Investigating social media’s role in the events of January 6.

The House select committee should dedicate substantial resources to investigating how social media was used to plan the Capitol insurrection and how to make similar events less likely in future. The committee has a crucial opportunity to shed light on the consequences of partisan hatred and how the interaction between social media, hyper-partisan news media, political leaders, and protesters motivated the violence on January 6.

For social media companies, recommendations include:

Changing algorithms to stop rewarding inflammatory content.

Social media users’ eagerness to see their posts go viral leads to the spread of extreme, divisive content and what has come to be called “performative politics.” A number of researchers, including Jaime Settle of William & Mary, Jonathan Haidt of NYU, and José Marichal of California Lutheran University, argue that social media companies ought to remove or downplay platform features that may contribute to polarizing online performances, such as hiding “like” and share counts on posts, de-emphasizing extremist political content in news feeds, and promoting authoritative sources.

Investing more in content moderation.

The platforms should double the number of human content moderators who police harmful content online, and bring them in-house, rather than outsourcing this critical work to third-party contractors. A larger moderator corps would also allow supervisors to rotate assignments more frequently so that reviewers exposed to the most disturbing content could switch to less brutal material.

Partnering with civil society groups to fight disinformation.

Social media companies should deepen their partnerships with nonprofit organizations dedicated to identifying and combating false and harmful content online, especially around issues like elections and COVID-19. But the companies must carefully assess the agendas and relative capabilities of those offering assistance. Many Palestinians, for example, have pointed out that they have experienced an unjustified degree of censorship on social media because the Israeli government has a proficient cyber unit that flags large quantities of allegedly hateful and violent Palestinian content. The Palestinians lack comparable capacity, creating an asymmetry in that conflict.

Make depolarizing adjustments more transparent.

If the platforms do take action to engage in more substantial moderation efforts and to engage in other interventions aimed at reducing polarization, they must do so transparently. Transparency is the only way to counter suspicions that such measures are designed to manipulate politics or otherwise exert illegitimate influence. Platforms must be much more open about what they’re doing, how they’re doing it, and what content might potentially get blocked in the process.

Experimenting with depolarization interventions.

Facebook and others should use their massive user bases and analytical data to experiment with algorithmic interventions that encourage civility and may depolarize users. Jonathan Stray, a Berkeley researcher at the Center for Human Compatible AI, theorizes that platforms could create metrics to track polarization and then respond with algorithmic adjustments designed to elevate the terms of online conflict and thus ease partisan hatred.

- - -

Finally, the report urges a reckoning with the current political reality in the United States. From political violence to the erosion of voting rights in dozens of states, there are profound problems ahead. “This is a do-or-die moment for American democracy,” Hakeem Jefferson, a political scientist at Stanford, told me in a recent interview.

While social media companies cannot rescue the United States from itself, they must reform their practices to neutralize the harm they cause to democracy. Government will have to play a role- the firms cannot be trusted to self-regulate. All parties must act with urgency.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics