Home

Study of social media, collective behavior should be a “crisis discipline,” researchers say

Justin Hendrix / Jun 17, 2021

Social media, message apps and other digital communications technologies restructure the ways in which information flows, and thus how humans interact with one another, how they make sense of the world and how they come to consensus on how to deal with problems.

Now, more than a dozen researchers at multiple universities who study technology, behavior and complex systems believe questions about the impact of communications technology on collective behavior should be regarded as a "crisis discipline," noting that "the vulnerability of these systems to misinformation and disinformation poses a dire threat to health, peace, global climate and more." They call on researchers and social media executives to take a Hippocratic oath and pledge first to do no harm to humanity.

"Our social adaptations evolved in the context of small hunter-gatherer groups solving local problems through vocalizations and gestures," write the researchers in a paper, Stewardship of global collective behavior, set to appear on the Proceedings of the National Academy of Sciences (PNAS). "Now we face complex challenges from pandemics to climate change- and we communicate on dispersed networks connected by digital technologies and social media."

Citing challenges such as vaccine refusal, election-related misinformation, racism and violent extremism, the paper says that "the structure of our social networks and the patterns of information that flow through them are directed by engineering decisions made to maximize profitability," and that the social changes technology has contributed to are "drastic, opaque, effectively unregulated, and massive in scale."

“We have built and adopted technology that alters behavior at global scales without a theory of what will happen or a coherent strategy for reducing harm,” said Joseph B. Bak-Coleman, a post-doctoral researcher at the University of Washington’s Center for an Informed Public and the lead author of a new paper to be published this week in the Proceedings of the National Academy of Sciences.

Drawing parallels to other complex collective behaviors observed in nature, including to schools of fish, ants, flocking birds, and other animals, the paper focuses on five specific themes at the intersection of communications technologies and global collective behavior.:

1. Increased scale of human social networks: The scale of human connection has grown rapidly in the span of a few short decades, a sliver of the 12,000 year history of modern humans during which our institutions have evolved. "Expanding the scale of a collectively behaving system by eight orders of magnitude is certain to have functional consequences," say the researchers. "In short, changes in scale alone have the potential to alter a group’s ability to make accurate decisions, reach a clear majority, and cooperate."

2. Changes in network structure: "For most of our evolutionary past, Homo sapiens may have maintained meaningful social contacts with, at most, hundreds of others and often far fewer," note the researchers. Now, new network structures are possible. This can have positive implications, such as connections between "individuals that do not fit in their local communities because of their beliefs and preferences," but it can also contribute to harmful phenomena: "echo chamber and polarization, eroded trust in government, world-wide spread of local economic instabilities, global consequences of local electorate decisions, difficulty coordinating responses to pandemics," and others.

3. Information fidelity and correlation: While "early human communication was largely biological (e.g. vocalizations, gestures, speech), relatively slow, and inherently noisy, allowing information to mutate and degrade as it moved throughout a network." Noise, latency and decay of information signals are known to be important in other information systems: "evidence from fish schools revealed that noise and decay are important for preventing the spread of false alarms." Today's rapid and global information flows may "overwhelm cognitive processes and yield less accurate decisions," eliminate "barriers that may previously have functioned as filters on the type of information that is shared," and "alter and define power relationships," among a variety of other implications.

4. Algorithmic feedback: Observing multiple features of algorithmic systems and their application, the researchers conclude that "we are offloading our evolved information-foraging processes onto algorithms. But these algorithms are typically designed to maximize profitability, with often insufficient incentive to promote an informed, just, healthy, and sustainable society." Indeed, "Given that algorithms and companies are already altering our global patterns of behavior for financial reasons, there is no safe hands-off approach."

Given the implications of these changes introduced by technology, the researchers suggest that understanding the impacts of communications technology on collective behavior should be regarded as a crisis discipline and "join the ranks of other crisis disciplines such as medicine, conservation biology, and climate science." A variety of disciplines are necessary to this effort, including computational social science, technology studies, economics, law, public policy, systemic risk, and international relations.

The researchers suggest there should be a "Hippocratic oath for anyone studying or intervening into collective behavior, whether from within academia or from within social media companies and other tech firms. Decisions that impact the structure of society should not be guided by voices of individual stakeholders but instead by values such as non-maleficence, benevolence, autonomy and justice."

“We urgently need to understand this and move forward with focus on developing social systems that promote well-being instead of creating shareholder value by commandeering our collective attention,” said co-author Carl T. Bergstrom, a UW professor of biology and faculty at the Center for an Informed Public, in a statement.

"Inaction on the part of scientists and regulators will hand the reins of our collective behavior to a small number of individuals at for-profit companies," conclude the authors.

Additional co-authors on the paper include Rachel Moran at the UW; Mark Alfano at Delft University of Technology and Australian Catholic University; Wolfram Barfuss at University of Tübingen; Miguel A. Centeno, Andrew S. Gersick, Daniel I. Rubenstein and Elke U. Weber at Princeton University; Iain D. Cousin at University of Konstanz; Jonathan F. Donges at Stockholm University; Mirta Galesic and Albert B. Kao at Santa Fe Institute; Pawel Romanczuk at Humboldt Universität zu Berlin; Kaia J. Tombak at Hunter College of the City University of New York; and Jay J. Van Bavel and Jennifer Jacquet at New York University.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics