Home

Donate

Evaluating Alternatives to Content Removal on Social Media Platforms

Justin Hendrix / Apr 13, 2022
Report on the zoological collections made in the Indo-Pacific Ocean during the voyage of H.M.S. 'Alert' 1881-2 (Pl. VII) / Wikimedia Commons

What to do about the various genera and species of problematic content in the sometimes sinister yet always fertile jungles of social media is a problem that has occupied tech, media and political elites for years. Yet in the past few months it might be argued we are at the end of the beginning of the struggle with these questions. One reason is that around the world, lawmakers are variously considering and in some cases imposing stricter limits on 'what can be done' and 'what should be done,' marking a distinct beginning of a new phase or era. Another is that the various options for 'what can be done' are more well theorized than in past.

For that we can thank people like Eric Goldman, Associate Dean for Research, Professor of Law, and Co-Director of the High Tech Law Institute at the Santa Clara University School of Law, regarded as one of the most eminent scholars studying the evolving field of content moderation. In Michigan Technology Law Review, Goldman has published an article that "describes dozens of remedies that Internet services have actually imposed," providing "a normative framework to help Internet services and regulators navigate these remedial options to address the many difficult tradeoffs involved in content moderation."

I'm pleased to share the final published version of my article "Content Moderation Remedies," a project I started over 3 years ago https://t.co/k5OW7aMb5H The article examines dozens of alternatives to content removals pic.twitter.com/d3YmQiAmB4

— Eric Goldman (he/him) (@ericgoldman) April 13, 2022

Noting the myriad ways in which a binary approach to content removal (leave it up, or take it down-- the "paramount solution for violative content or actions") is codified in laws around the world as well as in principles-- such as the Manila Principles and the Santa Clara Principles-- Goldman seeks to push bast the binary to arrive at a "taxonomy of remedy options." In this taxonomy he identifies five categories:

(1) actions against individual content items;



(2) actions against an online account;



(3) actions to reduce the visibility of violations, which can be
implemented against individual content items or an entire account;



(4) actions to impose financial consequences for violations, which also can be implemented against individual content items or an entire
account; and



(5) a miscellaneous category for actions that do not fit into the other
categories.

For instance, the category of "actions against individual content items," or 'content regulation', contains within it "eight remedies" such as the removal, the suspension, relocation, editing or redacting, appending warnings, adding counterspeech, disabling comments and so on. The category on visibility includes remedies such as the infamous "shadowban," removal from search indices, limitations on promotion of the content, and the introduction of other forms of friction to stop or limit its spread. On platforms that permit users to monetize content or conduct other transactions, financial penalties can be imposed. And all of these various interventions can be combined in creative ways.

Goldman also notes other, perhaps more intense options that are available- such as reporting content to law enforcement (required by some governments, particularly in the case of sexual abuse material) and more creative interventions based on notions of restorative justice.

Goldman worries that as regulators try their hand at introducing new schemes to standardize remedies, they may "eliminate (intentionally or not) the possibility of diverse remedial schemes" that have evolved in the wild. And, he notes, there are technological and design challenges to overcome; regulation runs the risk of being mis-calibrated if it does not take these into account. Goldman also considers the implications of moderation transparency regimes, many of which are under consideration, including in the United States Congress, noting that "regulators should consider how increased remedy options may affect any mandated transparency obligations, including the understandability of the reports and the production costs" of such information.

Ultimately, Goldman concludes, "[c]ontent moderation is hard. It is not possible to moderate content in a way that pleases everyone." Goldman is concerned that if governments make a hash of it by imposing poorly considered regulations in what he regards as the "current regulatory maelstrom," they may in fact kill off the more interesting flora and fauna of content moderation remedies that have evolved in the wild.

Read the full article here.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics