Home

Donate

Senators hear testimony on social media and harmful content

Justin Hendrix / Oct 28, 2021

In a hearing in the U.S. Senate Committee on Homeland Security & Governmental Affairs titled "Social Media Platforms and the Amplification of Domestic Extremism & Other Harmful Content," experts on social media implored Senators to mandate transparency from social media firms and to introduce reforms that can contain some of the harms and externalities they produce.

Chaired by Senator Gary Peters (D-MI), the hearing featured panelists including:

  • The Honorable Karen Kornbluh*, Director, Digital Innovation and Democracy Initiative and Senior Fellow, The German Marshall Fund of the United States (Opening Testimony)*
  • David L. Sifry, Vice President Center for Technology and Society, Anti-Defamation League Opening Testimony
  • Cathy O'Neil, Ph.D., Chief Executive Officer, O'Neil Risk Consulting & Algorithmic Auditing, Opening Testimony
  • Nathaniel Persily, Ph.D., Co-Director, Stanford Cyber Policy Center and James B. McClathy Professor of Law, Stanford Law School Opening Testimony
  • Mary Anne Franks, D.Phil., Professor of Law and Michael R. Klein Distinguished Scholar Chair, University of Miami Opening Testimony

The experts focused on the need for access to data for independent researchers as well as the necessity to hold tech firms accountable to their own terms of service.

"Facebook and the other Silicon Valley Platforms have lost their right to secrecy," said Persily. "We need national transparency legislation that will allow researchers, other than those tied to the profit-maximizing mission of the firms, to get access to the data that will shed light on the most pressing questions related to the effects of social media on society."

Persily has put forward draft legislation that would create such a mechanism for researchers to access platform data.

Franks pointed to the necessity of reform given the potential harms of future technologies that are in development at the major tech firms. Pointing to emerging technologies such as artificial intelligence, virtual and augmented reality, she said that tech firms have adopted a common practice to "aggressively push new, untested, and potentially dangerous products into the public realm and worry about the consequences later, if at all."

Senators showed concern about free speech issues, as well as worries that efforts to contain disinformation may be seen as partisan. Senator Rob Portman, R-OH, expressed such concerns specifically. "How do you figure out what is speech that is peaceful expression of points of view that we should be encouraging, and what is content that should be filtered in some way?"

"For the First Amendment, freedom of speech, freedom of association-- it's extremely important that the government not be in the business of deciding whats' true and what's not true," responded Kornbluh. "That's why some of these revelations from the whistleblower are so important, because she focuses us upstream of the content, at the mechanics of the platform, and how it's driving this content just to service itself, and to service its ad revenues. And if we focus on design elements and we focus on transparency especially, that furthers First amendment concerns, it furthers freedom of speech and association."

One of the more notable exchanges occurred between Franks and Senator Ron Johnson (R-WI). Senator Johnson made the argument that tech firms favor liberal interests and that content moderation practices at social media companies are "violating people's constitutional rights." He said that he had personally convened people who expressed concerns about vaccines that participated in a group that was removed by Facebook.

"Yes, we do have a First Amendment, we do have a right to free speech, but we also know of course that private companies are not obligated to take all comers," replied Franks. "They are allowed to make their own decisions about what is considered to be high quality or low quality content, they can make any number of decisions. And I think that we would applaud them in many cases to make those decisions."

In a hearing in the Senate Committee on Homeland Security & Gov Affairs today, @ma_franks had a sharp response to Sen. Ron Johnson, who pushed the idea that social media moderation favors the left. She noted the asymmetry of the problem on the right as it relates to extremism. pic.twitter.com/nsPrxMplMV

— Justin Hendrix (@justinhendrix) October 28, 2021

She pointed to the issue of non-consensual pornography as an example of a category content that it is appropriate to see major platforms reject. On the question of bias, Franks rejected the notion that the platforms favor left wing ideologies, and pointed to evidence that right wing extremism is an asymmetric problem on social media.

"The data actually do indicate that right wing content is more amplified on these social media platforms than left wing content, and that right wing content is more disproportionately associated with real world violence-- not hurt feelings, not people being upset, but in fact actual violence, actual armed insurrections, actual notions of terrorism and anarchy."

A question from Senator Alex Padilla (D-CA) touched on the problem of disinformation in non-English language communities and in countries abroad.

"It's a cost issue, on top of the fact the filters for hateful or extreme content are essentially keyword searches," said O'Neil. Pointing to the complexity of these problems, she suggested that the companies, to date, simply are not willing to invest in the human resources necessary to address these issues at scale. "You need a lot of experts working full time on this, and they simply won't pay for that."

Concern about engagement and surveillance advertising-driven business models were also top of mind. Multiple questions and responses implicated the relationship between platform business models and the amplification of extremism and harmful content.

"This won't change until there is a clear shift in the incentive systems that they use to be able to do this business, and that's where Congress must act," said Sifry.

*Kornbluh serves on the Tech Policy Press masthead.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & Inno...

Topics