Home

House Committee Explores Social Media Research and Data Transparency

Justin Hendrix / Sep 28, 2021

Referring to COVID-19 vaccine disinformation, House Committee on Science, Space & Technology Chairwoman Rep. Eddie Bernice Johnston (D-TX30) proclaimed that “we must not leave the black box of social media disinformation unexamined. Navigating the difficulties and extending access to data will not be easy, but failing to do so will have devastating consequences.”

So began a hearing in the Subcommittee on Investigations and Oversight on researcher access to social media data. The tone of the virtual hearing was serious but notably collegial. The Chairman of the Subcommittee, Rep. Bill Foster (D-IL11) is himself a PhD physicist. Before being elected to Congress he worked at Fermilab, the United States Department of Energy laboratory that does experiments in high-energy particle physics, where he was on the team that discovered the top quark. And, the Ranking Member, Jay Obernolte (R-CA8) holds a master’s degree in computer science and a doctorate in public administration.

In his opening remarks, Rep. Obernolte pointed out his concern on how to balance freedom of expression with “the effort to combat the spread of misinformation” by referring to the competing theories on the origin of COVID-19 as an example of “how these two ideas are in tension”. “Figuring out as a society how to balance those two competing interests is critical, because on the one hand, as recent events have shown, we all have a vested interest in trying to figure out how to stop the spread of misinformation, but on the other hand history has shown us repeatedly that if we allow censorship to take the place of misinformation that will take us down a very dark path as a society.” Rep. Obernolte said platforms such as Facebook and Twitter “need a seat at the table” to bring their expertise to the problem, in addition to the independent researchers before the committee.

Laura Edelson, a PhD Candidate at NYU Tandon School of Engineering, referred to the need for impression data.

“As researchers, we don’t want to just come to the conclusion that ‘misinformation is very engaging’- we would also like to understand how we could stop that, how we could design systems to make misinformation less engaging, and in order to do that one of the things we really need is impression data. This is something that would be really, really crucial to actually getting to solutions, and it’s something that Facebook doesn’t make available through CrowdTangle.”

She referenced a proposed technical standard about what data Facebook should make available on its advertising that she will soon publish.

Rep. Obernolte asked Edelson about the notion of a safe harbor for researchers and companies to share data. Edelson referenced a proposal from the Knight First Amendment Institute to create a safe harbor for researchers working with social media data.

Rep. Obernolte pushed on his question about how to find a balance on stopping the spread of misinformation “without suppressing free speech.” “You can’t yell ‘fire’ in a crowded theater, that’s now recognized as something that’s not an infringement on free speech because of its potential to cause harm,” he said. He suggested understanding the intent of a user will be important to that solution.

By reply, Dr. Kevin Leicht, Professor, University of Illinois Urbana-Champaign Department of Sociology, referred to methods to introduce “cognitive interference” into social media systems, such as “more effective labels.” “There will be some types of misinformation that it is simply not in the public interest to control, or necessarily stop the spread of, and others that it is more vital for public health or public safety,” he said.

Edelson pointed to research she has done that shows misinformation outperforms factual content across the political spectrum. She said it should be possible to do address the problem by building systems where “user interactions is not the driving force of what content is promoted.”

Rep. Sean Casten (DIL-6)- also a scientist- brought up recent reports in the Wall Street Journal about how Facebook favors influencers, including some that spread misinformation. Edelson said that some influencers have more latitude on the site than most users is a major problem. Rep. Casten noted that “I think we are all fond of the framing ‘freedom of speech’ and freedom of reach’ are two separate things.”

Mentioning he has been “rather persuaded by Roger McNamee in his writing,” Rep. Casten also queried whether users should have more agency or ownership over their own data.

“If we were to wave a wand tomorrow, and change the premise such that everyone owned their own data, that they could opt in to sharing that data and the metadata around their data, so that they truly had portability, so that they could still say ‘I actually find it useful that this device knows where I am and where I want to go and can have all the automated…' if we were to do all that, does that change the environment that you would have where essentially we would have to get permission for the data from the public rather than from the companies?”

Dr. Alan Mislove, Professor and Interim Dean, Khoury College of Computer Sciences, Northeastern University, responded “you’re essentially talking about is essentially democratizing your ownership of your data…. One way you could move towards that is to give users legal rights over the data that these companies already have on them. So for example, Facebook allows you to extract your data from the site, but there are many things they don’t provide you,” he said, noting that if users could get access to their complete data they could consent to provide it to researchers at will.

The researchers agreed it is in the public’s interest to make far more data available to independent researchers.

“Tobacco companies don’t get to decide who does research on smoking. The idea that social media companies get to decide who studies them is perverse,” said Edelson.

Dr. Maslove pointed out the effects of social media algorithms is particularly difficult. “Having a regime where Congress would require all data to be released to be able to be studied would allow us to tease out both malicious actors as well as the role of the platform itself,” he said.

Rep. Obernolte queried whether the business model of current social media platforms should be outlawed.

“You’re right, there is probably an inherent systemic problem with platforms whose business model is built around maximizing user engagement,” said Edelson. “Some of these risks and harms are particularly acute for the youngest users,” she noted.

Dr. Maslove said one way to make the business model more “benign” would be to “allow more competition” over users and user attention. He said he thinks the abuse of users would be “reduced somewhat” if the companies weren’t quite so large.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics