Home

Evading Scrutiny, YouTube is Slow To Change Its Policies

Justin Hendrix / Jun 13, 2022

Among experts that study social media platforms and their policies, it is understood that despite its massive, global scale, YouTube evades scrutiny by comparison to other major tech firms, both from journalists and from academic researchers. evelyn douek, a scholar at Harvard Law School and an affiliate of the Knight First Amendment Institute at Columbia University, has referred to the company’s ability to fly under the radar as YouTube “magic dust.

There is evidence that this relative lack of scrutiny results in lethargy with regard to the evolution of the company’s policies in response to demonstrable harms. A new report on YouTube from the NYU Stern Center for Business and Human Rights, authored by the Center’s Deputy Director, Paul Barrett, and me considers this dynamic and its impact.

The main reason YouTube receives less scrutiny is that it is more difficult and expensive to analyze a large volume of videos than it is to search for words or phrases, such as in a text data set of Facebook or Twitter posts. Budget-conscious academic researchers, as well as watchdogs in nonprofit and non-governmental organizations, weigh the feasibility of competing projects; dissecting video costs a lot more in human hours and computing resources.

Over time, this results in less research into YouTube. In the words of Kate Starbird, a researcher at the University of Washington, YouTube is “almost inscrutable,” despite the fact that it “fits centrally” into phenomena such as the spread of disinformation. “We can see pieces here and there,” says Starbird, “but we can't systematically collect large scale data from YouTube to use in our research.”

For instance, a keyword search of the May 2022 International Communications Association (ICA) global conference, which brings together hundreds of scholars that study media and technology, bears out this discrepancy. Across hundreds of research presentations, YouTube features in roughly half (40) the number as Facebook (77) or Twitter (87). One researcher, presenting the results of a study focused on election disinformation, explained that the results primarily focused on Facebook and Twitter because they were only analyzing text.

Fewer research insights means less journalistic scrutiny, which in turn means less change to policies. At the same ICA conference, Dr. Nahema Marchal, a researcher studying digital media, politics and democracy at the University of Zurich, presented preliminary findings of a project that analyzed to what extent news media coverage of social media platforms drove changes to the platforms’ policies. The results she presented suggest a relationship between journalistic scrutiny and policy changes at Facebook, Twitter and YouTube. Of the three platforms, however, YouTube received the least coverage, and implemented the fewest policy changes (36) compared to Facebook (126) and Twitter (90).

Super innovative work tracking platform policy changes in response to journalism coverage presented by ⁦@nahema_marchal#ica22 pic.twitter.com/OEnPMGQuVf

— Daniel Kreiss (@kreissdaniel) May 30, 2022

What’s more, less is known about YouTube’s policies and practices when it comes to content moderation in general. YouTube’s parent company, Google, says that it has more than 20,000 people around the world working on content moderation, but it declines to specify how many do hands-on review of YouTube videos. In an interview for the NYU report, Brendan Nyhan, a political scientist at Dartmouth College, told us that YouTube has managed to keep enforcement of its content rules “totally opaque.”

In response to the NYU report, YouTube contested this description, saying: “We use a variety of ways to convey our efforts to the public, from top executives having tough conversations with the press to blog posts providing overviews on how YouTube works.”

But clearly, it could do more, both to disclose information about how the platform works and to facilitate greater access to data that researchers need to study YouTube. Such scrutiny, while it may be uncomfortable for the company and its executives, should play a role in helping YouTube evolve its policies in response to real world problems, ultimately helping the company mitigate the risk of more profound controversy.

To do more thorough empirical studies, researchers have told us that they need more information via YouTube’s API, or application programming interface. Currently, YouTube provides access only to content and metadata that are available at the time researchers connect to the API. To understand how a given video gained attention over time, they need access to historic data as well.

Social scientists tell us that another potentially fruitful feature would be the ability for researchers to retrieve random samples of YouTube content, a need that may arise when a given search query otherwise returns an unmanageable number of results. Random sampling would allow researchers to make inferences from a subset of data and would better support the tracking of important trends. Some disclosures go beyond data access via the API. At present, YouTube doesn’t systematically disclose when it makes policy changes that may bear on the availability of problematic content.

Of course, soon YouTube’s hand may be forced. In Europe, the Digital Services Act (DSA) will introduce requirements for social media platforms to provide researcher access to platform data. A recent working group created a roadmap to do so that is compatible with Europe’s privacy regulation. And in the U.S., legislative proposals such as the Platform Accountability and Transparency Act would do the same.

Social media companies are not the sole cause of democratic backsliding, eroding trust in commonly held facts, and surging nativist resentment. Other forces—including economic globalization, heightened inequality, racism, hyper-partisan cable television, and a decades-long trend toward extreme political polarization—are also roiling the U.S. and many other societies. What social media companies have done is act as a powerful accelerant, pouring fuel on the flames and amplifying users’ worst instincts.

In this fraught environment, YouTube, like other influential platforms, has an obligation to do more to counter exploitation of its prodigious capacity to spread harmful content. More scrutiny will help it meet that obligation.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics