Home

Now What? Fixing Facebook to Protect Us All

Nora Benavidez, Carmen Scurato / Oct 26, 2021

Nora Benavidez is a senior counsel and the director of digital justice and civil rights at Free Press Action, where Carmen Scurato is an associate legal director and senior counsel.

Since Frances Haugen’s congressional testimony, dozens of news outlets have published exposés based on the documents she brought forward detailing Facebook’s knowledge of its role in fomenting violence on Jan. 6 and the various ways it has side-stepped safety to allow disinformation, hate, and human rights abuses to spread across the globe.

Facebook’s moderation and enforcement decisions consistently treat civil and human rights as secondary concerns. What’s worse is that the company’s own research confirms that it knew that threats to the safety of its users and others were serious problems, but did nothing to stop them. Instead, the company engaged in a serial cover up, misleading the public and regulators around the world.

The reality is worse for non-English Facebook users and countries outside the United States. Safety measures are not equitably distributed across all the countries and languages that Facebook operates in. Internal documents show that Facebook takes action on only 0.6 percent of violent and inciting content across its platform; meanwhile, calls for violence and hate have led to genocide in places like Ethiopia and Myanmar.

Facebook executives knew all of this, and made decisions that repeatedly undermined efforts to reduce misinformation and hate across the platform. According to Bloomberg, Facebook knew which users were “serial hate-speech offenders” who also more regularly shared misinformation, yet 99 percent of them remain active on the platform. The company also established XCheck, a program giving special treatment to powerful users, exempting those same users from the enforcement actions at the expense of safety. Failures to take enforcement action means toxic and incendiary content gets amplified, and events like January 6 show how the spread of lies and incitement to violence can play a role in a direct and immediate assault on our democracy.

Facebook has shown again and again that it doesn’t have the will to fix itself. While it has blanketed Washington, D.C. with ads calling on lawmakers to pass new platform regulations, as The Markup detailed last month, many of the “regulations” Facebook claims to be open to are things the company already does or is already compelled to do under the European Union’s General Data Protection Regulation (GDPR) or the California Consumer Privacy Act (CCPA). But what would meaningful government oversight and regulation look like for Facebook, and for other social media platforms such as Twitter and YouTube?

First, we must center people-- including the people most harmed-- in the regulatory conversation. Facebook users-- real people across the globe-- are often sidelined in more abstract policy debates about the First Amendment rights of platforms. Let’s bring back into focus the ways we can move forward to protect and empower people: Congress should pass a comprehensive data-privacy law with civil-rights protections. The law should limit the collection of personal data and put an end to abusive practices that fuel algorithmic discrimination. Privacy legislation should also mandate regular transparency reports as a mechanism for oversight and enforcement, as well as enable external researcher access.

Fortunately, the Algorithmic Justice and Online Platform Transparency Act put forward by Sen. Ed Markey and Rep. Doris Matsui would prohibit algorithmic processes on online platforms that discriminate on the basis of race, age, gender, ability and other protected characteristics and would require online platforms to publish annual public reports detailing their content-moderation practices.

Second, we must consider the possible solutions to solve our current broken and inequitable digital news ecosystem, which leaves online users with few high-quality, diverse news sources to counter what they passively consume on Facebook or other social media. Congress should create a tax on online advertising and direct the revenue to support high-quality noncommercial and local journalism. A 2-percent tax on the targeted-advertising revenues of the top-10 online platforms would yield more than $2 billion for a national endowment to support journalism that meets the needs of communities around the country. Local news can be a powerful antidote to the spread of disinformation. To fully combat the problems of disinformation, hate and other malign practices online, we must fund local journalism.

Finally, there are existing authorities that our federal agencies should leverage now to regulate the collection of personal data and investigate discriminatory practices on social media platforms. This includes the Federal Trade Commission’s (FTC) authorities to investigate and enforce against harms caused by abusive commercial data practices. The FTC also has the power to launch a rulemaking proceeding that will help create a public record of data abuses to minimize data collection and hold companies liable for discriminatory algorithmic practices. At Free Press, we recently sent a letter pushing for more funding for the agency to do just this. There are other existing agencies that should take up these issues, but let’s start with the FTC.

The next 12 months are critical for our democracy. We will witness a redistricting process that will influence electoral maps for years to come, a midterm election that will influence the balance of power in Washington, an ongoing pandemic perpetuated by lies about vaccines and masking, and a fractured media failing to cover these inflection points in ways that empower communities with the news and information they need.

Leaders in Congress and the Biden administration must not rest on the last month of bombshell investigations. The time is now for policy leaders to socialize and refine how we create a digital ecosystem that serves us all.

Authors

Nora Benavidez
Nora Benavidez is senior counsel and director of digital justice and civil rights at Free Press. She is the lead author of Big Tech Backslide (2023), a new report from Free Press examines how tech companies’ retreat from platform integrity harms democracy, as well as Empty Promises (2022), analyzing...
Carmen Scurato
Carmen Scurato is an attorney on the Free Press policy team working at the intersections of racial justice, technology and internet policy. Before joining Free Press, Carmen led the policy team at the National Hispanic Media Coalition, where she advocated for policies that advance the communication ...

Topics