Jeeyun (Sophia) Baik is a Postdoctoral Researcher at the University of California Berkeley Center for Long-Term Cybersecurity; Hamsini Sridharan is a Communication Ph.D. student at the University of Southern California.
The civil rights team at Meta (formerly Facebook) recently published a progress report responding to an independent civil rights audit of the platform’s practices between 2018-2020. The company wrote that the report marked the beginning of a “path to enhance protections for marginalized communities” and emphasized Meta’s commitment “to move toward increased equity, safety and dignity” on its platforms. These speak to the values of civil rights that are meant to protect marginalized groups by guaranteeing equal opportunities and protections to everyone regardless of characteristics such as race, national origin, age, sex, religion, or disability.
Three days later, the Washington Post revealed that internal research conducted by Facebook in 2019—which demonstrated that hate speech on the platform disproportionately targeted minorities, especially people of color—was not shared with the civil rights auditors and was only partially addressed by company leadership.
This incident illustrates the questionable efficacy of self-regulation to address civil rights issues on digital platforms and validates long-standing concerns held by civil society about how genuinely committed the leadership of Meta is to upholding the civil rights of its users. Meta repeatedly prioritizes profits over safety of users, as civil rights activists have argued for years, and as has been verified by whistleblowers such as Sophie Zhang and Frances Haugen. The company’s global scale and ever-expanding vision of building a “metaverse,” which will introduce newer problems, makes it hard to imagine meaningful accountability through an audit that has no binding force.
The Facebook civil rights audit was pushed by a coalition of civil and digital rights organizations that have criticized Facebook’s numerous harms, such as the doxxing of Black activists by White nationalists, surveillance of Black users, foreign electoral interference, and discriminatory ads. The audit was aimed at identifying civil rights problems on the platform, evaluating corporate practices in the issue areas identified, and reporting recommendations and any changes made.
However, our analysis of the audit, combined with interviews with civil society stakeholders, suggests that Meta cannot be relied on to protect minorities from discrimination and hate on its products without meaningful external accountability.
Facebook was not the first digital platform to undergo a civil rights audit; in fact, Airbnb conducted the first-ever civil rights audit in the tech industry, starting in 2016 through 2019. At the time, Airbnb faced increasing public scrutiny due to anti-Black discrimination on its platform, which resulted in the viral social media hashtag #AirbnbWhileBlack. In response, the company brought on civil rights leader Laura Murphy (who would later lead the Facebook audit) to audit its practices.
The civil society advocates we spoke with perceived the Airbnb and Facebook audits differently. In our research, we spoke with Murphy, as well as numerous civil rights and digital rights organizations involved in the process, including but not limited to Color of Change, the Brennan Center for Justice, and MediaJustice. Members of these organizations perceived the experience with Airbnb relatively positively, while criticizing Facebook. Why?
The difference between the two audits lies in their scope, scale, and the commitment of leadership. Compared to Airbnb, a business focused on housing and accommodations—a narrow scope connected to a clearly defined area of civil rights law—Facebook’s issue areas are wide-ranging, from disinformation and content moderation concerns related to elections, the census, and public health to algorithmic bias and advertising. With limited resources and access to proprietary data or algorithms, the auditors could not holistically scrutinize the full scope of Facebook’s impact. Moreover, advocates found that in their negotiations with Facebook, the company deployed a strategy of “listening” to their concerns and sought to defuse criticism while largely avoiding substantive change.
Worse, when Facebook initiated its civil rights audit in 2018, it also started an anti-conservative bias audit, positioning the civil rights audit as a partisan (liberal) political concern. Our interviewees criticized this “balance sheet” approach as undermining the legitimacy of civil rights concerns. One advocate noted to us about a meeting with Mark Zuckerberg, “He doesn’t understand civil rights. […] he literally sat in the meeting…with our coalition partners, like ‘thank you so much for explaining the nuances of what is hate or what’s white supremacy.’ And people were like, ‘They’re not nuances. You’re treating this like an academic exercise. These are our lives.’”
Also, while neither the Airbnb nor the Facebook audit covered non-U.S.contexts, Facebook’s audit did not even scrutinize its own subsidiaries, such as Instagram and Whatsapp. Considering that civil rights harms are prevalent on Instagram and Whatsapp– including impacts on teen girls’ mental health and human trafficking that affect users worldwide– anti-discrimination efforts should account for platform operations at scale and across all products.
Insofar as the civil rights audit remains self-regulatory, what decides the success of auditing is ultimately leadership commitment. Facebook has repeatedly shown its lack of commitment to civil rights. Even after hiring Laura Murphy as an auditor, Facebook was found in late 2018 to be paying a PR firm to conduct opposition research on civil rights groups such as Color of Change. When Facebook executives decided not to take action on Trump’s racist and incendiary posts in 2020, the tension with civil society resulted in the #StopHateForProfit ad boycott against Facebook. The auditors also raised their concerns in the final audit report, writing, “these political speech exemptions […] call into question the company’s priorities.”
The same doubts and concerns over leadership commitment continue even as Facebook rebrands as Meta. The recent progress report is already attracting criticism, as the new civil rights team includes just nine people, compared to the 10,000 workers the company is reportedly recruiting to build the metaverse.
What can be done to hold Facebook and other companies accountable? One useful step might be a law that requires large tech companies to undergo independent civil rights audits on a regular basis, similar to how financial audits are legally mandated. There are a growing number of industry-led shareholder initiatives that seek to address social justice concerns as part of ESG (environmental, social, and governance) efforts. Civil society members are also offering additional frameworks for companies to carry out civil rights or racial equity audits. But without a legal mandate, the public cannot fully look under the hood of how these companies are making decisions that have civil rights implications.
A law requiring regular civil rights audits would also ensure that all companies are held to the same standard. Airbnb and Facebook agreed to conduct civil rights audits in response to public pressure, but many of the largest companies have not bothered, including Google, Apple, Amazon, and Twitter. Reining in Big Tech has become a major civil rights issue of the 21st century. If we let these companies continue to play fast and loose with civil rights, they will repeatedly discriminate against and endanger the most vulnerable groups in society, online and off.