Facebook, It Isn’t That Complicated
Jordan Kraemer / Sep 20, 2024Facebook has notched up many notable accomplishments since its birth 20 years ago. But becoming the social media platform where Americans are harassed most is hopefully the one its founder and CEO Mark Zuckerberg is least proud of. According to ADL's fifth annual survey of online hate and harassment, Facebook and Instagram (now both part of Meta) remain the platforms where the most harassment occurs. Among American teens who were harassed online, nearly two-thirds experienced harassment on Facebook (61%), the platform that has made Zuckerberg one of the richest and most influential people in the world.
Few of Facebook's over three billion monthly users across the globe are probably even aware that the platform is a major purveyor of hate and harassment. The reason could be that the platform's "dark side" is not visible to all its users. Facebook has become more integrated in community life across the US than other platforms, emerging as an increasingly important place for local communities to gather. Especially since the pandemic, parent groups, swap groups, neighborhood discussion groups, and town pages have all become popular venues for vigorous discussion. And some of these groups and pages are only open to members.
Researchers at the ADL Center for Technology and Society recently found that a number of Facebook groups or pages in local communities have become toxic sites of harassment, particularly identity-based harassment against Jews, women, LGBTQ+ advocates, immigrants, and people of color. Targets of local harassment suffer in ways unlike targets of other forms of online harassment, such as trolling campaigns; they often interact with their harassers in everyday life, leading to emotional harm and potentially to physical violence and withdrawal from civic life.
Yet, when these incidents of hate and harassment—including very explicit antisemitism—were reported to Facebook, the platform routinely replied that the content didn’t violate community guidelines. In one instance, members of a neighborhood group in suburban New Jersey allegedly concerned with overdevelopment complained that Orthodox Jews were using zoning laws to evade taxes, and they characterized Jewish support at a town hall hearing as “disgusting.” It took much more explicit hate—and ultimately the community enlisting a state district attorney—for Facebook to remove an earlier iteration of this group. The hate didn’t only take place online; the same community experienced multiple violent incidents, including antisemitic vandalism and assaults.
Few members of the public are likely to have read Meta's statement of "vision and mission" that includes this commitment: "Our technologies connect people all around the world, creating opportunities and giving a voice to billions. We’re taking action to keep our platforms safe and inclusive for everyone”.
Meta, Facebook's parent, should and could do much more to make good on this commitment. Its decision to prohibit uses of the term “Zionist” as a slur, to ensure it isn’t used as an antisemitic synonym or to incite violence, is a move in the right direction. But the Meta Oversight Board’s recent decision that “from the river to the sea” is not antisemitic disregards the experience of Jewish users since October 7th who have been harassed with that phrase on Meta's platforms.
To protect Jews and other marginalized groups more fully, the company should reinvest in Trust and Safety efforts (both human and automated) after heavily downsizing staff and resources last year. It should also use network analysis to evaluate the context of abuse reports from local groups, not just the content of individual posts or comments, and adopt ADL’s recommendations for abuse reporting, including real-time support for targets of severe harassment, batch reporting, and flagging related activity.
In addition, an internal audit would allow Meta to evaluate its antisemitism moderation practices, ensuring staff can recognize shifts in the language and tactics of hate and enforce Meta’s rules. An independent review by external experts would further communicate Meta’s willingness to curb antisemitic hate, if the company provides full access to internal data and implements the recommendations. Meta must follow through on these steps to make social media safe and equitable for local communities across the United States and beyond.