Meta’s Oversight Board Gives Hate a Pass
Dia Kayyali / Apr 28, 2025Dia Kayyali is a fellow at Tech Policy Press and previously served as a Senior Case and Policy Officer at the Oversight Board from January 2023 to June 2024.

NEW YORK, NEW YORK—March 31, 2025: Person with a sign at an event on International Trans Day of Visibility in Manhattan. Shutterstock
A majority of Meta’s Oversight Board just made it clear that it will not take incitement to violence against transgender people seriously. Despite criticizing Meta’s recent policy changes that explicitly allow hate speech against trans people, in a split decision a majority of the Board upheld Meta’s decision to leave up two videos, shared by Libs of TikTok. The account is perhaps the most influential anti-trans account on major social media platforms right now, and it has been linked to threats of violence. The videos in question depicted identifiable transgender people. In one, a woman is being harassed in a university bathroom. In another, a girl wins a track race. The captions of both videos use the phrase “think he’s a girl.”
The Board was unable to reach a consensus on the case, so the decision includes portions attributed to a majority and a minority of the Board. The majority affirmed Meta’s decision to allow the videos to stay up. The case summary says “the majority of the Board found there was not enough of a link between restricting these posts and preventing harm to transgender people, with neither creating a likely or imminent risk of incitement to violence.” The decision at least recommends that Meta assess how the policy changes could impact the rights of LGBTQIA+ people, and that the platform should “adopt measures to prevent and/or mitigate these risks” and provide public updates. Only the minority opinion acknowledges the reams of evidence that this kind of content is quite clearly inciting violence against trans people.
The Board delayed its decision by nearly half a year. That’s perhaps because, as the Washington Post reported, while Meta’s head of global policy and former Republican operative Joel Kaplan and its former president of global affairs, Nick Clegg, “stopped short of telling the board how to rule, they said the cases were particularly sensitive given the fraught political debate about the rights of trans people in the United States.” When Meta founder and CEO Mark Zuckerberg announced Meta’s current “Hateful Conduct” policy on January 7, he made it clear where the platform stands in that debate, explaining that Meta would be collaborating with President Donald Trump to “fight censorship.”
The current policy allows calls for exclusion and claims of mental illness and predilection towards sexual violence about transgender people. It also uses the term “transgenderism,” which as GLAAD explains “inaccurately and harmfully impl[ies] that being trans is a political ideology, rather than an authentic aspect of one’s personhood.” Previously, the policy prohibited denying the existence of transgender people or calling for exclusion of people based on gender identity. It’s worth noting that the decision revealed a disturbing fact: “prior to January 7, there were exceptions under Meta’s internal guidance (not available publicly) to specifically allow calls for gender-based exclusion from sporting activities or specific sports, as well as from bathrooms.” Despite this internal, unwritten policy, the changes created a platform that is explicitly welcoming to incitement to violence against transgender people.
The Board was correct in finding that both videos would be allowed under these changes, but the majority and minority disagreed on how the content would have been treated under the bullying and harassment policy, the old hate speech policy—and how it should be treated under international human rights law.
One area of disagreement between the majority and minority was whether the teen athlete should be regarded as a “public figure,” with the majority agreeing with Meta that she was. Public figures receive less protection from bullying and harassment, which was key to the decision about the teen. The Board disagrees with Meta’s policy to apply public figure status to any athlete who receives over a certain number of press mentions, but the majority then goes on to claim that this trans teen should have known “that her participation in this level of competition would attract attention because of her transgender identity.” Read another way, any child who is from a protected characteristic group that is currently the subject of political debate, who then attracts attention because of that identity, is not protected from bullying or harassment. As the minority points out, “This circular cruelty is not in the best interests of the child.”
It can’t be stated clearly enough that the most disturbing thing about the decision is the majority’s insistence that this content does not rise to the level of incitement to violence or discrimination. The Board chose not to reveal that it was Libs of TikTok that had posted the content. However, in reporting on the decision, the Washington Post made it clear that the account posted these videos. The Post article says that the owner of the account, Chaya Raichik, claims she is not responsible for threats against gender affirming care providers. However, the evidence is hard to ignore. As I stated in my public comment to the Board, Raichik:
...recently stated that "she’s proud of being called a stochastic terrorist — someone who inspires supporters to commit violence by demonizing a person or group." Media Matters documented "at least 48 instances of threats or harassment" against individuals and institutions targeted by Libs of TikTok post. In 2022, after the account falsely claimed on Twitter that Boston Children’s Hospital performs hysterectomies on children, the hospital received a barrage of harassment including threats of violence. Later, when a woman was charged with making a bomb threat, both her lawyer and Boston Children's Hospital argued that she was directly influenced by Libs of TikTok while in a vulnerable mental state. Similarly, in May 2022, "FBI agents arrested a California man who had threatened to kill a staff member of a Wisconsin school district that was shamed by Libs of TikTok." And particularly relevant to this case, the day after the Libs of TikTok account reposted a video of a physical assault that allegedly took place in the women's bathroom at a high school in upstate New York, and claimed that the perpetrator was "a male student who identifies as a girl," the school received bomb threats.
The majority in this decision justifies the presence of these videos on Meta platforms by claiming that while the “pitched tenor” of current discussions about trans people “elevates risks for transgender people, it does not follow that posts discussing related policy issues, even when using coarse or insensitive language, will themselves incite discrimination or violence.” The majority also claims that removing the content on Meta will simply cause it to go elsewhere, not acknowledging the breadth and depth of Meta platforms’ reach. The majority also does not address any of the research about Libs of TikTok.
Despite how disastrous this decision is, it does at least include recommendations. It recommends that Meta conduct human rights due diligence on the January 7 policy changes, take measures to mitigate risks and monitor their effectiveness, and publicly report on this due diligence. The decision also calls on Meta to remove the word “transgenderism” from the policy— although as GLAAD’s statement on the decision points out, it is ironic that “the issued decision permits two posts containing anti-trans content, while notifying Meta that it must also remove [the anti-trans term “transgenderism] it added to its hate speech policy in January.” Whether Meta will actually respond to the Board’s recommendation to assess the impact of those changes….well, don’t hold your breath.
In 2021, before my time as a Case and Policy Officer at the Oversight Board, Jillian York and I wrote that it was making good decisions, but that if Meta did not implement those decisions, the Board should consider disbanding. Nearly five years later, my perspective on where the issues lie has shifted. Meta has made it clear that it is aligned with an authoritarian leader and his political movement, and has indeed failed to implement some of the Board’s most important and impactful recommendations. But the Board itself is part of the problem. It has laid off staff, recommended leaving up content that is blatantly linked to offline harm, and created an increasingly contradictory body of international human rights law analysis. But perhaps most importantly, it is clear the Board is more than willing to bend under political pressure.
In this case, the Board has bent towards transphobia.
Authors
