Home

Donate

A Perspective on Meta's Moderation of Palestinian Voices

Justin Hendrix / May 26, 2024

A conversation with Marwa Fatafta, who serves as policy and advocacy director for the nonprofit Access now, which has worked on digital civil rights, connectivity and censorship issues for the past 15 years. Along with other groups, Access Now has engaged Meta in recent months over what it says is the “systematic censorship of Palestinian voices” amidst the Israel-Hamas war in Gaza.

What follows is a lightly edited transcript of the discussion. Some links have been placed in the text to help the reader navigate to relevant documents.

Marwa Fatafta:

My name is Marwa Fatafta. I work as the MENA Policy and Advocacy Director at Access Now.

Justin Hendrix:

Marwa, I'm glad that you can join me today. We're going to talk through some of the activism that you've been doing recently and some of the interactions in particular that you've had with Meta and its senior executives. I want to ask though, just to start, what has it been like to do your job in these last several months? Clearly, you've been up to this for a bit, and these issues have been at the fore for quite some time, but things have changed dramatically on the ground.

Marwa Fatafta:

Thank you. Justin. How would I describe the last seven months? It's been a roller coaster, quite honestly, A frustrating journey as well, working with companies on issues pertaining to Palestine, Israel, Palestine. I have been working on content moderation issues in Palestine, where I'm from, for almost a decade. And I know from experience that whenever violence escalates on the ground, it's so predictable that we will see companies clamping down on Palestinian or Palestine-related content. It's just predictable. As the sun rises and sets. October 7th happened, it took everyone by surprise and non-surprisingly, with the company's reactions, they have been just removing disproportionately Palestinian content or Palestine-related content.

And I emphasize on “Palestine-related“ because we're not just talking about content coming from Palestine per se, but as just in this, the war in Gaza has triggered global conversations around the world. It's felt in different spheres, and naturally people who are engaged on the issue are quite active online and sharing information, accessing information. And to that end, people are being impacted everywhere. And so we have been meeting quite regularly with companies, especially in the very beginning, just to say, look, this is what we see. We've been escalating cases. We've been also demanding transparency and answers to some of the actions we've seen. For example, I don't know, comments with a Palestinian flag being removed or journalists being suspended. We've had photos of people injured, dead people from Gaza being removed on Instagram because the platform deemed it to be pictures of nude people to be removed under sexual nudity and sexual activity policy, things like that.

But most importantly is that because it's a structural issue, because it's been going on for quite a long time, we're talking about a systematic pattern of censorship. And even though Access Now and many other organizations have been pointing to Meta specifically about those patterns, and then there was, as you know, the BSR report that came out in 2021, which showed clearly that Meta over moderates Palestine-related content, especially in Arabic language, it under-moderates hate speech. And that's a whole other problem that we should absolutely discuss today. It's been frustrating because you feel that the company is, I don't know, they're taking a swim at the beach. They're super relaxed. They don't see the urgency, they're not willing to address the issues at a structural level.

They are only happy to engage with civil society when we just escalate a single or a few cases as opposed to really understanding that this is a big moment. We're talking about genocide and internet shutdowns. We're talking about the lack of foreign journalists being able to report, or international media being able to report from the ground in Gaza. So every piece of content is important and they should allow people, or at least make exceptions and carve outs as necessary to allow people to express themselves freely and safely, which they've done in the context of Russia's invasion of Ukraine in 2022.

But yeah, Palestine unfortunately is not that important for a big company like Meta. Not financially, not politically, obviously. The consequences of their censorship are, at least to them, are pretty much minimal. As I said, it has been a frustrating journey. But on the positive side, I see it extremely important to do this job, to hold companies accountable, to scrutinize their policies and also to document how discriminatory their content moderation systems are because they, of course, they always gaslight civil society and their general audience that this is never intentional. As a matter of fact, if you look up Meta's blog post on, or their statement in their newsroom immediately following October 7th, they'll say that it's never our intention to silence a particular group of people, but de facto they are. And therefore, it's important to keep documenting and monitoring and showcasing how what they say and what they do are two extremely different things.

Justin Hendrix:

We have discussed this issue of the double standard in the region before on this podcast around the time of the [the events described in] BSR report, and I do recall the basics of it. One of the things that seems to be apparent is a real incongruity in terms of the number of resources that each side has in this conflict to report content, to flag content to the platform. Does that continue to be the case in this current situation, or have there been more activist supporters mobilized on either side that have changed that balance at all?

Marwa Fatafta:

So what happened in 2021, and for listeners who are not familiar with the events of May, 2021, it mainly revolved around protests in East Jerusalem in the neighborhood of Sheikh Jarrah, basically people mobilizing against the imminent eviction of their homes for Israeli Jewish settlers to move in. And people were mobilizing online, as well as on the streets. And as I said, the moment people started mobilizing online we noticed the automatic removal of hundreds upon hundreds of posts, and that was really an egregious level of censorship that was just so pronounced. It was hard for Meta to, as I said, to manage its way around with the usual shenanigans like, "Oh, it was a technical glitch." Which of course they did say it was a technical glitch. They always do. But after so much pressure from civil society and then the Oversight Board that looked into a case from that time and again, asked for an independent investigation into Meta's actions, and they contracted BSR business for social responsibility.

But the report came out very clear, like one, there is discrimination, although the report claims it's not intentional. That's a conclusion I don't necessarily agree with, especially when you've been pointing to the company about this discrimination for quite a long time, and they continue to choose not to address it. They also found that Palestinian content is over-moderated. Hebrew content, including hate speech, is under-moderated. They also found that the company's terrorism-related policy, which is called the Dangerous Individuals and Organizations Policy, was disproportionately impacting Palestinian speech. And even though the company doesn't disclose who is designated at the terrorist, we know from a leaked version that the majority of individuals and orgs are from Muslim of them are Arab countries. So no surprise that over-enforcement of that policy leads to the disproportionate censorship of Palestinians and other Arabic-speaking users.

Sorry, I digressed a bit, but what I wanted to say is that what happened in May, 2021 and what's happening now is just a million times worse. it's hard for me to explain. It's unprecedented. And similar to it is the level of censorship and crackdown on speech. And there, to go more directly to your question, of course, you see the asymmetries of power. You see the asymmetry of an occupying force that has dedicated resources A, to monitor and submit requests for removal. We're talking about the Israeli Cyber Unit, which is housed within the Israel's Attorney General's office is job is to report content to social media platforms on a voluntary basis, meaning that the content is violating the terms of services of the companies themselves and not necessarily Israeli law, which skips a legal or a judicial process that should take place for censorship. And it also means that those types of requests are not captured by company's transparency reporting.

So effectively, content is being removed at the request of the Israeli government, and users don't know that they're being censored because of government. And in this case, we're not talking about, again, we're not talking about people in Palestine or in Gaza. We're talking about people in the US, people in Europe, people from different parts of the world who are being censored and the rights of freedom of expression violated at the request of a foreign government. And the Cyber Unit, of course, never shies away from sharing how diligent they are at their work. So we know that they've submitted tens of thousands of requests to companies, and the compliance rate remains up in the 90s. I think in November they said that the compliance rate with their request was up to 92% or 94%.

Justin Hendrix:

So let me ask you a kind of clarifying question on that, because one might say folks can report, but the company has to decide whether this is violative of its policies or not. So if the rate of take down is high, it would suggest on some level the company has determined that the content it was presented with did violate its policies. Is that what you're seeing in practice?

Marwa Fatafta:

So here's the thing, companies push back against transparency because at least some, their line of thinking is if we publish that type of information, it would encourage other governments to report to us or it would trigger a backlash from governments. Like we're reporting that content to you that violates your own terms of services. Why aren't you complying? I think the question here is not whether the content itself violates the terms of services and also the terms of services of those companies are questionable, especially when it comes to Israel-Palestine. And we can talk about some of those discriminatory policies. So I'll park that issue for now. But I think as a user, you have the right to know why your content is being removed and especially if it's coming from a government. And when governments dedicate a lot of resources, and at the end of the day we're talking about powerful actors.

If the government knocks on companies door to say, "Hey, you need to take down that content." Most likely that they would. Another interesting revelations this during this war is the level of organized effort and let's say non-state or state affiliated effort to report content. There's been a couple of revelations around Israeli-led efforts by individuals or groups that are state affiliated or probably state funded who claim that they're using their connections within the companies to take down content, which again, emphasizes this shady nature of relationship between the platforms and governments or government affiliated individuals. So it becomes a question of access. So if you have the resources, you have the access, then you can take down content. If you're an oppressed group or a marginalized group, then good luck. That's not how the system should work.

Justin Hendrix:

So you sent a formal letter two months ago following a meeting with Meta executives in February. I understand that you raised three key issues. You've already mentioned systematic censorship, silencing of Palestinian voices and Palestinian related content. Meta is inconsistent response to armed conflict situations. You've already mentioned perhaps how this situation may differ from perhaps the response in Ukraine. And then, you've mentioned the third area, the proliferation of hate speech, dehumanization, genocidal rhetoric, and incitement to violence against Palestinians, which exacerbates the likelihood of offline violence, as you say in your letter. Can you speak a little bit about the hate speech piece especially? So on the one hand you're saying we're seeing more and more take downs of content that are pro-Palestinian in some context or come from Palestinian sources, but you're seeing on the flip side, not enough take downs hate speech or dehumanizing content against Palestinians.

Marwa Fatafta:

Yeah, exactly. That's the over and under moderation dual being in full action. It's really unconscionable to me that you see a Palestinian photojournalist documenting war crimes and potential war crimes and human rights abuses and the killing of children and women being censored and banned while you have accounts of settlers calling for the ethnic cleansing of Palestinians in the West Bank, or Israeli state officials just posting sheer genocidal rhetoric. Or even worse, we've seen one of our partners, 7Amleh, did this little experiment using targeted advertising. And so, they ran a few ads in the system that call for wiping out Gaza or calling for the Holocaust of Palestinians, and these ads had been approved.

They pulled them back before they were published, but to prove the point that after October 7th, we saw literally an explosion of genocidal rhetoric and hate speech and Islamophobia and also anti-Semitism, which has real consequences. Here in the US a mother and her child were stabbed multiple times. The child was stabbed 26 times by their landlord. We had three Palestinian students here in the US was also shot, luckily survived. And when we're talking about the real situation where there's a real risk of genocide, and again, of course the court at the ICJ is looking into that case, but from my perspective where I stand watching Gaza is no less than genocide. And so, things are extremely volatile.

And again, platforms are not taking any effort to ensure that the content is removed, not to mention doing any effort to understand whether any of the content they're hosting or allowing to be posted on their platforms are linked to international crimes like atrocity crimes, war crimes. I'll give you an example. If you scroll down, if you go to the Israeli Army page, so the IDF page, whether on Twitter or on Facebook, I've done that exercise myself, clear disinformation campaigns that aim to justify, for example, attacks on hospitals.

So before the attack on Al-Shifa Hospital, the biggest hospital or medical complex in Gaza now completely annihilated. And there's been what mass graves being discovered there. Really, people lived through horrors in that hospital, but before that there was just piles and piles of content that claims that this was used as a command center by Hamas, and hence it removes its protection under international humanitarian law and therefore such attacks can be justified. Of course, a big question mark for me is that, and that's what we asked Meta and Nick Clegg specifically, are you doing your human rights due diligence? Are you looking into this type of content and understanding to what extent your services and platform and policies are contributing to these gross human rights abuses and atrocity crimes? And the answer, if you read the letter, does not exist. They just refer to some random ongoing human rights due diligence.

The same question we asked, especially after the ICJ ruling back in January this year where the court or the International Court of Justice determined that the risk of genocide is imminent and that what's happening in Gaza, it could indeed be a real risk of genocide and ordered the state of Israel to take measures to prevent the crime of genocide, including stopping incitement to genocide, which by the way is a crime on its own. You don't need to commit genocide for incitement to genocide to be a crime. That on its own is a crime and needs to be removed. And then also not obstructing access to humanitarian aid, as well as not destroying any evidence of war crimes or other human rights abuses. And so we asked Meta, "What's your assessment of the situation?" If I were a lawyer at Meta, that's something I would want to do to understand to what extent are they being complicit, or abetting, or aiding the crime of genocide? But at least in their interactions with us, there's been no satisfactory answer at all.

Justin Hendrix:

So the company has replied and it says that it is guided by core human rights principles, including respect for the right to life and security of person, protection of the dignity of victims, non-discrimination and freedom of expression that it, quote, "Looks to the UN guiding principles on business and human rights as enshrined in our corporate human rights policy to prioritize and mitigate the most salient human rights." And that it, quote, "Also used international humanitarian law as an important reference in establishing its policy towards this situation." And they also say, quote, "Obviously in exceptional and fast moving situations like this one, no response can be perfect, lines are difficult to draw, and people and systems can and will make mistakes."

What's been your continued back and forth with them since this exchange? This is a very formal process. Right? You've had a meeting, you've issued a letter, the company has clearly taken you seriously, at least enough to produce a document by reply. There's various kind of material they've provided in response, not necessarily in answer to all of your questions, but at least they appear to be in dialogue with you. Are you hopeful at this point that the situation will change or do you suspect more of the same going forward? What's it like these interactions at this point? Is it chilly or do they pick up the phone when you call?

Marwa Fatafta:

They do. And to be fair to them, they're quite responsive and engaging. Since October 7th, we have been, and by we I mean a coalition of civil society groups that formed the Stop Silencing Palestine campaign, which was launched in May, 2021, and here we are again demanding the same things from Meta. So we've been meeting with them on a regular basis. And then we quickly realized that, again, those meetings are not leading to any substantial change. They're not leading to concrete actions, for example. So here's the thing. You have all the censorship and whatnot, and then you have the hate speech and genocidal rhetoric, and we were asking for specific policy changes. For example, look, you need to have what you call like a newsworthiness allowance. When people are, especially for journalists from Gaza, when they're reporting from the ground, they will be reporting disturbing stuff, graphic stuff. And maybe some might violate your policy, but you need to make exceptions because that is so important.

Again, we're talking about an isolated population, a blockaded population where there's been just intermittent shutdowns around the clock, and where again, international media doesn't have access. And quite honestly, journalists in Gaza have been doing God's work under unbelievable and inhuman circumstances, whether it be surviving or having their families attacked or going through famine. They're doing everything with their hands to show the world what's happening. And in a context of just widespread disinformation and war propaganda.

And the last thing you want is, I don't know, like stupid Instagram algorithms to flag your posts of dead children as nudity. It's just unbelievable. So the company, the least that they can do is just have better due diligence, have extra care in assessing that content. But what have they done? Quite the opposite. They, for example, decided that content that comes from Palestine and particularly comments, the threshold for their algorithms detect hate speech or inflammatory content, should be reduced to 25% for content generated from Palestine. And Meta knows that their algorithms are all messed up. They're all running amuck. And again, the BSR report from 2021 should make that crystal clear to them, and they're still relying on those tools. They're tinkering with them to the end of suppressing content coming from Palestine. And that's why we wanted to meet with Nick Clegg because we thought, okay, they're human rights teams, the original folks are all fantastic people. They listen, they want to engage with us, but the decisions need to be made at the top.

And we met with Nick Clegg. It was a very quick meeting. We were given what? 30 minutes, literally 30 minutes to discuss all these very important issues that are life or death for some people. That's why we followed up with the letter, because we want answers. We are not there just to give the company a hard time. We really want serious answers about their human rights due diligence efforts. What are they doing to protect people? What are they doing to ensure that the issues we've been talking about for the last 5, 6, 7 years are not repeating themselves again? And mind you, Justin, you'll see very quickly after October 7th that the company's reaction was geared towards removing terrorist content. So they've been just removing content in bulk, and they've rolled out a few trust and safety or safety measures. And if you check their website last time, they've updated their newsroom about their crisis response was in December. And what they also informed us in the letter that some of those safety measures that they've taken were temporary and were lifted, but the genocide is continuing, it's May.

So it seems to me even their reaction to the October 7th attack was mainly geared towards protecting Israeli users, or focusing on the Hamas attack per se, and not what's happening in Gaza. I don't find any other explanation as to why they're not putting the same effort into that in their crisis response, even worse. And that feels really like a slap in the face. But after all this time, all these months, again, time and time again, campaigning, meeting with them in private, public, and the only action they came up with and they wanted to consult us on was whether they should protect Zionists under their hate speech policy.

So again, it's clear that they are looking at their options, but only for one group over the other. And that just emphasizes on the discriminatory nature of their response and their actions. Yeah. Am I hopeful that the companies would correct course? Not really. But I also think that this is an uphill battle. And even though change is painfully slow, we can't let companies off the hook. We need to hold them accountable to their words, and I'm sorry, human rights and the guiding principles on business and human rights and international law and all of that is not just some slogans that you can have on your website or in some glossy human rights reports. They're there for you to action. And in the Palestine context, as we discussed for the past few minutes, that's definitely not the case.

Justin Hendrix:

One of your partners on this letter is Mnemonic, which is known for its documentation of war crimes in particular in Syria, the Syrian Archive and other archives that it's created around the world. I know it's been active also in documenting war crimes in Ukraine. What response has the platforms given you on that, or Meta in particular? Are they being useful to efforts to preserve material that may be useful in war crimes, proceedings in future?

Marwa Fatafta:

Yeah, that's one of the campaign's main demands that content needs to be preserved and it needs to be accessible for future accountability and justice efforts. Meta and their response to us told us that they're currently can't remember the right wording that they used for us, but that they are going to preserve content and make it accessible, but they're not consulting with civil society on that point. So we don't know exactly what the system that they're using and whether it be will it be accessible by civil society? And yeah, it's pretty much opaque, similar to their also crisis response. It's really opaque. No one knows what their crisis policy is like crisis protocols. We just know that they've launched them, I believe in August last year or sometime in 2023, but we don't have as civil society any insights into that, which doesn't aspire much confidence to the level of archiving that the company is doing.

Justin Hendrix:

So the company does say that, quote, "We support justice for all international crimes, and that it has publicly stated that it will work to develop an approach to allow international courts accountability mechanisms to make requests, et cetera." But points to complications, significant legal, privacy, and policy considerations inherent in that work and suggests that the Oversight Board will continue to work on those matters. Have you had any interaction with the Oversight Board itself?

Marwa Fatafta:

We did. Yeah, we did. The Oversight Board has, so they've looked into two cases for the first time, what was the word? Didn't ask for input from civil society. It was just like an accelerated case review, one regarding Al-Shifa hospital and one regarding Israeli hostages. And those decisions were very important. And again, pinpointing how company's decisions or content moderation actions have been flawed. Interestingly enough, for me, the two cases were separate. Where the word did not make it extremely clear is that, for example, in the Israeli hostage case where Meta's reactions, like any content showing or identifying hostages, which is a violation, would be a violation of international humanitarian law, should be removed, should not be allowed on the platform. But then they quickly realized that the families of those hostages and other advocates are using their pictures or their content to highlight their plight and also to fight against misinformation, et cetera. And so they made an allowance, they made an exception to that rule. Such exceptions have not been made to content coming from Palestine.

Now, the Board is looking into another interesting case on the use of the slogan "From the River to the Sea," which for people following the implications of the Gaza war around the world, this slogan has been so charged and so criminalized and banned even in multiple places in Germany where I live. The Ministry of Justice a couple of days ago declared that this is a Hamas slogan and therefore any use of it will be a criminal offense, even though multiple courts, including in Germany had ruled that the use of the slogan is protected by freedom of expression. So it's interesting now to look at how the Board will look into that case and what they will rule. As you can imagine, there will be two polarized opinions on this.

And finally, I would say the Board's decision or advisory opinion on the moderation of the word 'shaheed' is a timely one, even though it took them a long time since they've announced that case where Meta asked them for their opinion about how they should tackle that word. Shahid in Arabic, I think the closest translation to it in English means martyr. But in the Palestinian context, we literally refer to anyone who has been killed by the Israeli forces as a shahid. We don't use the word kill or killed or dead, we just say shahid. And this word accounts for most content removals under all of Meta's community standards. So across all of their content moderation policies, which just is shocking, but at the same time really just points to the disproportionate impact of Meta's policies on Palestinians and Arabic speakers at large.

And the Board in their decision made it very clear that Meta's actions right now are disproportionate and are rights violating and the ban should be lifted, and it should have a more nuanced approach to tackling how to tackle that word. And so yeah, I think the Board is taking on a few emblematic cases, which I hope would result in policy changes, not just in leaving content up or down, but in addressing the systematic censorship and biased content moderation policies.

Justin Hendrix:

Well, we will see if that is the case. What is next in your campaign? What can we expect from Access Now in the next couple of months?

Marwa Fatafta:

We will continue with our advocacy for sure. We will continue with our documentation work. Currently, we're working on a case study that examines platforms accountability under international humanitarian law and criminal law. So we'll look at the implications hopefully, of the implications of the ICJ ruling on platforms and what they need to address. And I'm personally really looking forward to the study coming out because not only there is a gap in the normative framework around platforms operating an armed conflict, but we need to also look at the gaps and how companies are probably being complicit in multiple violations.

So there is that, and we will have a letter addressed to Google in a couple of days regarding Project Nimbus and their possible complicity in abetting or aiding atrocity crimes with providing their cloud computing services and automation to the Israeli Ministry of Defense and the Israeli government at large, which I'm sure you've seen the very tech from hell types of reports showing how the Israelis are using AI to automatically generate targets and generate kill lists. And that led to wiping out families and decimating entire neighborhoods. We'll be knocking on Google's door very soon demanding transparency and really just disclosure on their services. Yeah, we're also working on shutdowns, but don't want to hold you for another hour talking about our future work. But yeah, there are some of the stuff that we're working on at the moment.

Justin Hendrix:

Marwa Fatafta, I appreciate you taking the time to speak to me today.

Marwa Fatafta:

Yeah, of course. My pleasure. Thanks for reaching out.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics