UN human rights experts that chronicled Facebook’s role in spreading hate speech in Myanmar concluded that it played a “determining role” in the genocide against the Rohingya people. Facebook’s own investigation into the situation also found fault with the company’s practices, and made various recommendations for how it should develop a human rights strategy to protect against such things from happening again.
Today, we’re going to hear from a refugee from the violence, who is with other Rohingya refugees in a camp in Cox’s Bazar, Bangladesh, as well as three human rights advocates. We’ll learn about:
- A complaint filed by sixteen Rohingya youth to Ireland’s Organisation for Economic Co-operation and Development, the OECD, that argues that Facebook violated the OECD Guidelines for Multinational Enterprises by allowing its platform to be used to incite violence against them and their community. The remedy these refugees seek is for Facebook to divest from a portion of its 2017 profits (the year the genocide began) and provide remediation for the Rohingya community in the form of educational activities and facilities in Cox’s Bazar.
- An investigation from Global Witness that found that Facebook still cannot detect hate speech capable of inciting violence and genocide against the Rohingya despite commitments to better detect Burmese language hate speech. In an experiment, Global Witness submitted explicit and violent ads containing examples of real Burmese language hate speech against Rohingya, and Facebook approved all eight ads. (Global Witness pulled the ads before they were published.)
- Further context from a representative from Avaaz, the global advocacy organization, which as helped to bring attention to the problem of hate speech against the Rohingya and Facebook’s role in it as part of a broader campaign for more just practices by technology companies.
My guests include Maung Sawyeddollah (one of the complainants represented by Victim Advocates International, and Executive Director of the Rohingya Student Network), James Douglas (author of the complaint to OECD, Legal Advisor at Victim Advocates International), Ava Lee (Digital Threats Campaign Leader at Global Witness, and Antonia Staats (Campaign Director at Avaaz).
The complaint filed with the OECD argues that Facebook violated the OECD Guidelines for Multinational Enterprises. Before filing the complaint, these Rohingya youth advocated for months for Facebook to support an education project in Cox’s Bazar, where there is a lack of access to formal education. After a meeting with the company, Facebook’s Director of Human Rights Miranda Sissons told the group that Facebook could not fund their request because it needed “a more direct link to [Facebook’s] products” (see Sissons’ letter here).
Please note that the connection to Cox’s Bazar was not perfect- if you have any trouble making out a word here or there, you can refer to the lightly transcript below.
Sawyeddollah, maybe we’ll start with you. Can you explain to me what’s happened that brought you into this case and led you to where you are at the moment?
Yes. Actually, to explain what happened, like why we are taking this case, how we are taking, and where we are now, to understand all these things, we must go a little bit to our situation. So basically, it’s a historical record that Rohingya people had been living in Arakan in Rakhine State in Myanmar together with all other ethnic groups, living there peacefully. We do have really a good time. We have peace. We live there peacefully, but unfortunately, when the mind of some very narrow-minded people changed against Rohingya. There are some extremist politicians we also collected who are those peoples. Then they started hating Rohingya people. It’s their personal intentions because they do not want Rohingya people living in Myanmar. Then they started organizing campaigns and creating the campaigns, but we were living peacefully, all other people living there just accepted those people.
But they took help from the Facebook platform. They exploited their intentions, their motives, by sharing different kind of false news, false information, rumors against Rohingya peoples, like “These Rohingya peoples are very bad,” and like, if the peoples, if these Muslim people lives here continuously in our country, then soon we will see our President with a beard. They posted different kind of hate speeches on Facebook against the Rohingya, and then all other people who were good to us also changed their mind because every day they were seen the posts on Facebook against us, but Facebook failed to remove those contents against us, those that were spreading hate for the Rohingya. And finally, all the peoples changed their mind. They were hating us online. Like they were seeing on Facebook, but it turned to offline life.
They started hating us even in our offline life, and finally, we left the country. Now, we are surviving here in refugee life in Cox’s Bazar. So we noticed that was happening all because of the contributions of Facebook, but we didn’t know that we can take this step against Facebook because we were thinking that Facebook is really a big organization or really powerful. We are nothing in front of Facebook, so maybe there is no any way to go against Facebook to go for our own rights. Finally, we met with a big team at Victim Advocates International and all like some other international experts, and we explained to them our situation. We explained to them how we suffered, what we suffered.
We are also understanding step-by-step the situations, the systems, like there are also some ways for us that we can go to fight for our justice. We can go fight Facebook, like get help to violate our human rights in Myanmar. Then we started going to take this case as the Victim Advocates International showed us the way that there is a way to go against Facebook that is called OECD. So we have started with OECD, especially I read all the guidelines, all the information contained in the document OECD. So I read all those information and I also realized that yes, there is a way to do some things. So we started doing that, and we already submitted our case to the national contact point (NCP) in Ireland of the OECD.
So we explained how we suffered from what the Facebook did to us, and we demanded that Facebook engage with us and they must pay us. They must pay us as the compensation of what it means to us, and we are asking the Facebook to pay us $1 million dollars- they earn more than $1 billion as they abused us, by contributing their business in the human right violation we suffered. So we are asking for very few. We are asking for that money to rebuild our lives, to make education programs in the camp. So the case is still in process. We are now waiting to hear back from the NCP, so this is the time we are waiting still to hear from NCP. Thank you.
Thank you. And I may come back to you with a couple of additional questions, so keep an ear, but James, can you tell me, maybe picking up from there, how you got involved, and what is the status of the case right now and the jurisdiction?
So Victim Advocates International got involved. Firstly, we were not pursuing legal or quasi-judicial grievance mechanisms. We were assisting the victims we represent within Cox’s Bazar to engage directly with Facebook in discussions about how they could make life better in the camp, given that they had acknowledged that they had done too little to prevent the violence that happened in 2017. So it all started out with very kind of polite email exchanges with Miranda Sissons, who is the Director of Human Rights with Facebook. We were listening to what was really needed in the camps, and education was something that many groups had pointed which was missing, like the curriculum that followed what they would’ve learned in Rakhine state in Myanmar. So we asked Facebook if they would be willing to kind of to fund these type of educational projects, and they said that it was something that they would consider.
So after doing this, a group of students and youth groups within the camp, they spent three months devising an education proposal, which would’ve cost Facebook a total of $1 million. They spent three months developing this proposal, and they submitted it to Facebook. Facebook responded three months later saying that they do not typically engage in philanthropic activities, and if they do, it has to have more of a direct link to their products such as digital empowerment or internet literacy. So it was kind of at this stage where we stopped our focus around asking for what Facebook coined or deemed was philanthropy to asking for a remedy. Because Facebook contributed to the harms that were suffered in 2017, they owe. It’s not philanthropy, but they owe the Rohingya remediation in the form of an education.
So that’s kind of what brought us to this step. I had kind of previously worked with these OECD complaints mechanisms before, so I kind of started looking into what was publicly available and doing my research on whether or not Facebook’s behavior, actions, and omissions around the 2017 operations would constitute a breach of the OECD guidelines, which are guidelines that every state party to the OECD is bound to implement.
And these OECD guidelines, they incorporate the United Nations guiding principles on business and human rights. And when I was looking into kind of the corporate structure around Facebook, I saw like that Ireland was very central in their international operations. It’s called their international headquarters, their Dublin office. The content moderators for 2017 were all located here. The data for Myanmar is located and stored in the Dublin offices, and the contracts for the military are governed under Irish law. So for all of Facebook’s operations outside the US and Canada, it’s their Dublin office that takes care of it, which is why we decided to file before the OECD national contact point in Ireland and not the United States.
So let me just come to you, Ava. How did Global Witness get involved in this?
So we were very much kind of looking on the sidelines as Global Witness, really shocked to hear that Facebook were refusing to meet this wholly reasonable demand of the group of Rohingya that were just asking for contributions to their education who were living in this refugee camp. And yeah, actually appalled that Facebook just refused to do that and were willing to fight against them.
But from our perspective, at Global Witness, we do investigations and we also advocate for systemic change of the big tech platforms within our Digital Threats campaign. So we were really interested when we started to see the action that was taking place that James was leading with victim advocates, international and others, what we could do to investigate how much Facebook had really changed since the kind of beginning of the genocide and the beginning of this violence, because they’ve said a lot about how much they’ve invested in their ability to detect Burmese language hate speech, which is what a lot of this hate speech was. They’ve really invested in content moderation, both from an automated perspective and in terms of real people who have those language and jurisdiction skills. And unfortunately, when we started investigating it, we found that, even with all of this supposed investment, they’re still not able to detect really, really extreme hate speech that exists in real life. There’s all these examples that we found.
And can you tell us about how you did the test? Tell us a little bit about the methodology and what you did.
So we collected a range of hate speech, real life hate speech in Burmese language against Rohingya. These were all examples from a United Nations Independent International Fact-Finding Mission on Myanmar in their report to the Human Rights Council. We selected eight examples that were really, really horrible. We chose not to publish them in their entirety, but other journalists that reported on it did. And we created ads… So we wanted to be able to test Facebook’s ability to detect the language without actually posting any of this content and continuing to spread hate that had already been spread. So we chose to post ads, which had to be approved by Facebook and then which we could obviously delete before they were published. We found eight examples in the end, and we posted all of them and all eight were approved by Facebook and were ready for publication before we pulled them.
So we were really shocked. We didn’t think that this would be the case. They were already examples of hate speech that presumably Facebook would’ve known about, it’s in a very widely available publication. And yeah, as I said, really, really nasty stuff and pretty terrifying that four years after Facebook admitted it played a role in inciting the violence that led to the genocide, it still wasn’t able to detect such extreme hate speech in Burmese when it says that it’s invested so much in being able to do so.
Antonia, how did Avaaz get involved?
So our starting point, in a way, was what happens online doesn’t stay online. We’ve already heard from Sawyeddollah about his experience, and what story after story over the past years has shown us is that social media platforms can cause real offline harm, can really wreak havoc in people’s lives. And I think the Myanmar case here is a really extreme example. The Biden Administration, not too long ago, ruled that the violence in Myanmar amounted to genocide. And let’s also not forget that Facebook itself admitted that the hate speech that was left to flourish on its platform played a role in inciting that violence. We’ve also recently heard from Facebook whistleblower Francis Haugen, who mentioned, for example, that engagement-based ranking was literally fanning ethnic violence in places like Ethiopia. So I think there’s also an important aspect around marginalized communities, communities in the global south, communities in places and countries that maybe aren’t the highest priority for a company like Facebook, really facing the sharp edge of some of these harmful consequences.
So that’s a kind of pretty horrendous situation as it is. What we’re beginning to see is people fighting back and sort of addressing this kind of harm, whether that’s the OECD case in Ireland, whether it’s the Rohingya lawsuits in the US and in the UK, and that there is a sort of budding movement of tech harm survivors whose experiences, even though they may be really different in terms of real life effects on their lives, are sort of tied together by a connection of something harmful happening online that has an impact offline. Just to sort of broaden that scope out as well, I really think there is a sort of thread running from the Rohingyas’ horrendous situation to doctors not being able to do their job because their kind of medical advice gets drowned out by health misinformation that’s just going viral on social media, to parents who have lost children to online challenges that those children have encountered on social media, to Muslims in Assam in India who are facing hate speech and threats to their lives. And that list goes on and on and on.
We had the great privilege of gathering some testimony from Sawyeddollah and some of his fellow complainants and turning them into a short film to really help bring these really important voices that very often don’t feature that prominently in discussions about how to regulate big tech, what needs to change, and really sort of bring these important testimonies to the conversation as well.
James, let me just come back to you around OECD. Can you just give us a sense of the timeline on this? I assume these types of inquiries aren’t exactly hasty. They’re not quick. What should we expect over the next few months about the conversation, and what has Facebook said?
So the first thing to note about the OECD process is, it’s a quasi-judicial grievance mechanism, so it’s different to litigation, and it only works in so far as both parties, the complainants and the respondent, who in this case is Meta and Facebook, are on board. So typically, it is estimated that the whole process should take around a year. The initial assessment phase is supposed to be around three months. That’s when the NCP makes a determination as to whether or not our complaints, our allegations are bonafide, and then the NCP will try to bring the complainants and Facebook together for good offices, which is essentially like a mediation stage. So the resolution will be determined through negotiations between our complainants and Facebook. As you said, these things are very complicated and usually take much longer than what is stipulated in the procedural guidelines and the practice documents. So we are still in that initial assessment phase, which is where we are expecting the NCP to make a determination to proceed to the next stage.
We can’t really speak any specifically about any of the discussions that we’ve had during this phase, but all I can say is that we’re still in the initial assessment phase, and we’re hoping to receive a response from Facebook at some point.
And let me ask you this, just in terms of looking at this. I mean, you’ve essentially run a test that suggests that Facebook’s technical systems are perhaps not up to snuff with regard to looking at Burmese language content, especially advertising, but is there any sense that you have of the kind of human focus they have now on these matters? Do they seem to have more of an organization either in the region or in the country? Do you feel like they’ve made substantial investments or that you’re seeing the results of those investments? I don’t know if that’s a question maybe for Ava Lee or for James.
I think both of us can speak to maybe different aspects of that because some of that actually relates specifically to what we’ve alleged in the complaint. Not only did we allege that Facebook failed to conduct due diligence around its business operations before entering the telecommunications market in Myanmar, we’ve also said that their human rights policy, which they submitted last year, does not meet human rights standards. And this is because specifically, it places too much emphasis on content moderation and it doesn’t actually address what Antonia alluded to earlier. It’s this engagement-based algorithms that amplify hate speech, and there’s no due diligence, Facebook has not done due diligence around the algorithms in and of themselves. If you hire more content moderators, you’re only going to capture a fraction of what’s being posted, and this is something that Francis Hagen has said explicitly.
It’s just one, tiny part, and without this broader systemic due diligence around this whole algorithmic data-driven business model, it’s not going to have much of an impact. It also has a huge amount of harm on the content moderators who suffer awful psychological trauma from having to moderate images of beheadings and genocide in real time. So I think that Facebook have used the language of human rights but in a way that they’ve always kind of controlled the narrative. They’ve admitted some, acknowledged some wrongdoing but not to the extent that would make them own responsibility so that they would actually have to provide a remedy. They’ve just said that they did not do enough at the time, and we think it goes a lot deeper than just not hiring enough content moderators who were attuned to Burmese language and culture. They had two at the time, and as Ava has pointed out, they’ve invested more money in Myanmar than any other country because of what happened in 2017, and it’s still not working.
I think that it’s clear that they need to do so much more on the algorithmically-driven hate that we see across the platform and across countries around the world. I would say though, that kind of content moderation is important too, and well, many of us are fortunate to live in English language speaking countries where actually what they do in content moderation is worlds apart from what they’re doing for the rest of the world that isn’t English language speaking. And so we pushed it to Facebook, we put our investigation to Facebook, and they responded not to us. In fact, they didn’t grace us with a response, but when the Associated Press responded… Sorry, I’m going to say that again. They didn’t respond to us, but Associated Press wrote an article about our investigation, and they did respond to them and said that they have invested heavily in Burmese language technology and built a team of Burmese speakers whose work is informed by feedback from both experts, civil society organizations, and specifically, the UN Fact Finding mission on Myanmar, which was exactly where we took our examples from.
But our investigation demonstrated that that just clearly isn’t enough. And I think another key issue with this is the sheer lack of transparency about what they’re doing in any of these jurisdictions. It’s incredulous that they only had two at that time, but the reality is that, we don’t really know how many Burmese language speaking content moderators they have. We don’t really know what they’ve done with this technology beyond some very, very top line reports. Elsewhere, in countries where we’re seeing incitements to violence and real life violence happening on the streets right now, places like Ethiopia, places like India, where there’s a real risk of genocide happening there too, again, there’s zero transparency with what the platforms are doing in relation to content moderation.
And so, while I absolutely agree with James that a core part of the problem is with what the algorithms are doing, until that’s completely fixed, and even when it is completely fixed, if we can ever envisage this utopia ahead of us, there will need to be content moderation too. And the people who are doing that need to have their work as rights protected. They need to not be completely traumatized by what they’re seeing. They need to be paid properly, but they also need to be there, and we need to know how many people they are that are doing this work so that we can really be sure that the platforms are taking something incredibly serious, as seriously as it needs to be taken.
Sawyeddollah, could you perhaps just explain, I think maybe some of my listeners may not understand the role that Facebook has played there in Myanmar for people. What is its importance as a platform, and what was your experience with it?
Yes, actually, when all that things were happening in Myanmar, like when some people were spreading hate speech against the Rohingya and some other people started following those posts and making these reactions, at that time, Rohingya people were even not legally allowed to use smartphones. Just very few people, like me, used a smartphone. I had used a smartphone and used Facebook there. So basically, all the posts I saw against Rohingya on Facebook platform was writing in Burmese because the people in Myanmar read the information, which are written in Burmese much more than those that are in English because… Actually, you could also see, there is a person, we know him as the name Wirathu. He’s a Burmese religious leader and a monk. He’s also a politician.
So if now, if we check his speech, he’s still continuously spreading his hate speeches against Muslim religions, against the Rohingya like that. So that kind of influence terrorist people in Myanmar usually use the language Burmese on their posts so that everyone, every peoples in Myanmar can easily understand those posts because almost all the people who are living in the Northern side, in Rakhine, read Burmese more. Also, I already made some reports to Facebook against some of those posts, but Facebook didn’t respond to me in some posts, and sometimes, they responded the post that I reported is not against community standards. That’s made me something like maybe Facebook is not understanding even the language, Burmese language. Like I thought, because they were saying that post is not going against community standard.
But in my eyes, I’m seeing that it’s definitely going against community standard. But Facebook was not understanding that. They were not removing those posts. Unfortunately, we were very few people, so I just reported one or two posts when all those things were happening in Myanmar, and I unfortunately didn’t have success to even remove one post by Facebook. But now after coming here to the camps, I reported, I think, several posts, and I saw like one or two posts have been already down, removed by Facebook. So the language issue was also something like… I see the way I explain now.
So have you ever talked to anybody verbally at Facebook? Have you ever met anyone from Facebook?
Actually, physically, I didn’t talk with anyone from the Facebook, but as I already mentioned, I just reported it to some post. After coming to Bangladesh, we sent a letter to Facebook, but they didn’t respond. We had a meeting with Miranda, I think you already know Miranda, the Human Rights Director of Facebook. We already had a meeting, but we didn’t get any satisfactory feedback from Facebook. Then we sent another second letter to Facebook, and they diplomatically rejected our request. Then we have started taking this step. Before taking this step, we were not saying to Facebook that we would be taking a case against Facebook, but we were asking them that Facebook contributed its business in the human rights violation we suffered. So we are just asking for some help to rebuild our life, but Facebook refused, they rejected. And finally, now we are taking this step because there is also the process. There is also a mechanism that can make Facebook hold to account for their violations to Rohingya.
Got it. I want everybody to think about something they want to say in conclusion here, but Sawyeddollah, I might just ask you as a last point. If Mark Zuckerberg were to listen to this podcast, what would you hope he would understand about your situation there?
Yes. If Mark Zuckerberg is listening now to my voice or he’s already doing this podcast, like he’s listening to our voices, he must be very ashamed. If I see him, suppose he’s now in front of me, I’d say something to him, I’d be very ashamed of him because he thinks of himself as like a very big man. He has a very big organization. He’s leading the world. He’s making his dream come true, but he doesn’t care about the dreams of other people, because his dream is destroying the dream of other people. He doesn’t care about that. So it might be all, he’s not a big man, he’s not a big man. I see him like that because, you know, he sees himself as, ‘I’m very, very big one. I’m controlling this world. Now, the world is on my hand.’ Yes, it is true, the world is on his hand, but it is us. We are supporting it still, that’s why the world is in his hand. But if we stand together, if we get hand-in-hand, then he will be nothing. He will have to show his hand to all of us that there is nothing in his hand. We all are his power. We all make him the big one, but now, he’s not caring that he makes us suffer.
I just want to place this in the context of everything that’s come out about Facebook in the Facebook Files since the end of last year. It’s not as though… Facebook, it was brought to its attention multiple times that its platform was being used to incite hatred against the Rohingya specifically on numerous occasions and it completely fails to act. So I want to just place this in the broader context of Facebook knowing and failing to act. And we hear a lot of news stories around Facebook and Instagram and the impact it’s having on teenage girls, which is all very important, but I really think that I’d just like to reemphasize that what happened to the Rohingya in 2017, the escalation of violence, is really the worst thing that has happened as a result of Facebook’s lack of due diligence on its business model.
So I think our investigation has added to the evidence that Facebook can’t regulate itself and everything that is going on with this complaint and with the broader cases against Facebook in relation to Rohingya genocide really speaks to that. And we’ve seen some really incredible groundbreaking moves in Brussels in relation to the Digital Services Act, which, for the first time, is bringing much, much more regulation than we’ve ever seen before to these big tech companies so that Facebook would have to do a proper risk assessment in the way that its algorithm may be promoting hate in the way that it was in this context. And that’s great. It’s really important, but now, we need to see more countries follow suit. In particular, we really need to see this type of regulation in the US, and we need to see the Digital Services Act properly upheld so it can start to have an impact. So that’s what Global Witness will be looking at over the next few years.
I have two main points to finish up. One, to any lawmakers, any social media platform employees listening, I think there’s a really simple ask to listen to people like Sawyeddollah, people whose lives have been really harmed by some of the things happening online and kind of pay attention so that experiences like the one that we just heard about where Sawyeddollah reports hate speech that’s targeting his community in that context of genocidal violence, it just sort of goes nowhere. And then secondly, yes, in Europe, we’ve just seen some really important moves around creating a framework for tackling tech harm with the Digital Services Act that will force Facebook, Google, TikTok to study the amount of harm through their platforms and kind of be audited on how they have assessed these risks. The comparison that I really like is in a way to kind of Paris Agreement.
So for CO2 and pollution, we can think of this harmful content as polluting our information environment, and in the DSA, we see a sort of Paris Agreement for the internet. We need kind of transparency on what harm is actually happening, what the platforms are doing to address that harm, and then access to some of the data so organizations like Global Witness, like Avaaz, like many others who have been doing excellent research in these past years, don’t have to sort of like Scrabble around for some of that data.
For understandable reasons, we’ve all spoken a lot about Facebook on this call, but the reality is this is happening across a number of the big tech platforms, and none of them are doing enough. We need to see really systemic change across all of them so that the most egregious harms, like genocide, don’t continue to be amplified and encouraged by the way that these algorithms work. But also just so our democracy is no longer kind of really toxified and hate isn’t spreading in the same huge way that it is right now so that we can start to see a much more progressive politics that brings lots of people together and helps us to face the huge challenges that we are definitely going to have in the next 10 years in relation to the climate emergency.
I would just like to say, often these discussions around tech harms are put in the context of very technical language, but as Sawyeddollah has shown us today, there’s like a real human impact and real life consequences. And you see the 750,000 Rohingya whose lives were upended and are now languishing in refugee camps in Cox’s Bazar without an education. I mean, at the very least, there has to be some sort of obligation to remediate that as well as fixing the business model going forward, which is very important. We also need to see concrete remediation for those who have already been impacted by the harm.
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & Innovation. He is an associate research scientist and adjunct professor at NYU Tandon School of Engineering. Opinions expressed here are his own.