Home

Donate

Checking on the Progress of Content Moderators in Africa

Justin Hendrix / Dec 3, 2023

Audio of this conversation is available via your favorite podcast service.

For the past two years, there has been a steady stream of news out of Kenya about the relationships between major tech firms – including Meta, TikTok and OpenAI – and outsourcing firms like Sama and Majorel that have employed content moderators on their behalf. In the spring of this year, more than 150 moderators announced the formation of the African Content Moderators Union, which advocates for better pay and working conditions, and a lawsuit against Meta is working its way through Kenya’s courts. This month will see an important ruling in that case.

To learn more about the situation on the ground and what it’s been like for the individuals involved in this fight while the legal progress unfolds, I spoke to Njenga Kimani, a researcher at Siasa Place, a youth-led, prodemocracy NGO based in Nairobi, and three moderators who’ve worked on platforms including TikTok, Meta, and OpenAI: James Oyange Odhiambo, Sonia Kgomo, and Richard Mathenge.

What follows is a lightly edited transcript of the discussion.

Njenga Kimani:

My name is Njenga Kimani. I work at Siasa Place as a researcher. I've been working around just identifying some of the issues that moderators are facing and creating advocacy strategies on how we can voice out their concerns and just ensuring that the plight of content moderators and all who do platform work is voiced out there and people get to know about it.

Richard Mathenge:

My name is Richard Mathenge, former ChatGPT content Moderator at Sama and the current admin in as far as African Content Moderators Union is concerned. I'm also enlisted as Top 100 Individuals in AI, most influential individuals around the world by Times Magazine. This is a win for all content moderators, not just for me.

Sonia Kgomo:

My name is Sonia. I've been working as a content moderator just for over two years, moderating content for Facebook under the outsourcing company, Samasource, which is based in Nairobi, Kenya. And currently, I'm one of the petitioners in the ongoing litigation which is in court. And I'm also one of the people that are trying to form a union to advocate for content moderators in Africa.

James Oyange Odhiambo:

My name is James Oyange, commonly known as Mojez. I'm a former TikTok content moderator. Currently, I'm working as a union organizer. In the near future, working on a case against TikTok and its associate partner, Majorel.

Justin Hendrix:

Kimani, I might come to you and just ask if you'll maybe set the stage for us. What is it that's happening at the moment with regard to the effort to organize content moderators and with regard to these ongoing legal claims?

Njenga Kimani:

So Facebook, as we all know, it's... Or Meta. ... a company that is one of the biggest tech companies, especially in the realm of social media. There is a need to regulate what is usually posted online in an effort to ensure that there is internet hygiene. So because of that very necessity, which is what the content moderators here used to do and some continue to do, they hired or they recruited about 260 content moderators from all over Africa to come to the country and help regulate or moderate what is posted online so that it's hygienic on the internet space.

But what happened as a result, because some of them took the job, they were not actually told what they're coming to do, they were told they're coming to serve as telephone assistants, but coming here, they find that they're doing the work of content moderation. But earlier in the year, the company declared all of them redundant and fired them without following due process. And all the while, the company did not accord them the necessary safeguards to protect them given the amount of content that they consume, very bad content to say the least, people being raped, instances of murder, all this is necessary to ensure that they get to see that so that it's pulled down. So they were not given adequate mental wellness to ensure that they're protected from all of this, yet the company proceeds to declare them redundant without following due process.

So this was actually challenged before our courts here in Kenya. And the first response that Meta give is that, actually, it can't be sued in Kenya because it's an international company and therefore, Kenyan courts do not have the jurisdiction to entertain a case involving international companies. But the court said that, actually, Kenyan courts can have jurisdiction because they have actually been doing work here in Kenya, the agency they had sub-constructed called Sama was actually based in Kenya. And because of all those, the court assumed jurisdiction.

Other thing that the moderators, all the petitioners filed interlocutory application wanting that, pending the hearing and determination of the case, that Majorel and Sama be compelled to continue paying salaries and not continue with the redundancy. And these orders were issued pending the hearing and determination of the case.

But what we saw is a lot of impunity because the respondents, Sama and Meta, violated those court orders. They appealed the decision, their decision was rejected. And they still continued to violate court orders. Recently, there has been an application to find them in contempt. We are waiting for a ruling on the 15th that we'll see whether they can be held in contempt for violating court orders and for refusing to pay the content moderators and continue with the redundancy in stark contravention of what the court had directed them.

So that's the base. That's where we are at right now. The ruling comes out on 15th of December. We are waiting to see how that goes and the trajectory that could inform our advocacy moving forward.

Justin Hendrix:

I want to turn it over to you, Sonia, among those who have done the work of moderators who've been in this situation, what's it been like for you as this has been playing out? How has work been to the extent that you're able to work or are working? How have you been able to carry on since this saga started?

Sonia Kgomo:

Okay. So on our side or my side, and also speaking on behalf of the rest of my former colleagues, the people I was working with, currently, we are in a situation where, like Kimani said, he summarized it quite well, as we are currently in an ongoing litigation. The court did order that, as the case is still in court, Meta, Facebook should continue paying salaries. So this is now month number eight, going for months number nine, which the court order was just ignored by these tech companies, they have not been paying us the employees our salaries as the court ordered.

So basically, that's the whole situation around the court. And it's been quite difficult for, basically, the petitioners who are in this case right now. Because we find ourselves stuck in position where our work permits, actually, our immigration status in the country, most of us, it has expired, in a compromised situation where you cannot go home because, obviously, if you were not getting paid for the past eight, nine months, clearly, high possibility, you don't even have money for an air ticket. So most of us, we find ourselves in a very compromised situation, struggling to make ends meet. And yes, the situation is not really nice.

And on a day-to-day life in the office, basically speaking on content moderation, it's not your common, average, normal kind of job. It might sound nice, fancy and just like in any other job, but it's not. There's a lot that goes on behind the screen of the person makes sure that the user does not get to see what they see. It's more like you are the first defense kind of thing. There's a lot that goes on. There is a lot of mentally draining, mentally killing and yeah, there's a lot of things that temper a lot with your mental health, your mental wellbeing, things like, on a day-to-day basis, you encounter things like beheading, slaughtering, human beings being killed, slaughtered, live accidents, just a lot of mutilated humans, sexual activity of just child abuse, just a lot of gruesome stuff.

And these are things that as content moderators we have to... it's a job, somebody has to do it, yes, we understand. So we do that for eight hours every day for five days in a week with little to no support to our mental health. So the only thing these companies were offering, instead of offering proper mental health support or psychiatric or some sort of psychological help, the only thing they were providing for us is just a general counselor, like a church counselor, just a general counselor, not somebody who's qualified and experienced to deal with such traumas because it is a very traumatic experience that you go through.

You'd find people screaming in the office on the production floor, you'd find it was... As I say, there's a lot of dark side, there's a lot of dark that goes on behind the screen of the person that protects the user. So there's a lot that goes on and yes, this company, the biggest thing that they neglected to do was to prioritize its employees' mental health, and instead, I would say they were prioritizing more on profits than the wellbeing and mental health of the employees. And unfortunately, when you focus more on the profit side, money, profit, that can always be made and lost. But once your mental health is tempered with, it's very dangerous because once you lose that, there might be a possibility of not coming back.

Justin Hendrix:

Richard, perhaps I'll come to you next. Sama also, of course, working with OpenAI, which has been very much in the news over the last couple of weeks here in the United States as they've had this coup against Sam Altman, the CEO who was both fired and now returned. I'm sure you've been paying attention to some of those headlines. What was your experience as a moderator working specifically on ChatGPT? It's almost a different kind of problem in a way, not managing so much the output of humans on a social media network, but rather, managing the output of the machine.

Richard Mathenge:

My heart bleeds, especially when you are looking at things from the outside, thinking about the future, prospective content moderators who are currently being outsourced or being employed by these organizations such as Sama and Majorel.

So my experience, we got employed during a pandemic-stricken period, that was during the COVID-19 period. And when we started off, there was that euphoria that not so many people are getting opportunities like these ones, and so we are more blessed than the people who are outside. And because of this, there was that looking forward kind of attitude to starting engaging the platform, but the looking forward and the main experience, two different things.

Prior to starting engaging the platform, there was that commitment by the human resource and the counseling department at Sama that the content moderators need to commit themselves every now and then to counseling sessions. Once they engaged the platform, it was their obligation, it was the obligation of these content moderators to reach out to this department and ask for counseling sessions. So that was the commitment, that was the clarion call by the organization.

When we started off production, as things started becoming serious and encountering this content as it is, we decided to reach out to this department and see whether we can get assistance. Unfortunately, there was no commitment in terms of this department or from the management in trying to mitigate or trying to address the issue that we were grappling with. And so, because of that, as the designated team lead, because I was acting as a bridge between the organization and the team I was working with, I was forced to reach out to the management diplomatically. Unfortunately, there was no reaction or there was no commitment from the management.

And so, I decided as well to take the bull by the horn. And so, I went and knocked on the director's office and told them if things are not... I explained to myself, I explained the situation, the things that the team were grappling with. And so, I made myself clear that if things are not changing, then we'll be forced to take matters in our own hands because to me, the mental capacity or the mental health just as Sonia alluded, is much more important than anything else, it should be prioritized and not taken as just an option.

There was no commitment still. This action, actually, was perceived by the managers or by the directors as being arrogant. And so, fast-forward my contract was terminated. And so, because of that I was put on the other side, but yet, the team was actually functioning in a different project. But to me, even if the team was functioning in a different project, as I said earlier, my heart still bled or is still bleeding because you have not addressed the issue as it is, you've only made the thing much worse. And so, those are the things that we need addressed. And that was my experience in as far as ChatGPT is concerned and we need things to change.

Justin Hendrix:

And just so that my listeners understand the difference between the work you're doing moderating ChatGPT versus the work of moderating human posts on social media, can you describe the workflow a little bit, what it is that you were seeing? What was put in front of you on a day-to-day basis?

Richard Mathenge:

So basically what we were doing, Justin, is to train the chatbot to accept toxic pieces or to work with toxic pieces of text messages with the objective of future interactors or future encounters by individuals who are going to come and interact with the platform, which is ChatGPT, so that we can make it safe. So our job was actually to become gatekeepers to protect future interactors of ChatGPT so that they can have a safe space in terms of their interactions with the platform. So in a nutshell, that was our daily assignment on a day-to-day basis.

Justin Hendrix:

So did that involve giving ChatGPT itself adversarial prompts, things of that nature? So you're testing it in some cases for difficult or bad responses that it might give?

Richard Mathenge:

Absolutely. Yes.

Justin Hendrix:

Okay. I just want to ask you one other question as well. There was some reporting in Time that one of the things that your colleagues were asked to do was to collect sexual and violent images. Is that something that you were aware of or something that you saw happening in the work that was being done there?

Richard Mathenge:

Absolutely. And for me, the good thing that God has blessed me with is the fact that I have a very strong interpersonal skill. And when I started interacting with the team, there was a big difference, especially before the team started working production and the second week of production. The first week there was that euphoria just as I indicated because they were looking forward to the end of the month and working on the platform, maximizing the objective of the employer. But on the second week, you could see the individuals were becoming very detached and very disturbed and traumatized.

And so, I would reach out to the individuals or to the team members late at night and ask them, "Today, I noticed there was this difference in your mannerism, is there anything you need to share?" And that way they will come out and explain to me the kind of content that they encountered on the platform, which was very disturbing. For me, I believe in active leadership. Just to see people work on the platform, it's not enough, until and unless you get the experience on a personal level. At some point, actually, I was forced by the directors to do so, but I didn't see the need to do so. But later on it's when I felt if these young individuals are actually doing it, then let me just throw myself as well into the deep sea so that I can get that firsthand experience.

By the way, just as Kimani put it earlier on, the thing about Sama and Majorel, they try as much as possible to employ young individuals who are vulnerable and who cannot question the integrity of why things are being done the way they're being done. So once you are employed by Sama or once you're employed by Majorel, the possibility of raising your hand and saying, "I don't believe in this policy. I don't believe whatever you are saying," it's quite hard. And so, by this, then they become more traumatized and more disturbed just in the name of getting that paycheck at the end of the month.

Justin Hendrix:

James Oyange, I want to just ask you to describe your experience with TikTok. I know you've essentially sent a letter to TikTok via your law firm. You mentioned earlier that you're planning to move forward with this lawsuit.

James Oyange Odhiambo:

So for me, my working experience was, I'd say, I really didn't know what I was getting myself into because the contract that I signed up for and the work that I knew that I was going to do was for a customer service representative role. It's only until when I was doing my interview and when now we were doing our training is when now we were introduced to the realities of the job that we were supposed to do. And that was the content moderation part for TikTok. We never knew the client, we never knew the nature of the work, we never knew anything. So yeah.

And from then on, it's just been a working experience based on lies, being promised things that really don't materialize, asking questions and not being answered like the question of, "Why is our contract reading as a customer service representative and the work that you're doing? Is that the case? Is that the reason why you're justified in paying us less?" When you want them to be accountable, outsourcing company now they don't do that. So it's been that.

For example, my case, even if you had given a hundred percent to the company, like working, performance-wise, being good, your engagement with other people being good, that also doesn't count in this industry, right? So yeah, for me, the working experience wasn't... I'd say the people that I worked with, the agents, for them, they were good colleagues, but from the larger company area, it was based on lies and things that really don't materialize for the common agent.

Justin Hendrix:

In your case, does Majorel continue to moderate for TikTok? Is it still in the business?

James Oyange:

Yeah, Majorel is the only... I think right now, they're the ones who... Because Meta, they left, they are the ones who took over from Sama for moderating Meta content. But for TikTok, they're still doing it and it's something that will continue for the foreseeable future.

Justin Hendrix:

And how are you making your life or your living at the moment? How are you getting by as this plays out?

James Oyange Odhiambo:

Yeah, so currently, it's a little bit challenging because if you look at even the work that we did, you think that the work that you did, the trauma and all the other things will remain with you or will remain once you've left the job, but it's something that ideally transitions with you even when you're doing other jobs outside here. So you find that, I personally, a month after I left the job, I got a job at another place, but I didn't stay there for long because the vivid flashbacks, the PTSD and insomnia didn't make me a productive person to work at that company, so I left.

And then, now you have to also make ends meet. So you find that getting another work is a little bit challenging. Remember, in the country that we are in currently, there's a lot of corruption per se, so with the heat that is happening with the work that we did as content moderators internationally, so the companies, the outsourcing companies, they came together to fight now us, I'd say, the whistleblowers or the people who are advocating for proper working conditions for the content moderators. So they are ganged up to fight us through the government. So even right now, the government is in bed with the BPO's, the outsourcing companies. So it's very difficult because our names, obviously, are out there. We're the ones who decided to take the bull on its horn, so our names are out there.

The government on its side, it'll reach a time where they'll start now looking for us. And like I said, if our names are outside there, if even you go and apply for jobs in other places, you find that the company will just do a basic name search and then, they generalize you as somebody who's causing trouble.

So yeah, it's been very difficult trying to make ends meet over here because... And these are things that we really didn't think will happen when we're doing the job. We just thought that even if we left the job, our lives would continue being good.

Justin Hendrix:

There are other efforts that you're leading Kimani to organize moderators generally. What is the state of the kind of business of working with big tech firms on outsourcing contracts in Kenya right now? There's been a lot of publicity. Sama of course, has moved on, it says, from this type of work. It's focusing on other tasks that it can do rather than focusing on content moderation, perhaps it's decided that this is all too risky or too dirty business to be in. What do you suspect will come of the content moderation as a kind of field or industry in Nairobi, in Kenya more broadly, perhaps even neighboring countries?

Njenga Kimani:

I like the fact that you've mentioned that there has been a lot of conversation with respect to the state of big tech companies, especially in African countries. And one thing that we've noticed that they do in big tech companies is the constant abuse and underpayment by these big tech companies. If you compare to what even the moderators were earning, it's just a mere $429. And those who are expatriates used to get like an additional $200 on top of that as some sort of an allowance that was only paid three times. So if you look at it keenly, it just tells you that the work that they do does not in any way commensurate to the kind of pay that they get at the end of the month. And these are something we noticed by big tech companies operating in African countries that they underpay the people they...

Compared to what they do in other continents, African content moderators are significantly underpaid, yet they moderate the most, let me call it, bad. Oh, expatriates got an extra 50, yeah, yet they moderate the worst of content. So tells you that, one, we have a lot of discrimination by big tech companies, probably it's because they employ the most vulnerable. And when they are asked, they say that the amount that they pay is six times more than what the average Kenyan worker earns. But one of the principles of fair remuneration is equal work for equal pay. You really can't compare what the least amount of people earn and what content moderation is.

It brings me to the other question on what is the future of content moderation in the country and on the continent? I think it's important to note that content moderation is a profession. It's so important if at all we are to have hygiene on the internet, which means that it shouldn't be treated with the contempt that we are seeing these big tech companies doing. It's high time we recognized this as a profession that is so profound and so important, especially in an era where technology literally runs everything. So content moderation should be recognized indeed as a profession, which is what we are pushing towards at Siasa Place.

Because if you look at people who even do digital work and rely on a platform to earn a living, most of the time, they're not given the necessary safeguards as compared to traditional workers, people who, for example, have an office job. People who do platform work like moderators don't have transparent contracts. Some of them don't even have insurance that caters to the nature of the work that they do. Sonia mentioned that all they got was general counseling, but you see, that does not cater to the amount of work that you do simply because we have big tech companies treating this profession with so much contempt than it should be.

So moving on, even as we anticipate the ruling on the date of 15th, I think we should recognize even as a country that, one, this content moderation, we need adequate safeguards to protect will who do platform work. And this content moderation should be recognized as a very important profession, just like you'll recognize a lawyer, just like you'll recognize a doctor, just like you'll recognize a teacher so that all these people unionize. And that's what we are also pushing for together with the moderators because when they unionize, they're better placed to advocate for policies that will safeguard them. So I look forward to the day that content moderation will be recognized as an important profession and the moderators are given necessary safeguards that protect them while at it.

Justin Hendrix:

Obviously, you're waiting to see how things turn out with this lawsuit, waiting to see what the court says in December, of course. But what about for you personally? What do you hope will happen next? Do you want to stay working in this world or have you had enough or do you see other opportunities in tech or are you hoping to put this behind you?

Sonia Kgomo:

To be honest, Justin, the tech industry, it's not a bad industry to work in, it's just those few tech companies like the ones that I've worked with, all your social media companies, the likes of Facebook, Meta, TikTok and all those, but I'll speak on behalf of Meta because that's the one I've been working for.

Tech industry as an industry, it's not a very big industry, it's just on this side, this one side of the industry, which is content moderation and social media side. I would on the condition that things change in the industry. Because right now, the way things are in this industry, it's not something I would go back to and it's not something I would advise anyone to go back to. Mainly, reason being it's not healthy, it is not. The $450 that we're getting, $50, $60 that we're getting as expatriates, it's monthly as our salary. It's not enough to actually trade your mental health and basically your whole life away.

So more the reason why we are trying to unionize and change the industry, do away with the exploitation of not prioritizing mental health, salaries and all those issues. We have to get the industry to change or at least, if not to take a 360 change, but at least try to get the industry or the employer and the employee to at least meet each other halfway so that we can have a common understanding.

And I believe once that is done, once we get to a common understanding and once the industry changes, I believe, yes, content moderation would be a job that... Because still, no amount of artificial intelligence or computer can outsmart human intelligence. So human content moderators, they will continue still, there'll still be a demand for human content moderators.

So would I still want to go back? I would go back on condition that the industry, everything that we are currently fighting for right now, it's implemented and the industry changes. Like Kimani and the others said, it's not your traditional job, it's not an everyday job, it's not a normal job. If a normal job an employee does eight hours a day... When it comes to content moderation, you cannot let someone sit in a chair for eight hours a day watching all these graphics and you expect for one, two, three, four, five years, it's not a normal job. There's a lot that needs to change. And there are conversations going on. So yes, if things are, hopefully, made better and when they are made better, it's an industry that just needs to be cleaned up.

And with regards to the ongoing case, the litigation that's in court, yes, we are anticipating that, it's actually the seventh, which is next week if I'm correct, next Thursday, we are getting a ruling with regards to the contempt of court. So a couple of weeks ago we were in court and discussing the whole matter. The court was hearing the matter with regards to the contempts and the contempt that Meta, Facebook and Majorel, the court ordered and they did not follow.

So next week Thursday, which is the 7th of December, we are waiting to hear what the outcome, the ruling is on that. And yes, we are anticipating. And hopefully, the Kenyan system and the Kenyan courts, justice prevails, basically, justice prevails and they hold the tech companies accountable. Because it's been what? Eight, nine months? It's too long for... You cannot just decide to not follow a court order for eight months or nine months and there's no accountability. So nine months later, we are hoping that something will be done, they'll at least be held accountable.

Justin Hendrix:

If you could speak to the senior executives, either at the company, so at the primary company, at OpenAI or Sama or TikTok, Majorel or Facebook, what would you want them to hear from you, perhaps on behalf of your colleagues, of course, but from you specifically? What would you hope would break through to them?

James Oyange Odhiambo:

So one thing I'd like to say to the top management of their parent companies, TikTok, Meta and OpenAI is that they need to take this work very seriously. The content moderation, content moderators, they play a very critical role in maintaining the integrity and safety of online platforms, often at the cost of their own wellbeing. So we, content moderators, are the defenders of social media. So without our work, we would not have social media at all, completely.

Every time a user logs into Facebook or TikTok or any other social media company, it is the moderators who are there protecting them 24 hours a day. So when someone posts a gruesome murder on Facebook or TikTok, it's a moderator who stands between the user and that content spreading.

Even here in Kenya, when the government says they want to clean up TikTok, who do they expect to do that cleaning? It won't be IT people, they won't send the police officers to be checking their content that is being shared online, it is us the content moderators who are doing this.

So for all the multinational companies who are under this market, that is TikTok, Meta, OpenAI, YouTube, all of them, I like to urge them to take this work with the seriousness that it has. They should stop and should not be outsourcing such delicate work because an outsourcer will be only focused on profit making and won't take the work with the seriousness that it deserves.

Justin Hendrix:

Maybe Richard will come to you. What would your message be to Sam Altman or to the folks leading Sama now?

Richard Mathenge:

It's very simple. So right now there is a global conversation around regulation and policy framework. Everyone is talking about regulation. Just the other day, President Biden signed an executive order in as far as regulation and policy framework is concerned. My clarion call to these big tech organizations, the directors and the people who are running the show, Sam Altman and the likes, is just to make them understand that they need to be part of this global conversation. They need to be part of the policy framework, the regulation, and not just regulating and bringing policies that are geared towards profit-making. No, be part of a policy framework that is also accommodative of the human labor because you're talking about young people from Africa, you're talking about vulnerable people from Africa. Remember, these are young people who, they're breadwinners, their parents are actually dependent on that. So just be mindful of that. They need to be part of this regulation framework and policymaking

Justin Hendrix:

And Sonia, how about you?

Sonia Kgomo:

These companies, or let me just address this to the companies that I've been working with, Meta, Facebook and Sama, I would say, basically, they shouldn't prioritize profit over people's lives because currently, that's what's being done, they are prioritizing profit over people's lives. And when I talk of people's lives, I'm talking mental health because basically, when you talk mental health, that's my whole life. If that gets messed up, you've just messed up my life. So my biggest plea and message to these companies would be, do not prioritize profit over people's lives. And basically, stop with the exploitation. Treat content moderation as a job, not as a ghost worker job kind of thing. So yeah, that's it from my side as we anticipate and hope that the whole industry changes for the better.

Justin Hendrix:

Thank you. And Kimani, I'll come to you for maybe a last word there on what to expect in December and what to expect going forward for the African Content Moderator's Union and other efforts that you are engaged in.

Njenga Kimani:

Thank you, Justin. So first of all, with respect to the ruling on 15th of December, I am really optimistic and really hopeful that the court will rule in favor of the moderators so that we promote big tech accountability. That will send a message that it doesn't matter how much profit you're making, it doesn't matter how much money you have, when it comes to the dignity of people, that ought to be respected. So I'm really hoping for a positive ruling or in favor of the moderators, especially on their aspect of contempt because that will allow them to repay the salaries that have fallen due for, I think, nine months now and stop the redundancy that is going on and so that these moderators live in the dignity that they deserve as human beings.

Secondly, Siasa Place will continue to advocate for safeguards to protect the rights of digital workers and people who do platform work. We are pushing for legislation in Kenya that will see to it that we have policy framework to ensure that this profession is treated with the same level of importance as other professions and so that we don't have loopholes in law that allow big tech companies to get away with their atrocities because they are multinational corporations. They're also under the principle of horizontality of application of human rights, they can also be held accountable for the violations.

Lastly, I hope that justice will prevail for the moderators who've suffered the brunt of content moderation. And I hope this will send a message and will be the first step that we are taking to ensure that there's accountability on our platforms and that we do all that it takes to ensure that people are treated with the dignity that they deserve.

Justin Hendrix:

I want to thank each of you for speaking to me today.

Richard Mathenge:

Thanks Justin.

Njenga Kimani:

Thank you very much.

Sonia Kgomo:

Thank you very much for your time. Thank you for listening.

Special thanks to Nerima Wako-Ojiwa.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics