The Sunday Show: A Look Inside Facebook's African Sweatshop
Justin Hendrix / Feb 20, 2022Subscribe to the Tech Policy Press podcast via your favorite podcast service.
It has been ten years since a group of Facebook moderators working for an outsource company came forward to then Gawker staff journalist Adrian Chen to tell the story of their experience. Chen recounted the complaints of a Moroccan man who said he was paid $1 an hour to screen illicit Facebook content.
In her 2014 dissertation on the subject content moderation in the shadows of Social Media- a work that would later become a book- UCLA professor Sarah T. Roberts was among the first to produce scholarly work on this phenomenon, using an ethnographic approach to consider the labor issues involved in keeping these big tech platforms clean. She observed then that “Facebook benefits from the lack of accountability that comes with outsourcing, that is, introducing secondary and tertiary contracting firms into the cycle of production” and that “Workers engaged as moderators through digital piecework sites are isolated, with few (if any) options for connecting — for emotional support as well as for labor organizing — with other workers in similar conditions, and without any real connection to the original worksites from which the content emanates.”
Years later, after billions spent by Facebook on investment into artificial intelligence and other efforts at shoring up its content moderation, the picture has not changed. Today we hear from Billy Perigo, a journalist at Time magazine, who tells us of the plight of outsourced content moderation workers in Kenya tasked with moderating the typical stream of gore and bigotry, as well as screening for hate speech and incitement to violence emanating out of war-torn Ethiopia-- all for less than $1.50 an hour.