Home

Donate

How to Moderate Cleavage on Social Media?

Mona Elswah / Oct 29, 2024

The Maghreb region, Arab Maghreb, or also Northwest Africa on a political map, comprising Algeria, Libya, Morocco, Mauritania, Tunisia, Western Sahara, and the Spanish cities Ceuta and Melilla. Shutterstock

How much cleavage is too much? This question has troubled countless women checking their outfits in the mirror before stepping out in public. It’s also consistently puzzled policy teams at tech companies as they determine which images to keep on their platforms and which to remove. Tech companies serve users across the world, each with unique cultural and religious backgrounds; deciding where to draw the line on neckline placement is not just a matter of fashion but an example of the complex considerations that go into balancing universal principles with local norms on platforms used by billions of people.

As part of an investigation into content moderation bias in many regions in Africa, Southeast Asia, and Latin America, I recently explored the content moderation systems of social media platforms operating in the Global South, specifically the Maghreb region. My research illustrated how the structures put in place by these companies profoundly shape the online experiences of most of the world’s population and revealed a stark disconnect between the companies that create content moderation policies, the outsourced content moderators that enforce them, and social media users. The result is remarkable: 58% of users surveyed expressed that they do not trust social media companies with their content, and more than 62% were concerned about their posts being removed. Inadequate moderation has resulted in regional civil society organizations stepping in to escalate incidents; even then, tech companies can take months to respond.

While companies actually employ, directly or indirectly, many individuals who live in these same regions, many of the moderators on the front lines of culturally and politically sensitive issues say their perspectives are unheard and undervalued by the tech giants. This lack of dialogue deprives tech companies of valuable insights that could help globalize their policies while being sensitive to cultural norms.

Generally, tech companies moderate content online using either a global or local approach. The global approach — used particularly by Western companies operating under Western laws and regulations — entails deploying one policy for all users regardless of location, religion, and culture. This approach ensures consistency and reduces the risk of external or governmental interference in policymaking but often overlooks significant cultural differences, potentially leading to dissatisfaction among users in different regions. The Zulu Reed dance annual ceremony in South Africa, for instance, where women are singing and dancing bare-breasted wearing traditional Zulu outfits, goes against many tech companies’ nudity policies, despite being culturally acceptable.

The local approach, adopted by ByteDance’s TikTok, tailors important policies — particularly those concerning nudity, violence, and animal cruelty — according to regional or even national standards. Seemingly trivial factors become critical; content moderators must assess whether content adheres to the cultural norms of a given country, potentially overshadowing more critical safety concerns on the platform. One moderator, who worked for a vendor reviewing Arabic content for TikTok, told me that he had to take precise measurements, such as the number of centimeters of cleavage shown in a video.

This approach appears to respect the cultural nuances of different regions but can also significantly restrict freedom of expression, particularly when users in the Global South seek to push boundaries or challenge local norms on social media. It also demands extra work from moderators to distinguish between content policies for various countries. TikTok, for instance, divides the Middle East & North Africa region into three sub-regions; moderators expressed confusion about how to understand all of these distinctive regional policies at once and even wrote an interpretation list to explain the policies to the newly hired.

Ultimately, my interviews with policy teams within social media companies and content moderators in the Global South suggest that tech companies need to find an approach that recognizes platforms' global reach while being sensitive enough to respect cultural differences and local norms. To accomplish that and respond to concerns that they’re ignoring non-Western voices, tech companies should work more deliberately with their users in the Global South to get a fuller picture of their needs and concerns and engage cultural and linguistic natives throughout the development and use of moderation systems. Tech companies could, for instance, benefit from the expertise of researchers local to the Maghreb region who have examined the unique characteristics of Maghrebi Arabic dialects. They could also train content moderators using regionally appropriate examples and prioritize transparency about content removal or demotion to help reduce bias and foster trust.

Tech was largely designed for and by white men, leaving the Global South as an afterthought for many tech companies. That structure does a disservice to users and platforms alike. Social media platforms that aspire to global status must take the needs of the Global South seriously – and that means putting Global South voices at the center of the content moderation process. Companies know exactly where to find these voices: on their platforms and within their content moderation vendors. It is time for them to take advantage of the wisdom and perspective available to them.

Authors

Mona Elswah
Mona Elswah is a Project Fellow at the Center for Democracy and Technology, where she examines content moderation policies and measures in non-English languages, focusing on countries in the Global South. Mona is a research contractor with CDT and is also a Research Associate at Oxford University’s ...

Topics