Home

Donate

Can Big Tech Platforms Operate Responsibly on a Global Scale?

Justin Hendrix / Sep 18, 2022

Audio of this conversation is available via your favorite podcast service.

One of the documents disclosed by the Twitter whistleblower-- former security chief, Peiter Zatko, otherwise known as Mudge-- was a report based on internal interviews and documents that assessed the company’s ability to mitigate mis- and disinformation. The report plainly states that the company “lacks the organizational capacity in terms of staffing, functions, language, and cultural nuance to be able to operate in a global context.” It found the company has a bias towards the English language and English-speaking countries, a problem that is particularly acute in Africa, Latin America, and Asia.

A year ago, Facebook whistleblower Frances Haugen brought forward documents from that company that painted a similar picture of neglect, including underinvestment in content moderation, particularly in languages other than English, and particularly outside of rich Western countries. Another Facebook whistleblower, Sophie Zhang, highlighted how the company fails to protect elections in some of the more fragile parts of the world, ignoring the proliferation of inauthentic accounts and behavior.

Whether these major tech platforms- not just Facebook and Twitter, but also YouTube, TikTok and others that have similar weaknesses- will make the necessary investments to make their products safe in an international context is difficult to say. Despite crowing about their policies and investments in trust and safety, there is a great deal more to be done.

This reality is underscored in a series of reports published in June by Article 19- an international human rights organization that seeks to advance freedom of expression and freedom of information worldwide. Working with the United Nations Educational, Scientific and Cultural Organization-- UNESCO, which promotes peace and cooperation-- and with funding from the European Union, Article 19 studied three countries in particular- Bosnia and Herzegovina, Kenya, and Indonesia in order to evaluate the disconnect between tech giants’ content moderation practices and what is happening on the ground in the local communities in which content is produced and distributed.

The reports Article 19 produced take an in-depth look at how the tech firms operate in each of these countries, documenting a lack of understanding of cultural nuances and local languages, insufficient mechanisms for users and civil society groups to engage on moderation, a lack of transparency, and a power asymmetry that leaves local actors feeling powerless.

To learn more about the project and its recommendations, I spoke to four individuals involved in the drafting of the reports:

  • Pierre François Docquir, Head of Media Freedom, ARTICLE 19, who led the project globally;
  • Roberta Taveri, an ARTICLE 19 program officer who played a role in delivering the research on Bosnia and Herzegovina;
  • Catherine Muya from ARTICLE 19 East Africa, who focused on Kenya, and
  • Sherly Haristya, PhD, an independent researcher who conducted the
    research on Indonesia.

A transcript of this discussion will be available later today.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & Inno...

Topics