Home

Donate

When Data Bodies Become Targets: New Risks in a Networked World

Laila Shereen Sakr / Nov 4, 2024

Laila Shereen Sakr is a Public Voices Fellow of the OpEd Project. Known for developing R-Shief software and performing as VJ Um Amel, she is a professor and artist at the University of California, Santa Barbara, working on creative, critical AI; digital arts; and Middle East media.

Clarote & AI4Media / Better Images of AI / Power/Profit / CC-BY 4.0

We are all cyborgs. Our online footprints—the collection of our digital and real-world actions, traced, tracked, and recorded—turn individuals into data objects under surveillance. These data bodies govern our lives and can be subverted by those who want to control or even end them. Perhaps for no one is this reality more acute than for Arab people targeted by Israel in Gaza, Lebanon, and beyond.

In Palestine, where thousands of babies and children have been killed or maimed and millions displaced, artificial intelligence (AI) is being deployed to rapidly process massive amounts of data to generate thousands of potential targets for military strikes. One known example is Lavender, an AI system that uses personal data to decide who might have an affinity to Hamas and to track and target them and their families. The Israeli military has reportedly used it to identify and strike Palestinians, despite its known error rate.

Or consider the terror when simultaneously thousands of pagers and phones blew up across Lebanon, killing dozens and wounding or maiming thousands, including civilians. A few days later, people in Lebanon with area code (70) received texts from Israel’s military warning them to flee imminent bombings that killed or displaced thousands of innocent people.

These pernicious acts are built on a history of surveillance and data collection on Palestinians and their Arab neighbors. Recent Amnesty reports reveal that Israel is using cameras installed across East Jerusalem, the West Bank, and Gaza that can identify individuals based on race, age, and ethnicity and are used to target people even in crowds, such as during protests. Other Israeli military technologies include the Red Wolf system, Hikvision's facial recognition technology which also supported China's persecution of Uighur Muslims, and Camera Xaver 1000, which uses an AI tracking algorithm to see through walls. American Big Tech firms also play a role. For instance, despite technical errors and failures, Google continues to offer the Israeli government access to Arab data bodies through its controversial Project Nimbus, while the Israeli army uses Amazon’s cloud services.

Taking back our data bodies

Data bodies include the records institutions keep on us to determine who has access—our credit scores, citizenship papers, health records, school grades, and the text and images we produce and share on social media. According to what is inscribed on our data bodies, we can move around the globe and afford a home mortgage or education, or not.

While state and institutional records have always existed, today’s data bodies gain more potency, virality, and impact than ever as AI and machine learning systems learn and are trained on these vast information repositories. And so, AI systems are fundamentally created and determined by politically gathered, organized, and recorded data. What does that mean for Indigenous data bodies, Black data bodies, or Arab data bodies?

Some argue that technology itself, in a void, has no politics. But in the real world, the politics of a technology depends on how it was designed, by whom, and for whom. There are many liberatory uses of technology. For instance, over the last two decades, organizations like Visualizing Palestine have used data visualization and infographics to educate the public about Palestinian life and struggle. A recent initiative launched in October 2023, Palestine Love is an extensive digital archive of twelve social media platforms, a library, and a directory of publications and data. Another significant platform with over 4,000 activist volunteers, Tech for Palestine, encompasses various archiving and comprehensive technological initiatives supporting Palestinian rights and development, addressing digital infrastructure and online activism. Another initiative, the Librarians and Archivists with Palestine, is over ten years old.

Technological progress can be used for progressive mobilizations and love for liberation. One can glance at the social media landscape and across American university campuses to witness how young Palestinian voices and allies are transforming the political discourse by using various digital platforms and sites to demand a ceasefire and call out US complicity in a horrific genocide.

But the opposite is also true. Israel is setting a new precedent for surveillance and the data-driven slaughter of civilians. Who is to say it will not happen again and again? In the US, laws prohibiting abortion are compromising the very bodies and lives of women, and the three government agencies that use social media the most for monitoring, targeting, and information collection are the Department of Homeland Security, the Federal Bureau of Investigation, and the State Department. We must remain vigilant about the implications, especially as they implicate vulnerable populations.

What happens “over there” can happen here

The gnarly thing about data bodies is that real bodies suffer the consequences of attempts to manipulate them, even with mathematical distance from their programmer. Physical and data bodies are connected socially and politically, and together make up the contemporary individual living in a hyper-networked world governed by extractive power systems. What is clear is that our global activities are so networked that what happens in Palestine is in part because of decisions made here in the US. Likewise, what happens there could be reproduced anywhere – including the use of surveillance and lethal technologies on innocent people. The digital choices we make individually and collectively will inform how we move forward.

Related Reading

Authors

Laila Shereen Sakr
Laila Shereen Sakr is an artist and Associate Professor at the University of California, Santa Barbara. A media theory and practice pioneer, she demonstrates how digital platforms can predict and influence socio-political change. Her 2011 Twitter analysis predicted the fall of Muammar Qaddafi in Lib...

Topics