Afsaneh Rigot is a senior researcher at ARTICLE 19 and an affiliate at the Berkman Klein Center at Harvard University
How can something such as fragile, complex, and private as sexual or gender identity be identified, let alone be prosecuted in a court of law? In a context when identity is criminalized, who you are, who you’re friends with and who you love can get you thrown in jail. Even the dating app you use, who you messaged last night, what name you saved your intimate partner under, and what photos you shared with your partner can be used to target, harass, arrest, and prosecute you.
This is the everyday reality for many LGBTQ people around the world, including in Egypt, Tunisia and Lebanon.
In studying the impact of technology on marginalized people, I spent the past two and a half years looking at how private digital data is weaponized against the LGBTQ community– especially its most at-risk members. Through in-depth legal and social reviews, an analysis of 29 legal case files and 20 interviews with the main defense lawyers and a case worker, I pieced together a harrowing picture of injustice and abuse that the operators of social media and messaging platforms need to confront. There is a way to prosecute identity, and it’s based on a patchwork of information that can be extracted from your digital identity.
For the purposes of prosecution, the digital traces of intimate connections among queer persons have become the scene of the crime. The implications go beyond this particular community in this particular geography– solutions could help preserve the safety, security and liberty of people across the globe.
Old issues, new tactics
Persecution of LGBTQ people around the world, and in particular in the MENA region and beyond, is nothing new. Morality laws and a variety of colonial laws have ensured that under deliberately vague and overbroad legislation, non-normative gender and sexual identites are criminalized. What is new– and increasing at a troubling speed– is the use of personal digital data as “evidence” to arrest and convict LGBTQ people under existing anti-queer laws.
In a new report, Digital Crime Scenes, I documented in detail how in Egypt, Tunisia and Lebanon, the police and courts are increasingly using digital evidence— including selfies, sexts, texts, dating apps on devices, and in particular WhatsApp and Facebook Messenger chats— to target LGBTQ people. What emerged is a combination of manual and traditional policing combined with new technologies. A method that is often not only less resource-intensive, but also more effective in targeting and prosecuting highly marginalized people.
When it comes to proving a person’s identity, non-digital evidence tends to be circumstantial, such as locating clothes, condoms, or social connections. However, digital evidence, the interviewees concur, “provid[es] evidence for something that usually is very hard to find evidence for. […] at this stage digital evidence is actually providing new type of evidence and is providing evidence for something that was really impossible before,” one Lebanese lawyer said.
Police forcibly search people’s phones looking for ‘evidence’ of queerness. Of course, what qualifies as “too queer to be legal” is not defined; and if officers don’t find explicit content, they use innocuous material as the basis for charges. In one case, the prosecutor in Tunisia used a text that merely stated “I like you”, as evidence. In another in Lebanon, a contact save as “Honey” was used as part of the case against an individual. In Egypt something as she/her pronouns used to refer to a trans woman was used as evidence. Six people I interviewed said personal selfies were used to prove the “crime” of being queer. All of this is seen as “hard” evidence, rather than circumstantial (or no crime at all) in these cases.
In the report, I outline the fake and fraudulent evidence presented by the police (sometimes so fraudulent even the adjudicating judge agreed with the defence). Similar fake account tactics used by Egyptian police to entrap queer people in a sting opertation were identified in Tunisia and Lebanon. Regardless of the amount of resources placed into prosecuting these cases (the most being in Egypt and the least in Lebanon), it is clear that it’s an identity being prosecuted and not a physical “crime”. “It’s a violation of privacy… this is where you start prosecuting people based on their identity and not based on any evidence of criminal activity,” as one lawyer put it.
Whether each piece of evidence itself provides enough proof or not, inferences that can be made from all this deeply intimate and personal device data, when taken together, can serve as a “proof” of a person’s identity. In 26 out of 29 cases I reviewed, this digital evidence formed the backbone of the subsequent prosecution.
Most marginalized most at risk
Perhaps not surprisingly, my research found that those most marginalized, including refugees, sex workers, trans women, ‘feminine’ gay men and low-income LGBTQ people, are most at risk from digital persecution – and prosecution. Their cases are often used to test the viability of new laws and/or new law enforcement procedures. This is especially visible with how police have been testing the use of digital evidence on sex workers and refugees, before applying it to the broader LGBTQ community.
Referring to sex workers, one lawyer I spoke with told me that the police “use their telephones to go through their messages because most of the time sex workers would actually meet clients through messages or apps. They just go through that to confirm whatever they are trying to arrest them for.”
The pattern of parallel prosecutions not only increases the chances of queer people being prosecuted, but it also creates further barriers for rights groups to track LGBTQ cases in courts or gather statistics on them. This is because charges based on their gender or sexual identy are often commenced after people come into contact with police for unrelated reasons. If LGBTQ people actually go to the police to report being victims of crime– including domestic violence, rape, and assault– rather than investigating the crime, the police profile them, search their phones and use the data they find to prosecute them for being queer instead. In fact, in Tunisia, one-third of the cases were opportunistic in this way.
As a result, LGBTQ people are very unlikely to go to the police to report any crime against them. They are also less likely to take legal action against law enforcement, leaving police and prosecutors with full impunity. Given that queer people– especially the most marginalized– are more likely to be the victims of certain kinds of crime, this makes them even less likely to see justice done.
Copycat trends and mounting laws
In recent years, new trends have emerged in anti-queer policing in Egypt, where authorities are increasingly charging LGBTQ people under telecommunication and cybercrime laws. This shift enabled police and prosecutors to optimize their use of digital evidence in the pursuit of more stringent sentences.
Unlike the more traditionally used “morality” laws (which require prosecutors to prove sexual activity has taken place), the increased use of cyberlaws has a lower standard of evidence. Prosecutors only have to prove that someone has violated “any family principles or values in Egyptian society” or “misuse[d] telecommunications’– yet those laws impose higher penalties.
The cybercrimes departments that enforce these laws have more sophisticated and significant surveillance tools, which they use to actively monitor social media and dating apps. Police officers in Egypt also create fake profiles on queer dating apps to actively target and entrap people, while simultaneously building a portfolio of digital data to prosecute them.
Worryingly, evidence is already emerging that other countries in the region are starting to borrow from the Egyptian playbook when it comes to the digital targeting of queer people.
These types of cases are not in any way unique to these countries, however. I focused on these three countries because of my knowledge of their laws and my personal connections with the communities within them. They are also three countries that have laws which criminalize LGBTQ people, and they all have documented cases of police reliance on digital technologies to prosecute, arrest, or harass members of the queer community.
Tech needs to do more
In the course of my research, 100% of the interviewees mentioned data taken from WhatsApp, which is particularly ubiquitous in Egypt and Lebanon. Other platforms cited included Facebook Messenger, Snapchat, Grindr, and Hornet. In Egypt, 50% (6 out of 12)
of the cases and more than half of interviewees mentioned WhosHere– an app unresponsive to any calls for protective actions in order to limit its utility to Egyptian antiqueer sting operations.
These cases are not prosecuted on the basis of large-scale data gathering through the use of sophisticated tools, or through data requests made to private corporations. Instead, they are prosecuted using data taken from personal devices, which indicates that the tactics are more manual and based on traditional policing methods. These manual and traditional tactics historically affect marginalized groups the most. This is why the design process for personal digital tools and devices must take into account that these applications can be used in very different contexts than intended, up to and including being weaponized against those who use them.
Tech companies have a responsibility under international human rights law to “prevent, address and remedy human rights abuses”. Meta and other major social media and chat-based apps have committed to this, so they are bound to prevent and mitigate such abuses, minimize risks, and support users to stay safe– whoever and wherever they are.
It is therefore vital that these companies take immediate action to build in protections for their most marginalized and at-risk users. This includes prioritizing the implementation of security measures and reverse engineering systems to identify harm-reduction techniques. Many of these are outlined in the report, based on the asks from frontline lawyers, such as layered PINs, self-destruct buttons, and in-app password-protected camera rolls. These are some of the methods suggested (echoing previous research done directly with community members) which can limit the increasing reliance on digital evidence in these prosecutions.
Those are all simple fixes – yet for so many users, they can mean a difference between freedom and imprisonment.
Several tech companies have paid attention to previous research by ARTICLE 19, as well as more recent studies and advocacy. Grindr introduced vital security features such as “discreet app icons”, which allow an app icon to appear as an innocuous one, for instance, a calculator, to protect people’s privacy. Grindr launched its new security features in December 2019 based on this work.
These changes work. In the research I’m leading at ARTICLE 19, we’re conducting large surveys in 8 MENA countries. Though the data is due to be released next year, so far over 65% of our 8000+ survey respondents say their choice of dating app depends on whether or not these features are available.
Following research set out in my report, WhatsApp has already introduced more disappearing messages options. ‘In line with Design From The Margins, we also believe that when you design with the most at-risk groups in mind, it benefits everyone,’ the company said, pointing out that it had also added ‘end-to-end encrypted backups for additional security’ and provided other safety tools to ‘make it easy for users to block and report.’
It’s important that tech companies abandon the damaging “move fast, break things” ethos, which can have harrowing consequences for marginalized communities. Instead, they should center their most at-risk users in product design, from ideation to production. I’m spending the foreseeable future making our communications and major tech players see that we need a radical change in how we design our tools. This research and a volume of related work by others shows the effects of these technologies in contexts they were not designed for and the effects of Western-centrism on vulnerable and/or hard-to-reach communities. We need to adopt what I’m currently calling, a ‘Design from the Margins’ methodology that would result in better tech for all users.
Just like homophobia and transphobia, the tactics used by the police in the MENA region are not unique to these countries, and the solutions are not going to be either. By extension, addressing these issues will protect marginalized communities and, in effect, help ensure the safety and security of communities around the world.
– – –
Read the Digital Crime Scenes: The Role of Digital Evidence in the Persecution of LGBTQ People in Egypt, Lebanon and Tunisia here.
The report is supported by international freedom of expression organization ARTICLE 19 as well as the Cyberlaw Clinic and the Berkman Klein Center for Internet and Society, both based at Harvard University.
Afsaneh Rigot is a senior researcher at ARTICLE 19 and an affiliate at the Berkman Klein Center at Harvard University, researching law, technology, human rights, and corporate responsibility. She’s the author of Digital Crime Scenes: The Role of Digital Evidence in the Persecution of LGBTQ People in Egypt, Lebanon and Tunisia.