Home

Donate
Perspective

Surveillance Tech Heightens Chilling Effects for Disabled Protestors

Ariana Aboulafia / Dec 15, 2025

Ariana Aboulafia is a fellow at Tech Policy Press.

WASHINGTON, DC—APRIL 5, 2025: Demonstrators take part in the Hands Off! day of action against the Trump administration and Elon Musk on the National Mall. (Photo by Paul Morigi/Getty Images for Community Change Action)

For better and for worse, 2025 was a record-breaking year. Taylor Swift sold a record number of albums, tech companies marked their largest rounds of layoffs in history – and the United States may have seen more protests than ever before. Indeed, according to research published by the Crowd Counting Consortium (a joint project between the Harvard Kennedy School and the University of Connecticut), there were significantly more protest events in the first five months of the current Trump administration than in the same period of time during Trump’s previous administration, or that of former President Joe Biden. This trend likely continued with multiple iterations of the popular “No Kings” protests, with the October protests reportedly including 7 million participants at over 2,700 events across the world.

The right to peaceably assemble – to protest and engage in demonstrations – is one of the five freedoms guaranteed by the First Amendment, and is a vital way that people express dissent and press those in power to take action on social issues. But protest has never been equally available to all people.

Many of the most significant victories in the disability rights movement, including the passage of the Americans with Disabilities Act, occurred at least in part because of protests led by disabled people. The grassroots organization ADAPT has a long history of organizing protests that center people with disabilities and aim to advance disability justice. Outside of this, though, many demonstrations are and historically have been largely inaccessible to people with disabilities. For example, there may be no access to seating or restrooms, or no space for assistive devices or service animals, and loud groups of people can cause issues for those who communicate differently.

Being in a large crowd can also heighten the risk of being exposed to COVID-19 and other respiratory pathogens, to which many disabled people are particularly vulnerable. While wearing a mask can help mitigate that risk – and protect people from being surveilled while exercising their First Amendment-protected rights to protest – more than a dozen states have some kind of anti-mask law on the books. Some (though not all) of these laws do have exceptions for medical use, but the laws can still cause chilling effects that disproportionately affect disabled people who want to protest. Technologies, including those designed to surveil protestors, can make these chilling effects significantly worse for disabled people in particular.

Despite the presence of mask bans in several states and counties, which notably do not seem to apply to ICE as they engage in activities such as stops, arrests, and use of force, where public accountability is most needed, many protestors still wear masks, including disabled people who wish to participate in public demonstrations. And while law enforcement has been using surveillance technology to identify and intimidate protestors for many years, a new set of tools is specifically designed to identify protestors who wear masks through the use of enhanced AI-enabled facial recognition and other biometric technologies.

Corsight AI, for example, advertises that its surveillance technology works regardless of masks, hoods, and sunglasses, according to reporting from Surveillance Watch. It is not clear how effective these tools are, in part because scanning part of one’s face is inherently less reliable than scanning the entire face. But, if these tools are effective, they would disproportionately impact disabled people who need to wear masks to be in crowds safely, as well as those who wear sunglasses to mitigate light or sensory sensitivity, including blind or low vision people. Less accurate tools would also introduce familiar risks associated with false identification.

Some tools like NesherAI also allege that they are able to identify people wearing masks using “behavioral and gait analysis,” presumably because standard facial recognition may not be as accurate on mask wearers and additional data points would theoretically be useful for identification purposes. Again, there are questions as to how effective these tools actually are as a means of precise identification. But, if they do work, it would be to identify the more “extreme” cases– a dynamic that would disproportionately affect disabled protestors with gait differences or those who may “behave” in atypical ways as a result of their disabilities. These differences could make disabled people easier targets for identification by such tools.

Similarly, AI tools designed to identify conspicuous objects (like weapons, in theory) or to flag suspicious behavior in crowds can also have disproportionate impacts on disabled protestors. These systems could incorrectly flag disability-related behavior (like “erratic actions”) or even assistive devices as being suspicious or dangerous. Disabled travelers such as Emily Fogle have noted that their prosthetics are often flagged as suspicious when going through airport security, sometimes resulting in invasive additional screenings. It is both possible and reasonably foreseeable that surveillance tech used in protest settings could have similar deficiencies, disproportionately affecting disabled people.

There may not be extensive evidence at this time of the impacts that these tools are having on protestors in general, including protestors with disabilities. But, many people are unaware of the extent to which facial recognition or other biometric tools are being developed and deployed — due in large part to police efforts to keep the tech hidden — and of how those tools may have specific impacts on marginalized communities. This lack of transparency makes it difficult for both individual protestors and advocates to know when, for example, a surveillance tool was a contributing factor to an unjust arrest during a demonstration.

Furthermore, human rights risk assessments are expected to consider both current, actual impacts and potential future impacts, according to the UN Guiding Principles on Business and Human Rights – meaning they’re meant to be forward-looking, and to mitigate harms based on what is likely to happen. In the absence of federal privacy legislation regulating biometric privacy (and regardless of a growing number of state regulations on facial recognition and related tools), companies can – and should – account for these effects, as should any city or municipality that is weighing whether to adopt such technologies for law enforcement efforts.

The ability to exercise the right to peaceably assemble, for all people, is both vital to functional free expression and essential to a healthy democracy. Any chilling of that right, whether through the widespread use of surveillance technology or other means, is to our collective detriment.

Authors

Ariana Aboulafia
Ariana Aboulafia leads the Disability Rights in Technology Policy project at the Center for Democracy & Technology. Her work currently focuses on maximizing the benefits and minimizing the harms of technologies for people with disabilities, including through focusing on algorithmic bias and privacy ...

Related

Perspective
Civil Society Is At Risk—and Tech Is Part Of The ProblemNovember 20, 2025

Topics