Home

Donate
Podcast

Centering Disability Rights in US Tech Policy 35 Years After ADA

Ariana Aboulafia / Jul 24, 2025

Ariana Aboulafia is a fellow at Tech Policy Press. Audio of this conversation is available via your favorite podcast service.

This weekend, the Americans with Disabilities Act (ADA) turns 35. Signed into law on July 26, 1990, by then-President George H.W. Bush, the law provides broad anti-discrimination protections for people with disabilities in the US, and has impacted how people with disabilities interact with various technologies. To discuss how the law has aged and what the fight for equity and inclusion looks like going forward, I spoke with three leaders in the disability rights community—Maitreya Shah, Blake Reid, and Cynthia Bennett—working at the intersection of disability and technology).

  • Maitreya Shah is the tech policy director at the American Association of People with Disabilities.
  • Blake Reid is a professor at the University of Colorado.
  • Cynthia Bennett is a senior research scientist at Google.

During the conversation, we discussed the different roles that academia, civil society, and industry can play in enshrining disability inclusion in both technology and tech policy. As Maitreya Shah mentioned:

I feel like we are wearing several different hats, but I think the most important work is to really see that people with disabilities have a voice and a seat on the table where any decisions related to technology are being made, whether those are in the manufacturing or production side or in the government or regulation side.

We also discussed the impact of the Americans with Disabilities Act, and how advocates have continued to face barriers in ensuring that the vital and wide-reaching civil rights statute applies not only to the web, but also to AI tools and other emerging technologies. Blake Reid said:

One of the challenges that Congress faced in drafting the ADA was trying to think about how do we… define all the parts of the world that we want to be equitable, that we want to be accessible, that we want to be inclusive and articulate exactly what that means… We do all these things online now, but Congress was talking about all these places that have these roots in the physical world. So, what do we do about that?

Perhaps most importantly, we spoke about the importance of thinking about how technology and tech policy impact people with disabilities, even if tools or systems are accessible. Cynthia Bennett described this:

There can be kind of disproportionate impacts of technology on people with disabilities that are more far reaching than addressing direct interactions… for example, the way generative AI generates text about disabilities. Some of it might have false or incorrect information. Representations of disability and text and images can often replicate stereotypes, or present people with disabilities in very negative ways.

As technologies become more prevalent, people with disabilities are impacted in every aspect of life—this conversation illustrates the importance of ensuring that people with disabilities are properly centered in every step of the creation and implementation of tools and policies.

What follows is a lightly edited version of the transcript of the discussion.

People on phones (portrait) by Jamillah Knowles & Reset.Tech Australia / Better Images of AI / CC by 4.0

Ariana Aboulafia:

My name is Ariana Aboulafia, and today's podcast, as a framing, is going to be about the upcoming 35th anniversary of the passage of the Americans with Disabilities Act. I have some great people here who work at the intersection of technology and tech policy and disability, and we're going to be talking a little bit about how those things intersect and how tech and tech policy issues impact people with disabilities. In light of that, I'm going to start with Cynthia. If you want to go ahead and introduce yourself, talk a little bit about what it is that you do, we'd love to hear it.

Cynthia Bennett:

Hi, my name is Cynthia Bennett. I'm a research scientist in Google Research, specifically in our human-centered AI organization. I am a researcher by training, which means I ask questions about the world and, luckily, cool people, often people with disabilities, have been generous to spend time with me to explore the research questions that I have. Currently, I am focused on two areas. One is understanding how generative AI represents people with disabilities and developing evaluations, so that AI technologies can be better about representing people with disabilities more respectfully. By generative AI, I'm referring to tools that automatically generate text or images, now we think video, even music, but specifically tools that generate content by answering a prompt that a user might type in or request. The second area of my research is novel-applied accessibility solutions, and I often work with different stakeholders to understand the impacts of novel accessibility solutions on them. For example, recently I worked with artists who have disabilities to understand how generative AI tools, like the ones I just described, could affect their workflow.

How could automation maybe enhance some of the accessibility or remove accessibility barriers? But also at the same time, generative AI is critiqued for copyright infringement and other ways of threatening artists' work. Those are just a few of the things I'm working on, but largely, AI ethics in disability.

Ariana Aboulafia:

Thank you so much. Maitreya, do you want to go next?

Maitreya Shah:

My name is Maitreya Shah. I'm an attorney and I currently work as the director of technology policy at the American Association of People with Disabilities, AAPD. AAPD is a national disability rights and across disability organization. We work in several different areas to increase the political and economic participation of people with disabilities. I currently lead AAPD's work on all things technology policy. That includes assistive technology, digital accessibility, broadband and telecom access, and AI and emerging technologies. Some of the work that we do include collaboration with the industry on ensuring that disability is included in their product development and design. We also do advocacy and policy interventions at both the federal and the state level, so we try to work on model governance instruments on legislations, on bills, and other regulatory statements and other things. We are also trying to create a research-to-policy pipeline. That means two things.

One, we are also trying to start our own projects where we do our research and then see that translated to policy interventions, but we also partner with the academia and research institutions to see that the research makes it to wherever, either to the legislators or to the industry, and so on. We do work in several different areas and with many different stakeholders, but I feel like the most important work that we do on technology is to engage with the disability community and see that people with disabilities are both informed about technology and their voices are heard.

Ariana Aboulafia:

Thanks so much, Maitreya. We'll kick it over to you, Blake.

Blake Reid:

Thanks, it's so nice to be here. Hi, everyone. My name's Blake Reid. I usually introduce myself as a failed computer scientist, and I should probably add failed lawyer to that as well. I'm a law professor, I teach at the University of Colorado. Had a short-lived career as a computer scientist and then a somewhat longer-lived career as a legal clinician, and I worked in tech policy clinics that focused on issues that start in telecommunications law and internet law and copyright law, but ended up spending a lot of my clinical career working with different disability communities on much of the same work that Maitreya just described. In particular, I spent a lot of time at the Federal Communications Commission on the implementation of the 21st Century Communications and Video Accessibility Act, which is focused on both the accessibility of video programming, the accessibility of communications technologies, and also did a fair amount of work on limitations and exceptions to copyright law to facilitate the transformation of materials from one format to another.

For example, the transformation of books into audio formats for accessibility purposes. These days, I've hung up both my coding and my lawyering spurs and I teach copyright and telecom law, but I also write a lot about the history of the legal and policy interventions in technology accessibility. I spent a lot of time looking at, "How do laws like the Americans with Disabilities Act, how did they age as technology has come on the scene and transformed different aspects of society that the ADA was meant to protect us as areas of equality and inclusion? How is it fared over time and what does that mean for the efforts that Maitreya is talking about going forward?" Looking forward to the conversation.

Ariana Aboulafia:

Thanks so much, Blake. We're going to talk quite a bit more about that, about the ADA, and how it's aged and what it means in the context of tech and so many emerging technologies that we're seeing. I want to talk a little bit about what Blake just mentioned. Everyone on this call knows, and I'm sure we'll talk about it more, but there is a rallying cry, the disability rights movement of, "Nothing about us without us." That has been modified to mean, "Nothing without us." What that means is that, really, all issues are disability issues, and particularly with technology, right? All issues in tech and tech policy are issues about disability, not just issues of accessibility, although we'll talk a little bit about that, too. The Americans Disabilities Act is a broad anti-discrimination statute, and it has led to very significant progress towards equality for people with disabilities, particularly in the United States, but tech tools they represent and they pose potentially new challenges.

I use the term "Tech-facilitated disability discrimination" to encompass the wide umbrella of ways in which people with disabilities may face discriminatory or disproportionately negative outcomes as a result of interacting with technologies, and that can be in employment or in education or when applying for benefits, and all sorts of things. I think that this is going to be a two-parter, and Blake, I'm going to kick it to you because you brought it up, but the question is, in what ways in your work do you see tech potentially contributing to discrimination against people with disabilities? But also, how does the ADA, in your opinion, age or can be used to combat some of those, whether in practice or in theory? Blake, I'm going to start with you.

Blake Reid:

Well, maybe I'll choose an example that implicates the ADA directly, and that's the web. I know that's something that we'll talk about a lot today. When the ADA was being drafted back in the late '80s, it followed on both an earlier strain of disability law, the Rehab Act, which we've mentioned already, which is all about the accessibility of government services, and then it followed on... Can think about it as part of the broader set of outgrowth of the Civil Rights Movement and the Civil Rights Act in particular, which has a provision, Title II, that prevents discrimination in places of public accommodation. One of the challenges that Congress faced in drafting the ADA was trying to think about, "How do we take what Jacobus Timbra called 'The right to live in the world' and articulate what the world means?" How do we concretize that to say, "We want to define all the parts of the world that we want to be equitable, that we want to be accessible, that we want to be inclusive, and articulate exactly what that means"?

It's difficult to just say "The whole world," because when we're thinking about enforcing a law, we want to get specific, we want people to understand what that means. The compromise that Congress settled on in drafting the ADA and thinking about, "What are the public places run by private entities," we call them "Places of public accommodation," "That are covered under the ADA?" They define several categories. They say, "Places of recreation and exercise," things like gymnasiums and golf courses and bowling alleys. They say, "Things like restaurants and bars that serve food." Clothing stores and other places that sell goods. They talk about different kinds of service establishments and use similar sorts of examples. They started try to paint the world in both broad strokes by identifying all these categories and then by providing all these really specific examples. The trouble is that ADA is drafted in 1990 when the World Wide Web is just a glimmer in Tim Berners-Lee's eyes.

We're just about to see the launch of the first website, it comes something like five months after the ADA is signed into law. Then, as the courts start to interpret the ADA, they take a look at websites, we start asking questions about, "Well, what does it mean for a website to be accessible? How do we construct the architecture in a way that it works with screen readers and so on?" We suddenly have to confront this really hard question and say, "All the examples that Congress talked about in the ADA are these physical locations. They're these types of places that we have always associated with physical locations, so how do we make sense of these new places that are in virtual reality that are, in many ways, transforming how society works and providing access for lots of people to things that they would not have had access to in the real world?"

This is the part where technology giveth, right? For people with physical and mobility disabilities for whom getting around the built world of a city is just incredibly time-consuming and challenging, because of all the barriers in the built world, suddenly, we can use a computer to shortcut past all of that, but now we've introduced this new digital architecture that also poses all of these barriers to assistive technology, that people use computers to operate, screen readers, and so forth, which I know Cynthia and Maitreya will probably have a lot to say about. Then we confront this legal question, which is, "Are these new digital places the kinds of places that Congress meant to cover?" We see the courts really get stuck on that question.

They really get stuck on, "On the one hand, it has replaced all the things that we used to do at banks and stores and bowling alleys, we do all these things online now, but Congress was talking about all these places that have these roots in the physical world, so what do we do about that?" I think it illustrates the struggle that the law has with keeping up with, "How do we shoot broad? How do we embed this value of accessibility, of equitable access, of inclusion into the future while being specific enough to make that understandable, to make that concrete for people?" I think it's a real challenge and I'll leave it there.

Ariana Aboulafia:

Cynthia, I'm going to turn it to you.

Cynthia Bennett:

I don't have an expertise in law, but I'll provide a couple of examples. Laws like the ADA, from my understanding, are aimed at ensuring that people with disabilities have equal opportunities, and this, to me, relates to direct engagements. Someone using a piece of technology, and that makes a lot of sense, but the impacts... There can be disproportionate impacts of technology on people with disabilities that are more far-reaching than addressing direct interactions. For example, where I work with generative AI tools that are generating content, some of this content will be disability-related and it may or may not be generated by someone who identifies or is perceived to have disabilities, but still, that topic, it can affect how people with disabilities are treated or conceived of. I understand the impacts of bias and technology-facilitated disability bias, as Ariana has articulated, are much more wide-ranging than understanding and addressing people with disabilities' direct engagements with technology.

For example, the way generative AI generates text about disabilities, some of it might have false or incorrect information, representations of disability and text and images can often replicate stereotypes or present people with disabilities in very negative ways that are not preferred by the people with disabilities I've worked with to evaluate these models. I would argue that is a way both of disabilities are being impacted that is much more widespread than the ways I often hear about laws rooted the ADA applying to. The second example I'll give is, while there are web content accessibility guidelines and many branches from there developed by various companies and organizations, those tend to direct website developers to, again, ensure that a person using an access technology, such as a screen reader or the keyboard or switches, to engage with a website to ensure that they can do that.

Those guidelines are not yet caught up, so something that I often talk about is, again, when we talk about AI and interfaces where users can prompt something and there's an output, the user or even the website developer can't predict what that output will necessarily be, so we need to develop new guidance for how to ensure that these unpredictable or changing outputs are accessible to users, is a way that I see that guidelines really need to be expanded and that I don't find is often covered.

Ariana Aboulafia:

Maitreya, I'm going to turn it to you as well, if you want to talk a little bit about how... Because I know you do work and AAPD does work both on the accessibility side, on the telecom side, and the assist technology side, on the algorithmic bias side. Maitreya, you truly do it all, so maybe you can talk a little bit more about how tech contributes to discrimination in your work and what you come across, and then any thoughts on the ADA and how it does or does not address some of those concerns?

Maitreya Shah:

I think I will build on to what Blake and Cynthia shared. I think I completely agree on the limitations of the ADA as well as the numerous challenges we are facing, and those are some that we encounter on a daily basis in our work at AAPD. Just to talk about technology-based discrimination, I think the conversation has really been restricted to the idea of accessibility for many decades, as we all know, and we've not really gone beyond the web content accessibility guidelines, so websites and of web applications, and so on, and realize that there are many, many other issues beyond just accessibility of websites or web applications that affect people with disabilities. Some of the examples that both of my co-speakers already shared, but there are others, such as... AAPD's president and CEO, Maria Town, recently wrote about this. How mammography machines, which is also a form of advanced technology, are usually not accessible for people with disabilities and depriving them of crucial diagnosis of cancer.

That just exists with other medical devices as well. That's one piece of that larger work, but then in this new boom, new hype of AI and algorithmic technologies that we are seeing, people with disabilities, I feel, are pushed or, I would say, further pushed to the margins, whether that be in sector of employment, in education, in social security, in immigration. There are many, many issues where people are either excluded, face inaccessibility, or face bias and discrimination. So as far as ADA is concerned, on top of what Blake already said, I think ADA, we have been told by the Department of Justice for so many years now that that will be a federal regulation on digital accessibility. We haven't seen that materialized yet. We just passed the Title II Web Accessibility Regulation last year for state and local governments, and that is also just limited to websites and web applications.

We have not even reached the stage where we can talk about accessibility of AI systems or the content, as Cynthia shared, that is generated by AI systems, and so on. There are also other issues. For example, how much are the employment discrimination-related provisions in ADA, how much do they cover algorithms? The EEOC had published some guidance and have been trying to do some enforcement until the previous administration at least, but there are many, many unanswered questions there as well. How could employers keep disability data protected when they are suddenly using so many algorithmic tools for recruitment and hiring? I know CDT has written extensively on this issue before, but there are many other issues as well, so I feel like there is this big issue of privacy and how ADA is framed that is posing issues in the age of technology.

We often raised this issue that a lot of AI technologies are not trained on disability data, and on the other hand, ADA puts disability as a, quote-unquote, "Protected characteristics," then say that disability information is legally protected, and this poses a privacy conundrum, as Professor Jasmine Harris has written extensively about, where on the one hand, you are keeping disability data protected to prevent stigma in the society, and on the other hand, you don't have enough disability data to train your AI systems, which in a way, results in so much bias and discrimination against people with disabilities. I feel like there are many limitations to ADA in different quarters, whether that be employment, whether we are thinking about tech development and design, or whether we are thinking about students with disabilities and how AI affects them, and so on.

Ariana Aboulafia:

Thanks, Maitreya. Yeah, I'm actually going to kick it right back to you to talk a little bit more about some of these things, but I'll set aside some framing. As you all have said here, we still have quite a bit of a long way to go, even just when we're thinking about accessibility, web accessibility, app-based accessibility, and then secondarily things like accessibility of AI systems and AI tools, and all of that. The accessibility of these tools of the web and other emerging technologies is not a foregone conclusion, but in a lot of the work that we do at CDT is we tend to think about what concerns for people with disabilities when they interact with technologies look like when we're not thinking just solely about accessibility.

The reason that we frame it in that way is partially because, when you think about the way that a lot of folks in the tech community, may, as a reflex, think about disability and technology, they may think about it in a silo of accessibility and think that if, let's say, a product is accessible, then that checks every box when it comes to people with disabilities. But Maitreya, as you mentioned, there may be really intense privacy concerns, and those concerns, they may be exponentially worse for people with disabilities that may have to share, let's say, sensitive health or disability-related data to use a technology or an assistive tool or something like that. It's not just privacy, there are some tools or some technologies that may not work for people with certain disabilities. When we think about those tools being incorporated into part of, let's say, travel or the hiring process for employment, it's really difficult to think about how those tools would not have a discriminatory impact on people with disabilities.

Maitreya, I want to kick it back to you to answer this question and talk a little bit more about, not to say that all of your work or all of any of our work is outside the context of accessibility, but what it means to work at the intersection of tech and disability in a way that breaks beyond solely thinking about tech and disability as meaning accessible technology.

Maitreya Shah:

I really like this question. I think all of our thoughts are quite aligned on some of these things, and I know that we share a lot of work in this area as well. I like the framing of thinking beyond accessibility, because to answer your question... I feel like when we talk about accessibility of technologies, we also start with many assumptions. One of the first assumptions that we are trying to address right now and tell the organizations and partners that we work with is that, when you just talk about accessibility, and particularly digital accessibility, you start with an assumption. An assumption that people with disabilities are going to use a website or a web application and have that access, and then they will face some accessibility barriers, because the website or the digital app, or something, was not designed in an accessible way, but one of the biggest assumptions that get shattered here is that so many people with disabilities in our country don't have access to the internet.

In some of the recent raising debates on the constitutionality of the Universal Services Fund and a lot of issues with digital equity grants under the Federal Communications Commission and the BEAT program and other things, we have been trying to tell people that people with disabilities don't have access to internet, and that is a massive digital divide that people with disabilities face. We have this large group of people who can't even go and get onto the internet or use a smartphone, so accessibility issues are secondary for them.

I think this conversation goes on to think about many, many other ways in which this intersection of technology and disability affects. To talk about other forms of disabilities that we work with, there is this increase in the market of variable healthcare technologies, especially things like these diabetes monitors that are usually provided by doctors to people with disabilities who have diabetes ,and the amount of health data that it collects and then how it is used as a disability proxy and how much it can tell about a person is just so alarming and raises all the privacy concerns that you just talked about. If I give you another example of just going beyond accessibility, there is an ongoing case called Murphy v. Workday Inc. In this, the plaintiff has submitted proof of how he had applied for about 100 jobs and got rejections from all of them. What was common was that there was a single platform that all the employers were using to hire applicants.

Here, the issue was not accessibility. Here, the issue was that the person was screened out for all his applications, sometimes receiving rejections at 12:00 AM, 1:00 AM, 2:00 AM, where you would not expect any humans to work or be present in the office and review applications. You would know that there's an algorithm screening these tools or screening these applications and has a potential disability bias. Just fixing accessibility in all these scenarios would not change the circumstances for people with disabilities. I think we'll have to start thinking broadly about access, about inclusion, about agency, about privacy, about epistemic justice, and many, many other intersecting themes.

Ariana Aboulafia:

Cynthia, do you want to jump in and add anything? Again, the framing question is, in your work and in general, how you think about how tech can impact people with disabilities outside the context of accessibility, or just thinking about accessible tech. I know some of your work has touched that.

Cynthia Bennett:

Just quickly, echoing what others have said and what I've said before, particularly thinking about the information disseminated by AI systems, there can be lots of different harms or I think it would be fairly blurry, like misinformation or the inability of an AI system to provide relevant or accessible information that would be useful or correct for people with disabilities and their various accessibility needs. Then, as I mentioned, also the ways that information is disseminated via AI systems that is harmful and that maybe is not necessarily directly affecting people with disabilities in every moment, but is contributing to a broader false narrative about disability.

Ariana Aboulafia:

Blake, do you want to add anything?

Blake Reid:

Yeah, I will add a thought on this. I think the way we frame, as advocates for disability rights or disability justice, and what level of abstraction we frame the problems we're approaching is really important. I think about the work of people like TL Lewis, who is focused on abolition and carcerality as a level of abstraction, for example. I think those choices matter a lot, right? It makes a big difference, as Maitreya is alluding to, when you frame the problem as, "Well, let's think about the particular structure of outputs of a large language model and whether they are reflecting some form of implicit bias." That's a really different and much harder problem in certain ways, much easier problem in certain ways, than, "We are trying to figure out how to help incarcerated disabled people get access to basic communications to the outside world."

Those are just worlds apart in terms of the legal surfaces that are involved and the technologies that are at issue. I think the question is really important, and I don't know if I have a good answer to it, but one thing I'll say is, as we have seen both technology evolve and the accessibility of technology evolve, we complicated the political economy of advocating for improvements at all of these layers of abstraction. I want to drive this. I'm being a little obtuse here, so I'll try and drive this point home with an example, which is, when you look at a lot of the early technology accessibility movements, there's real interest convergence with deeper forms of subordination and inclusion. I want to take close captioning as an example. Where does closed captioning come from? It comes from the introduction of talkie movies.

Now, we can talk about closed captioning as an abstract technology that makes video programming accessible for folks who are deaf or hard of hearing, but when talkie movies came on the scene and disrupted the silent movie industry, it was a form of exclusion of both labor. All of the deaf actors that worked in silent movies were essentially, overnight, kicked out of the industry. This is a disruption to livelihood, this is a disruption to the ability to create, to participate in a creative community. It took the biggest form of entertainment, which had been equitably accessible for the deaf community up until that point, and said, "The movies are no longer for you and that social nexus of the movies is no longer for you." The approach to remedying all of that is the closed captioning movement, right?

Now we need to figure out ways to translate the audible forms of a soundtrack into visual depictions to display them on a screen, but there's real interest convergence with that in seizing back labor power, seizing back social inclusion as bigger ideas. Once you solve some of that problem, you let a lot of the air out of the balloon to advocate around it, and I think that's what's really hard about AI, because so much of the exclusion is difficult to find, right? It's difficult to make legible, it's difficult to communicate how it materializes in the world, and that makes it really difficult to advocate around, so I would just say this question of abstraction and how we frame harms as being about technology accessibility versus something else are really, really important questions both for strategy and tactics of movements, but also in just understanding the marginalization and the barriers that people face. "What are they at root?" Is a really contestable question, so I think it's important we grapple with it.

Ariana Aboulafia:

One of the cool things about having you all on this podcast, although many of us wear many hats, is that we have representation from Blake, who is a professor, works in academia, and Cynthia, who works in the tech industry, and Maitreya who works in civil society, in the disability rights space in particular. Earlier this year, in March, CDT and AAPD together published a report where we thought about, "What would it take and what would it look like to create an ecosystem that would be inclusive of people with disabilities regarding the use of AI in particular?" One of the things that we thought about and grappled with is the different roles of civil society and academia and industry and also secondarily disability rights and justice advocates and people with disabilities and legal aid attorneys and government agencies, and all of the different pillars that would have a role in regulating or developing or using these technologies, and what would it look like for all of those players to work together to build a disability-inclusive AI ecosystem.

Since we have some representatives from some of those areas here with us, I wanted to ask the question of, what do you think is the role respectively of industry, if you're an industry or civil society, if that's where you are, academia, in creating a disability-inclusive tech or AI ecosystem? Cynthia, I'm going to start with you on that.

Cynthia Bennett:

I work in industry and I think my response is idealistic. I see the role of industry as being able to drive development forward, and I say that with a lot of caveats. In a magical industry company where the regulations are created by the people, and as the industry company works with people in respects to those regulations, but doesn't necessarily own that or take charge of that, but can take charge of driving these development processes to provide a scaled access and scaled innovation when that is deemed appropriate, again, by civil society and the people. I also see, a lot of times in industry, there's a focus on scaling, but I also would argue that an industry... It is possible to have resources to create... Yes, maybe there might be a centralized tool or experience, but also customizations and individualized experiences that I think, sometimes in smaller organizations, it can be harder just because of lack of resources.

But when we have these larger concentrations of resources, there are opportunities to explore, "What are the unique user journeys and experiences that different people need?" And to enable that. I guess that's an idealistic view, and if others on the call in different sectors have suggestions for how industry can be better, I would love to hear them, but I feel fairly strongly that certain things, like data stewardship and ownership and setting regulations, I feel that, when industry is in charge of those, there can be conflicts of interest. I would really like to see industry be driving development that responds to the people.

Ariana Aboulafia:

Blake, I will kick it to you.

Blake Reid:

All right, I'm going to pick up the industry point and then I could say a word about academia, which is to say... One of the things I study is the curb-cut effect, which is the idea of technology interventions designed for accessibility purposes that have positive externalities, positive impacts for non-disabled people. I think, for better and for worse, that drives a lot of how the industry approaches accessibility, which is like, "If we make our technology more accessible for everyone, it will be better for everyone. It'll make our technology more valuable and we'll get the virtuous cycle of positive effects," and I think that could be true to an extent, but that's not the only set of problems and solutions that exist when we're talking about the impact of technology on people with disabilities, the role that people with disabilities have in the crafting of technology.

I think that's where, in the first order, the law and policy has a really important role to play. I think, obviously, as academics, we have some role in outlining the world of law and policy and communicating about it and observing the patterns that have occurred over time, and all of that sort of thing, but I think the real story, as I look back at the history of this stuff, is... The most important thing is the political agency of disability communities and thinking about how, in all of the silos that you've mentioned, Ariana, that that becomes real, that traction begins to happen. Sometimes it can be the law and sometimes it can be an industry and sometimes it can be in academia, but I think that's the core focus, right? It's all about, "How do communities' voices become concretized and matter and drive change, whether that is inside of a company in their design process when they're training LLMs, when they're designing services, when they're developing tools?" Whatever it is, when they're approaching data collection and storage and protection.

Does it happen in law and policy when we're crafting new legislation and rules? Does the community have a seat or maybe the biggest seat at the table? Are they in the agencies? Are they in the legislatures? Are they represented in Congress? I think it matters in academia as well, right? Are there ideas that are coming out that are representative of where the communities want things to head? That's what I think about when I think about this question of, "How do the different silos contribute?" I think there are lots of different ways, but that core goal of it all is building political capital and political agency and figuring out how to spend it in the right ways.

Ariana Aboulafia:

I agree. Maitreya, do you want to talk a little bit on the part of being at a major disability rights organization and being in civil society?

Maitreya Shah:

At AAPD, I feel like we are quite uniquely positioned in the United States as the only disability rights organization that works or that focuses on cross-disability issues and has both the experience and the expertise of working on a wide range of technology issues. That means many different things. We obviously do a lot of advocacy with companies, push back on bad regulations, criticize problematic technologies, and many other things, but I feel like some of our most important roles include, one, to build awareness around the impact of technologies amongst the disability community. People with disabilities obviously use and are subjected to many different forms of technologies, particularly AI, and they're often not adequately informed about the risks, the harms, and the actual real implications of these technologies. As an organization that is deeply involved with the disability community, we try to build more awareness amongst people with disabilities, give them more information on what they need to know about a particular technology, and so on.

But we also work with governments and industry on the other side of this issue, and I think this partly answers the industry question that was raised. Something that I've seen over the years is industry often develops products and then approaches people with disabilities to either test the products or to see how they're working or if they're accessible or not, and I feel like a lot of that has come out of the compliance-related questions that the legal teams and the policy teams would be sitting together and thinking about, but they don't really solve the access inclusion and discrimination problems that we've all been talking about in this podcast.

At AAPD, we are also trying to work with the industry to see that disability gets included from the very beginning, to see that we work with the industry on approaches, such as privacy by design, inclusion by design, to see that people with disabilities or disability data or the harms that people with disabilities face, all of these are included on every stage of a technology life cycle, whether that be in the ideation phase, whether that be the design or development phase or when it comes to the market or when we are thinking about regulation, whether that be self-regulation by the industry or state-sponsored regulations with the government. I feel like we are wearing several different hats, but I think the most important work is to really see that people with disabilities have a voice and a seat on the table where any decisions related to technology are being made, whether those are in the manufacturing or production side, or in the government or regulation side.

Ariana Aboulafia:

I'll just hammer in on that, too. I think so much of what we do here at CDT and so much of what we talk about is making sure that people with disabilities are centered both in tech, Maitreya, like you said, in the development and all of that, and the tech itself, but also in tech policy. I think it is very much the north star of the work that we do and the work that so many of us do. I know that we are coming up on time, I just want to thank all of our guests again for being here. Blake Reid, Cynthia Bennett, and Maitreya Shah. Thank you all so, so much, and thank you for all the work that you do to make sure that people with disabilities are being centered in tech and in tech policy. Thank you, all.

Cynthia Bennett:

Thank you.

Maitreya Shah:

Thank you.

Authors

Ariana Aboulafia
Ariana Aboulafia leads the Disability Rights in Technology Policy project at the Center for Democracy & Technology. Her work currently focuses on maximizing the benefits and minimizing the harms of technologies for people with disabilities, including through focusing on algorithmic bias and privacy ...

Related

Analysis
DOGE & Disability Rights: Three Key Tech Policy ConcernsMay 12, 2025
Perspective
Clocked In: How Surveillance Wage-Setting Can Affect People with DisabilitiesJune 4, 2025

Topics