Home

Want to Know About the Future of AI Equity? Watch What's Happening in California

Sareeta Amrute / May 18, 2023

Sareeta Amrute is Associate Professor of Strategic Design at Parsons, The New School and Principal Researcher at the Data & Society Research Institute.

California State Capitol, Sacramento. Shutterstock.

Two bills currently making their way through the California legislature could make the state a leader on tech equity.

The first bill, SB-403, which passed the California Senate on May 5th, clarifies that discrimination on the basis of caste—defined as “an individual’s perceived position in a system of social stratification on the basis of inherited status”—is unlawful under California’s anti-discrimination laws. By explicitly adding caste to California’s Unruh Civil Rights Act and Fair Housing and Employment Act, the bill recognizes a type of discrimination that affects all the countries and religions of South Asia–which include Pakistan, India, Sri Lanka, Bangladesh, Nepal, Afghanistan and the Maldives–as well as forms of discrimination based on inherited identity categories found in Africa, Latin America, and Japan.

Including caste in California’s anti-discrimination laws better equips the state’s Civil Rights Department to investigate cases of caste-based discrimination, which is often hidden and not easily recognized. It also encourages private employers to comply with the law and ensure that caste-based discrimination is not occurring at their workplaces. SB-403 will also make it easier for those experiencing caste-discrimination to come forth, especially when they may fear reprisals compounded by a temporary visa status, because it sends a clear signal to companies that caste needs to be taken seriously as a category of discrimination at work.

Even though detractors believe adding caste to the Unruh Act would enable racial or religious discrimination against members of the South Asian Hindu community, SB403 supplements, rather than overrides, existing legislation. Discrimination based on religion, race, or gender would continue to be illegal. In light of the additive nature of this bill, SB-403 has found wide support across the South Asian community in the United States. According to the South Asian Bar Association, “discrimination in any form is unacceptable, and the decision to take action against caste discrimination is a positive step forward in the pursuit of social justice”.

The second bill, the Automated Decision Tools Act, AB-331, introduced by Assemblymember Rebecca Bauer-Kahan (D-CA16), would prohibit the use of automated tools that result in algorithmic discrimination, which the bill defines as “unjustified differential treatment or impacts disfavoring people based on . . . [any] classification protected by California law.” Specifically, it would require deployers and developers of automated decision tools like machine learning and AI-backed hiring, sentencing, and credit-assessing applications, to perform audits on these tools. These assessments, which would be publicly available and submitted to the Civil Rights Department, would describe the technology’s stated purpose, benefits, the data it collects, an analysis of potential adverse impacts, and safeguards to address reasonably foreseeable algorithmic discrimination. The bill would then authorize state prosecutors to sue deployers and developers where an automated decision tool results in algorithmic discrimination. It also would authorize individuals to sue on their own behalf starting in 2026.

Here is where the two bills meet: as caste is explicitly added to the state’s anti-discrimination laws, caste-based algorithmic discrimination also will be unlawful under the new Automated Decision Tools Act. If these laws are enacted, California will become a leader in developing various means to test whether automation is producing discriminatory results across a range of products as applied to various and intersecting categories of ascription, caste, race, religion, ability, sex, and immigration status among them.

As is well demonstrated in studies of automated decision-making, the use of statistical markers of success or failure tend to reproduce historical patterns, which in themselves are discriminatory. Automated hiring tools, for instance, tend to filter for traits found in workers who currently hold positions, regardless of their ability to do the job for which they have been hired. Many automated hiring tools scan resumes for particular colleges, hobbies, and affiliations, screening out those that do not match what a company or job category already contains. As such, automated tools can work as anti-diversity filters that discriminate against candidates based on proxies for race (such as which club a person joins or what colleges and universities they went to) or gender (such as what hobbies or volunteer services they might pursue).

These same patterns of discrimination may also apply to caste, since profiles might be matched to particular educational institutions or organizations that have historically excluded oppressed-caste peoples. In housing markets, tenant screeners that claim to predict rates of eviction are used with little oversight or assessment, leading to predictably biased outcomes. Here too, the same patterns of using proxies such as language and area of residence for rental-worthiness may encode caste discrimination.

The fact is, there is little research on the effects of automation on caste bias in hiring, housing, or any other areas of business and public life. Taken together, these two bills would lay the foundation for discovering–and then remedying–how all kinds of discrimination, including caste, affect life prospects as automated, AI-backed systems increasingly govern access to goods, opportunities, and services worldwide.

Authors

Sareeta Amrute
Sareeta Amrute is Associate Professor of Strategic Design at Parsons, The New School and Principal Researcher at the Data & Society Research Institute. She is an anthropologist who studies race, labor, and class in global tech economies. She is the director of the Trustworthy Infrastructures program...

Topics