U.S. Senator Ron Wyden (D-OR), Senator Cory Booker (D-NJ) and Representative Yvette Clarke, (D-NY9) today introduced the Algorithmic Accountability Act of 2022. An announcement referred to it as “a landmark bill to bring new transparency and oversight of software, algorithms and other automated systems that are used to make critical decisions about nearly every aspect of Americans’ lives.” The bill’s cosponsors include co-sponsored by democratic Senators Brian Schatz (D-HI), Mazie Hirono (D-HI), Ben Ray Luján (D-NM), Tammy Baldwin (D-WI), Bob Casey (D-PA), and Martin Heinrich (D-NM).
And, it would empower the FTC to enforce violations of the legislation as unfair or deceptive practices.
This newly proposed legislation is broader in scope than the Algorithmic Justice and Online Transparency Act, proposed by Senator Edward Markey (D-MA) and Congresswoman Doris Matsui (CA-06) last year, as well as an earlier version of a bill with the same name from Sen. Wyden and Rep. Clarke introduced in 2019.
“When algorithms determine who goes to college, who gets healthcare, who gets a home, and even who goes to prison, algorithmic discrimination must be treated as the highly significant issue that it is. These large and impactful decisions, which have become increasingly void of human input, are forming the foundation of our American society that generations to come will build upon. And yet, they are subject to a wide range of flaws from programing bias to faulty datasets that can reinforce broader societal discrimination, particularly against women and people of color. It is long past time Congress act to hold companies and software developers accountable for their discrimination by automation,” said Rep. Clarke in a statement.Algorithmic-Accountability-Act-of-2022-Bill-Text
The legislation directs the FTC to create regulations that would require companies that meet certain size thresholds– such as those with revenues greater than $50 million or more than $250 million in market capitalization– to do impact assessments on how automated decision making systems affect the lives of consumers. Companies would be required to produce documentation and submit it to the FTC, and to show diligence in producing the assessment, such as consultation with internal ethics and “responsible technology teams” and external “representatives of and advocates for impacted groups, civil society and advocates, and technology experts” in order to “attempt to eliminate or mitigate” negative impacts.
The legislation would also require companies to assess the privacy and security risks of automated systems. And, it would require companies to “support and perform ongoing training and education for all relevant employees, contractors, or other agents regarding any documented material negative impacts on consumers from similar automated decision making systems” in order to facilitate best practices and the adoption of proposals from “advocates, journalists, and academics.” The FTC would also publish a repository of summary information on the assessments and update it on a quarterly basis.
The statement from Wyden’s office notes that the legislation is endorsed by groups and individuals including “Access Now, Accountable Tech, Aerica Shimizu Banks, Brandie Nonnecke, PhD, Center for Democracy and Technology (CDT), Color of Change, Consumer Reports, Credo AI, EPIC, Fight for the Future, IEEE, JustFix, Montreal AI Ethics Institute, OpenMined, Parity AI and US PIRG.”
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & Innovation. He is an associate research scientist and adjunct professor at NYU Tandon School of Engineering. Opinions expressed here are his own.