Home

Donate

US Congress Must Restore Fairness Protections to Privacy Bill

Laura MacCleery / Jun 26, 2024

June 13, 2024 - The United States Capitol. Justin Hendrix/Tech Policy Press

Until this week, the US Congress had been considering the most promising bi-partisan privacy bill in several years—the American Privacy Rights Act (APRA). The bill is currently scheduled for a mark-up on June 27 in the House Energy and Commerce Committee, but many civil society groups, including UnidosUS, are calling for a delay, because a new draft strips key consumer safeguards from the bill and opens regrettable loopholes for data stored on “devices” such as mobile phones.

In a few key areas—including for housing, employment and credit—laws already bar discrimination, but even for those, the standards are poorly enforced and creating any accountability is practically challenging for those impacted. Many other consequential areas that involve data-driven decision-making—including education, workplace surveillance, and policing and sentencing, among dozens of others—remain a Wild West and lack even rudimentary protections for fairness.

The artificial intelligence (AI) explosion means that these sorts of complex decision models will drive opportunities across nearly every conceivable area of our lives. It was therefore fitting that the prior version of the bill barred use of sensitive personal data to discriminate against consumers and required tech companies to test algorithms and submit data showing that outcomes are fair.

In fact, it’s both naïve and uninformed to assert, as Sen. Ted Cruz (R-TX) has, that the prior version of the bill was “woke” merely because it included such assurances of fairness. Because they form predictions based on patterns, bias is a major concern with AI systems—but it can be surprisingly specific, and ample evidence shows they can discriminate against anyone, regardless of race or ethnicity.

For example, a 2024 Bloomberg investigation of an employment hiring system based on ChatGPT showed it would problematically favor certain demographic groups for specific roles—for example, recommending Asian women as top hires for investment jobs and disfavoring white men, and that all four groups men were down-ranked for human resource positions regardless of race or ethnicity. At least one demographic group had an actionable claim under federal employment law for every role they examined. Another example of systems’ inherent unpredictability is a hiring algorithm that became infamous for preferring applicants named Jared who played high school lacrosse.

At their core, AI models match patterns found in vast volumes of data, “learning” inferences along the way. Basic checks for fairness ensure mistakes do not get made and bias in any specific direction, even unanticipated ones, does not affect the outcome. Without such checks, whenever AI is used across many areas of life, we will be unable to ascertain whether it is treating applicants, students, or patients fairly.

As a blog from the Federal Trade Commission warned last week, AI “avatars and bots can collect or infer a lot of intensely personal information.” Now more than ever, we urgently need a national privacy law. And support for the bill has been bi-partisan: Rep. Gus Bilirakis (R.-FL), Chair of the Energy and Commerce Subcommittee, praised the bill for giving “Americans the right to control their personal information, including how and where it is being used, collected, and stored,” and for creating a national framework for “consistent rights, protections, and obligations” for all consumers. In an Innovation, Data, and Commerce Subcommittee markup hearing of the bill, he also lauded the algorithmic assessments as a positive step to “prevent manipulation of Americans.”

Lawmakers’ sense of urgency is responsive to public worries. A 2023 Pew survey found 81% of Americans are concerned about how companies use the data they collect, and 68% of Republicans and 78% of Democrats support regulation of what companies can do with their personal information. The same polls show that the more people know, the more anxious they become. Our lack of basic privacy protections in the U.S. is also a sore point for voters—both Democrats and Republicans support more rules of the road for use of consumer data. A 2023 UnidosUS poll found Hispanic voters’ top concern about AI is that it would reduce personal privacy.

Yet moving forward without baseline protections could permit discriminatory data practices to remain undetected and unchallenged. APRA would only be a major step forward if it includes actionable new safeguards on unfair forms of bias while preventing collection of our personal data without consent. Lawmakers’ failure to recognize the need for baseline fairness is short-sighted and—as we move to integrate AI in everything—could harm any one of us.

To be clear, APRA is far from perfect—for example, the bill side-steps hard questions about unwarranted surveillance by governments. Such risks will need to be addressed in future legislation. And remedies for harms for people in states with privacy safeguards should remain available to them, as the bill currently provides. In addition to the flaws above, the latest version also opens a massive loophole for data stored on our devices, and lawmakers will need to eliminate or narrow that exemption substantially to accomplish anything real on privacy.

Still, a version of APRA with the fairness checks restored to the bill would be the best vehicle federal lawmakers will likely have for the foreseeable future to deliver on consumer privacy—a hot-button electoral concern for millions of Americans across the political spectrum. Policymakers should not walk back this chance at progress, or neuter its provisions out of ignorance about how AI works and the risks it entails for everyone. They should also keep their eyes on the real target: creating recourse for consumers to stop the constant extraction and sale of highly personal data without our consent.

Authors

Laura MacCleery
Laura MacCleery is Senior Director for Policy and Advocacy at UnidosUS, the nation’s largest Latino civil rights and advocacy organization. She has deep expertise in regulatory design guided by public interest principles and has advocated for more than 20 years for changes that benefit human lives a...

Topics