Home

State Bills Aren’t Enough: The Case for National Legislation on Data Privacy and Civil Rights

Amanda Beckham / May 19, 2023

Amanda Beckham is the government relations manager for Free Press Action.

The breadth of data private companies have access to — about all of us — is virtually limitless. These entities know where you went to school, what religion you practice, your medical history, your political opinions — even what your family life is like.

These are concrete data points that are easily accessible from search engines and social media. And there are many other kinds of information companies collect about our behavior online. We are constantly generating data that reveal all of the above and much more about ourselves — and most of the time, we do this willingly for the convenience that many applications offer.

The problem is that we don’t know what happens to this information once it’s created — or who has access to it. And all of this data, when aggregated, represents the power to influence, manipulate and discriminate. Demographic and behavioral data that platforms collect could fall into the hands of advertisers that want to serve up fairly benign targeted ads. Or it could fall instead into the hands of those who wish to undermine our democratic processes by sowing mistrust in our institutions — or who routinely discriminate against people in digital markets.

Years of research, investigative journalism, activism, and whistleblower revelations have shown the harms caused by the companies collecting and misusing all of this data. On Facebook, for example, bad actors have exploited demographic data to discourage Latinx individuals from completing the 2020 census and to target communities of color with lies about the electoral process. Ironically, the U.S. Government does billions of dollars in business with many of these same brokers in order to get around a 50-year-old law that prevents federal agencies from accessing this kind of personal information themselves. Without safeguards in place to protect our data from those who wish to harm us, both individuals and our national security are extremely vulnerable.

The best way to temper these threats is to target the source: the collection and misuse of sensitive data that companies may sell or misuse on their own. Comprehensive national standards for data privacy and protection are critical because gaps in protection anywhere undermine efforts to protect individual privacy everywhere.

But this has become complicated in the U.S. legislative landscape, as state and federal lawmakers often disagree on what approach to take. According to the App Association’s tracking of privacy legislation, 289 data privacy and protection bills have been introduced at the state level across the country. Clearly, those in power at all levels of government recognize that more needs to be done to protect our sensitive data. Still, a national landscape where data-privacy legislation varies from state to state leaves people unprotected in states without good laws.

An application or platform could have business practices that violate one state’s data-protection laws while complying with those of another, meaning people in the state with weaker protections invariably lose out. Industry representatives say that the clarity and uniformity of federal legislation would allow them to comply more easily with laws protecting our data-privacy rights.

For its part, the Federal Trade Commission (FTC) is attempting to protect consumers from data abuses through rulemaking. It has successfully identified violators, like BetterHelp and GoodRx, which broke rules prohibiting the disclosure of health data for advertising. But the agency’s enforcement and rulemaking mechanisms are limited, and many large corporations account for FTC fines and settlements as just a cost of doing business. The FTC chair and majority commissioners have said that a new law augmenting the agency’s authority would be the best way to protect U.S. consumers.

Existing national laws to protect consumer privacy don’t account for our increasingly interconnected data ecosystem. These include the Fair Credit Reporting Act of 1970, the Health Insurance Portability and Accountability Act (HIPAA) of 1996, the Children’s Online Privacy Protection Act (COPPA) of 1998 and the Gramm-Leach-Bliley Act (GLBA) of 2000. These laws don’t cover areas of business where some of the biggest threats to online privacy exist today. And targeting individual platforms is akin to fighting a hydra: You can chop off one head, but it will keep growing back if you don’t kill the beast at its core.

Take health-care data, some of the most sensitive and valuable data on the market. We’re all familiar with apps that track our exercise regimen and sleep habits, manage prescriptions and supplements, and help manage mental-health conditions. These applications all generate sensitive health-care data. And HIPAA protects none of this; it applies only to health-care and insurance providers — even though it’s much of the same information. If this information is shared via an app instead of with a provider, it’s beyond the scope and protections of the law.

It’s especially critical to have comprehensive laws in place to protect the personal information of minors, which is extremely valuable to advertisers and often-unscrupulous data brokers. Protecting a minor’s data means very little when information about their parents, guardians, and other adults in their household is vulnerable.

Some lawmakers have fixated recently on TikTok and the potential national-security threat it poses, with its links to a nation with a poor human-rights record and an often-adversarial relationship with the United States. However, banning TikTok — or any app — is an authoritarian tactic that would limit the freedom of expression of millions of people. And such a ban wouldn’t address the larger issue: the need for a federal data-privacy standard to protect people regardless of the apps they use. Other apps, platforms and devices gather the same data as TikTok to inform their algorithms. And data brokers aren’t averse to selling this information to the highest bidders, including those with ties to other countries.

The burgeoning field of artificial intelligence (AI) demonstrates the need for federal privacy protections too. Following this week’s Senate Judiciary grilling of OpenAI CEO Sam Altman, Majority Leader Chuck Schumer reportedly will host a series of bipartisan meetings among lawmakers to study possible abuses of a relatively new technology. But whatever they do after taking stock of it, Congress must prioritize laws that provide meaningful protections for sensitive personal data across all online platforms and technologies.

The national-security and civil-rights implications of data privacy and protection are so significant that Congress must respond. Data privacy is the right of every individual, and that level of protection shouldn’t depend on the state someone lives in. And the burden of protecting consumer privacy must fall on the companies that violate it, not on the individuals using their products.

The American Data Privacy Protection Act (ADPPA), which passed the House Energy and Commerce Committee on an overwhelmingly bipartisan vote last year, would set a national standard that both consumers and companies are asking for. It would prohibit companies from collecting sensitive data, like geolocation and health information. It would also strengthen the FTC’s ability to bring enforcement actions against violators. And time is of the essence: sensitive information can be gleaned from Google searches about health care, as well as apps that track reproductive health. With states across the country banning or severely restricting abortions, the ADPPA would be a crucial tool in protecting access to abortion care.

There have been calls from across the political spectrum for ADPPA to be reintroduced in the current Congress. We need our leaders to heed that call to create a stronger national standard and safeguard our digital civil rights.

Authors

Amanda Beckham
Amanda Beckham is the government relations manager at Free Press Action's Washington, D.C. office where she engages lawmakers to advance policies that achieve the organization's core goals of promoting equitable access to technology, saving Net Neutrality, and ensuring that media and technology are ...

Topics