Congress's New Privacy Bill Is Built on Empty Promises
Eric Null / Apr 23, 2026
House Energy and Commerce Committee Chair Brett Guthrie (R-Ky.) and Ranking Member Frank Pallone (D-N.J.) preside over a markup on Capitol Hill on March 5, 2026. (Francis Chung/POLITICO via AP Images)
After more than a year since originally seeking comment on a federal privacy framework, House Energy & Commerce Republicans have introduced their draft legislation, the SECURE Data Act, expressing the agreement among Republicans on their preferred approach to privacy. Unfortunately, this draft is a major step back in the privacy debate given the progress made in prior federal privacy bills. Without significant improvements, the Act would fail to protect peoples’ privacy while giving companies a free pass to continue engaging in the same data practices consumers have grown to hate.
Generally, the bill is modeled after the weak privacy laws some states have passed in recent years, like Kentucky and Virginia. Those laws are industry-friendly; they largely let companies set their own rules, and continue to place inordinate privacy burdens on individuals. Most troublingly, they rely on user consent, perpetuating the fiction that users are capable of reading endless privacy policies, consenting to privacy practices, and protecting their privacy interests all alone, rather than establishing baseline protections that users can rely on. The Act builds on these fallacies, and scaffolds on top of this very flimsy foundation.
Like some of the weakest state privacy laws, the SECURE Data Act’s data minimization provision is completely ineffectual. The bill requires data to be minimized to whatever purposes the company discloses to the consumer, generally in a privacy policy. To be clear: that’s already the state of the law today. Under the FTC Act and state equivalents, companies already have to abide by whatever commitments they make in their (usually long, vague, and hard to read) terms of service. Consumers repeatedly complain about this status quo, expressing a strong desire for better privacy protections. This draft solves none of their concerns. It doesn’t even ban dark patterns that manipulate how consumers can express their preferences. If this bill passes, the same privacy harms that existed before will continue to exist, but this time those harms will effectively be blessed, instead of addressed, by Congress.
Unsurprisingly, and unfortunately, this draft also lacks meaningful civil rights protections, which were central to prior iterations of bipartisan federal privacy bills (namely the American Data Privacy and Protection Act of 2022 and the American Privacy Rights Act of 2024). We know that data is used in discriminatory ways, and that current law is insufficient to address this issue. The Act simply reiterates that illegal discrimination is still illegal, a woefully inadequate punt, especially given the Trump administration’s ongoing efforts to undermine civil rights laws by removing disparate impact as a way of proving discrimination. Companies should not use data and artificial intelligence systems to deny people a loan, employment, or other opportunities because of their skin color, their gender, where they’re from, or any other protected characteristic.
The bill also lacks a private right of action. This omission is unsurprising, as industry’s biggest concern for any privacy law is its teeth. A privacy law is only as effective as its enforcement, making these provisions perhaps the most essential component of any bill. Just look at Texas — its law is relatively weak on substance, but a motivated attorney general’s office and a large budget has made the state one of the preeminent privacy enforcers in this country. California built its own privacy agency, showing a clear priority for protecting privacy with savvy, technical enforcement expertise, and it too has enforced its privacy law repeatedly. In previous years, the bipartisan privacy bills considered in the House included private rights of action, tailored to specifically address industry concerns about an over-active plaintiffs’ bar and consequences for small business. Yet the SECURE Data Act rejects that bipartisan compromise, instead omitting any private right of action at all. At a time when the Federal Trade Commission has been politicized and undermined, the consequences for meaningful enforcement are dire.
Adding insult to injury, the bill also takes aim at states seeking to establish stronger privacy safeguards for their residents. The bill includes broad preemption, providing that no state may pass or enforce any law or standard that “relates to the provisions of” the Act. This sweeping preemption should be reserved only for the strongest, most future-proof federal laws that are unlikely to become obsolete over time. This draft is already obsolete. Congress should not prevent states like Maryland, Virginia, Oregon, and others from enacting stronger protections than exist in this bill. Even Kentucky’s privacy law, which the bill was based on, included impact assessments, which are not present in this draft. Worse, such broad preemption could even wipe out every state civil rights law, Texas’ biometric privacy law, Illinois’ Biometric Information Privacy Act, and potentially California’s AI training dataset transparency law.
As if this weren’t enough, the bill includes a long list of exemptions, some of which would neutralize its protections entirely. If a company processes data for a product or service specifically requested by a consumer, or processes data pursuant to a contract, those activities are exempt. Especially in the AI era, companies are likely to claim that all their data practices are in response to consumer requests to provide the product or service, or that the processing is pursuant to a contract (which may include a privacy policy). Further, data processed for conducting “internal research to develop, improve, or repair a product, service, or technology” is exempt, which means any data processed for AI training (which might as well mean most data) is also exempt from the bill. With these exemptions, it’s as if the rest of the bill simply vanishes.
The SECURE Data Act marks a significant step backward in the privacy debate. We can hope to see improvements to the bill in the future. But it’s time for Congress to get serious about adopting meaningful protections that will serve Americans for the long term.
Authors
