Home

Donate
Perspective

How Europe’s “Chat Control” Regulation Could Compromise American Communications

Darío Maestro / Feb 3, 2026

Darío Maestro is the Legal Director at the Surveillance Technology Oversight Project.

For over a decade, Europe has lectured America on surveillance. The Court of Justice of the European Union (CJEU) struck down Safe Harbor in 2015 and Privacy Shield in 2020. These were the frameworks that allowed American companies to receive and process Europeans’ personal data in the US. Each time, the Court concluded that US intelligence programs failed to provide protections equivalent to those afforded by EU law. The current EU-US Data Privacy Framework, negotiated in 2022 and implemented through Executive Order 14086, represents the third attempt to stabilize this relationship. But it may not survive a threat from an unexpected direction: Europe itself.

The source of this threat is the European Union’s proposed Regulation to Prevent and Combat Child Sexual Abuse—known colloquially as “Chat Control.” If adopted, the regulation would pressure online communication services to use client-side scanning technologies that analyze the content of users’ messages before they are encrypted. Its implications extend far beyond Europe. This measure would create a legal and technical architecture that systematically compromises the communications of American citizens and residents whose data transits through or is stored in European jurisdictions. The result would be a remarkable inversion of the famous Schrems litigations: a credible argument that European surveillance frameworks now fail to provide adequate protection for American personal data.

The Schrems precedents

Understanding why Chat Control threatens the Data Privacy Framework requires understanding what Max Schrems accomplished by suing over government surveillance. In Schrems I, the CJEU invalidated the Safe Harbor framework on the ground that US surveillance programs permitted government access to personal data “beyond what was strictly necessary” and without effective judicial remedies for EU data subjects. Five years later, in Schrems II, the Court struck down Privacy Shield for substantially similar reasons, finding that Section 702 of the Foreign Intelligence Surveillance Act and Executive Order 12333 authorized bulk collection lacking adequate limitations and that the Privacy Shield’s Ombudsperson mechanism failed to constitute an effective remedy.

The logic underlying both decisions is straightforward. When personal data flows from the EU to a third country, that country must provide protections “essentially equivalent” to those guaranteed under EU law. If the receiving country’s legal framework permits governmental access to personal data on terms that would violate the EU Charter of Fundamental Rights, the transfer mechanism is invalid.

This adequacy framework extends well beyond the United States. The EU maintains similar arrangements with the United Kingdom, Switzerland, Japan, and other jurisdictions—each premised on the assumption that EU law itself provides the baseline against which third-country protections are measured. European data protection operates extraterritorially through this mechanism. Data may leave the Union only if it remains subject to equivalent safeguards abroad.

This principle is bidirectional in theory. The Data Privacy Framework rests on European recognition that US law now provides adequate safeguards. But it equally assumes that the EU itself maintains the fundamental rights protections against which third countries are measured. So what happens when that assumption falters?

What "Chat Control" would actually do

The Chat Control proposal, now proceeding to trilogue negotiations between the European Commission, European Parliament, and Council, would establish a detection framework demanding that platforms scan communications for child sexual abuse material (CSAM). Crucially, this is a regulation, not a directive. It would take an immediate, binding effect across all twenty-seven member states without requiring national implementing legislation. Every platform operating in Europe would face identical obligations from day one. The most technically significant provision involves client-side scanning: technology that analyzes content on user devices before end-to-end encryption engages.

To understand why this matters, consider how the technical architecture would actually work. Platforms that employ end-to-end encryption—including WhatsApp, Facebook Messenger, and other popular services built on the Signal Protocol—would be required to insert a pre-encryption scanning layer. Every time a user composes a message or attaches media, the content would first be analyzed against CSAM databases and AI detection models before the encryption protocol activates. If the system flags the content, it would be transmitted to EU authorities. If not, the content would proceed through standard encrypted transmission.

Proponents of Chat Control emphasize that the encryption itself remains mathematically intact. But critics rightly counter that this framing misses the point entirely. End-to-end encryption derives its security value from the assurance that only the communicating parties can access the content of their messages. No intermediate party, including the service provider itself, possesses that capability. Client-side scanning destroys this assurance by creating precisely the kind of third-party content analysis that encryption is designed to prevent. The cryptography may remain intact; the privacy guarantee would be functionally zapped.

The proposal’s legislative trajectory has been contentious but persistent. The November 2025 Council position dropped mandatory detection orders—but not permanently. The text removes the operative provisions while adding a review clause that requires the Commission to assess “the necessity and feasibility of including detection obligations” within three years. Moreover, it would make permanent the temporary ePrivacy Directive derogation (set to expire this April) that permits platforms to voluntarily scan communications for CSAM without violating EU privacy law, while creating regulatory pressure for platforms to actually use it.

Specifically, the regulation could compel high-risk providers to “contribute to the development of technologies” to combat abuse, while requiring all platforms to adopt “all reasonable mitigation measures” from a list that explicitly includes “voluntary” scanning. But scanning is not voluntary if refusing to do it counts against you in a regulatory assessment. Finally, the text embraces age verification on messaging services where regulators identify a risk of child solicitation, which would force encrypted platforms to collect user data that law enforcement could compel them to hand over.

Critics have argued that this amounts to mandatory scanning with extra steps. MEP Patrick Breyer has characterized the Council position as “cementing ‘voluntary’ mass scanning” and “legitimizing the warrantless, error-prone mass surveillance of millions of Europeans.” The European Parliament’s 2023 position, however, rejected indiscriminate scanning outright; the two bodies will have to reconcile fundamentally different visions in the trilogue negotiations scheduled for February, May, and June 2026. Advocates expect a final deal by mid-2026, after which both Parliament and Council must formally approve the text before it becomes law. Privacy-rights groups warn that even without explicit mass scanning, the framework could normalize surveillance-by-default if risk-mitigation rules are interpreted broadly.

The US compliance dilemma

If Chat Control becomes law, American technology companies will face an impossible choice. Maintaining separate security architectures for different jurisdictions is technically difficult and economically ruinous. Security modifications implemented for European compliance would inevitably affect American users.

The US government has already taken notice. In August 2025, FTC Chairman Andrew Ferguson sent letters to major technology companies warning them of precisely this dilemma. “If a company promises consumers that it encrypts or otherwise keeps secure online communications but adopts weaker security due to the actions of a foreign government,” he wrote, “such conduct may deceive consumers” in violation of Section 5 of the FTC Act. The implication was unmistakable: companies could face liability for unfair or deceptive practices if European compliance degrades the security they have promised American consumers.

This puts American companies in an awkward position. On one side, European law would require (or strongly incentivize) scanning; on the other, American consumer protection law would potentially prohibit the security degradation resulting from that scanning. No company can serve two masters. And the conflicts extend beyond the FTC. The Electronic Communications Privacy Act restricts the interception of communications; the Stored Communications Act limits their disclosure; state consumer protection laws add further layers of obligations related to online messaging. Each of these frameworks could be triggered by European-mandated scanning.

Schrems in reverse

The Schrems litigations established that transatlantic data transfers are invalid if the receiving jurisdiction permits surveillance incompatible with fundamental rights. But that logic cuts both ways.

The Data Privacy Framework permits American companies to receive European personal data on the assumption that those companies will handle it consistently with European fundamental rights standards. But what about American personal data transmitted to or through Europe? If Chat Control mandates systematic pre-encryption analysis of communications—including those initiated by or involving American users—then the EU would be implementing precisely the kind of surveillance architecture that the CJEU found objectionable when it was in American hands.

Consider what this means in practice. An American citizen messaging a friend in Paris on WhatsApp would have their message scanned on their phone before it’s encrypted on the platform’s servers—and if the system flags it, that content would be sent to European law enforcement. The scanning would occur without individualized suspicion, without judicial authorization for the specific surveillance of that user, and without effective remedies for false positives. By any reasonable definition, this is mass surveillance.

An American litigant or regulator could argue that the Data Privacy Framework’s adequacy determination should be revisited—not because American law has changed, but because European law no longer provides adequate protection for American personal data transferred to or processed in the EU. The EU spent a decade telling America its surveillance was unacceptable. Now America could return the favor.

Where would such a challenge proceed?

The Schrems cases proceeded before the CJEU because they challenged Commission adequacy decisions under EU law. An American challenge, by contrast, would require different forums and legal theories.

Several pathways are available. The FTC could take enforcement action against companies that degrade American consumers’ security to comply with European surveillance mandates—precisely the threat that Chairman Ferguson’s letters foreshadowed. The Department of Commerce, which administers the Data Privacy Framework, could conclude that continued certification is inappropriate given the new European changes. Congress could restrict data transfers to jurisdictions that implement mass surveillance. And private litigation under state consumer protection laws could challenge companies’ security representations in court.

To be sure, none of these mechanisms would directly invalidate the European Commission’s adequacy decision—that remains a matter of EU law. But they could effectively render the Data Privacy Framework inoperative from the American side by making European compliance legally hazardous for US companies. The practical result would be the same as Schrems I and II: disruption of transatlantic data flows affecting thousands of companies and millions of users.

The deeper irony

For years, the European Union has positioned itself as the global standard-bearer for data protection and privacy rights. The GDPR’s extraterritorial reach reflects a confident judgment that European standards should govern the processing of European personal data regardless of where that processing occurs. And the Schrems litigations reflected a conviction that American surveillance practices were fundamentally incompatible with those standards.

Chat Control would undermine all of that. The European Data Protection Board itself has warned that preventing the use of encryption or weakening the effectiveness of the protection it provides would have a severe impact on the respect for private life and confidentiality of users. If the EU proceeds despite this warning, it will have embraced precisely the surveillance architecture it spent a decade condemning in American hands.

This hypocrisy will carry punishing legal consequences. The adequacy framework depends on mutual recognition that both parties maintain fundamental rights protections. If Europe abandons those protections while continuing to demand them from others, the entire structure becomes incoherent. And the damage would not stop at the Atlantic. Every adequacy decision the EU maintains rests on the same premise: that European law provides the gold standard against which data protections are measured. If Europe itself implements mass surveillance, what remains of that standard?

What comes next

The trilogue negotiations beginning in early 2026 will determine Chat Control’s final form. The European Parliament’s more protective position may yet prevail, or a compromise may emerge that mitigates the most intrusive elements. But if something resembling the Council’s November 2025 mandate becomes law, the consequences will reverberate far beyond Europe’s borders.

US companies would face conflicting legal obligations that they cannot simultaneously satisfy. US users would find their communications subject to systematic European surveillance. And the Data Privacy Framework—negotiated with such difficulty after two CJEU invalidations—would face a challenge from a direction no one anticipated.

The Schrems litigations taught us that surveillance is incompatible with fundamental rights and transatlantic data flows. Chat Control would teach us that lesson again. Only this time, with the roles reversed.

Authors

Darío Maestro
Darío Maestro is the Legal Director at the Surveillance Technology Oversight Project, where he focuses on litigates and advocates on issues at the intersection of technology and civil rights, including government surveillance, consumer privacy, AI safety, biometric identification, location tracking,...

Related

News
Regulators Are Going After Grok and X — Just Not TogetherJanuary 26, 2026
Perspective
Moving Past 'Chat Control' to Solutions that Truly Protect Kids and PrivacyNovember 18, 2025

Topics