Home

Donate
Perspective

Disinformation on Private Messaging Platforms Requires New Regulatory Approach

Katharina Zuegel, Mariana Olaizola Rosenblat / Mar 10, 2026

For more than a decade, policymakers have focused significant attention on the harms of open social media platforms. Meanwhile, some of the most effective disinformation campaigns today are unfolding elsewhere: inside private messaging platforms.

Evidence from recent elections and crises makes this clear. During Brazil’s 2024 municipal elections, manipulated political content circulated widely through WhatsApp groups. In Ukraine as explained by its authorities during meetings, Telegram has functioned simultaneously as a lifeline for emergency communication and a conduit for Russian disinformation during wartime. Similar dynamics have been documented in Lebanon and elsewhere. Yet despite these realities, private messaging platforms remain only partially addressed by existing regulatory frameworks.

A new report by the Forum on Information and Democracy examines why this gap persists and proposes a new approach to governing information integrity on private messaging platforms. The report is the culmination of a year-long process co-chaired by the governments of Luxembourg and Ukraine and supported by the NYU Stern Center for Business and Human Rights, bringing together public authorities, civil society organizations, and researchers.

The regulatory challenge of private messaging platforms

Many governance frameworks rely on a distinction between “public” and “private” communication that no longer reflects how most messaging platforms operate in practice. While these services were originally designed for one-to-one or small-group exchanges, they now include broadcast channels, large group chats numbering in the thousands, business messaging systems, advertising and, increasingly, AI-powered features. These functionalities allow information to circulate widely while retaining the perceived trust of private communication.

Most regulations, and even non-binding frameworks, addressing disinformation fail to account for the hybrid nature of messaging spaces and generally exclude them from their scope. Platform regulations tend to either focus on strictly illegal content, such as terrorism content and child sexual abuse material, or to address disinformation but only on platforms deemed “public.” In practice, this means that private communications are often explicitly or implicitly excluded from regulation governing disinformation.

According to our survey of twelve jurisdictions, the UK Online Safety Act (OSA), which regulates all “user-to-user services” and imposes duties to address "foreign interference" and "false information" that could cause physical or psychological harm, is one of the only exceptions. Yet even the UK OSA leaves significant ambiguity as to how user-to-user services that include encrypted messaging functionalities ought to comply with these obligations without compromising encryption. A central challenge emerges: how can binding and non-binding frameworks address disinformation on platforms that include messaging services without undermining encryption?

Our report lays out several key recommendations.

First, lawmakers should design rules that attach obligations to specific platform features rather than categorizing entire platforms as one type of service.

The central argument is to reframe how regulation should approach these hybrid platforms. Rather than classifying entire services as public or private, we propose attaching regulatory obligations to specific features based on their reach, discoverability, access controls and capacity for amplification. A one-to-one encrypted chat does not pose the same systemic risks as a searchable broadcast channel or a mass-forwarding feature. Treating them differently is not an erosion of privacy; it is a prerequisite for proportionate governance. The recent designation of WhatsApp Channels as a VLOP under the Digital Services Act by the European Commission is a first step in this direction.

Platforms, in turn, can reduce systemic risks by clearly separating private messaging from broadcasting and AI-driven functions and providing more transparency and empowerment to users in their choices about different features.

Second, lawmakers should safeguard encryption in private communications.

This feature-based approach also helps clarify debates around encryption. End-to-end encryption is essential for privacy, freedom of expression, and security, particularly for journalists, activists, and users in conflict or repressive environments. Protecting encryption does not require abandoning oversight altogether. Instead, obligations related to transparency, risk mitigation, or content governance should be limited to non-encrypted or public-facing functionalities, where platforms already exercise control and moderation. Private communications, on the other hand, should explicitly be excluded from any regulatory obligations that could render encryption legally or technically untenable.

Third, governments should undertake initiatives that strengthen societal resilience and encourage platforms to build in features that empower users.

Finally, governments should support and implement measures to empower users and build societal resilience. This includes sustained media literacy campaigns such as Ukraine’s “Filter” project, which integrates formal education with fact-checking partnerships, and Ireland’s Media Literacy Ireland Network, introduced by Coimisiún na Meán, which provides a platform for multistakeholder coordination linking broadcasters, NGOs, and regulators.

Platforms, for their part, should also continue to develop and integrate in-app tools that serve to enhance user awareness and agency, including access to “tiplines” and other fact-checking affordances.

Conclusion

Looking ahead, promoting information integrity on private messaging platforms is a shared responsibility. Governments can provide greater regulatory clarity by defining what constitutes public, semi-public, and private communication online, and by aligning obligations with specific platform features. Depending on the context, this requires greater clarity about existing regulations and their application to different private messaging platform features or new regulations taking into account the evolving nature of modern communication technologies. Greater international cooperation, taking into consideration the global nature of these platforms, is also encouraged.

Platforms, in turn, are called upon to create more clarity about their different features, to enable user choices and to strengthen anti-abuse safeguards and enable independent research.

Private messaging platforms have become central to democratic discourse. Governing them effectively does not require importing social media regulation wholesale, nor does it require weakening encryption. It requires regulatory frameworks that reflect how these platforms are designed and actually used today.

Authors

Katharina Zuegel
Katharina Zuegel is a Policy Director at the Forum on Information and Democracy, where she develops policy recommendations on information integrity, media freedom and artificial intelligence. She holds a Masters degree in Cultural Studies from SOAS. 
Mariana Olaizola Rosenblat
Mariana Olaizola Rosenblat is a policy advisor on technology and law at the NYU Stern Center for Business and Human Rights. Previously, she served as a Lecturer-in-Law at the University of Chicago Law School. Mariana received her JD from Yale Law School and her BA in Political Theory from Princeton....

Related

Analysis
Policy Directions on Encrypted Messaging and Extreme SpeechAugust 22, 2025

Topics