Home

Donate

What is Secure? An Analysis of Popular Messaging Apps

Justin Hendrix, Caroline Sinders, Cooper Quintin, Leila Wylie Wagner, Tim Bernard, Ami Mehta / Jun 20, 2023

Justin Hendrix, Cooper Quintin, Caroline Sinders, Leila Wylie Wagner, Tim Bernard, and Ami Mehta.

In a world where privacy and security are increasingly under threat, particularly in countries swept up in a global wave of autocratization and erosion of rights, encrypted messaging apps are an increasingly popular—and necessary—way to share information, organize and engage with one another, and do business. But while the promise of secure messaging is private communications and user control over the spread of personal or group information, the reality is often more complicated, particularly in the age of surveillance capitalism. An overlapping, interconnected set of engineering, design, and system factors, coupled with varied user behaviors and shifting policy environments, have created conditions in which individuals may subvert their own interests or those of their communities while using encrypted messaging apps.

From September 2022 through May 2023, we analyzed popular messaging apps–including Signal, WhatsApp, Telegram, Messages by Google, Apple Messages and Meta’s Messenger–across a range of dimensions, including technical security, user experience, how the apps engage with users and developers, and their policies, terms and conditions. We sought to understand how people form mental models of their own individual or group digital security and corresponding threats, ways in which the technical and design decisions that the developers of encrypted messaging apps make can leave users vulnerable, and potential solutions that encompass technical, design, and policy adjustments.

To answer these questions, we adopted principles from frameworks such as Privacy by Design and Design from the Margins. We completed a technical review of selected apps; a detailed user experience and user interaction design analysis; and a comprehensive policy review. We interviewed a range of experts, and conducted field work with at-risk users including abortion rights activists in New Orleans, Louisiana and journalists in Delhi, India.

The full 86-page report PDF is available for download here.

Key findings and recommendations include:

1. Users are too often flying blind. Even those most concerned about privacy rarely have sufficient information to make decisions that are in their own best interest. There is a substantial gap between the promise of encryption and the reality of threats to secure messaging in practice. We encountered various forms of “security folklore” that inform user decisions in place of information grounded in fact, as well as “security nihilism,” a debilitating sense among some that there is no way to communicate securely.

2. An app’s cryptographic security doesn’t mean it is secure. Implementation is everything. The failure to implement end-to-end encryption by default, such as on Telegram and Meta’s Messenger, illustrate this point. Users may not understand the distinction when presented with confusing options like “secret chat” and “private chat.” And few users understand design distinctions, such as different colors for messages in Apple’s iMessage and Google Messages, that are intended to communicate different types of messages (SMS or encrypted,) and thus different levels of security.

3. Follow Signal’s lead and encrypt or don’t store metadata. Signal is the only app that has taken steps to hide users’ profiles, contacts, group metadata, and even message sender information. Other developers need to follow Signal’s example and hide user metadata by keeping it encrypted with the user’s account key and only handling unencrypted versions in secure enclaves.

4. Let users decide which features should be on or off. Companies need to allow for any feature that impacts privacy and security to be turned on and off, and to explore and implement more granular settings that allow for users, especially high-risk users, to tailor the service to their needs, including when it comes to disappearing messages, link previews, storing and deleting call logs, and interaction history.

5. Close technical and design ‘loopholes’ that betray privacy. From unencrypted backups of messages and the use of phone numbers as identifiers to flaws in how deleted messages are handled, confusing naming conventions for certain features, and bad user design on some options, there are a range of technical and design issues that the makers of messaging apps need to address urgently.

6. Beware the bloat. Especially when it comes to apps that are connected to or are trying to emulate some aspects of social media platforms, including Meta’s Messenger, Telegram and increasingly WhatsApp, there is evidence of feature bloat and connections to other apps and services that may create new privacy concerns. The incentives of surveillance capitalism are privacy and safety’s worst enemy, particularly when developers deploy deceptive design patterns.

7. Encryption must be defended. Governments around the world–including in democracies–are threatening encryption with a range of new regulations and laws that will effectively break the model of apps like Signal and WhatsApp. It is crucial that policymakers, industry voices, and activists that understand the value of encryption speak up in its defense.

This research, conducted by Convocation Research & Design and Tech Policy Press, was supported with funding from a program at Omidyar Network focused on private and trustworthy messaging.

The full 86-page report is available here.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...
Caroline Sinders
Caroline Sinders is a critical designer and artist. For the past few years, she has been examining the intersections of artificial intelligence, abuse, and politics in digital conversational spaces. She has worked with the United Nations, Amnesty International, IBM Watson, the Wikimedia Foundation a...
Cooper Quintin
Cooper Quintin is a security researcher and senior public interest technologist with the EFF Threat Lab, and board member of Open Archive. He has worked on projects including Privacy Badger, Canary Watch, and analysis of state sponsored malware campaigns such as Dark Caracal. Cooper has given talks ...
Leila Wylie Wagner
Leila Wylie Wagner is a project manager and disaster response professional based in New Orleans, Louisiana. Her work for the past two years has focused on designing and managing programs to combat COVID-19 misinformation and disinformation and to decrease vaccine hesitancy in Southeast Louisiana. He...
Tim Bernard
Tim Bernard is a tech policy analyst and writer, specializing in trust & safety and content moderation. He completed an MBA at Cornell Tech and previously led the content moderation team at Seeking Alpha, as well as working in various capacities in the education sector. His prior academic work inclu...
Ami Mehta
Ami Mehta (she/her) is a creative technologist, researcher, and artist based in Brooklyn. As a Postdoc Fellow at NYU’s Interactive Telecommunications Program, Ami is exploring ethnographic approaches to XR design practices. Her research interests include issues related to safety, privacy, self-ident...

Topics