A Critical Look at Apple's Privacy Record
Laboni Bayen / Oct 27, 2023Laboni Bayen is a researcher for the Tech Transparency Project.
Apple CEO Tim Cook has said protecting people’s privacy is “one of the most essential battles of our time.” The iPhone giant has built a well-oiled PR campaign around its privacy agenda, setting itself apart from data-hungry tech rivals like Google and Meta. From billboards to punchy ad campaigns, Apple assures its customers that their data is safe—and that privacy is a human right.
But a closer look at Apple’s record shows that the company’s carefully cultivated reputation as a privacy champion doesn’t always stand up to scrutiny.
Take Advanced Data Protection, the new security feature that Apple launched in the US late last year. Dubbed a “Digital Fort Knox,” it applies end-to-end encryption to messages, photos, and documents stored on Apple’s cloud platform, iCloud. That’s considered the gold standard for encryption, shielding data from hackers, law enforcement, and even Apple itself. Some privacy experts welcomed Advanced Data Protection, calling it an important step toward protecting users from mass surveillance and data leaks. But the security feature—and Apple’s rollout of it—have some major deficiencies:
- Advanced Data Protection isn’t automatically turned on, and Apple took no apparent steps to alert users to the new feature through icons (as it does for software updates) or pop-up messages (as it does with app tracking requests). That means many people may be unaware that it even exists.
- For those motivated to turn on Advanced Data Protection, it may not be easy to find and activate. Apple buried it deep in the iCloud settings, and it requires a multi-step process to set up—factors that could dissuade some users.
- Not all Apple customers can use Advanced Data Protection. Older models of iPhones, iPads, Apple Watches, and Mac computers don’t have access to it, and if just one of a user’s devices is an older model, they are unable to switch it on. That could shut out people who can’t afford to upgrade to newer devices.
- Advanced Data Protection does not cover all types of data on an Apple device. The technicalities of Apple’s policies reveal that a user’s Contacts, Calendar, and iCloud email are not covered by the encryption feature.
This is part of a pattern for Apple: the company’s privacy performance has often been inconsistent with its soaring rhetoric on the topic.
Apple cemented its privacy reputation in the minds of many Americans when it refused to help the FBI unlock an iPhone belonging to one of the suspects in the 2015 San Bernardino mass shooting, leading to a highly publicized court battle with the US government. Apple took a lofty stance, painting itself as a champion of civil liberties by refusing the FBI’s request to write special software to bypass its security protocols and extract the shooter’s iPhone passcode.
The company said the US government wanted it to create the equivalent of a “master key,” which it warned could be exploited by hostile governments and hackers to gain access to anyone’s iPhone. In a public customer letter, Cook said the government’s request was not only “unlawful, but it puts the vast majority of good and law abiding citizens, who rely on iPhone to protect their most personal and important data, at risk.”
This stance represented a major policy shift for Apple, which, according to federal prosecutors, had previously been quite cooperative when it came to unlocking iPhones. In 2015, prosecutors in a drug trafficking case detailed how Apple had an “established track record” of extracting data from passcode-locked iPhones going all the way back to 2008, not long after the first iPhone was introduced. According to these officials, Apple even advised the government on what specific language it should use in court orders. (The government later dropped the effort to force Apple to help, saying it had found another way to unlock the iPhone in question.)
But while Apple used the San Bernardino case to stake out a new, public position as a privacy champion, the company was at the same time building a service that gave law enforcement another major access point for iPhone user data: iCloud.
Since introducing iCloud in 2011, Apple has marketed the cloud storage service as a safe and convenient place for users to store their personal data. “iCloud is built into every Apple device,” reads one typical promotion. “That means all your stuff – photos, files, notes and more – is safe, up to date, and available wherever you are.” By 2018, three years after San Bernardino, analysts estimated there were 850 million iCloud users.
But the emphasis on convenience masked an inconvenient truth for privacy-promoting Apple: Turning on iCloud made almost all user data—including messages, photos, documents, maps data, browsing history, and more—available to law enforcement via a search warrant. That meant police could access the contents of someone’s iPhone without the need for a passcode. In September 2015, a few months before the San Bernardino shooting, Apple reportedly turned on iCloud by default for iPhone users as part of a software update. (That remains the case today, and users must opt out if they don’t want the cloud backup.)
A review of hundreds of pages of legal filings and a 2021 FBI training document reveals that law enforcement has made heavy use of iCloud search warrants over the years. In one drug smuggling case in 2020, an Immigration and Customs Enforcement agent explained that Apple users are “automatically” set up with iCloud accounts and only “sophisticated users” know how to opt out of the service:
Unless disabled by a sophisticated user, iCloud accounts are automatically created and information automatically backed up. Information on someone's iPhone is automatically backed up to their iCloud account.
Similarly, a representative of PenLink, a Nevada-based law enforcement contractor, called iCloud search warrants “phenomenal” for tracking suspects. “If you did something bad, I bet you I could find it on that backup,” the contractor told a National Sheriffs’ Association conference, according to a report last year in Forbes.
Apple’s instant messaging platform, iMessage, provides a good illustration of how Apple’s privacy promises failed to hold up in the era of iCloud. Apple has, and continues to, describe iMessages as end-to-end encrypted, meaning no one but the user—not even Apple—can read the messages. But in 2021, media reports, citing an FBI document, highlighted that if users back up their data to iCloud, Apple stores the encryption keys for iMessages in iCloud. That means an iCloud search warrant can get the keys to decrypt iMessages—a major loophole.
All this shows that Apple, while still coasting on the privacy praise generated by its stance in the San Bernardino case, put troves of user data within reach of law enforcement, and potentially governments and hackers, via iCloud. This situation continued for years, with Apple at one point dropping plans to fully encrypt iCloud backups after the FBI complained it would harm investigations, Reuters reported in 2020. Privacy experts called iCloud the company’s “most alarming privacy failing.”
In theory, Apple’s new Advanced Data Protection should patch these loopholes and protect iCloud data. But Apple’s lackluster promotion of the feature means that many people will likely miss it, leaving the state of iCloud data security relatively unchanged. Without Advanced Data Protection switched on, user data is still accessible to prosecutors wielding search warrants.
One sign of that: Court filings from a 2023 criminal case reveal that Apple provided Department of Homeland Security (DHS) agents with GPS data, chat conversations, images, and more stored in up to six different iCloud accounts—months after Advanced Data Protection was introduced. DHS agents investigating an alleged drug trafficking organization got an iCloud search warrant for the contents of six unique accounts. It appears that Apple complied within 24 hours.
The likelihood that many Apple users won’t activate iCloud end-to-end encryption leaves open the question of how Apple will respond to law enforcement data requests related to newly enacted state laws restricting abortion and transgender rights.
A California law enacted last year prohibits companies based in the state from complying with out-of-state search warrants unless they attest they are not related to an abortion investigation. That may give Apple legal cover to steer clear of abortion-related warrants, though some experts say the law is likely to face a court challenge, and its ultimate fate is unclear.
Another California law bars state health care providers, law enforcement, and courts from aiding in out-of-state investigations of minors receiving gender-affirming care, but it does not apply to tech platforms. Apple and its CEO, Tim Cook, have publicly criticized anti-trans legislation, but the company hasn’t said how it will handle law enforcement data requests related to these laws, which are advancing across the country.
Public interest advocates have warned that digital data obtained from search warrants is a major threat when it comes to enforcement of anti-trans laws. In fact, private messages and search histories have already been used to prosecute women in abortion-related cases. This fall, a Nebraska woman was sentenced to two years in prison for helping her daughter obtain an illegal abortion in a case built on private Facebook messages.
Apple says privacy is a “fundamental human right” and one its “core values,” but it is clear from the company’s policies and practices that its users’ data isn’t always protected. While Apple’s new Advanced Data Protection has received deserved praise, the company has done a poor job of informing people about the new encryption feature—which could deprive many people of its benefits.