Health and fitness devices and apps are proliferating faster than laws and regulations to protect people from privacy abuses related to them, write Christine Bannan and Andi Wilson Thompson.
The COVID-19 pandemic has dramatically changed the lives of hundreds of millions of people around the world. Among the changes to daily routines, our relationship with fitness and health has been disrupted. Some have become passionate about virtual yoga and interactive spin bikes. At the same time, many have refrained from visiting the doctor unless terribly ill, while many more have watched their health more carefully than ever for signs that they may have been infected with the coronavirus.
As part of this transition, internet of things (IoT) devices and health apps have become even more deeply intertwined with the way many individuals monitor their own health. However, users don’t necessarily consider the privacy risks these largely unregulated technologies pose. Fitness trackers and other connected devices can play a valuable role in providing data for medical care, but doctors, patients, and policymakers must take steps to protect the highly personal data these devices create.
The role that connected technology can play in individual health tracking is nothing new. Two of the most popular brands, Garmin and Fitbit, introduced their first wearable connected trackers in 2003 and 2009 respectively. Since then the market has grown dramatically—a 2019 Pew survey found that 21% of American adults regularly wear a smartwatch or fitness tracker.
As wearable technology has advanced, data collection has expanded beyond basic fitness metrics to data that more closely resembles a combination of medical records and high-tech surveillance. Fitness trackers can now collect much more than steps and speed, with more advanced models measuring heart rate, respiration, temperature, pulse oxygen, and GPS location. Some even integrate with smartphones to measure body composition, and use microphones to interpret the emotions conveyed by the wearer’s tone of voice. Watches or wristbands are probably what comes to mind for most people when you say “wearable,” but the market also includes many other styles, including rings that record body temperature and headbands that track sleep patterns and meditation.
These aren’t just fun toys or a new piece of fitness equipment. The data IoT devices collect is increasingly used for healthcare purposes, both for individual treatment and for medical research. Especially now, when users are both more isolated and more conscious of their health, the data devices collect can be a useful tool for medical providers who have been mostly limited to virtual care. Some products, like Apple’s HealthKit and Android’s Google Fit allow developers to integrate medical records, lab results, and medications to help patients and their doctors analyze personal health data.
Doctors at Royal Prince Alfred Hospital in New South Wales, Australia, have even created an innovative “virtual hospital” (RVA Virtual) that allows them to manage COVID patients either at home or at special accommodation hotels. Doctors send patients home with the same brands of IoT pulse oximeter and wearable temperature sensors that any consumer can buy online, which allow them to track their symptoms and relay that information back to health care workers at RVA Virtual. During the COVID pandemic researchers have taken advantage of the wealth of data collected by devices like the Kinsa smart thermometer and the Oura ring, which can help warn individuals who may be developing subtle symptoms as well as providing aggregate research data that can help track virus patterns among users. Last year, Apple released its first watch that monitors blood oxygen levels, both on demand or automatically throughout the day and night. There was a run on home pulse-oximeters earlier in the pandemic, when reports indicated that low oxygen levels might be an early warning sign that someone should seek care. Now the feature is part of the Apple Watch Series 6, along with the FDA-cleared ECG app (which measures heart rate and flags irregular rhythm) that has been added in an early version.
Companies originally marketed these tools to consumers as something people could use to monitor their own health. Then, tech giants seized on the opportunity to enter the healthcare sector, creating corporate partnerships with health insurance companies and healthcare providers that further blur the line between medical and consumer data. Although data can be helpful to medical providers, there are also real privacy concerns posed by the proliferation of wearable health tech, especially the overlap between tracking personal fitness progress and collecting data for medical providers or scientific studies.
Company wellness programs are also taking advantage of the data wearables collect. Apple has a partnership with insurance company John Hancock that incentivizes customers to share their health data by offering cheaper rates. Fitbit’s corporate wellness programs already allow employers to monitor employees’ real-time health data, for example toreward weight loss or monitor post-surgery recovery to determine when an employee will return to work. Also, companies who manufacture wearables are increasingly developing data-sharing partnerships with third-party companies, which could create further privacy concerns. Amazon advertises that its new Halo fitness tracker integrates with programs from Weight Watchers, and the company has teamed up with corporate partners like Headspace and OrangeTheory Fitness to produce exclusive content for Halo users.
There is a widespread misconception that the federal law that protects medical records—the Health Insurance Portability and Accountability Act (HIPAA)—protects all health information. However, the law only applies to health insurers, healthcare providers, and related entities. Although the information your fitness tracker collects to monitor COVID symptoms may be the same type of data your doctor would collect, what matters for determining whether the law applies is who collects the data, not the type of data collected. And because the United States lacks a comprehensive federal privacy law, there are few legal restrictions on what a company can do with the data of U.S. users. For example, the same wearable temperature sensor used at RVA virtual in Australia also markets a clinician-facing mobile app for use by U.S. medical providers. The clinician app is HIPAA-compliant, and the temperature sensor is cleared by an FDA certification process, but only when used by doctors. Consumers can buy the exact same sensor for home use, but that data isn’t subject to the same privacy protections, nor is the data collected by many other consumer-directed health wearables.
Unfortunately, the United States still lacks a consumer privacy law that would protect personal data regardless of whether it is collected by a healthcare provider or an app. At present, consumers are left to make judgments about privacy on their own. The FTC recommends comparing privacy options and taking control of your sensitive information when choosing a fitness tracker. However, without any legal requirements for companies to follow, there are limitations on both the privacy-protective options available to consumers and the access consumers have to privacy information.
Some of the questions consumers should ask themselves when considering the privacy risks of using the latest fitness wearable or new connected exercise equipment:
1) What data is the product collecting? Is it collecting personal contact information, information that users input into an app or platform, sensor data from a camera, microphone, or biometric sensor, or multiple of these types of data?
2) Who does it share your data with? For example, is the data used for research, is it shared with insurers or employers through wellness programs, and is it shared with advertisers?
3) Does the product use basic security features like strong passwords or encryption to keep the information it collects safe? The Digital Standard addresses some of these concerns, for example about data collection and sharing, but it is also important to consider privacy concerns specific to each user. For example, what arrangement has their employer set up for accessing and protecting any data that employees share?
Ultimately, health data is sensitive—enough so that there are strict laws governing what health insurers, healthcare providers, and related entities can do with it. Now, much of the sensitive health data is no longer being collected by those governed entities—it’s being collected by the wristband that someone bought for Christmas. The companies that make these products have every incentive to share data with other companies as part of targeted ads, partner with organizations you might not like to share your detailed health information with, or use it for market research. Legislation hasn’t caught up with this reality yet. When Congress passed HIPAA back in 1996, no one could have envisioned how drastically the landscape of health data would shift, or the proliferation of consumer-grade tools to collect data that was once limited to doctors.
Until privacy laws address these changes, users unfortunately need to look out for their own privacy. Research a product you are interested in using, ask a few questions about what data it’s collecting and how it uses the data, and look to see if there are expert reviews, evaluations, or ratings available that could help you make informed choices. After all, the cool new toy can definitely help you get more active and monitor health info—but it can also use a microphone to tell you if you sound stressed or not. That’s information you might not want to share with your employer or insurance company.