Unpacking the Privacy Implications of Extended Reality
Daniel Berrick, Jameson Spivack / Apr 4, 2023Daniel Berrick, JD, is a Policy Counsel and Jameson Spivack is Senior Policy Analyst, Immersive Technologies at the Future of Privacy Forum.
It wasn’t long ago that the “metaverse” was the seeming buzzword for the year. Although the hype cycle has moved on to generative AI and the uses of ChatGPT, major companies, universities – even fashion brands – continue to invest in immersive projects and platforms. But what does that mean for the average consumer?
What people call the “metaverse” today is actually a collection of technologies, including but not limited to extended reality (XR)—an umbrella term for virtual reality (VR), augmented reality (AR), and mixed reality (MR) tools. XR provides new ways for people of all ages to engage with content, not only for gaming, but also for education, health, productivity, and socializing. While these applications have big potential to change the way that individuals go about their daily lives, before people make big investments in personal XR devices it is important for them to understand what data these devices and applications collect, how they use this data, and what it all means for privacy. In addition to this clarity and transparency, there is a strong case for implementing regulatory safeguards to ensure privacy protections for everyone in the US.
The Future of Privacy Forum’s recently published infographic identifies what and how data is collected, where it is used, and the risks it may raise. XR relies on—and even requires—large volumes and varieties of data that are typically sourced from the user, their devices, and their surroundings. This data includes things like the movement and pupil dilation of a user’s eyes, the movement of their body, their position in a room, and their voice. Particularly when combined with other data, these data points can be used to make inferences about people’s interests, behaviors, and physical and mental characteristics, which, without safeguards, may facilitate harmful or discriminatory practices.
People who have spent time in VR may have had the opportunity to attend a concert, sporting event, or watch a stand up comedian. These environments illustrate well how data gets processed and used in an XR environment. For instance, outward-facing cameras in the physical world track both the performers’ and audience members’ movements to place them within the virtual scene, while inward-facing cameras can capture their facial expressions in order to display more realistic virtual avatars, mirroring the give-and-take of social interactions in the physical world. Hand tracking or controllers can allow for clapping, or holding up the sign of your favorite team. Meanwhile, microphones capture audio associated with certain expressions and emotions like shock or laughter. Altogether, this data is used to ensure the user feels a sense of embodiment, as if they are really at the venue alongside friends, family, and strangers.
But this data collection also creates privacy and data protection risks. Data about a user’s body, movement, and surroundings could be used to infer information that people may not want to share, such as their gender or disability status. Tracking a user’s body could also allow for individuals to be personally identified, undermining any desire for anonymity in XR environments. Even bystanders—those who are not using an XR device but are in close proximity to someone using one—may have their data collected, often without their awareness.
Despite these risks, XR technologies can benefit individuals and society. Their integration into personal and professional interactions may improve the quality and accessibility of health, education, and entertainment. However, organizations should implement appropriate safeguards to protect users and bystanders. Uncertainty around how well current legal protections address the privacy risks posed by certain XR processing activities underscores these safeguards’ importance.
Currently, it is unclear how privacy laws apply to certain XR data, and the way US law regulates biometric data varies by state. Given this legal uncertainty, ensuring that appropriate safeguards exist at the technical, policy, and legal levels to protect against potential harms is necessary to guarantee users have safe, enjoyable experiences in XR. While no one-size-fits-all approach can eliminate the hazards that exist with extensive data processing and use, a combination of risk-management strategies may prevent or mitigate some of the negative effects.
Such measures include on-device processing and storage to ensure that the data remains in the user’s hands and is not accessible to others, as well as limiting data collection, processing, and use to particular, specified purposes. Concerns around data sharing or secondary data use could be addressed through privacy-enhancing technologies (PETs), such as end-to-end encryption, differential privacy, and synthetic data sets, as well as requiring third-party developers to comply with established data practices. To protect bystanders, platforms and developers can implement features such as automatically blurring bystanders’ faces.
XR technologies can benefit individuals and society when organizations design them appropriately. However, the significant data collection and processing they require to function may raise privacy and data protection concerns related to sensitive inferences, loss of anonymity, and bystander privacy. As XR devices increase in use, it’s important for people to know their privacy implications for companies to develop and implement them in ways that minimize risks, and for the regulatory environment to keep up by pursuing comprehensive privacy safeguards.