A photo of four students in a group, smiling and looking happy

Digital Citizenship Week is October 14–18!

Join thousands of teachers and students worldwide and celebrate in your classroom!

by Olivia Figueira, Privacy Engineering Intern

Imagine a world, perhaps in the not-so-distant future, in which virtual reality (VR) devices become commonplace for various day-to-day activities, such as working, socializing, shopping, entertainment, and playing games. Considering the swaths of behavioral and sensory data collected by VR devices, the privacy issues that already exist outside of VR, such as for location tracking, will only be amplified in the metaverse.

Why Is Location Data Privacy Important?

The impact of location data on privacy has been studied in other technologies. For example, researchers studied over a year of human mobility data for 1.5 million individuals using their mobile phone network usage and found that, with an anonymized data set of spatial and temporal points, an attacker needs to know only four of those points to uniquely reidentify 95% of the individuals in the data set. Since humans typically follow a common routine, such as traveling from home to work, human mobility traces are inherently quite unique and could further reveal sensitive information about users, such as whether they have visited a Planned Parenthood or A.A. meeting. Location data poses great privacy risks, whether anonymized or not, yet it continues to be quite useful—and profitable—in the ecosystem of advertising and tracking. 

A study from the New York Times in 2018 found that several smartphone applications that asked for access to location sold precise location data in an anonymized format to advertisers and other third parties, such as retail outlets and hedge funds. These third parties use this location data to serve advertisements related to consumer behaviors. The location-based advertising market, which involves the sale of advertisements to users based on their location, is also quite lucrative—in the U.S., it was estimated at $22.8 billion in 2021. Though advertisers say they're interested in the behaviors of the users they're tracking, rather than their identities, we already know that location data can be used to uniquely identify people. And in the wrong hands, it could be used for malicious purposes, such as stalking and harassment. To make matters worse, the same New York Times study also found that many mobile applications prompt users to allow location permissions without fully explaining how the data will be used. For example, the prompts will explain that the location data is needed to give weather or traffic, but leave out that the data will also be shared and sold to advertisers and trackers.

These same location privacy concerns we have in the physical world will also exist in the VR space as more users begin to use VR for more activities in which other users, both virtual friends and strangers, can observe their presence, activity, and behavior in virtual spaces. Malicious users may be able to stalk and surveil in virtual spaces as they can do in the real world, and trackers and advertisers may be able to collect virtual location data in order to serve advertisements and influence consumer behaviors. As with all privacy issues, the risks are greater for children and vulnerable users who may not be able to distinguish between safe and unsafe interactions with other users, or may not understand what is an advertisement in VR spaces.

Surveillance Capitalism Meets Virtual Reality

Advertising in VR is continuously growing, and is expanding beyond traditional methods of display advertisements. For example, Meta's VR application Horizon Worlds has a space called "Wendyverse," which is a virtual Wendy's restaurant that allows users to "interact with the company, its food, and even each other." Additionally, Meta has begun experimenting with virtual advertisements within VR applications and video games, such as virtual billboards and entire virtual rooms that serve as advertisements. In the retail space, H&M has opened a "metaverse concept store" in the CEEK VR platform, and other brands are reported to be following suit, including Nike, Adidas, and Zara. The VR platform Decentraland hosted a "Metaverse Fashion Week" with several brands hosting fashion shows, afterparties, showrooms, talks, and stores in which users can buy and virtually wear clothing seen on avatars in the fashion shows. It is already difficult for children to differentiate advertisements from regular content online, and it will be just as difficult, if not more, for children to recognize advertisements in the VR space with such novel and interactive advertisement formats.

It is unclear what data is collected from user interactions with VR advertisements. If a user's virtual location, inferred from their proximity to an advertisement in a VR space, can be monitored and collected, this presents a data type similar to physical location data. Further, it's not known whether VR developers are currently sharing virtual location data with third parties. Developers already utilize positional tracking data, such as where the user is looking and the position and movement of their bodies in VR, to render the VR space and virtual interactions correctly to the user. Considering the potential for a virtual location data market for advertising and tracking, developers could begin sharing such data with third parties.

Whether it is shared by application developers or directly collected through advertisements, trackers could cultivate rich data sets of users' virtual spatiotemporal data (where they are in time and VR space), their virtual proximity and/or interactions with advertisements (intentional or not), and their behaviors when using VR, such as when they play, how long they play, and who they interact with in different virtual locations. This data can contain unique traces for user virtual location, and users can be further singled out when the data is combined with biometric and sensory data collected by VR devices

Altogether, advertisers and trackers can potentially have access to and exploit extremely comprehensive data of user virtual location and behavior. This combination results in essentially "surveillance capitalism," as coined by professor and author Shoshana Zuboff, in VR. According to Zuboff, surveillance capitalism "claims private human experience as free raw material for translation into behavioral predictions that are bought and sold in a new kind of private marketplace," and this is precisely what could occur in the virtual space with such detailed location and behavioral information. However, advertisers and trackers aren't the only ones who may exploit location data; bad actors can stalk and harass other users in virtual spaces by tracking location as well.

The Interpersonal Dangers of VR

Just like in the physical world, bad actors exist in VR spaces, where they can stalk and harass other users. Reports of unsafe interactions in VR include groping, racist comments, stalking, and sexual harassment, which can be extremely traumatizing for victims despite happening virtually, given how immersive VR feels. According to a survey in 2018 of more than 600 users who engage in social VR platforms, 49% of women reported at least one experience of sexual harassment, 30% of men reported racist or homophobic comments, and 20% of men experienced violence either through comments or threats. After such reports came out early on in the VR timeline, developers created safety mechanisms such as safety bubbles and the ability to block or mute users, but that may not be enough to protect users.

Malicious users can target others by simply remembering their screen names or avatar appearances, but poor development practices of the applications themselves can also reveal more information about users that can be used to track them. Network traffic transmitted by VR applications can contain more metadata about the users present in VR spaces, such as their user ID, specific data about their avatar customizations, the type of device they're using to use the application, and more. Additionally, many VR applications are accessible across devices beyond VR headsets, such as mobile and web browsers, which often have easier ways to access the network traffic. This means attackers with less technical know-how may be able to access the network traffic of VR applications without even having a VR device. Even if users change their virtual appearance, such as screen name and avatar, in attempts to thwart stalking, a malicious user with technical knowledge could abuse the data in the network traffic by keeping track of user IDs and other metadata to identify users and harass them anyway.

Additionally, users can only protect themselves as much as the application settings allow, such as through safety bubbles and blocking or muting other users. However, not every VR application will have the same protection settings, and poor development practices may still lead to network traffic being transmitted with metadata about the users present in the space, potentially rendering those settings useless. Since every application is developed by different engineers and contains different settings and functionalities, users—particularly children and vulnerable people—could be at risk across different VR spaces. Such dangerous interactions in VR introduce more questions about what degree of privacy and protection users should expect in a metaverse that attempts to replicate public, real-world spaces. Is VR surveillance the next step?

Cameras on the Virtual Street

Tracking location data in the metaverse is akin to physical surveillance cameras and efforts such as Google's online map-making using panoramic pictures taken on public streets. Hypothetical VR surveillance and virtual "cameras" introduce new legal and consent questions, considering that virtual spaces hosted by privately owned devices and networks are not exactly the same as public places in the physical world. 

Consider the following scenario: What if someone in a public VR space is considered to be "disturbing the virtual peace"? It isn't clear how to handle this  situation in the VR space, just like it is unclear how to define VR crimes. We will have to imagine what VR surveillance might be like, how users would consent to such surveillance, and whether users might be willing to give up a level of privacy to fight and prevent potential VR crimes by way of surveillance, as we do in public spaces. In the metaverse, there may be newly enabled virtual crimes that we have not yet thought of that may necessitate more serious intervention and regulation. 

While it is important to think about future privacy issues as VR continues to grow, let's take a step back to the current state of VR and identify if and how it can be safer and more private for all users.

Can We Even Have Private VR?

VR devices inherently need to collect a certain level of information to function and render the VR space to the user. However, users and developers can do more to make VR a safer place, especially as VR develops into more comprehensive metaverses.

The first step for users is to learn about the privacy risks for VR devices and applications before deciding to use them. Users can access the Common Sense Privacy Evaluations to make an informed decision about using VR in the first place. When using any VR device or application, users should customize the permission settings to be as privacy-preserving as possible given the provided settings, which may still be quite limited. Many social VR applications have various functionalities for blocking users to prevent harassment and abuse, such as personal space bubbles, blocking users, and muting, which users should take advantage of to protect themselves. However, users can only do as much as the devices and applications allow them to do, which means VR developers need to incorporate more privacy- and safety-enhancing features. 

The most effective privacy setting that developers can implement is to make privacy controls opt-in by default and take the burden off the users. In a privacy-by-default model, users would have to opt in to things like location data collection, data sharing with third parties, and making their profiles and avatars public to other users. Consent for location data collection should not be a "take it or leave it" approach—users should be able to decide whether to give or revoke permission without being blocked from using the application. In addition, users should be able to change their privacy settings during use, rather than just once at signup or login. If a user sees a threat, they might want to respond by invoking certain privacy settings while in the VR space. If the metaverse is used for a wide variety of public functions, such as education, politics, and daily shopping, users should not be blocked from accessing the metaverse entirely. Additionally, participation in these public spaces should not be conditioned upon agreeing to location and other data collection permissions buried in the terms and conditions. 

Other safety features that developers should implement include more granular audio settings, such as muting or modulating a user's voice for specific users to mask their true voice, randomization of avatar appearances and screen names to avoid being recognizable every time a user enters an app or space, and customizable personal space bubbles that block specific users at different distances. These settings should also be the default for kids and other vulnerable populations to protect them throughout all their VR interactions. Developers should also implement better programming practices and obscure user metadata in the network traffic to block attackers from abusing that data for malicious purposes.

The FTC recently announced that it "is exploring rules to crack down on harmful commercial surveillance and lax data security." Given all of the current and hypothetical privacy and safety issues discussed here—and there are certainly more, beyond location privacy—the VR space is in desperate need of regulation, and quickly. As for developers, they need to do better and create more privacy-protecting and safety-preserving features in their applications and devices. While users remain largely unprotected in the current state of VR, they should educate themselves on the existing privacy risks and learn what they can do to protect themselves, their kids, and other vulnerable users if they decide to use VR.

Jill Bronfman

Jill Bronfman, served as Privacy Counsel for Common Sense. She taught law, graduate, and undergraduate students.