Virginia Tech leads research to expose new privacy concerns with virtual reality

Virtual and augmented reality (VR/AR) isn’t just for gamers anymore. Immersive 3D environments are becoming more common in education, health care, engineering, aerospace, and other important industries.
As emerging technologies make VR/AR systems faster and more efficient, they become even more useful — and more risky.
Eye tracking by VR/AR headsets produces what’s called “gaze data,” or information gleaned from where your eyes focus attention. How it is captured and used — and by whom — could pose significant security and privacy challenges. Department of Computer Science faculty Brendan David-John and Bo Ji are leading a three-year, $600,000 U.S. National Science Foundation project to better secure these systems.
“We know this vulnerability exists, although it has not been exploited by malicious users — yet,” Ji said. “So we want to make the research community and general public aware of this novel vulnerability, and also, of course, we want to propose effective defense mechanisms.”
Gaze data can reveal personal details such as age and gender that could have nefarious implications in the wrong hands.
Securing the side door
To increase speed and efficiency and enrich the visual experience of VR/AR, “foveated systems” track the movement of the fovea, a small part of the retina that’s responsible for sharp, detailed vision.
These technologies follow the fovea as it moves around the screen, sending the highest concentration of computer processing power to those parts of the image. This allows systems to render finely-detailed 3D experiences with fewer computing resources.It also boosts the realism, not just for entertainment purposes like gaming, but also for training simulations used in critical industries.
Most eye-tracking systems offer users the option of protecting their personal data. But malicious actors may be able to circumvent these protections using what are known as “side-channel attacks,” to capture and exploit information.
“Gaze data is very sensitive. Your eye movements can reveal a lot, like age, gender, ethnicity, what particular ad you are looking at on a computer screen — that sort of detail. It can even be used to extrapolate a user’s unconscious biases,” David-John said. “Because of that, best practices are to not share gaze data freely with other people unless we have to.”
Retaining gaze data locally on a device like a computer can be a smart choice. But, it’s still possible someone could code a “trap door” into a virtual environment that is triggered when a user looks at it — or develop some other mechanism for capturing and extrapolating sensitive information.
These are known as “side-channel” attacks, and they can be hard to guard against.

“Side-channel attacks themselves are not new. You basically infer some information through a secondary channel. There are many different known side-channel attacks,” Ji said. “But we haven't seen anything in the literature before about side-channel attacks against gaze data in foveated systems. It’s new, and poses some significant risks.”
In fact, predicting the way hackers might attack and exploit VR/AR headsets with foveated rendering is a big part of the project. The research team, including computer science doctoral student Paul Maynard and electrical and computer engineering undergraduate Evan Hess, will develop a set of novel attack mechanisms to test the vulnerability of these foveated systems. They will also work on ways to neutralize those threats.
“We really need to get ahead of it before these systems have eye tracking fully integrated and running all the time, and to have the testing infrastructure to be prepared for newer VR systems and foveated rendering approaches,” David-John said.
The data set and methods will be made public to educate VR/AR developers about existing security and privacy risks and provide a testbed to help thwart future attacks.