Researchers at Rutgers University-New Brunswick have published what they claim is the first work to examine major privacy leakages using virtual reality (VR) headsets.
The research has found that using VR headsets with built-in motion sensors could be targeted by eavesdropping attacks to record subtle, speech-associated facial dynamics to steal sensitive information on the user such as credit card data and passwords.
Researchers used popular VR headsets such as the Oculus Quest 2, HTC Vive Pro and PlayStation VR during the research.
To test the security vulnerability, researchers developed an eavesdropping attack targeted augmented reality (AR) and VR headset known as “Face-Mic.”
“Face-Mic is the first work that infers private and sensitive information by leveraging the facial dynamics associated with live human speech while using face-mounted AR/VR devices,” said Yingying Chen, associate director of WINLAB and director of electrical and computer engineer at Rutgers. “Our research demonstrates that Face-Mic can derive the headset wearer’s sensitive information with four mainstream AR/VR headsets, including the most popular ones: Oculus Quest and HTC Vive Pro.”
How they did it
Researchers studied three types of vibrations captured by AR/VR headset motion sensors: speech-associated facial movements; bone-borne vibrations; and airborne vibrations. Bone-borne vibrations are encoded with detailed gender, identity and speech information.
While vendors have policies regarding the voice access function in headset microphones, Rutgers found that built-in motion sensors — such as accelerometer and gyroscope — inside VR headsets do not require any permission to access. These vulnerabilities can be exploited by hackers intent on committing eavesdropping attacks.
Simple speech content, such as digits and words, can infer sensitive information like credit cards, Social Security numbers, phone numbers, PINs, transactions, birth dates and passwords. This could lead to identity theft, fraud or healthcare leakage.
Once a user has been identified by a hacker, further details can be exploited such as AR/VR travel histories, game/video preferences and shopping preferences, which could be lucrative to shady companies wanting this tracking information.
“Given our findings, manufacturers of VR headsets should consider additional security measures, such as adding ductile materials in the foam replacement cover and the headband, which may attenuate the speech-associated facial vibrations that would be captured by the built-in accelerometer/gyroscope,” Chen said.
The next steps include examining how facial vibration information could authenticate users and improve security in these headsets and how these devices could capture a user’s breathing and heart rate to measure well-being and mood states discreetly.
The full research can be found in the Proceedings of the 27th Annual International Conference on Mobile Computing and Networks.