wiki:Other/Summer/2024/pLS

Version 12 (modified by Emily, 3 days ago) ( diff )

Privacy Leakage Study and Protection for Virtual Reality Devices

Advisor: Dr. Yingying (Jennifer) Chen

Mentors: Changming LiGR, Honglu LiGR, Tianfang ZhangGR

Team: Dirk Catpo RiscoGR, Brody VallierHS, Emily YaoHS

Final Poster with Results

Project Overview

Augmented reality/virtual reality (AR/VR) is used for many applications and have been used for many purposes ranging from communicating and tourism, all the way to healthcare. Accessing the built-in motion sensors does not require user permissions, as most VR applications need to access this information in order to function. However, this introduces the possibility of privacy vulnerabilities: zero-permission motion sensors can be used in order to infer live speech, which is a problem when that speech may include sensitive information.

Project Goal

The purpose of this project is to extract motion data from AR/VR devices inertial measurement unit (IMU), and then input this data to a large language model (LLM) to predict what the user is doing

Weekly Updates

Week 1

Progress

  • Read research paper [1] regarding an eavesdropping attack called Face-Mic

Next Week Goals

  • We plan to meet with our mentors and get more information on the duties and expectations of our project

Week 2

Progress

  • Read research paper [2] regarding LLMs comprehending the physical world
  • Build a connection between research paper and also privacy concerns of AR/VR devices

Next Week Goals

  • Get familiar with AR/VR device:
    • Meta Quest
    • How to use device
    • Configure settings on host computer
  • Extract motion data from IMU
    • Connecting motion sensor application program interface (API) to access data
    • Data processing method

Week 3

Progress

  • Set up host computer and android studio environment
  • Started extracting data from the inertial measurement unit (IMU)
  • Recorded and ran trials of varying head motions

Next Week Goals

  • Run more tests to collect more data
  • Design more motions for data collection *Different head motions
    • Rotational
    • Linear
  • Combinations of head motions
    • Looking left and then right
    • Looking up and then down

Week 4

Progress

  • Designed and collected more motion data
    • Looking up then back middle
    • Looking right then back middle
    • Moving head around in a shape
    • Moving head back and forward
  • Used MATLAB noise removing functions to clean graphs
    • smooth
    • lowpass
    • findpeaks
  • 3D visual of acceleration to show time and position

Next Week Goals

  • Find a way to get hand motion data using Android Studio
  • Work on fixed prompts to get accurate LLM results using ChatGPT 4o and ChatGPT 4

Week 5

Progress

  • Enabling hand motion data collection using VR device
    • Utilize Android Studio and VrApi to access VR controller interface and extract motion data
  • Conducted additional motion experiments to gather comprehensive data sets
    • Motion data with both head and hand activities
  • Implemented 3D plots to visualize and analyze hand motion data for accuracy

Next Week Goals

  • Utilize motion research paper [3] to model more motion activities
  • Start building a CNN model that can recognize activity based on motion data

Week 6

Progress

  • Made a list of different motion data to capture and train a convolutional neural network (CNN)
    • Do research on previous work based on raw motion data and CNN’s
  • Specifics of the motion data
    • Samples
    • Users
    • Data
  • Design prompt for LLM

Next Week Goals

  • Start getting more motion data
  • Start using LLM to analyze the collected data
    • Use the designed prompts
    • Design and try more prompt structures and compare LLM responses

Week 7

Progress

  • Placeholder

Next Week Goals

  • Placeholder

Week 8

Progress

  • Placeholder

Next Week Goals

  • Placeholder

Week 9

Progress

  • Placeholder

Next Week Goals

  • Placeholder

Week 10

Progress

  • Placeholder

Next Week Goals

  • Placeholder

Links to Presentations

Week 1 Week 2 Week 3 Week 4 Week 5 Week 6 Week 7 Week 8 Week 9 Week 10 Final Presentation

References

[1] Shi, C., Xu, X., Zhang, T., Walker, P., Wu, Y., Liu, J., Saxena, N., Chen, Y. and Yu, J., 2021, October. Face-Mic: inferring live speech and speaker identity via subtle facial dynamics captured by AR/VR motion sensors. In Proceedings of the 27th Annual International Conference on Mobile Computing and Networking (pp. 478-490).

[2] Xu, H., Han, L., Yang, Q., Li, M. and Srivastava, M., 2024, February. Penetrative ai: Making llms comprehend the physical world. In Proceedings of the 25th International Workshop on Mobile Computing Systems and Applications (pp. 1-7).

[3] Garcia, M., Ronfard, R. and Cani, M.P., 2019, October. Spatial motion doodles: Sketching animation in vr using hand gestures and laban motion analysis. In Proceedings of the 12th ACM SIGGRAPH Conference on Motion, Interaction and Games (pp. 1-10).

Note: See TracWiki for help on using the wiki.