Using Unity to obtain Eye-tracking data from the VIVE Pro Eye headset

Presenter(s): Zachary Hoffman—Human Physiology

Faculty Mentor(s): Kate Spitzley

Session: Prerecorded Poster Presentation

Recent developments in Virtual Reality (VR) technology have created new opportunities for the usage of VR in biomechanics research . Current VR devices allow for the tracking of head and limb movements through sensors on the head mounted display (HMD) and handheld or attachable controllers . VR headsets with built in eye-tracking cameras are a relatively new technology, and little research has been conducted on manipulating a virtual environment to analyze a subject’s gaze . The aims of this project were to create a virtual test environment with a moving object, and to have an eye-tracking code that would compare the gaze of the subject to the position of the moving object at each frame . An HTC VIVE Pro Eye headset was used, and a virtual environment consisting of a gray room with a ball moving along one wall was created through Unity . At each frame, a text file (readable in Excel) was updated with the time since program initiation, the gaze direction of the subject, the position of the ball, and the difference between the position of the ball and the gaze position . The aims of this project were successfully completed . The virtual environment was successfully created and we were able to export data comparing subject gaze position and ball position . This work provides evidence that this technology can be used for future research in VR eye-tracking .

Leave a Reply

Your email address will not be published. Required fields are marked *