The role of limb dominance in visuoproprioceptive tasks

Presenter(s): Kieley Trempy

Faculty Mentor(s): Kate Spitzley & Andy Karduna

Poster 30

 Session: Sciences

Movement is the product of sensory input, mainly from vision and proprioception, and motor output. Vision is the sense of the surrounding space and proprioception is the sense of the body’s position in space. Joint position sense (JPS) is commonly used as a measure of proprioception. JPS of the dominant and nondominant shoulder was measured in healthy subjects to quantify error in a JPS task with and without visual information. Previous studies have examined sensory differences in limb dominance with conflicting results. Some have shown that no differences exist, while others show that movements with the dominant arm rely more on visual information and movements with the nondominant arm rely more on proprioceptive information. The latter theory is illustrated in activities of daily living, such as with preparing food, where the dominant arm uses a knife by viewing the movement while the nondominant arm guides the food by feeling the movement. It was hypothesized that in a JPS task, the dominant arm would have less error with visual information whereas the nondominant arm would have less error without visual information. Subjects wore a virtual reality headset with a tracker on their arm while performing a JPS task. Using the headset, subjects were presented with either a visual representation of their arm location or no visual information about arm location. No difference was found between sides. However, difference was seen between the vision and no vision conditions regardless of limb dominance. Higher error with no vision indicates that proprioception alone is not as effective in driving accurate movements as the combination of vision and proprioception. Future studies analyzing the contributions of vision and proprioception to movement may rule out variation associated with limb dominance.

Using Unity to obtain Eye-tracking data from the VIVE Pro Eye headset

Presenter(s): Zachary Hoffman—Human Physiology

Faculty Mentor(s): Kate Spitzley

Session: Prerecorded Poster Presentation

Recent developments in Virtual Reality (VR) technology have created new opportunities for the usage of VR in biomechanics research . Current VR devices allow for the tracking of head and limb movements through sensors on the head mounted display (HMD) and handheld or attachable controllers . VR headsets with built in eye-tracking cameras are a relatively new technology, and little research has been conducted on manipulating a virtual environment to analyze a subject’s gaze . The aims of this project were to create a virtual test environment with a moving object, and to have an eye-tracking code that would compare the gaze of the subject to the position of the moving object at each frame . An HTC VIVE Pro Eye headset was used, and a virtual environment consisting of a gray room with a ball moving along one wall was created through Unity . At each frame, a text file (readable in Excel) was updated with the time since program initiation, the gaze direction of the subject, the position of the ball, and the difference between the position of the ball and the gaze position . The aims of this project were successfully completed . The virtual environment was successfully created and we were able to export data comparing subject gaze position and ball position . This work provides evidence that this technology can be used for future research in VR eye-tracking .