Analyzation of Kinematic Data in the Perturbation of Virtual Arm Position in Primates
Student:
Justin Pettit
Mentors:
Dr. Christopher Buneo, PhD - Arizona State University, SBHSE
Dr. Stephen Helms Tillery, PhD - Arizona State University, SBHSE
Dr. Marco Santello, PhD - Arizona State University, SBHSE
YouTube Link:
View the video link below before joining the zoom meeting
Zoom Link:
https://asu.zoom.us/j/82759847201
Abstract:
When looking into the realm of multisensory integration, in which the body takes into account sensory information and amalgamates it via the nervous system, the sense of sight plays a large impact on the way a person may perceive the world or attempt to perform a movement.. This goes in hand with the concept of proprioception, which is the body’s ability to sense its location and movements in space. Through the use of virtual reality, the relationship between visual and proprioceptive signals during movement can be tested. In this study, we hypothesize that increasing the size of a visual perturbation of arm position will systematically influence the accuracy and precision of movement trajectories within a reaching task performed by non-human primates. This visual perturbation has been created through the use of a semi-immersive virtual reality environment in which the primates perform a reaching task under two conditions, one with tactile feedback regarding their starting position and one without. In separate blocks of trials, the arm is visually perturbed from its veridical position at one of four distances. The impact of this perturbation will be determined through the use of circular statistical techniques, which will determine the circular mean and confidence intervals of each trial block. The results of this study will provide a better understanding of the relationship between vision and proprioception, and have potential implications for neural prosthetic development.