Multisensory Perception: Eye Movement and Auditory Perception in Immersive Virtual Environments
Student:
Daniel Bhella
Mentors:
Dr. Christopher Buneo, PhD – Arizona State University, SBHSE
Dr. Ayoub Daliri. PhD – Arizona State University, SBHSE
Dr. Yi Zhou, PhD – Arizona State University, SBHSE
YouTube Link:
View the video link below before joining the zoom meeting
Zoom Link:
https://asu.zoom.us/j/7715217209
Abstract:
Perceiving our environment through multisensory stimuli is a task that our brains must perform regularly in real life. The main goal of this study is to understand how visual stimulation can improve or degrade auditory perception and sound localization when given an audiovisual stimulus. Also, we seek to understand how eye movement patterns occur during sound localization for people with normal and impaired hearing. We hypothesize that these eye movement patterns contain information about how the brain encodes the interactions between multisensory signals and which sensory modality dominates and under what context. We have developed a series of virtual environments (ranging in difficulty) that are to be displayed on a 48-inch curved monitor to challenge the subjects’ audiovisual perception. This will be used in tandem with eye-tracking software. Subjects (with varying hearing impairment levels) will be tasked with identifying the on-screen character who is speaking. In some trials, the subject will receive visual signals that conflict with the audio signals (auditory stimulus coming from different direction than visual signal on-screen) to see which sensory modality dominates. The results from this study will help us understand how the brain adapts its multisensory perception when one sensory modality is compromised.