Action & Brain Lab
At the Action & Brain Lab, we use neural and behavioral science to study the embodied aspects of sign language.
Through EEG and other methods, we examine the function of sensorimotor, language, and attentional networks of the brain in relation to sign language. Research areas include the role of mirroring systems, spatial perception, and how action experience and action processing affect one another. We also leverage this work to build new systems for learning signed language.
Our postdoctoral, graduate, and graduate trainees work primarily on two arms of research. Firstly, we study how knowledge of signed languages (e.g., American Sign Language) changes visual perception skills, cognitive processing, and spatial cognition. We conduct both EEG and large-scale online behavioral work to examine these questions.
Secondly, along a team from the Motion Light Lab, we study how to develop high-quality signing virtual humans (e.g., signing avatars), and examine how people respond to them. We also lead an ongoing project to develop an immersive virtual reality (VR) ASL-learning game, which includes sign language recognition in VR.
Our lab is always interested in connecting with potential collaborators and trainees! Please contact us if you would like to connect.