Language and Cognition
Language Acquisition
Sign Language
Gesture
Spatial Development
Mathematical Development
How do gestures and sign languages shape the way we think, learn, and remember?
My research explores how visual communication, such as hand gestures and sign languages, supports learning and cognitive development. When we speak, we often move our hands in meaningful ways. In sign languages, meaning is expressed entirely through visual-spatial forms. Many of these forms are iconic: they visually resemble what they represent (for example, a sign that mimics turning a key). I study how these visual similarities help children and adults understand and organize information.
My work focuses on four main areas. First, I examine how gestures and sign language support spatial thinking, such as understanding concepts like left and right.
Second, I study vocabulary learning, showing that visually meaningful (iconic) signs can make word learning easier, especially for children who experience delayed language exposure. Some of this work includes developing and testing game-based digital learning tools.
Third, I explore how gesture and sign language influence emotional development and memory—how we remember emotionally meaningful events and how caregivers and children regulate emotions together. Finally, I investigate how parents’ use of number-related gestures supports children’s early math development.
Across all of these areas, I ask how the timing of language exposure matters. Comparing hearing children and adults with deaf individuals who learn sign language at different ages allows us to better understand how early experiences shape development.
Overall, my research highlights that gestures and visual language are not “extra” to communication; they are central tools that help us learn, think, and connect with others.