#BreGroMM+
Linguistics & Multimodality - The Way to Go?!
Linguistics & Multimodality - The Way to Go?!
26 April 2024, 1.00-3.00pm, M.0074 (Noordamzaal)
Asli Özyürek
MPI Nijmegen, The Netherlands
Languages can be expressed and perceived not only through speech or written text but also through visible body expressions (hands, body, and face) or visual graphics (e.g., drawings, emojis etc) . All spoken languages use gestures along with speech, and in deaf communities all aspects of language can be expressed through the visible body in sign language.Thus expressing language through visual as well as other modalities is an inherent nature of language. In the last years there has been growing studies and theoretical framings to understand how expressions in different modalities and representational formats can be used together to convey an integrated message at levels of speech acts, grammar, discourse and semantics in individuals or in interactional contexts . In this talk I will present studies at these different levels and discuss different proposals for integrating how visual features of language, along with speech or in sign languages, constitute a fundamental aspect of the human language capacity, contributing to its unique flexible and adaptive nature in adults and children.
Hagoort, P. and Özyürek, A. (2024), Extending the Architecture of Language From a Multimodal Perspective. Top. Cogn. Sci.. https://doi.org/10.1111/tops.12728
Asli Özyürek, received a joint PhD in Linguistics and Psychology from the University of Chicago. She is currently the Director of Multimodal Language Department at the Max Planck Institute for Psycholinguistics and is a Professor at Radboud University. She is also a PI at Donders Institute for Brain Cognition and Behavior and an elected member of Academia Europea. She has received many prestigious grants from NSF, NIH, Dutch Science Foundation, ERC and Turkish Science Foundation. Özyürek investigates the inherently and universal multimodal nature of human language capacity as one of its adaptive design features. To do so she studies how brain supports multimodal language, how typologically different spoken and signed languages pattern their structures given their multimodal diversity, and how the learning constraints and communicative pressures of interaction shape multimodal language, its acquisition and evolution as an adaptive system. She uses a variety of methodologies such as behavioral and kinematic analyses of multimodal linguistic structures, eye tracking, machine learning and brain imaging to understand the complex multimodal nature of human language capacity.
https://www.mpi.nl/people/ozyurek-asli