The forms and functions of primate multimodal communication

The forms and functions of primate multimodal communication

While language is often considered uniquely human, many of its multimodal components are shared with other species, providing a window on the origins of language. This symposium focuses on recent advances made to describe the forms of multimodal communication in primates, and the ways in which signalers use and combine sensory modalities within and across signals to achieve different communicative functions. Form encompasses the redundancy and complementarity of the information conveyed through different modalities at the prosodic/syntactical levels, as well as the linear or synchronous combination into a single stream. Functional explanations are varied and include: detectability, cross-modal inputs in learning/development, the disambiguation or addition of information, rhythmicity and turn-taking in conversations, and the maintenance of recipient attention. Our aim is to address how the recent drive to employ a multimodal approach articulates with existing theoretical models of language evolution, extends them, or calls for new theoretical developments.

Do infants perceive prosody as a multimodal event?

Bahai Guellaï, Laboratoire Ethologie, Cognition, Développement (LECD), Université Paris Nanterre, France

Speech is a multimodal experience: it is perceived both by the ears and the eyes. While studies evidenced that seeing someone talking influences adults' and infants’ perception of speech, it is not clear whether prosody is perceived both auditory and visually. Here, we investigated if infants were sensitive to the match between the auditory and visual correlates of infant-directed speech prosody from birth, and how this capacity develops during the first year of post-natal life. Results suggest that the prosodic structure of speech from voice and head kinematics are already perceived as highly connected from the earliest stages of development.

Multimodal and multicomponent lipsmacking in crested macaques (Macaca nigra)

Jérôme Micheletta1, Antje Engelhardt2, Lee Matthews1, Muhammad Agil3 & Bridget M. Waller1,

1Centre for Comparative and Evolutionary Psychology, University of Portsmouth, Portsmouth, UK, 2School of Natural Sciences and Psychology, Liverpool John Moores University, Liverpool, UK, 3Faculty of Veterinary Medicine, Bogor Agricultural University, Bogor, IndonesiaSchool of Natural Sciences and Psychology, Liverpool John Moores University, Liverpool, UK

Primates’ communicative signals are often dynamic and composed of multiple components, often from different sensory modalities. We investigated whether the composition of the lipsmack (a display mainly used in affiliative interactions) influenced the outcome of social interaction in crested macaques. Multimodal lipsmacks increased the probability of affiliative contact, suggesting they have enhanced signal value. While the total number of visual components in the display had no effect, some visual components seemed more influential than others. The results highlight the importance of a multimodal approach to the study of primate communication to better understand the evolution of complex communication.

Production and reception of unimodal and multimodal signals in wild chimpanzees (Pan troglodytes schweinfurthii)

Disruption of social attention and the tactical use of sensory modalities in primate infant signaling

Marie Bourjade, CLLE, Université de Toulouse, CNRS, Toulouse, France

While intentional multimodal communication may well be a necessary condition for the breakthrough of a multimodal language, the debate on language origins has largely revolved around the primary gestural or vocal roots of modern language. I will present an original hypothesis that intentional multimodal communication has evolved under the selective pressure of repairing visual attentional and communicative breakdowns. I will present how changes in mothers’ attention may prompt the use of different signal modalities in their infants and then draw on empirical evidence taken from baboon, chimpanzee and human communication to generate testable predictions of this hypothesis.

Combining signals can provide functional specificity: evidence from great ape vocal-gestural combinations

Emilie Genty1 & Catherine Hobaiter2, 1IPTO, Faculty of Science, University of Neuchâtel, Switzerland, 2Origins of Mind, University of St Andrews, St Andrew, Scotland

The ability to combine categories of signals (vocal, gestural, facial) is a universal and important feature of human language shared with great apes, but it has rarely been investigated. We will present findings on bonobos and chimpanzees’ vocal-gestural combinations, revealing that some combinations are specific to certain social functions and provide specificity towards signalers’ intended goals. We will discuss how this approach can provide insight into the debate regarding the mechanisms underpinning the learning process of successful signal production. We argue that a holistic approach is key to understanding the evolution of the combinatorial characteristics of human communication.

Gesture is the natural modality for language creation: Evidence from two extreme experiments

Nicolas Fay, School of Psychological Science, University of Western Australia

How does modality affect language creation? We compared the efficacy of gesture to non-linguistic vocalization over two extreme experiments. In each, we recorded participants communicating a range of meanings by gesture-only and by vocalization-only. The recordings were later played back to interpreters who tried to guess each meaning. Experiment 1 was a cross-cultural study, with producers sampled from Australia and Vanuatu (N=60). In Experiment 2 producers were severely vision-impaired or unimpaired (N=20). In both experiments communication using gesture was more successful than communication using non-linguistic vocalization, indicating that gesture is the natural modality for language creation.

Marie Bourjade is Lecturer in Psychology at the University of Toulouse in France. Her main interests lie in psychology, animal and human behavior, and cognition. Her research focuses on the evolution of intentional multimodal communication, social attention and coordination in primates with the aim to further document the evolution of human language. She studies multimodal signaling in diverse social contexts like dyadic and triadic social interactions, face-to-face communication and cooperative situations.

Catherine Hobaiter is a Lecturer in Primate Behavior at the Origins of Mind group in the Department of Psychology and Neuroscience, at the University of St Andrews in St Andrews, Scotland. She has spent the past 14-years studying wild primates across Africa, in particular the chimpanzees of the Budongo Forest in Uganda. Through long-term field studies she explores what the behavior of great apes living in their natural environment tells us about their minds, and about the evolutionary origins of our own behavior. Her research interests include communication, in particular gestural communication, and social learning. She recently established the Great Ape Dictionary, an online site allowing scientists from around the world to access video archives of wild ape behavior, and citizen scientists to participant in online research. She is the co-director of the Bugoma Primate Conservation Project, and the Vice President of Communications for the International Primatological Society.

Sponsoring Institutions