Program

Date: September 5, 2022

Time (JST)

Time (CEST)

Presenter & Title

13:30-13:35

06:30-06:35

Welcome message and introduction to the workshop

13:35-13:50

06:35-06:50

Cas Coopmans, Karthikeya Kaushik, & Andrea E. Martin

Max Planck Institute for Psycholinguistics


Hierarchical structure in language and action

Abstract

It has been argued that both language and actions are analogous in their hierarchical organization. In an attempt to investigate this claim formally, we present a formalism to describe the algebraic structure corresponding to the set of hierarchical structures underlying sentences. When this formalism is applied to actions, it appears to be both too strong and too weak. One of our critical results is that the formal model for language is based on a type of compositionality that does not seem to be found in action structures. More specifically, in contrast to language, the meaning of action sequences is not based on their hierarchical structure. We conclude that the notion of structure in actions is fundamentally weaker than what we find in language.

13:50-14:05

06:50-07:05

Giorgio Papitto, Angela D. Friederici, & Emiliano Zaccarella

Max Planck Institute for Human Cognitive and Brain Sciences


Neurobiology of the relationship between action and language processing

Abstract

Similarities across action and language have usually been inferred either from formal comparisons between (i) the abstract structures of language and action or (ii) from the observation that both recruit partially overlapping neural resources, especially in the inferior frontal cortex. Recent work has called these assumptions into doubt, both on formal and neuroanatomical grounds. In this talk, I will discuss the similarities and differences between language and action, mainly focusing on their differential recruitment of Brodmann Area (BA) 44 in the left Inferior Frontal Gyrus (IFG) and on what we can infer from this functional segregation.

14:05-14:20

07:05-07:20

Emiliano Zaccarella, Patrick C. Trettenbrein, & Angela D. Friederici

Max Planck Institute for Human Cognitive and Brain Sciences


Neurobiology of spoken, written, and sign language processing

Abstract

The human capacity for language is best described as a biologically determined computational mechanism yielding an unbounded array of hierarchically structured expressions and should not be conflated with notions of “speech” or “communication”. The neurobiological basis of this cognitive mechanism has been localized primarily to left inferior frontal and posterior temporal cortices in the left hemisphere. This functional and structural network subserves the dynamic interaction of semantic and syntactic aspects during spoken, written, and sign language processing. In addition, other left perisylvian, right-hemispheric, as well as sub-cortical regions have also been implicated in different aspects of language processing. Against this background, we will argue that the brain regions involved in language processing can conceptually be segregated into a “core” and an “extended” language network: The former is assumed to be specialized for processing linguistic information, whereas the latter may also be recruited by other cognitive functions.

14:20-14:35

07:20-07:35

Simon Thibault, Raphaël Py, Angelo Mattia Gervasi, Martin Lövden, Véronique Boulenger, Alice C. Roy, & Claudio Brozzoli

University of Lyon


Neural correlates of action and language processing

Abstract

Does tool use share syntactic processes with language? Acting with a tool is thought to add a hierarchical level into the motor plan. In the linguistic domain, syntax is the cognitive function handling interdependent elements. Using functional magnetic resonance imaging, we detected common neurofunctional substrates in the basal ganglia subserving both tool use and syntax in language. Consistent with the existence of shared neural resources, we observed bidirectional behavioral enhancement of tool use and syntactic skills in language so that training one function improves performance in the other. This reveals supramodal syntactic processes for tool use and language.

14:35-14:50

07:35-07:50

Yannick Becker, Siham Bouziane, Eloise Disarbois, Amelie Picchiottino, Solène Brunschvig, Romane Phelipon, & Adrien Meguerditchian

Aix Marseille Université


Evolutionary perspective on the neurobiology of language and action

Abstract

In this talk, I will focus on the structural lateralisation of key perisylvian structures, that have historically been shown to be crucial for language processing in the human brain: Broca’s area, the Planum Temporale and their white matter fibre connection, the Arcuate Fasciculus. The central questions are: Is this structural hemispheric organisation shared between humans and monkeys? When does it emerge in development, and does it predict later asymmetric behaviours? If so, for what function did it evolve? My results support possible gestural origin of language lateralisation, dating back to the common ancestor of humans and baboons, 25-35 million years ago.

14:50-15:00

07:50-08:00

Presentation of the following E-Posters

Axel G. Ekström

KTH Speech, Music & Hearing


Reaching for speech: The role of dopamine

Abstract

The motor constellation theory of infant phonological ontogenesis is presented. In vocal learners, auditory feedback is necessary for matching explorative vocal output against intended sound, and motor learning is facilitated by dopaminergic circuitry involving the basal ganglia and cerebellum. Phonetic fragments can thus be mapped onto corresponding motor activity constellation in laryngeal–orosensory space, enabling replicated matching over repeated vocalizations across time.

MC poses a set of assumptions: (1) human infants are born with the instinct to explore orosensory space through tactile sensory motor behavior; (2) infant babble gives rise to emergent pseudo segmental phonetic properties; (3) invariance in speech signal attributes further facilitates the acquisition of language–specific phonemic repertoires, and gives rise to phonemes proper, defined as invariants in cognitive–orosensory space; (4) babble is thus gradually replaced by elective values in sound space, perceptual invariants selected via interaction with ingroup members; (5) achieving a desired phonological target results in enforced and reinforced via dopaminergic signaling, which instantiate encoding of combinations of motor sensory and auditory perceptual features.

Complex motor learning, underlying vocal learning, is contingent on sensory feedback. More broadly, basal ganglion function likely underlies learning through parsing successful from unsuccessful motor behavior, compared to desired outcomes . Basal ganglia, through being part of the neural dopaminergic circuitry, provide the necessary emphasis for mapping speech sounds, once achieved, to its corresponding place in orosensory space, facilitating repetition across continuous interaction. Neurologically, internally guided vocal explorative behavior and imitation are likely enabled by common ventral tegmental area-basal ganglion circuitry and guided via cortical-basal ganglion circuitry. Recent work has also suggested that the human laryngeal motor cortex, center for motor control of the larynx, may be modulated by dopamine via its being part of the vocal basal ganglia circuitry. Any achieved combinatory pattern thus subsequently becomes more easily repeatable through continuous reinstatement. A novel method for conceptualizing speech development, where emergent speech capacities are described as the cognitive mapping of coordinates in laryngeal-orosensory space, is described. Future directions are discussed.

Elliot Howard-Spink, Misato Hayashi, Tetsuro Matsuzawa, Daniel Schofield, Thibaud Gruber, & Dora Biro

University of Oxford


Syntactic Complexity of Chimpanzee action in a natural tool use task

Abstract

Hierarchical sequencing underlies many complex human behaviours, including language and tool use. However, much uncertainty remains surrounding the origins of the cognitive machinery that supports hierarchical behaviour. Several hypotheses have proposed that the cognition supporting hierarchical syntax in language may have evolved via the exaptation of pre-existing cognitive machinery supporting hierarchical sequencing in non-linguistic domains (e.g. action, or more specifically in some hypotheses, tool use). Thus, it is of interest to determine whether hierarchical action sequencing could pre-date the emergence of hierarchical syntax in human language. As close evolutionary relatives of our species, non-human primates (particularly chimpanzees) may provide insight into possible homologous origins of hierarchically organised human behaviours – specifically given their use of both tools and vocal sequences in the wild. Current discussion in the field finds little evidence of a hierarchical organisation in the vocal sequences of chimpanzees; however, there is less availability of data to make judgements about the hierarchical complexity of chimpanzee action during tool use. In the present study, we evaluated the structural organisation of sequential action used by West-African chimpanzees (Pan troglodytes verus) during natural nut-cracking behaviours at an outdoor laboratory in Bossou, Guinea. We recorded over 8000 actions used by 8 chimpanzees during nut-cracking with stone tools, and employed models of mutual information (MI) decay to assess whether chimpanzees relied on Markovian, hierarchical, or composite structuring strategies to sequence actions during nut-cracking. For all individuals, MI decay was best approximated by a composite model, which reflects the production of shorter Markovian sequences (between 2-5 elements long), which are then hierarchically organised to produce extended sequences of action. Our findings therefore offer evidence for a hierarchical system of action organisation in a natural form of chimpanzee tool-use. Together with the limited hierarchical complexity of non-human primate communication systems, we report our findings in relation to possible paths to the evolution of hierarchical syntax in human language through action.

Misaki Tsuboya, Tetsuya Yasuda, & Harumi Kobayashi

Graduate school of Tokyo Denki University


Symmetry bias in dogs

Abstract

Simmetry bias refers to the expectation that if exposed to two subsequent stimuli (A→B), humans also expect (B→A) to hold, which has been argued to be crucial for language development, in particular, for word learning, and possibly be important for language evolution. In non-human animals, it is often stated that symmetry bias does not exist. However, Lionello-DeNolf et al. (2021) claimed that symmetry bias was confirmed in approximately 30% of the non-human animals examined. It has been suggested that self- domestication, or becoming less aggressive and prosocial (Range et al, 2022), may be a preadaptation to language, and dogs have a long history of cooperation with humans and they developed such trains. In addition, dogs show similar contextual considerations to human infants, unlike trained wolves (Topál et al, 2009). In this preliminary study, we tested dogs on symmetry bias as a basis of word learning using behavior measures.

In the experiment, we first trained three dogs (poodles) to make associations between toys. Two kinds of toys were prepared: One that can be thrown (Toy A) and another that can be pulled (Toy B). For each toy, two significantly distinct sounds were corresponded. Sound A and Sound B (not the sounds of these toys) were played using a cellphone. In the training with Toy A, after the dog looked at the toy for 3 seconds, the experimenter played Sound A and simultaneously threw the toy. The dog fetched the toy and the experimenter praised the dog. This activity was repeated for 5 minutes. Then, another session with Toy B started. This totally 10-minutes training was done once a day for a month. An association test (MTS: matching-to-sample test) was done once a week. Two toys were placed in front of the dog, and either Sound A or Sound B was played. The dogʼs responses were recorded. After one of the dogs successfully passed the MTS test, a symmetry bias test was conducted. In the symmetry bias test, the toys and sounds were presented in a reverse order (i.e., Sound→ Toy). In the inconsistent condition, Sound A was played and only Toy B was placed in front of the dog. In the consistent condition, Sound A was played and only Toy A was similarly placed. All sessions were recorded on video and the dogʼs behavior was coded by frame-by- frame method.

The reaction time under the inconsistent condition was 1.6 sec, and that under the consistent condition was 1.1 sec. Thus, the dog responded more slowly to the toy in the inconsistent condition. The behavioral measurement also suggested that the dog seemed hesitant to take the toy that eventually took in the inconsistent condition, for example by looking around the toy rather than taking it. In the consistent condition, the dog went to the toy directly without hesitation. This result suggests that dogs may have symmetry bias. Then why dogs might have developed symmetry bias? We speculate that cooperation with humans who have naturally assumed symmetry bias might have promoted symmetry bias in dogs.

15:00-15:30

08:00-08:30

General discussion with all the presenters