Abstract
Autism is a neurodevelopmental condition characterized by persistent communication challenges, often compounded by social anxiety. Although differences in perspective-taking, cognitive flexibility, and social motivation have been implicated in these challenges, their influence on live interpersonal interactions remains unclear. In this study, we quantitatively examined how autistic and non-autistic individuals with varying levels of social anxiety adapted their communication during experimentally controlled interactions with two ostensibly distinct partners-a child and an adult-both portrayed by the same role-blind confederate. Autistic participants were equally motivated and capable as non-autistic participants in adjusting their communication to stereotypical assumptions about a partner's abilities, spontaneously using greater emphasis when addressing the presumed less capable child. However, they were less likely to modify these stereotype-driven behaviors in response to interaction-based evidence of partners' equal competence. While non-autistic participants dynamically adapted their communication to treat both partners equivalently, autistic participants maintained their stereotype-driven adjustments throughout the interaction. Preregistered analyses further linked non-autistic individuals' adaptive responses to early social exposure, a developmental factor not observed in autistic participants. Together, these findings highlight a core interactional capacity, shaped by early social experiences and operating on interaction-based evidence, as central to understanding communication challenges in autism.
Abstract
Despite large inter-individual differences in experience and conceptual structures, humans converge on referents largely underspecified by the signals exchanged during communication. Many neural networks have become extremely sensitive to context-dependent relationships between signals, but remain relatively blind to their referents outside the signal space. Here, we study how human interlocutors dynamically coordinate both their signal and referential spaces over extended communicative interactions. We identify a latent control parameter, representational complexity, that may regulate referential coordination in human communication. Using a custom hierarchical Transformer model, we generate movement- and interaction-level embeddings from neurotypical (NT) and autistic (ASC) dyads engaged in an experimental semiotic task. By leveraging ASC-related communicative variance, we identify changes in parameters that track the representational dimensionality of signals and referents as communication unfolds. Movement-level embeddings (within-trial dependencies) could not differentiate the two groups, indicating comparable communicative behaviors. In contrast, interactionlevel embeddings (across-trial dependencies) distinguished ASC from NT dyads with high accuracy. Crucially, representational complexity, i.e. dyadic alignment in the interaction-level intrinsic dimensionality used to encode communicative histories, tracked referential coordination demands, with greater misalignment in ASC dyads under referential volatility. These findings suggest that referential alignment is an interaction-level process, driven by the dynamic adaptation of representational complexity, rather than statistical relationships between signals alone.
Abstract
While individuals with autism often face challenges in everyday social interactions, they may demonstrate proficiency in structured theory of mind (ToM) tasks that assess their ability to infer others' mental states. Using functional magnetic resonance imaging and pupillometry, we investigated whether these discrepancies stem from diminished spontaneous mentalizing or broader difficulties in unstructured contexts.
Fifty-two adults diagnosed with autism and 52 neurotypical control participants viewed the animated short Partly Cloudy, a nonverbal animated film with a dynamic social narrative known to engage the ToM brain network during specific scenes. Analysis focused on comparing brain and pupil responses to these ToM events. Additionally, dynamic intersubject correlations were used to explore the variability of these responses throughout the film.
Both groups showed similar brain and pupil responses to ToM events and provided comparable descriptions of the characters' mental states. However, participants with autism exhibited significantly stronger correlations in their responses across the film's social narrative, indicating reduced interindividual variability. This distinct pattern emerged well before any ToM events and involved brain regions beyond the ToM network.
Our findings provide functional evidence of spontaneous mentalizing in autism, demonstrating this capacity in a context that affords but does not require mentalizing. Rather than responses to ToM events, a novel neurocognitive signature-interindividual variability in brain and pupil responses to evolving social narratives-differentiated neurotypical individuals from individuals with autism. These results suggest that idiosyncratic narrative processing in unstructured settings, a common element of everyday social interactions, may offer a more sensitive scenario for understanding the autistic mind.
Abstract
Stereotypes can exert a powerful influence on our interactions with others, potentially leading to prejudice when factual evidence is ignored. Here, we identify neuroanatomical and developmental factors that influence the real-time integration of stereotypes and factual evidence during live social interactions. The study uses precisely quantified communicative exchanges in a longitudinal cohort of seventeen-year-olds followed since infancy, testing their ability to moderate stereotype tendencies toward children as contrary evidence accumulates. Our results indicate that the impact of stereotypes on communicative behavior is linked to individual variations in gray matter density and cortical thickness in the right anterior cingulate gyrus. In contrast, the ability to moderate stereotype tendencies is influenced by developmental exposure to social interactions during the initial years of life, beyond the effects of familial environment and later experiences. These findings pinpoint a key brain structure underlying stereotype tendencies and suggest that early-life social experiences have lasting consequences on how individuals integrate factual evidence in interpersonal communication.
Abstract
For over a century, psychology has focused on uncovering mental processes of a single individual. However, humans rarely navigate the world in isolation. The most important determinants of successful development, mental health, and our individual traits and preferences arise from interacting with other individuals. Social interaction underpins who we are, how we think, and how we behave. Here we discuss the key methodological challenges that have limited progress in establishing a robust science of how minds interact and the new tools that are beginning to overcome these challenges. A deep understanding of the human mind requires studying the context within which it originates and exists: social interaction.
Abstract
Since the second half of the twentieth century, intracranial electroencephalography (iEEG), including both electrocorticography (ECoG) and stereo-electroencephalography (sEEG), has provided an intimate view into the human brain. At the interface between fundamental research and the clinic, iEEG provides both high temporal resolution and high spatial specificity but comes with constraints, such as the individual's tailored sparsity of electrode sampling. Over the years, researchers in neuroscience developed their practices to make the most of the iEEG approach. Here we offer a critical review of iEEG research practices in a didactic framework for newcomers, as well addressing issues encountered by proficient researchers. The scope is threefold: (i) review common practices in iEEG research, (ii) suggest potential guidelines for working with iEEG data and answer frequently asked questions based on the most widespread practices, and (iii) based on current neurophysiological knowledge and methodologies, pave the way to good practice standards in iEEG research. The organization of this paper follows the steps of iEEG data processing. The first section contextualizes iEEG data collection. The second section focuses on localization of intracranial electrodes. The third section highlights the main pre-processing steps. The fourth section presents iEEG signal analysis methods. The fifth section discusses statistical approaches. The sixth section draws some unique perspectives on iEEG research. Finally, to ensure a consistent nomenclature throughout the manuscript and to align with other guidelines, e.g., Brain Imaging Data Structure (BIDS) and the OHBM Committee on Best Practices in Data Analysis and Sharing (COBIDAS), we provide a glossary to disambiguate terms related to iEEG research.
Abstract
Shared attention experiments examine the potential differences in function or behavior when stimuli are experienced alone or in the presence of others, and when simultaneous attention of the participants to the same stimulus or set is involved. Previous work has found enhanced reactions to emotional stimuli in social situations, yet these changes might represent enhanced communicative or motivational purposes. This study examines whether viewing emotional stimuli in the presence of another person influences attention to or memory for the stimulus. Participants passively viewed emotionally-valenced stimuli while completing another task (counting flowers). Each participant performed this task both alone and in a shared attention condition (simultaneously with another person in the same room) while EEG signals were measured. Recognition of the emotional pictures was later measured. A significant shared attention behavioral effect was found in the attention task but not in the recognition task. Compared to event-related potential responses for neutral pictures, we found higher P3b response for task relevant stimuli (flowers), and higher Late Positive Potential (LPP) responses for emotional stimuli. However, no main effect was found for shared attention between presence conditions. To conclude, shared attention may therefore have a more limited effect on cognitive processes than previously suggested.
Abstract
This study uses electrocorticography in humans to assess how alpha- and beta-band rhythms modulate excitability of the sensorimotor cortex during psychophysically-controlled movement imagery. Both rhythms displayed effector-specific modulations, tracked spectral markers of action potentials in the local neuronal population, and showed spatially systematic phase relationships (traveling waves). Yet, alpha- and beta-band rhythms differed in their anatomical and functional properties, were weakly correlated, and traveled along opposite directions across the sensorimotor cortex. Increased alpha-band power in the somatosensory cortex ipsilateral to the selected arm was associated with spatially-unspecific inhibition. Decreased beta-band power over contralateral motor cortex was associated with a focal shift from relative inhibition to excitation. These observations indicate the relevance of both inhibition and disinhibition mechanisms for precise spatiotemporal coordination of movement-related neuronal populations, and illustrate how those mechanisms are implemented through the substantially different neurophysiological properties of sensorimotor alpha- and beta-band rhythms.
Abstract
As scientists, we brainstorm and develop experimental designs with our colleagues and students. Paradoxically, this teamwork has produced a field focused nearly exclusively on mapping the brain as if it evolved in isolation. Here, we discuss promises and challenges in advancing our understanding of how human minds connect during social interaction.
Abstract
The Brain Imaging Data Structure (BIDS) is a community-driven specification for organizing neuroscience data and metadata with the aim to make datasets more transparent, reusable, and reproducible. Intracranial electroencephalography (iEEG) data offer a unique combination of high spatial and temporal resolution measurements of the living human brain. To improve internal (re)use and external sharing of these unique data, we present a specification for storing and sharing iEEG data: iEEG-BIDS.
Abstract
Communication deficits are a defining feature of Autism Spectrum Disorder (ASD), manifest during social interactions. Previous studies investigating communicative deficits have largely focused on the perceptual biases, social motivation, cognitive flexibility, or mentalizing abilities of isolated individuals. By embedding autistic individuals in live non-verbal interactions, we characterized a novel cause for their communication deficits. Adults with ASD matched neurotypical individuals in their ability and propensity to generate and modify intelligible behaviors for a communicative partner. However, they struggled to align the meaning of those behaviors with their partner when meaning required referencing their recent communicative history. This communicative misalignment explains why autistic individuals are vulnerable in everyday interactions, which entail fleeting ambiguities, but succeed in social cognition tests involving stereotyped contextual cues. These findings illustrate the cognitive and clinical importance of considering social interaction as a communicative alignment challenge, and how ineffective human communication is without this key interactional ingredient.
Abstract
Human orbitofrontal cortex (OFC) has long been implicated in value-based decision-making. In recent years, convergent evidence from human and model organisms have further elucidated its role in representing reward-related computations underlying decision-making. However, a detailed description of these processes remains elusive, due in part to (i) limitations in our ability to observe human OFC neural dynamics at the timescale of decision processes, and (ii) methodological and interspecies differences that make it challenging to connect human and animal findings or to resolve discrepancies when they arise. Here we sought to address these challenges by conducting multi-electrode electrocorticography (ECoG) recordings in neurosurgical patients during economic decision-making to elucidate the electrophysiological signature, sub-second temporal profile, and anatomical distribution of reward-related computations within human OFC. We found that high frequency activity (HFA, 70-200Hz) reflected multiple valuation components grouped in two classes of valuation signals that were dissociable in temporal profile and information content: (i) fast, transient responses reflecting signals associated with choice and outcome-processing, including anticipated risk and outcome regret, and (ii) sustained responses explicitly encoding what happened in the immediately preceding trial. Anatomically, these responses were widely distributed in partially overlapping networks, including regions in the central OFC (Brodmann Areas 11 and 13) which have been consistently implicated in reward processing in animal single unit studies. Together, these results integrate insights drawn from human and animal studies and provide evidence for a role of human OFC in representing multiple reward computations.
Abstract
One of the main symptoms of Autism Spectrum Conditions (ASC) is experiencing cognitive inflexibility when adjustments of behaviour are required. While this so-called behavioural rigidity is broadly recognised in ASC, finding evidence for the underlying neurocognitive mechanisms remains challenging. In this electroencephalographic (EEG) study, participants with ASC and matched controls were instructed to choose between two cognitive tasks in each trial, and to respond to the subsequently presented target stimulus according to their task choice. While doing so, we tracked the frontally distributed contingent negative variation (CNV) during the task preparation interval as a measure of intentional control, and the posteriorly measured P3 during the task execution interval to monitor the translation of intentions into actions. The results support the notion of intentional control difficulties in ASC, where the CNV was attenuated in the ASC group compared to the control group. Furthermore, the CNV was differentiated between the tasks and transition types in the control group only, suggesting that the ASC group was less fine-tuning the required amount of intentional control to contextual circumstances. In contrast, the P3 showed no significant differences between the groups. Together, these findings highlight the importance of intentional control mechanisms as a crucial future route for a better understanding of cognitive flexibility and behavioural rigidity in ASC.
Abstract
Human intracranial electroencephalography (iEEG) recordings provide data with much greater spatiotemporal precision than is possible from data obtained using scalp EEG, magnetoencephalography (MEG), or functional MRI. Until recently, the fusion of anatomical data (MRI and computed tomography (CT) images) with electrophysiological data and their subsequent analysis have required the use of technologically and conceptually challenging combinations of software. Here, we describe a comprehensive protocol that enables complex raw human iEEG data to be converted into more readily comprehensible illustrative representations. The protocol uses an open-source toolbox for electrophysiological data analysis (FieldTrip). This allows iEEG researchers to build on a continuously growing body of scriptable and reproducible analysis methods that, over the past decade, have been developed and used by a large research community. In this protocol, we describe how to analyze complex iEEG datasets by providing an intuitive and rapid approach that can handle both neuroanatomical information and large electrophysiological datasets. We provide a worked example using an example dataset. We also explain how to automate the protocol and adjust the settings to enable analysis of iEEG datasets with other characteristics. The protocol can be implemented by a graduate student or postdoctoral fellow with minimal MATLAB experience and takes approximately an hour to execute, excluding the automated cortical surface extraction.
Abstract
Oxytocin is a neuropeptide known to influence how humans share material resources. Here we explore whether oxytocin influences how we share knowledge. We focus on two distinguishing features of human communication, namely the ability to select communicative signals that disambiguate the many-to-many mappings that exist between a signal’s form and meaning, and adjustments of those signals to the presumed cognitive characteristics of the addressee (“audience design”). Fifty-five males participated in a randomized, double-blind, placebo controlled experiment involving the intranasal administration of oxytocin. The participants produced novel non-verbal communicative signals towards two different addressees, an adult or a child, in an experimentally-controlled live interactive setting. We found that oxytocin administration drives participants to generate signals of higher referential quality, i.e. signals that disambiguate more communicative problems; and to rapidly adjust those communicative signals to what the addressee understands. The combined effects of oxytocin on referential quality and audience design fit with the notion that oxytocin administration leads participants to explore more pervasively behaviors that can convey their intention, and diverse models of the addressees. These findings suggest that, besides affecting prosocial drive and salience of social cues, oxytocin influences how we share knowledge by promoting cognitive exploration.
Abstract
Referential pointing is a characteristically human behavior, which involves moving a finger through space to direct an addressee towards a desired mental state. Planning this type of action requires an interface between sensorimotor and conceptual abilities. A simple interface could supplement spatially-guided motor routines with communicative-ostensive cues. For instance, a pointing finger held still for an extended period of time could aid the addressee’s understanding, without altering the movement’s trajectory. A more complex interface would entail communicative knowledge penetrating the sensorimotor system and directly affecting pointing trajectories. We compare these two possibilities using motion analyses of referential pointing during multi-agent interactions. We observed that communicators produced ostensive cues that were sensitive to the communicative context. Crucially, we also observed pervasive adaptations to the pointing trajectories: they were tailored to the communicative context and to partner-specific information. These findings indicate that human referential pointing is planned and controlled on the basis of partner-specific knowledge, over and above the tagging of motor routines with ostensive cues.
Abstract
Listeners interpret utterances by integrating information from multiple sources including word level semantics and world knowledge. When the semantics of an expression is inconsistent with their knowledge about the world, the listener may have to search through the conceptual space for alternative possible world scenarios that can make the expression more acceptable. Such cognitive exploration requires considerable computational resources and might depend on motivational factors. This study explores whether and how oxytocin, a neuropeptide known to influence social motivation by reducing social anxiety and enhancing affiliative tendencies, can modulate the integration of world knowledge and sentence meanings. The study used a between-participant double-blind randomized placebo-controlled design. Semantic integration, indexed with magnetoencephalography through the N400m marker, was quantified while 45 healthy male participants listened to sentences that were either congruent or incongruent with facts of the world, after receiving intranasally delivered oxytocin or placebo. Compared with congruent sentences, world knowledge incongruent sentences elicited a stronger N400m signal from the left inferior frontal and anterior temporal regions and medial pFC (the N400m effect) in the placebo group. Oxytocin administration significantly attenuated the N400m effect at both sensor and cortical source levels throughout the experiment, in a state-like manner. Additional electrophysiological markers suggest that the absence of the N400m effect in the oxytocin group is unlikely due to the lack of early sensory or semantic processing or a general downregulation of attention. These findings suggest that oxytocin drives listeners to resolve challenges of semantic integration, possibly by promoting the cognitive exploration of alternative possible world scenarios
Abstract
To select a movement, specific neuronal populations controlling particular features of that movement need to be activated, whereas other populations are downregulated. The selective (dis)inhibition of cortical sensorimotor populations is governed by rhythmic neural activity in the alpha (8–12 Hz) and beta (15–25 Hz) frequency range. However, it is unclear whether and how these rhythms contribute independently to motor behavior. Building on a recent dissociation of the sensorimotor alpha- and beta-band rhythms, we test the hypothesis that the beta-band rhythm governs the disinhibition of task-relevant neuronal populations, whereas the alpha-band rhythm suppresses neurons that may interfere with task performance. Cortical alpha- and beta-band rhythms were manipulated with transcranial alternating current stimulation (tACS) while human participants selected how to grasp an object. Stimulation was applied at either 10 or 20 Hz and was imposed on the sensorimotor cortex contralaterally or ipsilaterally to the grasping hand. In line with task-induced changes in endogenous spectral power, the effect of the tACS intervention depended on the frequency and site of stimulation. Whereas tACS stimulation generally increased movement selection times, 10 Hz stimulation led to relatively faster selection times when applied to the hemisphere ipsilateral to the grasping hand, compared with other stimulation conditions. These effects occurred selectively when multiple movements were considered. These observations functionally differentiate the causal contribution of alpha- and beta-band oscillations to movement selection. The findings suggest that sensorimotor beta-band rhythms disinhibit task-relevant populations, whereas alpha-band rhythms inhibit neuronal populations that could interfere with movement selection.
Abstract
We share our thoughts with other minds, but we do not understand how. Having a common language certainly helps, but infants’ and tourists’ communicative success clearly illustrates that sharing thoughts does not require signals with a pre-assigned meaning. In fact, human communicators jointly build a fleeting conceptual space in which signals are a means to seek and provide evidence for mutual understanding. Recent work has started to capture the neural mechanisms supporting those fleeting conceptual alignments. The evidence suggests that communicators and addressees achieve mutual understanding by using the same computational procedures, implemented in the same neuronal substrate, and operating over temporal scales independent from the signals’ occurrences.
Abstract
Damage to the human ventromedial prefrontal cortex (vmPFC) leads to profound changes in everyday social interactions [1, 2]. Yet, in the lab, vmPFC patients show surprising proficiency in reasoning about other agents [3, 4, 5, 6, 7, 8]. These conflicting observations suggest that what vmPFC patients lack in everyday social interactions might be the ability to guide their decisions with knowledge about a social partner [9, 10, 11, 12, 13], despite preserved access to that knowledge [2, 14]. Quantification of socially relevant decisions during live interaction with different partners offers the possibility of testing this hypothesis. Eight patients with vmPFC damage, eight patients with brain damage elsewhere, and 15 healthy participants were asked to communicate non-verbally with two different addressees, an adult or a child, in an experimentally controlled interactive setting [15, 16]. In reality, a confederate blindly performed the role of both adult and child addressee, with matched performance and response times, such that the two addressees differed only in terms of the communicator’s beliefs. Patients with vmPFC damage were able—and motivated—to generate communicatively effective behaviors. However, unlike patient and healthy controls, vmPFC patients failed to adjust their communicative decisions to the presumed abilities of their addressee. These findings indicate that the human vmPFC is necessarily involved in social interactions, insofar as those interactions need to be tailored toward knowledge about a social partner. In this perspective, the known contribution of this region to disparate domains like value-based decision-making [17, 18, 19], schema-based memory-processing [20, 21, 22], and person-specific mentalizing [11, 12, 13] might be instances of decisions based on contingently updated conceptual knowledge.
Abstract
How can we understand each other during communicative interactions? An influential suggestion holds that communicators are primed by each other’s behaviors, with associative mechanisms automatically coordinating the production of communicative signals and the comprehension of their meanings. An alternative suggestion posits that mutual understanding requires shared conceptualizations of a signal’s use, i.e., “conceptual pacts” that are abstracted away from specific experiences. Both accounts predict coherent neural dynamics across communicators, aligned either to the occurrence of a signal or to the dynamics of conceptual pacts. Using coherence spectral-density analysis of cerebral activity simultaneously measured in pairs of communicators, this study shows that establishing mutual understanding of novel signals synchronizes cerebral dynamics across communicators’ right temporal lobes. This interpersonal cerebral coherence occurred only within pairs with a shared communicative history, and at temporal scales independent from signals’ occurrences. These findings favor the notion that meaning emerges from shared conceptualizations of a signal’s use.
Abstract
The capacity for mutual understanding, often metaphorically expressed as “being in sync,” is one of the great scientific enigmas (Levinson, 2006). How can we understand what another is thinking or feeling just by observing their actions? For example, how does my friend know I am suggesting we enter a pub when I point toward a nearby bicycle that both of us know belongs to his girlfriend?
Abstract
Rhythmic neural activity within the alpha (8–12 Hz) and beta (15–25 Hz) frequency bands is modulated during actual and imagined movements. Changes in these rhythms provide a mechanism to select relevant neuronal populations, although the relative contributions of these rhythms remain unclear. Here we use MEG to investigate changes in oscillatory power while healthy human participants imagined grasping a cylinder oriented at different angles. This paradigm allowed us to study the neural signals involved in the simulation of a movement in the absence of signals related to motor execution and sensory reafference. Movement selection demands were manipulated by exploiting the fact that some object orientations evoke consistent grasping movements, whereas others are compatible with both overhand and underhand grasping. By modulating task demands, we show a functional dissociation of the alpha- and beta-band rhythms. As movement selection demands increased, alpha-band oscillatory power increased in the sensorimotor cortex ipsilateral to the arm used for imagery, whereas beta-band power concurrently decreased in the contralateral sensorimotor cortex. The same pattern emerged when motor imagery trials were compared with a control condition, providing converging evidence for the functional dissociation of the two rhythms. These observations call for a re-evaluation of the role of sensorimotor rhythms. We propose that neural oscillations in the alpha-band mediate the allocation of computational resources by disengaging task-irrelevant cortical regions. In contrast, the reduction of neural oscillations in the beta-band is directly related to the disinhibition of neuronal populations involved in the computations of movement parameters.
Abstract
Despite the ambiguity inherent in human communication, people are remarkably efficient in establishing mutual understanding. Studying how people communicate in novel settings provides a window into the mechanisms supporting the human competence to rapidly generate and understand novel shared symbols, a fundamental property of human communication. Previous work indicates that the right posterior superior temporal sulcus (pSTS) is involved when people understand the intended meaning of novel communicative actions. Here, we set out to test whether normal functioning of this cerebral structure is required for understanding novel communicative actions using inhibitory low-frequency repetitive transcranial magnetic stimulation (rTMS). A factorial experimental design contrasted two tightly matched stimulation sites (right pSTS vs left MT+, i.e., a contiguous homotopic task-relevant region) and tasks (a communicative task vs a visual tracking task that used the same sequences of stimuli). Overall task performance was not affected by rTMS, whereas changes in task performance over time were disrupted according to TMS site and task combinations. Namely, rTMS over pSTS led to a diminished ability to improve action understanding on the basis of recent communicative history, while rTMS over MT+ perturbed improvement in visual tracking over trials. These findings qualify the contributions of the right pSTS to human communicative abilities, showing that this region might be necessary for incorporating previous knowledge, accumulated during interactions with a communicative partner, to constrain the inferential process that leads to action understanding.
Abstract
Human referential communication is often thought as coding–decoding a set of symbols, neglecting that establishing shared meanings requires a computational mechanism powerful enough to mutually negotiate them. Sharing the meaning of a novel symbol might rely on similar conceptual inferences across communicators or on statistical similarities in their sensorimotor behaviors. Using magnetoencephalography, we assess spectral, temporal, and spatial characteristics of neural activity evoked when people generate and understand novel shared symbols during live communicative interactions. Solving those communicative problems induced comparable changes in the spectral profile of neural activity of both communicators and addressees. This shared neuronal up-regulation was spatially localized to the right temporal lobe and the ventromedial prefrontal cortex and emerged already before the occurrence of a specific communicative problem. Communicative innovation relies on neuronal computations that are shared across generating and understanding novel shared symbols, operating over temporal scales independent from transient sensorimotor behavior.
Abstract
A large body of work has focused on children’s ability to attribute mental states to other people, and whether these abilities are influenced by the extent and nature of children’s social interactions. However, it remains largely unknown which developmental factors shape children’s ability to influence the mental states of others. Building on the suggestion that collaborative experiences early in life might be crucial for the emergence of mental coordination abilities, here we assess the relative contribution of social exposure to familial and non-familial agents on children’s communicative adjustments to their mental model of an addressee (‘audience design’). During an online interactive game, five-year-olds spontaneously organized their non-verbal communicative behaviors according to their beliefs about an interlocutor. The magnitude of these communicative adjustments was predicted by the time spent at daycare, from birth until four years of age, over and above effects of familial social environment. These results suggest that the degree of non-familial social interaction early in life modulates the influence that children’s beliefs have on their referential communicative behavior.
Abstract
Magnetoencephalography (MEG) is measured above the head, which makes it sensitive to variations of the head position with respect to the sensors. Head movements blur the topography of the neuronal sources of the MEG signal, increase localization errors, and reduce statistical sensitivity. Here we describe two novel and readily applicable methods that compensate for the detrimental effects of head motion on the statistical sensitivity of MEG experiments. First, we introduce an online procedure that continuously monitors head position. Second, we describe an offline analysis method that takes into account the head position time-series. We quantify the performance of these methods in the context of three different experimental settings, involving somatosensory, visual and auditory stimuli, assessing both individual and group-level statistics. The online head localization procedure allowed for optimal repositioning of the subjects over multiple sessions, resulting in a 28% reduction of the variance in dipole position and an improvement of up to 15% in statistical sensitivity. Offline incorporation of the head position time-series into the general linear model resulted in improvements of group-level statistical sensitivity between 15% and 29%. These tools can substantially reduce the influence of head movement within and between sessions, increasing the sensitivity of many cognitive neuroscience experiments.
Abstract
Humans have a remarkable capacity for tuning their communicative behaviors to different addressees, a phenomenon also known as recipient design. It remains unclear how this tuning of communicative behavior is implemented during live human interactions. Classical theories of communication postulate that recipient design involves perspective taking, i.e., the communicator selects her behavior based on her hypotheses about beliefs and knowledge of the recipient. More recently, researchers have argued that perspective taking is computationally too costly to be a plausible mechanism in everyday human communication. These researchers propose that computationally simple mechanisms, or heuristics, are exploited to perform recipient design. Such heuristics may be able to adapt communicative behavior to an addressee with no consideration for the addressee's beliefs and knowledge. To test whether the simpler of the two mechanisms is sufficient for explaining the “how” of recipient design we studied communicators' behaviors in the context of a non-verbal communicative task (the Tacit Communication Game, TCG). We found that the specificity of the observed trial-by-trial adjustments made by communicators is parsimoniously explained by perspective taking, but not by simple heuristics. This finding is important as it suggests that humans do have a computationally efficient way of taking beliefs and knowledge of a recipient into account.
Abstract
How do infants acquire their first words without any prior knowledge of language? And how do they later use language so effectively, despite considerable differences in individual experience and expertise? This article argues that language acquisition draws on the same foundational capacity that supports adult language use: the ability to construct shared frames of reference with others. We suggest that this capacity, essential for transforming behavior into communicative acts, begins to emerge before speech. We further propose that it not only supports the social acquisition of language but also helps explain language’s very existence, as a tool for efficiently aligning understanding and navigating increasingly complex social landscapes.
Abstract
Preparing intracranial electroencephalography (iEEG) datasets for analysis presents a unique set of methodological challenges that are absent in non-invasive investigative techniques. Because iEEG is primarily used in epilepsy patients with varying brain pathologies, the main challenges pertain to variability in electrode coverage and therefore the regions of the brain from which electrophysiological recordings can be obtained. In this chapter, we outline how to efficiently integrate the raw anatomical images and electrophysiological recordings during preprocessing, allowing iEEG datasets to be analyzed in an anatomically precise and consistent way.
Abstract
This contribution argues that a common language and its statistics do not explain how people overcome fundamental communicative obstacles. We introduce joint epistemic engineering, a neurosemiotic account of how asymmetric interlocutors can communicate effectively despite using ambiguous signals that are referentially contingent on the current communicative circumstances. The basic insight is that a communicative signal contains a multiplicity of functions and that interlocutors use those multi-layered signals to simultaneously coordinate a space of possible interpretations, declare a communicative intent, and to reduce uncertainty over the identity of a referent.
Abstract
A common way to understand memory structures in the cognitive sciences is as a cognitive map. Cognitive maps are representational systems organized by dimensions shared with physical space. The appeal to these maps begins literally: as an account of how spatial information is represented and used to inform spatial navigation. Invocations of cognitive maps, however, are often more ambitious; cognitive maps are meant to scale up and provide the basis for our more sophisticated memory capacities. The extension is not meant to be metaphorical, but the way in which these richer mental structures are supposed to remain map-like is rarely made explicit. Here we investigate this missing link, asking How do cognitive maps represent non-spatial information? We begin with a survey of foundational work on spatial cognitive maps and then provide a comparative review of alternative, non-spatial representational structures. We then turn to several cutting-edge projects that are engaged in the task of scaling up cognitive maps so as to accommodate non-spatial information: first, on the spatial-isometric approach, encoding content that is non-spatial but in some sense isomorphic to spatial content; second, on the abstraction approach, encoding content that is an abstraction over first-order spatial information; and third, on the embedding approach, embedding non-spatial information within a spatial context, a prominent example being the Method-of-Loci. Putting these cases alongside one another reveals the variety of options available for building cognitive maps, and the distinctive limitations of each. We conclude by reflecting on where these results take us in terms of understanding the place of cognitive maps in memory.
Abstract
As members of the next generation of leaders, we aim to promote an ethical, inclusive, humanistic, and curiosity-driven code of conduct within cognitive neuroscience. We critically examine the current state of the field and discuss issues that remain to be addressed. In particular, we highlight the need for cultural change, including a greater focus on integrity and accountability (i.e., addressing sexual harassment), as well as promoting diversity. Further, we highlight current methodological considerations, such as big data and deep data, replicability, and reproducibility. We consider some of the unresolved controversies within cognitive neuroscience and where the research should go from here. Finally, we delve into how neuroscience research is presented to the public and how science communication can benefit society. This chapter presents a call to action not only to further scientific improvements but also to address biases and behaviors that have the potential to hinder scientific advancements.
The power of early diversity, Psychology Today
Childhood social interactions can combat stereotypes, Dartmouth News
How stereotypes impact our social interactions, Psychology Today
The hidden act in everyday conversation, Psychology Today
Why don't we say what we really mean?, Het Talige Brein
Wat grote pupillen te maken hebben met een vlotte babbel, De Standaard
Hoe we elkaar begrijpen - of soms juist niet, Kennislink
A computer game provides invaluable information on autism spectrum disorder, CORDIS
How we understand each other in a shared cognitive space, Donders Newsletter
A novel cause for communication deficits in autism, Language In Interaction
Will computers ever truly understand what we’re saying?, UC Berkeley
Will computers ever understand us?, Donders Institute
Waarom zijn computers zo slecht in communiceren met mensen?, Motherboard
Waarom je computer je niet begrijpt, Kennislink
Frontal lobe damage alters communication, Donders Institute
Communicatie mist nuance bij frontale hersenschade, Radboud University
Synchrony in communicating brains, Donders Institute
Honden luisteren net als mensen, Volkskrant
De “Just-Yo” app en wederzijds begrip, Donders Wonders
Elkaar begrijpen gaat niet alleen met woorden, Kennislink
Begrijpen kan ook zonder taal, Volkskrant
A rough guide to mind-reading, The Guardian
Creche maakt kinderen socialer, Volkskrant
Creche versterkt inlevingsvermogen, Kennislink
Children who go to daycare may benefit from a wider variety of social situations, ScienceDaily
Non-verbale communicatie, Het Talige Brein