LEAS Results

Visual experience shapes bodily representation of emotion

Philosophers and experimentalists have long debated whether bodily representation of emotion is grounded in our sensory experience. Indeed, we are used to observe emotional reactions expressed through the bodies of others, yet it is still unknown whether this observation influence how we experience affective states in our own bodies. To delve into this question, we developed a naturalistic haptic task and asked a group of early (n= 20) and late (n= 20) blind, as well as sighted individuals (n= 20) to indicate where in the body they perceive changes associated with affective states. Our results show that visual experience shape bodily representation of emotion. Blind and sighted individuals attribute different importance to body regions in relation to specific emotional states, as sighted people focus more on visceral sensations, while blind report as more relevant the mouth and the hand areas. We also observe differences in the coherence of bodily maps of specific emotions, such as aggressiveness, for which early and late blind are homogenous in reporting the mouth, while sighted subjects demonstrate a scattered pattern of activation across the body. Finally, our findings show that blind people rely on a different organization of affect, as only sighted categorize bodily maps of emotion through the valence and arousal dimensions. In summary, we demonstrate that sensory experience impacts the bodily representation of affect by modulating the relevance that different body parts have in emotional reactions, by modifying the weights attributed to interoceptive and exteroceptive signals, and by changing how emotions are conceptualized in the body.

HABEMO: haptic bodily maps of emotions

The body serves as a foundational element not only for expressing emotions but also for experiencing them. Importantly, alterations occurring within both the internal and external environment of the body during and subsequent to an emotional occurrence, carry significant importance. Here, we developed an innovative haptic tool to investigate the association between emotional states and bodily reactions. Our new paradigm is extremely intuitive to use, does not necessitate vision, precise motor control, or familiarity with technology; making it particularly suited to test populations with visual or motor impairment or people with lower acquaintance with technology. At the same time, it guarantees the acquisition of bodily maps as accurate as those obtained with traditional tasks based on vision. Through the integration of motion tracking and a 3D human representation, our system allows to capture in a naturalistic manner where individuals sense affective and cognitive states within their bodies. Indeed, we developed a human body model which participants can easily touch and use to report the association between emotional experiences and bodily sensations. To confirm the effectiveness of this innovative method, we conducted an haptic and a visual version of the same task and revealed a strong correlation between the two modalities in how they capture individuals’ representation of emotional states in their bodies. Our novel behavioral paradigm based on haptic exploration therefore allows the mapping of emotions in the body in an intuitive way, offering a more inclusive and versatile method for exploring how people connect their emotions to their physical experiences. More importantly, the haptic version of our task allows to investigate whether and how the representation of affect is maintained in individuals with limited or no visual capability, in those with limited hand dexterity, and in those less familiar with technology.

The coding of affect in the brain

Emotion and perception are tightly intertwined, as affective experiences often arise from the appraisal of sensory information. Nonetheless, whether the brain encodes emotional instances using a sensory-specific code or in a more abstract manner is unclear. Here, we answer this question by measuring the association between emotion ratings collected during a unisensory or multisensory presentation of a full-length movie and brain activity recorded in typically-developed, congenitally blind and congenitally deaf participants. Emotional instances are encoded in a vast network encompassing sensory, prefrontal, and temporal cortices. Within this network, the ventromedial prefrontal cortex stores a categorical representation of emotion independent of modality and experience, and the posterior superior temporal cortex maps valence using an abstract code. Sensory experience more than modality impacts how the brain organizes emotional information outside supramodal regions, suggesting the existence of a scaffold for the representation of emotional states where sensory inputs during development shape its functioning.