Scientific axes

 

You can see a lot of things at the same time. In fact, the visual field is made up of hundreds of thousands of small segments that come together as a whole. Attention can be focused on a particular detail, but the parts of the whole are not in rivalry. They coexist in space. For sound, on the contrary, part of the acoustic field masks the rest. The closest visual equivalent would be glare. A line of light erases everything else.
John Hull, Touching the rock: An Experience of Blindness (1990)

John Hull became blind in 1983 in his fifties, and wrote a touching and fascinating autobiographical account of blindness. He highlights specific features inherent to hearing, such as the superposition or even destructive interactions between several sound sources (in the above quote), or the fact that acoustic space is imposed on us: we can't turn away our ears  from a distracting sound, but instead we have to pay attention to a sound over another one, possibly learn to distinguish one sound from another, to form memory templates of important sounds.

In that context, we seek for the neural basis of auditory cognition. Our goal is to understand how how contextual factors interact intimately with auditory processing at the cortical level. We tackle this problem from different conceptual angles detailed below. We use two main types of neural recording techniques, both focusing on the mesoscopic scale: electrophysiological recordings of population of neurons, and large-scale functional UltraSound (fUS) neuroimaging.

ATTENTION

Our perception can dramatically change with attentional focus over the course of a few hundreds of milliseconds. We study how top-down attention modulates auditory processing, and how flexible attention involved during task switching changes population-coding of task-relevant sensory dimensions. For this, we deploy population-level decoding techniques for reading out the activity of hundreds of neurons at multiple stages of the cortical hierarchy. We also dissect inter-area using a combination of electrophysiology and functional UltraSound neuroimaging. Incessant interactions with theoreticians provide testable predictions on the underlying mechanisms.

LEARNING

Learning shapes our auditory perception at multiple stages of our lives, and this phenomenon  unfolds over multiples time scales. We study how exposure to natural sounds in our early life profoundly influences the way our auditory cortex process speech and music sounds. We also track the dynamics of learning in artificial and cortical networks that are triggered by sound exposure, or task learning.

MEMORY

Exposure to sounds can leave long-term traces in auditory cortex, and possibly sculpts spontaneous activity reoccurring over the neural network. We image mnesic signatures in auditory cortex due to short- and long-term sensory history, in order to understand how auditory cortex dynamically keeps memory of the past.