Speech processing and reading

In one pioneer study (Buiatti et al., 2009), we have used frequency-tagging to identify the 'high-level' neural correlate of word parsing in speech processing, and dissociate it from the 'low-level' sensory processing of single syllables. An unexpected outcome of this analysis is the inhibition of single syllables processing when subjects are aware of the presence of words in the speech stream, even if they are not able to identify them.

Power spectrum of the EEG signal in a central midline electrode of one participant calculated from the whole 9 min period of exposure to artificial speech in the four different conditions to illustrate the method and the main results. Power bars at target frequency bins are colored in blue (one-syllable frequency bin ≈ 4.2 Hz), black (two-syllable frequency bin ≈ 2.1 Hz) and red (three-syllable frequency bin ≈ 1.4 Hz). At one-syllable frequency, power peaks clearly emerge in both Random conditions (top row), while they disappear in both Word conditions (bottom row). Conversely, a peak at three-syllable frequency is clearly visible in the Word condition with pauses only (bottom right-hand panel). From (Buiatti et al., 2009).

In a following study, we have used the same experimental paradigm to investigate word segmentation in 8-month-old infants (Kabdebon et al., 2015). To overcome the higher level of noise and endogenous activity of EEG infant data, we have developed a more sensitive method that tracks the brain response at the tag frequency by using a measure of phase-locking at that frequency (rather than the power spectrum peak). Using this method, we were able to infer from the infants’ brain responses their ability to segment words within a continuous speech stream - in other words, to compute the statistical structure of language. 

Frequency tagging during the learning stream. A: Schematic representation of the expected brain activity in response to the stimulation, at syllabic and word frequencies. The oscillatory activity is phase locked to the onset of syllables (blue) and words (red). B: Phase-locking values at syllabic, bi-syllabic and word rates in the real data (first column), and surrogate data (second column). The difference between the two is presented as a third column. The electrodes showing a significant difference are highlighted. From (Kabdebon et al., 2015).


In another experiment (Forget et al., 2010), I have used phase-locking at the stimulation frequency as a tag for tracking the first steps of visual processing in word identification. This analysis, combined with classical ERP analysis on temporal integration for word identification, shows that the brain is able to integrate successively presented parts of words together to correctly read the word even though the visual system processes each word part separately. 

Phase locking of the evoked voltages at the frequency of stimulus alternation. For each SOA, a topographical plot shows the scalp map of the difference in Phase Locking Factor (PLF) at the corresponding frequency (f = 1000/SOA) during stimulation versus during baseline. Channels forming statistically significant clusters are marked with black points (p < .02 for SOA = 50 msec;p < .001 for all other SOAs). An occipital cluster is present at all SOAs, indicating that the alternating stimuli always enter into occipital visual cortex, even when subjects report perceived the integrated string. Within this occipital cluster, the bar graphs in the second row show the PLF difference at the corresponding SOA frequency (red bars) and at the other SOA frequencies (blue bars).

References:

Buiatti M, Pena M, Dehaene-Lambertz G, 

Investigating the neural correlates of continuous speech computation with frequency-tagged neuroelectric responses

Neuroimage 44, 509-519 (2009). 

Kabdebon C, Pena M, Buiatti M, Dehaene-Lambertz G, 

Electrophysiological evidence of statistical learning of long-distance dependencies in 8-month-old preterm and full-term infants,

Brain and Language 148, 25-36 (2015).

Forget J, Buiatti M, Dehaene S, 

Temporal integration in visual word recognition

Journal of Cognitive Neuroscience 22(5), 1054-1068 (2010).