My research aims to unravel the neural bases underlying the representation and retrieval of conceptual-semantic knowledge in the human brain, mainly using functional magnetic resonance imaging (fMRI) and transcranial magnetic stimulation (TMS). I am particularly interested in the tremendous flexibility of the conceptual system to adapt to different tasks and contexts, as well as to disruptions through non-invasive brain stimulation or actual brain damage.
Whereas traditional amodal theories propose that concepts comprise amodal symbols represented outside the modality-specific perceptual-motor systems, grounded cognition theories posit that concepts consist of perceptual-motor features represented in modality-specific perceptual-motor brain regions.
The results of our studies support a hybrid account, where conceptual processing involves both modality-specific perceptual-motor regions and more abstract cross-modal convergence zones.
Our own model of the conceptual system assumes a hierarchical neural architecture from modality-specific to multimodal to amodal brain regions (Figure 1; Kuhnke et al., 2020a, Kuhnke et al., 2021, Kuhnke et al. 2023).
Moreover, our studies indicate a strong task-dependent flexibility of the conceptual system. Specifically, perceptual-motor features of concepts seem to be selectively retrieved when they are task-relevant. This flexibility can manifest itself in a task-dependent modulation of neural activity (Kuhnke et al., 2020a), functional connectivity (Kuhnke et al., 2021), and causal relevance of brain structures (Kuhnke et al., 2020b).
Figure 1. Our hierarchical model of the conceptual system.
The following sections present brief summaries of the key results of each of our studies.
In this paper, we report a large-scale meta-analysis of 212 neuroimaging studies on conceptual-semantic processing related to 7 perceptual-motor modalities (action, sound, visual shape, motion, color, olfaction-gustation, and emotion).
In line with grounded cognition theories, we found that conceptual processing consistently engages brain regions that are also activated during real perceptual-motor experience of the same modalities. For example, action-related conceptual processing robustly recruits somatomotor regions engaged in real action execution (Figure 2).
These perceptual-motor areas generally showed a high modality specificity, that is, a selectively significant and higher activation likelihood for conceptual processing related to the relevant modality than for the other modalities (Figure 3). Interestingly, conceptual processing mainly involves high-level (e.g. secondary), not low-level (e.g. primary) regions of the modality-specific systems.
In addition to modality-specific areas, we identified several multimodal convergence zones that are recruited for multiple modalities (Figure 4). In particular, the left inferior parietal lobe (IPL) and posterior middle temporal gyrus (pMTG) are engaged for three modalities: action, motion, and sound. These “trimodal” regions are surrounded by “bimodal” regions engaged for two modalities.
Our findings support a new model of the conceptual system, according to which conceptual processing relies on a hierarchical neural architecture from modality-specific to multimodal (i.e., bimodal and trimodal) areas up to an amodal hub (Figure 1).
Read the full paper here.
Figure 4. Overlap between modalities.
Figure 2. Action feature retrieval overlaps with real action execution.
Figure 3. Action-specific brain regions.
This fMRI activation study systematically tested the task dependency of conceptual knowledge retrieval. Specifically, we asked to what extent the retrieval of sound and action features of concepts, and the resulting engagement of auditory and motor brain regions depend on the task. 40 healthy human participants performed three different tasks (lexical decision, sound judgment, and action judgment) on the same words with a high or low association to sounds and actions.
We found that the retrieval of sound and action features, and engagement of modality-specic areas, strongly depended on the task: Selectively during sound judgments, auditory-related regions showed increased functional activation for sound features of concepts (Figure 5). Specifically during action judgments, somatomotor regions exhibited an increased response to action features of concepts (Figure 6).
Importantly, several regions (e.g. left posterior IPL) were engaged for both sound and action features when they were task-relevant, responding to sound features during sound judgments and to action features during action judgments (Figure 7). We therefore propose these regions to be "multimodal" convergence zones which retain modality-specic information.
In contrast, the ATL seems to be "amodal" (i.e. modality-invariant) as it responded to general conceptual information (words > pseudowords) but not to modality-specic features.
Based on these findings, we proposed a new model of the neural architecture underlying conceptual processing. According to our model, conceptual processing relies on a representational hierarchy from modality-specic perceptual-motor regions to multimodal convergence zones (e.g. left pIPL) up to an amodal hub in the ATL. Crucially, we assume this hierarchical system to be flexible, with different regions being engaged in a task-dependent fashion: Regions representing a certain conceptual feature are selectively engaged when that feature is task-relevant.
Read the full paper here.
Figure 5. Sound feature retrieval is task-dependent and overlaps with real auditory perception.
Figure 6. Action feature retrieval is task-dependent and overlaps with real somatomotor action.
Figure 7. Multimodal conceptual regions engaged for both sound and action features in a task-dependent fashio.
This fMRI connectivity study investigated the functional interaction between modality-specic and multimodal regions during conceptual knowledge retrieval. Specifically, we asked (1) whether modality-specic and multimodal areas are functionally coupled during sound and action feature retrieval, (2) whether their coupling depends on the task, (3) whether information flows bottom-up, top-down or both, and (4) whether their coupling is relevant for behavior. In a two-stage analysis approach, we combined whole-brain psychophysiological interaction (PPI) analyses with dynamic causal modeling (DCM).
We found that functional coupling between modality-specific and multimodal areas strongly depended on the task: Selectively during action judgments, action feature retrieval (high > low action words) increased coupling between the multimodal region in left posterior parietal cortex (PPC) and left primary motor / somatosensory cortex M1/S1. Conversely, selectively during sound judgments, sound feature retrieval (high > low sound words) involved increased coupling between multimodal PPC and left auditory association cortex (AAC).
DCM analyses revealed both top-down and bottom-up information flow between multimodal and modality-specific nodes (Figure 8): Multimodal PPC was bidirectionally coupled with left AAC and sound knowledge modulated both the top-down and bottom-up connections. In contrast, left M1/S1 was unidirectionally connected to multimodal PPC and action knowledge specically modulated this bottom-up connection.
Crucially, functional coupling between multimodal and modality-specific cortices predicted behavior in a modality-specific fashion (Figure 9): Individual coupling strength between multimodal PPC and M1/S1 was associated with participants' individual action, but not sound associations. Conversely, coupling between multimodal PPC and AAC predicted participants' sound, but not action associations. These results indicate that flexible coupling between multimodal and modality-specic areas is relevant for conceptually-guided behavior.
The results of this study allowed us to refine our model of the conceptual system (Figure 10). As we found that functional coupling involved not only high-level (e.g. AAC) but also low-level perceptual-motor areas (e.g. M1/S1), we subdivided modality-specific regions into low-level areas and "unimodal convergence zones". Moreover, we extended our model with information on functional interactions (Figure 2B). This new model illustrates that functional coupling during conceptual processing is extensive, reciprocal, and task-dependent: Somatomotor regions selectively come into play when action knowledge is task-relevant, and auditory regions when sound knowledge is task-relevant. The multimodal region in left PPC seems to act as a functional coupling "switchboard" which dynamically adapts its connectivity profile to task-relevant modality-specific nodes.
Read the full paper here.
Figure 8. Dynamic causal modeling of the task-dependent effective connectivity between multimodal and modality-specific brain regions.
Figure 9. Correlations between functional coupling and behavior.
Figure 10. Our refined model of the conceptual system, informed by information on task-dependent functional and effective connectivity.
This TMS study tested the causal role of the left posterior inferior parietal lobe (pIPL) as a multimodal conceptual brain region.
Our previous fMRI studies (Kuhnke et al., 2020a, Kuhnke et al., 2021). suggested a key role of the left pIPL as a multimodal convergence zone (or "hub") for conceptual knowledge. However, as fMRI is correlational, it remained unknown whether left pIPL plays a causal role as a multimodal conceptual hub.
Here, we transiently disrupted left pIPL using TMS to test its causal relevance for processing action and sound knowledge. We compared effective TMS over left pIPL with sham TMS, while 26 new participants performed the three tasks (lexical decision, sound judgment, and action judgment) on words with a high or low association to sounds and actions.
We found that pIPL-TMS selectively impaired action judgments on low sound-low action words, as compared to sham stimulation (Figure 11). Bayesian analyses provided evidence for a null effect on sound judgments and lexical decisions.
For the first time, we directly related computational simulations of the TMS-induced electrical field to behavioral performance, which revealed that stronger stimulation of left pIPL (but not SPL) is associated with worse performance on action (but not sound) judgments (Figure 12).
These results indicate that left pIPL causally supports conceptual processing when action knowledge is task-relevant and cannot be compensated by sound knowledge. Our findings suggest that left pIPL is specialized for action knowledge, which challenges the view of left pIPL as a multimodal conceptual hub.
Read the full paper here.
Figure 11. Behavioral effects of TMS over left pIPL.
Figure 12. Computational simulations of the TMS-induced electrical field.