Recent site activity



Brain structures



The process of identifying the parts of the brain that are involved in language began in 1861, when Paul Broca, a French neurosurgeon, examined the brain of a recently deceased patient who had had an unusual disorder. Though he had been able to understand spoken language and did not have any motor impairments of the mouth or tongue that might have affected his ability to speak, he could neither speak a complete sentence nor express his thoughts in writing. The only articulate sound he could make was the syllable “tan”, which had come to be used as his name.

Paul Broca

Tan’s brain

When Broca autopsied Tan’s brain, he found a sizable lesion in the left inferior frontal cortex. Subsequently, Broca studied eight other patients, all of whom had similar language deficits along with lesions in their left frontal hemisphere. This led him to make his famous statement that “we speak with the left hemisphere” and to identify, for the first time, the existence of a “language centre” in the posterior portion of the frontal lobe of this hemisphere. Now known as Broca’s area, this was in fact the first area of the brain to be associated with a specific function—in this case, language.

Ten years later, Carl Wernicke, a German neurologist, discovered another part of the brain, this one involved in understanding language, in the posterior portion of the left temporal lobe. People who had a lesion at this location could speak, but their speech was often incoherent and made no sense.

Carl Wernicke

Brain with a lesion causing Wernicke’s aphasia

Wernicke's observations have been confirmed many times since. Neuroscientists now agree that running around the lateral sulcus (also known as the fissure of Sylvius) in the left hemisphere of the brain, there is a sort of neural loop that is involved both in understanding and in producing spoken language. At the frontal end of this loop lies Broca's area, which is usually associated with the production of language, or language outputs . At the other end (more specifically, in the superior posterior temporal lobe), lies Wernicke's area, which is associated with the processing of words that we hear being spoken, or language inputs. Broca's area and Wernicke's area are connected by a large bundle of nerve fibres called the arcuate fasciculus.

This language loop is found in the left hemisphere in about 90% of right-handed persons and 70% of left-handed persons, language being one of the functions that is performed asymmetrically in the brain. Surprisingly, this loop is also found at the same location in deaf persons who use sign language. This loop would therefore not appear to be specific to heard or spoken language, but rather to be more broadly associated with whatever the individual’s primary language modality happens to be.

In addition to Broca’s and Wernicke’s areas, a third area of importance for language, located in the temporal cortex, has been described more recently.

A general problem encountered in any attempt to determine the locations of brain functions is that every brain is unique. Just as every normal human hand has five fingers, but everyone’s hands are different, all human brains have the same major structures, but the size and shape of these structures can vary from one individual to another—by as much as several millimetres. Average measurements can be used, of course, in studying the brain, but the fact remains that the same type of lesion will not always cause exactly the same type of deficit in several different individuals.

With functional brain maps standardized for the sizes of various brains, we obtain a reference that is useful but does not really correspond to the brain of any one particular individual.

Tool : Brain Imaging



A first model of the general organization of language functions in the brain was proposed by American neurologist Norman Geschwind in the 1960s and 1970s. This “connectionist” model drew on the lesion studies done by Wernicke and his successors and is now known as the Geschwind-Wernicke model. According to this model, each of the various characteristics of language (perception, comprehension, production, etc.) is managed by a distinct functional module in the brain, and each of these modules is linked to the others by a very specific set of serial connections. The central hypothesis of this model is that language disorders arise from breakdowns in this network of connections between these modules.

According to this model, when you hear a word spoken, this auditory signal is processed first in your brain’s primary auditory cortex, which then sends it on to the neighbouring Wernicke’s area. Wernicke’s area associates the structure of this signal with the representation of a word stored in your memory, thus enabling you to retrieve the meaning of the particular word.

In contrast, when you read a word out loud, the information is perceived first by your visual cortex, which then transfers it to the angular gyrus, from which it is sent on to Wernicke’s area.

Whether you hear someone else speak a word or you read the word yourself, it is the mental lexicon in Wernicke’s area that recognizes this word and correctly interprets it according to the context. For you then to pronounce this word yourself, this information must be transmitted via the arcuate fasciculus to a destination in Broca’s area, which plans the pronunciation process. Lastly, this information is routed to the motor cortex, which controls the muscles that you use to pronounce the word.

The Wernicke-Geschwind model is thus based on the anatomical location of areas of the brain that have distinct functions. On the whole, this model provides a good understanding of the primary language disorders, such as Broca’s aphasia or Wernicke’s aphasia. But is also has its limitations. For one thing, its assumption that the various areas involved in processing speech are connected in series implies that one step must be completed before the next one can begin, which is not always actually the case. Because this model also fails to explain certain partial language disorders, other models have been proposed to address these shortcomings.

In addition to semantic memory, which lets us retain the various meanings of words, we must use other specialized forms of memory in order to speak. For example, to pronounce any given phoneme of a language that you know how to speak, you must place your tongue and mouth in a particular position. They assume this position unconsciously, but obviously you must have stored it in memory somewhere in your brain.

In some languages, such as Spanish, the relationship between spelling and pronunciation is fairly straightforward, so it is fairly easy to retrieve the pronunciation of a word when you read it. But in other languages, the exact same string of letters may be pronounced very different ways in different words—for instance, the “ough” in “thought”, “tough”, “through” and “though”, in English, or the “ars” in “jars”, “mars”, and “gars”, in French. These arbitrary variations must be memorized as such, with no logical rules to help.

Tool : The Human Vocal Apparatus



Perhaps the most striking anatomical characteristic of the human brain is that it is divided into two hemispheres, so that it has two of almost every structure: one on the left side and one on the right. But these paired structures are not exactly symmetrical and often differ in their size, form, and function. This phenomenon is called brain lateralization.

The two most lateralized functions in the human brain are motor control and language. When a function is lateralized, this often means that one side of the brain exerts more control over this function than the other does. The side that exerts more control is often called the “dominant hemisphere” for this function, but this expression can be somewhat misleading (see sidebar).

Lateralization of motor control is what determines whether someone is right-handed or left-handed. When someone is ambidextrous—when they can use either hand as easily as the other—it means that their brain is only partly lateralized or not at all lateralized for motor control.

In right-handed people, the “dominant” hemisphere for motor control is the left, while in left-handed people, it is the right. The reason for this inversion is that the motor pathways of the nervous system cross over to the other side of the body as they go down the spinal cord. Thus the movements of one side of the body are controlled by the hemisphere on the opposite side.

About 9 out of 10 adults are right-handed. This proportion seems to have remained stable over many thousands of years and in all cultures in which it has been studied (see sidebar).

Now, what about language—what is its “dominant” hemisphere? And is there any correlation between handedness and language lateralization? Considering how easily we can determine whether someone is right-handed or left-handed, if there were such a correlation, it might prove very useful for research. And indeed, this correlation does exist, but it is not perfect. In the vast majority of right-handed people, language abilities are localized in the left hemisphere. But contrary to what you might expect, the opposite is not true among left-handed people, for whom the picture is less clear. Many “lefties” show a specialization for language in the left hemisphere, but some show one in the right, while for still others, both hemispheres contribute just about equally to language.

Though handedness does influence the brain hemisphere that people use to speak, the left hemisphere does seem to have a natural predisposition for language, and this predisposition is reflected anatomically.


Verbal language is not the only way that two people communicate with each other. Even before they open their mouths, they are already communicating through various non-verbal mechanisms.

First of all, their physical appearance, the way they dress, the way they carry themselves, and their general attitude all form a context that lends a particular coloration to their verbal messages. Next, the particular position of their bodies during conversation, the way their eyes move, the gestures they make, and the ways they mimic each other will also impart a certain emotional charge to what they say. There is also what is often called the music of language—the variations in tone, rhythm, and inflection that alter the meanings of words.

When we are talking about language, it is therefore useful to distinguish between verbal language—the literal meaning of the words—and everything that surrounds these words and gives them a particular connotation. That is the big difference between denoting and connoting: the message that is perceived never depends solely on what is said, but always on how it is said as well.

Another good reason to distinguish between the denotative and connotative aspects of language is that they call on different parts of the brain. In the great majority of people, it is the left hemisphere that formulates and understands the meaning of words and sentences, while the right hemisphere interprets the emotional connotation of these words.

For example, if you ask someone who has right hemisphere damage to tell you which of the two pictures here best portrays the expression “She has a heavy heart”, that person will point to the woman with the big heart on her sweater rather than to the woman in tears. Similarly, if you remarked in a sarcastic tone that someone was a really nice guy, a person with right-hemisphere damage would think you really meant it.

When scientists first began to investigate what functions are performed by the parts of the right hemisphere that are homologous to the language areas of the left hemisphere, most of their initial findings came from studying people who had lesions in these parts of the right hemisphere.

Because the sign language used by the deaf involves so many visual and spatial tasks, you might expect it to be controlled by the right hemisphere. But in fact, the proportion of people who are left-lateralized for language is just as high among deaf people who use sign language as it is among people with normal

The Evolution of Broca's Area


Discussions about language origins are polarized along two lines of thought. One is based on Chomsky’s (1978) premise that the fundamental rules governing human languages are genetically embedded in human brains as a result of rapid mutation(s). According to this view, human language has little, if anything, to do with nonhuman primate vocal communications systems or their neurological underpinnings. The second hypothesis proposes that human language evolved gradually from Darwinian natural selection on primate-like vocalizations and their associated genetic and neurological substrates (Falk 2004a, b; 2007). Assessment of the relative merits of these two views lends itself to hypothesis testing: If the first view is correct, human language areas did not emerge from gradual elaborations of incipient (primitive) cortical areas that subserved vocal communications in their nonhuman primate ancestors, and comparative neurological studies on living primates should confirm the uniqueness of language areas in human brains. If the continuity hypothesis is correct, however, homologous language areas should be present in brains of humans and nonhuman anthropoid primates (Preuss 2000).

Broca’s speech area of humans classically consists of Brodmann’s areas 44 and 45 (pars triangularis) in the left hemispheres, while the receptive language area (Wernicke’s area) is traditionally located in that same hemisphere on the planum temporale (PT, located within the depths of the Sylvian fissure), and caudally in area Tpt (temporoparietal) and Brodmann’s area 40 (supramarginal gyrus) (Figure 1). (Some workers would also include part of area 37.) Studies on cytoarchitectonics of macaque brains suggest that the inferior limb of the arcuate sulcus contains homologs of areas 44 and 45 (Galaburda and Pandya 1982; Deacon 1992; Preuss 2000), and that regions of the superior temporal and inferior parietal lobes are homologous with human posterior language receptive areas (Galaburda and Pandya 1982; Preuss 2000) (Figure 1). Wernicke’s area (narrowly defined) is fundamentally involved with the comprehension of human speech. Furthermore, like humans, macaques are thought to be left-hemisphere dominant for comprehending certain socially meaningful vocalizations (Rauschecker et al. 1995; Petersen et al. 1978, 1984; Heffner and Heffner 1984, 1986).

Figure 1: Gross language areas in humans and their proposed homologs in macaques and common chimpanzees. In the left hemispheres of humans, Brodmann’s areas 45 (pars triangularis) and 44 comprise Broca’s speech area, while areas Tpt (temporoparietal), PT (planum temporale, buried within the floor of the Sylvian fissure and located behind Heschl’s gyrus), and Brodmann’s area 40 are parts of Wernicke’s receptive language area. Human area 40, macaque area 7b, and chimpanzee area PF/PG are proposed homologs, as are human and macaque areas Tpt and chimpanzee area TA; fo, fronto-orbital sulcus of chimpanzee. Identifications are based on cytoarchitectonic and functional similarities and should be viewed as tentative. Data from Preuss (2000); Amunts et al. (1999); Gannon et al. (1998); Aboitz and García (1997); Galaburda and Pandya (1982); Crosby, Humphrey and Lauer (1962); Jackson et al. (1969); Bailey et al. (1950); von Bonin (1949). (Figure reproduced from Falk 2007: Fig. 9.4.)

Although less experimental work has been done on chimpanzees, early comparative anatomists speculated about how a chimpanzee-like frontal lobe in human ancestors could have evolved into Broca’s area. As discussed elsewhere (Falk 2007), these efforts were hampered by a lack of consensus about the identities of homologous gyri and sulci in apes and humans, which were often proposed on the basis of relative positions of sulci rather than on cytoarchitectonic grounds. A distinguishing feature of chimpanzee (and other great ape) brains is the fronto-orbital sulcus (fo), which incises the lateral border of the dorsal frontal lobe and then courses caudally on the orbital surface to the temporal pole (Connolly 1950). The lower portion of fo provides the anterior limiting sulcus of the insula, which can be seen peeking out near the rostral end of the Sylvian fissure in many specimens (because chimpanzees usually lack all but the rudiment of a frontal opercula (Connolly 1950)). The bulge delimited by fo in chimpanzees (Figure 1), or orbital cap, represents Brodmann’s area 44 (Bailey 1948; Bailey et al. 1950; Connolly 1950; Jackson et al. 1969) (Figure 1) and may incorporate part of area 45 (Sherwood et al. 2003). Some workers have suggested that, as discussed below for humans (Amunts et al. 1999), area 44 is larger in the left than the right hemisphere of chimpanzees (Cantalupo and Hopkins 2001), but this hypothesis has recently been challenged (Sherwood et al. 2003). In any event, it is important for paleoneurologists to realize that the bulge that appears at the level of the temporal pole in humans, the so-called “Broca’s cap”, is not homologous to the orbital cap of chimpanzees because the human cap contains areas 45 and 47 rather than areas 44 (Connolly 1950) and sometimes part of 45 (Sherwood et al. 2003) (Figure 1). With respect to chimpanzee homologs of Wernicke’s area, on the other hand, the planum temporale is reported to be larger on the left than the right in chimpanzees (Gannon et al. 1998), as is well known to be the case for humans (Geschwind and Levitsky 1968; Amunts et al. 1999).

Wernicke’s area has been interpreted as a more ancient structure than Broca’s area (Aboitiz and García 1997), in part, because area Tpt appears in some prosimians (Preuss and Goldman-Rakic 1991). Aboitiz and García propose that Tpt first differentiated in primates as a bottleneck where cross-modal associations acquired phonological correlates, and that Wernicke’s area eventually “originated as a place in which multimodal representations or concepts obtained a linguistic dimension by being mapped into simple phonological sequences” (Aboitiz and García 1997:390). According to the authors, during primate evolution area Tpt became increasingly connected with inferoparietal regions and these contributed to a link between the auditory system and a parieto-premotor circuit with incipient Broca’s area. A second parallel pathway may also have evolved directly between the precursor of Wernicke’s area and prefrontal cortex. Hypothetically, Broca’s area developed, in part, as a phonological rehearsal device entailed in generating complex vocalizations.  Eventually, an evolving parieto-premotor circuit contributed to the origin of a lexicon (perhaps at the level mastered by apes schooled in American Sign Language). Syntax and the generation of discourse, however, emerged only later in conjunction with further elaboration of these circuits (Aboitiz and García 1997). Because Aboitiz and García’s model is well-reasoned and based on comparative and experimental evidence, their observations warrant serious consideration.

The anatomical arrangement of the language areas fits this large-scale cortico-cortico network and can be described as part of it. In this sense, the neural architecture involved in language is embedded in a complex system of large-scale connectivity that is the hallmark of the primate brain, and therefore should not be considered as an isolated system working independently of similarly organized cortico-cortico networks (Aboitiz and García 1997:388).

The evolution of Broca’s area

The inferior frontal convolution consists of cytoarchitectonic areas located in the pars opercularis (area 44), pars triangularis (area 45) and pars orbitalis (area 47) (Brodmann 1909). These areas are referred to as the frontal operculum (‘cover’) because (along with the parietal and temporal operculum) they form the walls of the anterior Sylvian fissure that cover the insula (Crosby et al. 1962:344). Broca’s pioneering work on brains of aphasics (Broca 1861) revealed that areas 44 and 45 on the left side of the brain are involved in the articulatory aspects of language – hence the name “Broca’s speech area” or “Broca’s area”. Neurons in Broca’s area are activated by simple movements of the mouth and hands, and similar movements in monkeys activate the ventral premotor cortex (Colebatch et al. 1991; Gallese et al 1996; Petersen et al. 1988, Rizzolati et al. 1996). These activated neurons are called ‘mirror neurons’ because they also discharge when similar actions are observed in others (Rizzolati et al. 1996). Mirror neurons are hypothesized to be part of an action-perception network that facilitates gestural (manual and orofacial) communication in apes and humans as well as linguistic communication in the latter (discussed in Falk 2004c).

Recent imaging studies have more precisely delineated specific functions that are facilitated in and around classical Broca’s area. Thus, area 44 tends to be activated during phonetic and phonological tasks that entail coordination of lip and tongue movements, but these do not necessarily entail speech per se. For example, area 44 is recruited during rehearsal strategies that rely on silent ‘inner speech,’ which is consistent with the ‘motor’ theory of speech perception (Gabrieli et al. 1998; Démonet et al. 1996; Lieberman and Mattingly 1985). Area 44 is also involved in nonlinguistic sequencing of orofacial and tongue movements, while the more rostrally located area 47 facilitates semantic aspects of speech and is recruited during verb generation tasks (Petersen et al. 1988; Démonet and Thierry 2001). Parts of Broca’s area are also involved in nonlinguistic tasks such as observation and imitation of finger movements (Binkofski et al 2000; Heiser et al 2003) and recognition of manual gestures (Rizzolatti & Arbit 1998). Because the activated neurons discharge when similar actions are observed in others, they are called ‘mirror neurons’ (Rizzolati et al 1996). Thus, more than a century after Broca’s area was identified, we recognize that it has certain nonlinguistic functions and that the act of speech activates wider areas of the cerebral cortex (Falk 2007). Nevertheless, the importance of this area for speech cannot be denied and the question of its phylogenetic development with respect to language evolution remains important (Sherwood et al 2003; Holloway et al. 2004). 

Amunts et al. (1999) mapped areas 44 and 45 in ten human brains in order to establish more precise cytoarchitectonic borders and relationships to neighboring sulci (Amunts et al. 1999). Intersubject variability was high for the volumes and cytoarchitecture of both areas, although areas 44 and 45 tended to resemble each other within individuals. Unfortunately for those who interpret hominin endocasts, borders of the two areas failed to coincide consistently with superficial sulci or locations buried within their walls. For example, in some brains area 45 extended beyond the pars triangularis onto the orbital surface, or spilled over caudally behind the anterior ascending branch of the Sylvian fissure. Despite these reservations, however, the authors concluded that the free surface of the human triangular and opercular parts of the inferior frontal gyrus are highly likely to represent areas 45 and 44 respectively. The relationship of sulci to cytoarchitectural areas 44 and 45 was recently explored in brains from five adult chimpanzees (Sherwood et al. 2003). Just as the border between cytoarchitectonic areas 44 and 45 of humans may not correspond precisely with the anterior ascending branch of the Sylvian fissure (Amunts et al. 1999), the border between the two areas in chimpanzees does not always coincide with the surface of the fronto-orbial sulcus. As is true for humans, intersubject variability is high for chimpanzees and area 45 may spill over caudally into the presumed domain of area 44 (Sherwood et al. 2003).

Amunts et al. also found that the volume of area 44 was larger on the left in all ten brains (but the favored direction of volumetric asymmetry of area 45 varied from brain to brain), and eight of the ten brains also had higher cell densities on the left side. Other workers have observed a similar volumetric asymmetry favoring the left pars triangularis in patients who were known to be left-hemisphere dominant for speech and language (Foundas et al. 1996), and Amunts et al. suggest that the morphological asymmetries in area 44 provide a basis for functional lateralization of speech. The take-home message is that humans have a unique sulcal pattern in the region of Broca’s area which appears to correlate with speech functions. In other words, if a hominin endocast reproduces a pars triangularis and general humanlike shape of the orbital and frontal opercular cortex in contrast to the simpler morphology seen in chimpanzees, one may reasonably conclude that cortical reorganization had, at least to some degree, occurred in the inferior prefrontal cortex of that individual (Holloway et al. 2004).

What remains mysterious in the comparative study of the inferior frontal convolution is the rostral extent of area 45 in chimpanzees and details about area 47 in both species. Because area 47 is important for processing the semantic aspects of human speech, more research would be desirable on this area.

Area 47 is located ventral to areas 44 and 45. However, area 47 does not seem to be homogeneous because it consists of several sub-areas.  These sub-areas are arranged in a latero-oribital sequence (Konoonva, 1935). By contrast, Economo and Koskinas (1925) found a series of cytoarchitectonic areas and their transitional forms in this region to be arranged in a caudorostral sequence. The mapping of the ventral-orbital surface outside Broca’s region clearly needs further attention.

 Memory involves the linking of various areas of the brain and changing of the properties of synaptic connections.
it is mediated to a large extent by the papez circuit below


complex information from all aspects of the nervous system enter the subiculum of the hippocampus from the parahippocampal gyrus and entorhinal cortex. The hippocampus itself is responsible for the formation of the new circuits. This is believed to occur through upregulation of glutamate receptors (glutamate is an excitatory neurotransmitter) so that a new or modified ability of one neuron to act on another is created. This information is then relayed to the cingulate gyrus which is responsible for much of the emotional significance or emotional content of memory.

Note: Different forms of memory are believed to be formed in different ways. For example, the cerebellum plays a large role in memory of fine procedural coordination (motor memory)

Memory can be altered in various ways clinically.
1) Simple amnestic seizure. In very rare cases, a seizure in the hipocampus on the dominant side can cause amnesia for the time period while the seizure is occurring with no other symptoms.

2) Transient global amnesia. this a rare syndrome where a patient loses the ability to form new memories for several hours. The patient often repeats questions over and over and then recovers completely. Some of these cases are believed to be due to poor venous drainage from the hippocampus.

3) korsakoff's syndrome-this is caused by vitamin B1 (thiamine) deficiency often seen in alcoholics which leads to bilateral hemorrhage and destruction of the mamillary bodies (see diagram above). The result is a permanent inability to form new memories (similar to the protagonist in the movie Memento)

4) In the past, it was common to perform a temporal lobectomy (removal of the temporal lobe) in patients with seizure disorders to control seizures. This would very commonly lead to severe defects in spatial memory.

5) In Alzheimer’s dementia, amyloid plaques accumulate in cells, leading to neuronal death, particularly in the temporal and frontal lobes. This initially causes short term memory difficulties and ultimately profound amnesia.