Speaking is easy, reading and writing are the great challenges
1 Parts of this text were based on my book O Cérebro Aprendiz (The Apprentice Brain, 138 pp.), published in 2019 by Editora Atheneu, Rio de Janeiro, Brazil.
Language is an innate ability of the human brain that distinguishes it from those of all other animals. It is made possible by a network of areas in the cerebral cortex that connects thoughts, feelings, and memory with the vocal organs and the auditory system. Reading and writing, on the other hand, are a product of culture, too recently acquired to be explainable by evolution. These latter functions are imprinted in brain areas biologically prepared to exert other functions, such as face and object recognition. Preparing the brain to decode the graphic symbols of any of the many idioms of the world is a task of education. This is why it is so important to understand how literacy occurs during children’s development and beyond. This knowledge is of utmost importance to win the battle against illiteracy in many countries.
The biological underpinnings of language
One of the most evident functional networks of the human brain is that of verbal language. We are born with a brain “prebuilt” to speak, and we acquire this ability with no need for any specific training, much earlier than we learn how to read, write, or button our shirt. Verbal language is considered a biological feature of humans, organized structurally in the brain despite the cultural modulation that produced over six thousand languages still practiced currently in the world. This feature represents an impressive evolutionary advantage of the human brain with no equivalent in other species, and of which the genetic/evolutionary determination is a subject of great debate among scientists1.
The first identification of the brain areas involved with verbal language came from the observation of patients with lesions in the cerebral cortex, by 19th century neurologists such as the French Pierre-Paul Broca (1824-1880) and the German Karl Wernicke (1848-1905). Broca discovered that the “speech area,” as he called it, was located in a specific, though ill-defined region in the frontal cortex of the left hemisphere that later became known as Broca’s area (see Figure 1A). Patients with lesions in that region were unable to speak. Wernicke discovered another area further back in the brain, also in the left hemisphere at the posterior margin of the lateral sulcus, responsible for speech understanding (see Figure 1A). It became known as Wernicke’s area, and patients who suffered lesions in it failed to understand the speech of others.
As all neurologists of his time, Wernicke was a dedicated neuroanatomist. By dissecting human brains, he rediscovered a fiber tract interconnecting his area with Broca’s: the so-called arcuate fasciculus (see Figure 1A). This fiber bundle was revealed to be a typically human structure, absent or small in other primates. All seemed to make sense. Anyone hearing someone else speaking would use Wernicke’s area to understand the meaning and, if an answer was needed, he/she had simply to activate Broca’s area in sequence by way of the arcuate fasciculus. This interesting, though simplistic scheme became known as the classical model of language, and often carries the names of its authors2.
With the advance of neuroscience in the last decades, together with the development of techniques capable of recording brain activity dynamically, the scenario of language processing by the brain changed pronouncedly3. To understand anything we hear—and to reply and maintain the conversation—is a task of utmost complexity, which is not restricted to the identification of successive phonemes (the sounds of language) and to grouping them in words and sentences. It is essential to also scrutinize our brain lexicons (mental dictionaries) in search of semantic memories to grab the content of what was said, syntactic rules to understand the meaning of sentences, and phonetic features to discern the nuances of pronunciation that add subtle cognitive and emotional tones to the literal meaning of a sentence. Semantic lexicons mobilize an extensive area that occupies most of the ventral surface of the temporal lobe (see Figure 1B) where our familiar faces are stored, as well as the streets of our neighborhood, domestic utensils, fruit names, and hundreds of thousands of meanings we learn throughout life. Syntactic lexicons are represented in the frontal lobe, right next to the classical Broca’s area (Figure 1B). And finally, the phonological lexicons lie near Wernicke’s area, somewhere between the temporal, occipital, and parietal lobes (Figure 1B).
Notice that to understand what your interlocutor is telling you, an ascending chain of numerous subcortical areas (mostly sensory) is mobilized beforehand, then great extensions of the left temporal lobe (at the back of the ear), an important sector of the frontal lobe (at the forehead), and regions of the parietal lobe (at the side of the head) are recruited in sequence. But the process does not stop there5: To understand what you are hearing, you must pay attention to your interlocutor, inhibiting perception of all other environmental sounds. Then you should pause, and certainly wait for him/her to finish speaking before producing an answer. This is called executive control, a function involving regions at the very front of the brain behind the eyes (prefrontal cortex).
And more! The words your interlocutor is saying may arrive embedded in emotions: irony, anger, friendship, love, whatever… So, you must understand not only the overt meaning of what you have heard, but also the covert content that reveals intentions and feelings. This aspect of language is called prosody6. It is very elaborate in the human species and multiplies the richness of verbal language. This evolutionary acquisition is so important that it became lateralized in the right hemisphere (see Figure 1B), in the regions homologous to those just described for language in the left hemisphere.
Together with the involvement of these other multiple areas, relevant connections were identified between them: For instance, the arcuate fasciculus was found to be more complex than previously thought, with branches communicating the classical Wernicke’s area with the many other regions of the left hemisphere that perform the additional functions just mentioned. In addition, the corpus callosum (see Figure 1B)—a gigantic bundle of nerve fibers connecting both hemispheres—was found to be necessary for coordination between right and left hemispheres in language functions.
How culture changes the brain for reading
If brain networks of verbal language are innate in human beings, then interesting questions arise about other, highly complex networks of cultural origin, such as those for reading and writing. Brain networks related to these elaborate functions are particularly intriguing, because they cannot be explained by natural selection: Written language is a recent acquisition of human culture (around four to five thousand years ago), and mathematics is even more recent (not more than three thousand years). These times are too short for their biological substrates to have been influenced by evolutionary pressures. For this reason, they are considered products of culture and, therefore, sculpted in the brains of children by parents and teachers at home and/or at school.
In adulthood, a set of cortical regions in the left hemisphere is observed to be consistently activated when someone reads while undergoing simultaneous brain imaging or electrophysiological recording. This is often called the reading network7. Noticeably, the reading network is consistent across different cultures, with similar topography in the brains of Hebrew, Chinese, or Roman alphabet readers8. As mentioned before, it should be emphasized that reading is a complex function that involves not only a perceptual component specialized in the identification of graphemes (letters) and their correlation with phonemes (the sounds of letters), but also other different features, such as coordinated eye movements, attentional focusing, comprehension, imagination, memorization, etc. For this reason, we observe many brain areas activated simultaneously when a reading task is proposed to someone who is having the brain dynamically recorded. The architecture of this network is similar, but not identical, to that of verbal language, given the similarities and differences between one function and the other (see Figure 2). Besides, evidence indicates that verbal language, with its biological substrates, serves as a base to host the written language network, as it becomes established during childhood and mediated by culture and education. It is as if the reading network took advantage of those regions previously connected for verbal language to develop and operate its functioning.
In addition to the various functions necessary for reading, another set is involved in writing9. After knowing which are the symbols of our language and how to connect them with the corresponding sounds, we ought to learn how to draw them on a piece of paper, or to digit them in sequence on a computer keyboard. This is the function of writing, the generation of reading material for others. Writing is a complex motor behavior employing mainly the hands, one or both. But of course, not only the hands. After our thoughts determine what we want to write (executive control, memory, emotions, etc.), a motor program must be put into action by sectors of the cerebral cortex controlling the movements, in order to organize the right ones in the right sequence and therefore transform our thoughts into visually recognizable symbols.
The key property of the brain allowing all those changes in its internal structure underpinning reading and writing is neuroplasticity. Neuroplasticity is the ability of the brain to undergo changes under influence of the environment. Reading and writing are the perfect examples of this property, called by some neuroscientists as “neuronal recycling,” meaning that established brain areas (and their neurons, of course) change their workings to host other learned abilities10. Learning to read and write, then, recycles the brain areas previously devoted to other functions. It is interesting that, when literacy does not take place, some components of the reading network remain dedicated to other functions (e.g., face recognition instead of word recognition11).
The reading network
To identify letters, syllables, words, and sentences, our visual system needs to direct the eyes to a given target, that is, to position these symbols right at the center of the retina, where acuity is greater and capable of distinguishing the fine details that differentiate an e from an o, or a p from a q. In the case of those who suffer blindness, the input of written information employs the tactile channels, usually through the system of signs developed by Louis Braille (1809-1852). Brain pathways, in both cases, are very similar. After translation by sensory receptors, either visual or tactile, information is coded into nerve impulses. Then, it follows pathways specific for pattern identification until reaching the cerebral cortex, where there is a set of areas in charge of the different perceptual aspects (form, color or texture, movement, and others). Each of these aspects is interpreted specifically by different regions and subregions of the cortex, after which the interpretations then proceed to areas of higher complexity (see Figure 2).
One of these complex areas in the cerebral cortex has been identified by the French neuroscientist Stanislas Dehaene and his collaborators, who called it the visual word form area12. This area is situated at the ventral surface of the temporal lobe, at a region behind the ear where we have a bone bump called the mastoid. This area is activated whenever literate individuals are exposed to written words, but much less or nothing at all when the stimuli are different: faces, objects, and even numerals. This area is so efficient that it “recognizes” words presented for just a few milliseconds—a truly subliminal exposure. People with lesions restricted to this area present a condition called pure alexia—i.e., the inability to read while other functions are unaltered (including speech, understanding speech and even writing).
But reading does not end at the visual recognition of words. It is much more complex and richer than just this “simple” perceptual stage. It is necessary to understand each word, place it in the context of each sentence, the sentence in the context of the paragraph, and the paragraph in the context of the whole story. These functions are all included in what are called, more technically, orthographic, phonological, syntactic, and semantic processes, some of which are represented in Figure 2.
It is necessary for reading, as with many complex cognitive processes, to also focus attention on the act of reading. This includes partially inhibiting behaviors other than moving the eyes along the streams of text and moving the hands to change pages of physical or digital books. In addition, one has to check a special type of memory (referred to by scientists as working memory) that makes it possible to connect the successive pieces of information immediately arising from the act of reading. This is necessary to provide continuity to the flow of ideas conveyed by the text. Also, another kind of memory is mobilized—declarative memory—which lasts longer than working memory and includes consolidating learning. It permits us to associate new ideas with those stored in our brain, and therefore to understand the content of what has been read. Also, during reading, we may be touched emotionally by what we read, bringing a whole set of other brain areas into play. This short description illustrates how complex it is to read, and how many areas of intricate networks are mobilized to perform that function.
Learning to read: About literacy and illiteracy
The main issue about the repercussions of literacy on the brain is to unravel which differences appear in illiterate brains after they learn how to read and write. One first comment is that although literacy acquisition is much easier in children, it may occur as well in adulthood. This relates to the concept of sensitive period—when the brain is more prone to change. Sensitive periods do exist, but this does not mean that following a sensitive period the brain becomes unchangeable.
Neuroscientists have employed different techniques to investigate brain changes during and after the process of literacy14. Functional magnetic resonance imaging (fMRI) is perhaps the most productive of them, but also electrophysiological recording of brain waves (electroencephalography or EEG) has been widely used to give a more dynamic picture of functional brain phenomena.
From this work, the following picture emerges15. Literacy begins by increasing the activity of the visual areas of the cerebral cortex bilaterally in response to various items: letters, faces, objects. This means that an enhancement of visual perception occurs. In preschoolers learning to read, electrophysiological evidence has shown that reading is first represented bilaterally in the brain, becoming lateralized during literacy, usually to the left hemisphere16. The next step after visual perception of letters and words is the increase of activation in the left inferior temporal area, mentioned above (see Figure 2), where one sector decreases its preference for face recognition to increase its accuracy for the recognition of letters—the visual word form area12. It is observed that the recognition of faces becomes somewhat displaced from the left to the right hemisphere. It is also observed that this same region in the left hemisphere remains specialized to identify faces in illiterate adults. The gradual improvement of the ventral temporal cortex in recognizing letters and words means the loss of an important property of the visual system—object invariance. This perceptual feature guarantees that we are able to recognize our mom’s picture taken frontally, from the left, or from the right. We can also recognize the same chair seen from the right, turned 180° to the left, or half covered by another object. However, object invariance is not favorable to recognize letters, because we would not be able to differentiate a p from a q or a b from a d. So, during literacy acquisition, object invariance is lost in the left, reading side of the visual brain, to be maintained only in the right side.
At about six years old, neural activity in the left inferior temporal cortex is already increased in a child that is learning to read. It becomes clearly stronger again by nine years old. Interestingly, ideographic languages, such as Chinese, provoke an inverse pattern of lateralization: The right side becomes specialized in recognizing the language symbols, as compared with the left side. Noticeably, the same region becomes more active in blind persons when they learn to read through the tactile recognition of Braille symbols.
At the same time as the visual word form area flourishes with reading acquisition, other regions connected to it also increase their activity: The auditory areas, for instance, become able to relate graphemes with phonemes. In parallel, an increase in connectivity of the branch of the arcuate fasciculus linking these two sectors of the cortex is observed17. And not only that, the corpus callosum, that great bundle of fibers connecting the two sides of the brain, becomes more specialized during reading acquisition. This is evidence that lateral specialization of function between the two hemispheres has to be administered dynamically to allow the perfect functioning of the perception of letters, words, and sentences during the delicate movements of the eyes from one side to the other.
Knowledge about the neurobiological mechanisms underpinning human acquisition of the ability to read and write are of utmost importance to create evidence-based interventions to improve these processes in schools. One of these interventions is the so-called phonic method, an alternative to the frequently used global method. The first makes use of the scientific knowledge about how we learn to read, as briefly described above: We first associate sounds with the visual patterns of letters and words, then become prepared to extract meaning from these symbols. So, by making use of this normal sequence, at least in transparent languages—those in which only one or a few graphemes represent one phoneme (Italian, Portuguese, and other alphabetic languages)—it is possible to accelerate learning to read and thus benefit children and the educational system in general. It is not an exaggeration to say that illiteracy is still a great plague in many countries.
- Michon, M.; López, B.; Aboitiz, F. (2019). Origin and evolution of human speech: Emergence from a trimodal auditory, visual and vocal network. Prog. Brain Res. 250:345-371.
- Henderson, V.W. (2019). Alexia and agraphia from 1861 to 1965. Frontiers of Neurology and Neuroscience 44:39-52.
- Corballis, M.C. (2015). What’s left in language? Beyond the classical model. Annals of the New York Academy of Sciences, 1359:14-29.
- Lent, R. (2010). One Hundred Billion Neurons? Fundamental Concepts of Neuroscience (in Portuguese), Rio de Janeiro, Ed. Atheneu, 765 pp.
- Barker, M.S.; Nelson, N.L.; Robinson, G.A. (2019). Idea formulation for spoken language production: The interface of cognition and language. Journal of the International Neuropsychological Society, 15:1-15.
- Kreiner, H.; Eviatar Z. (2014). The missing link in the embodiment of syntax: prosody. Brain and Language, 137:91-102.
- Dehaene, S. (2009). Reading in the Brain, Penguin Viking.
- Nakamura, K.; Kuo, W.J.; Pegado, F.; Cohen, L.; Tzeng, O.J., Dehaene, S. (2012). Universal brain systems for recognizing word shapes and handwriting gestures during reading. Proceedings of the National Academy of Sciences, USA, 109:20762-20767.
- Palmis, S.; Danna, J.; Velay, J.L.; Longchamp, M. (2017). Motor control of handwriting in the developing brain: A review. Cognitive Neuropsychology, 34:187-204.
- Dehaene, S.; Cohen, L. (2006). Cultural recycling of cortical maps. Neuron 56:384-398.
- Dehaene, S. et al. (2010). How learning to read changes the cortical networks for vision and language. Science, 330:1359-1364.
- Dehaene, S.; Cohen, L. (2011). The unique role of the visual word form area in reading. Trends in Cognitive Sciences, 15:254-262.
- Lent, R. (2019). The Apprentice Brain (in Portuguese), Rio de Janeiro, Ed. Atheneu, 135 pp.
- Monzalvo, K.; Dehaene-Lambertz, G. (2013). How reading acquisition changes children’s spoken language network. Brain Language. 127:356-365.
- Dehaene, S.; Cohen, L.; Morais, J.; Kolinsky, R. (2015). Illiterate to literate: behavioral and cerebral changes induced by reading acquisition. Nat. Rev. Neurosci. 16:234-244.
- Vogel, A.C.; Church, J.A.; Power, J.D.; Miezin, F.M.; Petersen, S.E.; Schlaggar, B.L. (2013) Functional network architecture of reading-related regions across development. Brain Language, 125:231-243.
- Thiebaut de Schotten, M.; Cohen, L.; Amemiya, E., Braga, L.W.; Dehaene, S. (2012) Learning to read improves the structure of the arcuate fasciculus. Cerebral Cortex, 24: 989-995.