easier to see how a primitive gestural language could have evolved into speech—an idea that many classical psycholinguists find unappealing.

As a concrete example, consider the phrase “come hither.” Notice that you gesture this idea by holding your palm up and flexing your fingers toward yourself as if to touch the lower part of the palm. Amazingly, your tongue makes a very similar movement as it curls back to touch the palate to utter “hither” or “here”—examples of synkinesia. “Go” involves pouting the lips outward, whereas “come” involves drawing the lips together inward. (In the Indian Dravidian language Tamil—unrelated to English—the word for go is “po”).

Obviously, whatever the original language was back in the Stone Age, it has since been embellished and transformed countless times beyond reckoning, so that today we have languages as diverse as English, Japanese, !Kung, and Cherokee. Language, after all, evolves with incredible rapidity; sometimes just two hundred years is enough to alter a language to the point where a young speaker would be barely able to communicate with her great-great-grandmother. By this token, once the juggernaut of full linguistic competence arose in the human mind and culture, the original synkinetic correspondences were probably lost or blended beyond recognition. But in my account, synkinesia sowed the initial seeds of lexicon, helping to form the original vocabulary base on which subsequent linguistic elaboration was built.

Synkinesia and other allied attributes, such as mimicry of other people’s movements and extraction of commonalities between vision and hearing (bouba-kiki), may all rely on computations analogous to what mirror neurons are supposed to do: link concepts across brain maps. These sorts of linkages remind us again of their potential role in the evolution of protolanguage. This hypothesis may seem speculative to orthodox cognitive psychologists, but it provides a window of opportunity—indeed, the only one we have to date— for exploring the actual neural mechanisms of language. And that’s a big step forward. We will pick up the threads of this argument later in this chapter.

We also need to ask how gesturing evolved in the first place.2 At least for verbs like “come” or “go,” it may have emerged through the ritualization of movements that were once used for performing those actions. For instance, you may actually pull someone toward you by flexing your fingers and elbow toward you while grabbing the person. So the movement itself (even if divorced from the actual physical object) became a means of communicating intent. The result is a gesture. You can see how the same argument applies to “push,” “eat,” “throw,” and other basic verbs. And once you have a vocabulary of gestures in place, it becomes easier for corresponding vocalizations to evolve, given the preexisting hardwired translation produced by synkinesia. (The ritualization and reading of gestures may, in turn, have involved mirror neurons, as alluded to in previous chapters.)

So we now have three types of map-to-map resonance going on in the early hominin brain: visual-auditory mapping (bouba-kiki); mapping between auditory and visual sensory maps, and motor vocalization maps in Broca’s area; and mapping between Broca’s area and motor areas controlling manual gestures. Bear in mind that each of these biases was probably very small, but acting in conjunction they could have progressively bootstrapped each other, creating the snowball effect that culminated in modern language.

IS THERE ANY neurological evidence for the ideas discussed so far? Recall that many neurons in a monkey’s frontal lobe (in the same region that appears to have become Broca’s area in us) fire when the animal performs a highly specific action like reaching for a peanut, and that a subset of these neurons also fires when the monkey watches another monkey grab a peanut. To do this, the neuron (by which I really mean “the network of which the neuron is a part”) has to compute the abstract similarity between the command signals specifying muscle contraction sequences and the visual appearance of peanut reaching seen from the other monkey’s vantage point. So the neuron is effectively reading the other individual’s intention and could, in theory, also understand a ritualized gesture that resembles the real action. It struck me that the bouba-kiki effect provides an effective bridge between these mirror neurons and ideas about synesthetic bootstrapping I have presented so far. I considered this argument briefly in an earlier chapter, let me elaborate the argument now to make the case for its relevance to the evolution of protolanguage.

The bouba-kiki effect requires a built-in translation between visual appearance, sound representation in the auditory cortex, and sequences of muscle twitches in Broca’s area. Performing this translation almost certainly involves the activation of circuits with mirror-neuron-like properties, mapping one dimension onto another. The inferior parietal lobule (IPL), rich in mirror neurons, is ideally suited for this role. Perhaps the IPL serves as a facilitator for all such types of abstraction. I emphasize, again, that these three features (visual shape, sound inflections, and lip and tongue contour) have absolutely nothing in common except the abstract property of, say, jaggedness or roundness. So what we are seeing here is the rudiments—and perhaps relics of the origins—of the process called abstraction that we humans excel at, namely, the ability to extract the common denominator between entities that are otherwise utterly dissimilar. From being able to extract the jaggedness of the broken glass shape and the sound kiki to seeing the “fiveness” of five pigs, five donkeys, or five chirps may have been a short step in evolution but a giant step for humankind.

I HAVE ARGUED, so far, that the bouba-kiki effect may have fueled the emergence of protowords and a rudimentary lexicon. This was an important step, but language isn’t just words. There are two other important aspects to consider: syntax and semantics. How are these represented in the brain and how did they evolve? The fact that these two functions are at least partially autonomous is well illustrated by Broca’s and Wernicke’s aphasias. As we have seen, a patient with the latter syndrome produces elaborate, smoothly articulated, grammatically flawless sentences that convey no meaning whatsoever. The Chomskian “syntax box” in the intact Broca’s area goes “open loop” and produces well-formed sentences, but without Wernicke’s area to inform it with cultivated content, the sentences are gibberish. It’s as though Broca’s area on its own can juggle the words with the correct rules of grammar—just like a computer program might—without any awareness of meaning. (Whether it is capable of more complex rules such as recursion remains to be seen; it’s something we are currently studying.)

We’ll come back to syntax, but first let’s look at semantics (again, roughly speaking, the meaning of a sentence). What exactly is meaning? It’s a word that conceals vast depths of ignorance. Although we know that Wernicke’s area and parts of the temporo-parieto-occipital (TPO) junction, including the angular gyrus (Figure 6.2), are critically involved, we have no idea how neurons in these areas actually do their job. Indeed, the manner in which neural circuitry embodies meaning is one of the great unsolved mysteries of neuroscience. But if you allow that abstraction is an important step in the genesis of meaning, then our bouba-kiki example might once again provide the clue. As already noted, the sound kiki and the jagged drawing would seem to have nothing in common. One is a one-dimensional, time-varying pattern on the sound receptors in your ear, whereas the other is a two-dimensional pattern of light arriving on your retina all in one instant. Yet your brain has no difficulty in abstracting the property of jaggedness from both signals. As we have seen, there are strong hints that the angular gyrus is involved in this remarkable ability we call cross-modal abstraction.

FIGURE 6.2 A schematic depiction of resonance between brain areas that may have accelerated the evolution of protolanguage. Abbreviations: B, Broca’s area (for speech and syntactic structure). A, auditory cortex (hearing). W, Wernicke’s area for language comprehension (semantics). AG, angular gyrus for cross-modal abstraction. H, hand area of the motor cortex, which sends motor commands to the hand (compare with Penfield’s sensory cortical map in Figure 1.2). F, face area of the motor cortex (which sends command messages to the facial muscles, including lips and tongue). IT, the inferotemporal cortex/fusiform area, which represents visual shapes. Arrows depict two-way interactions that may have emerged in human evolution: 1, connections between the

Вы читаете The Tell-Tale Brain
Добавить отзыв
ВСЕ ОТЗЫВЫ О КНИГЕ В ИЗБРАННОЕ

0

Вы можете отметить интересные вам фрагменты текста, которые будут доступны по уникальной ссылке в адресной строке браузера.

Отметить Добавить цитату