Computer-programming languages admit no such imprecision; they have ways of representing formal quantifiers ([DO THIS THING REPEATEDLY UNTIL EVERY DATABASE RECORD HAS BEEN EXAMINED ] ) but no way of expressing generics at all. Human languages are idiosyncratic — and verging on redundant — inasmuch as they routinely exploit both systems, generics and the more formal quantifiers.

Why do we have both systems? Sarah-Jane Leslie, a young Princeton philosopher, has suggested one possible answer. The split between generics and quantifiers may reflect the divide in our reasoning capacity, between a sort of fast, automatic system on the one hand and a more formal, deliberative system on the other. Formal quantifiers rely on our deliberative system (which, when we are being careful, allows us to reason logically), while generics draw on our ancestral reflexive system. Generics are, she argues, essentially a linguistic realization of our older, less formal cognitive systems. Intriguingly, our sense of generics is 'loose' in a second way: we are prepared to accept as true generics like Sharks attack bathers or Pit bulls maul children even though the circumstances they describe are statistically very rare, provided that they are vivid or salient — just the kind of response we might expect from our automatic, less deliberative system.

Leslie further suggests that generics seem to be learned first in childhood, before formal quantifiers; moreover, they may have emerged earlier in the development of language. At least one contemporary language (Piraha, spoken in the Amazon Basin) appears to employ generics but not formal quantifiers. All of this suggests one more way in which the particular details of human languages depend on the idiosyncrasies of how our mind evolved.

For all that, I doubt many linguists would be convinced that language is truly a kluge. Words are one thing, sentences another; even if words are clumsy, what linguists really want to know about is syntax, the glue that binds words together. Could it be that words are a mess, but grammar is different, a 'near-perfect' or 'optimal' system for connecting sound and meaning?

In the past several years, Noam Chomsky, the founder and leader of modern linguistics, has taken to arguing just that. In particular, Chomsky has wondered aloud whether language (by which he means mainly the syntax of sentences) might come close 'to what some super-engineer would construct, given the conditions that the language faculty must satisfy.' As linguists like Tom Wasow and Shalom Lappin have pointed out, there is considerable ambiguity in Chomsky's suggestion. What would it mean for a language to be perfect or optimal? That one could express anything one might wish to say? That language is the most efficient possible means for obtaining what one wants? Or that language was the most logical system for communication anyone could possibly imagine? It's hard to see how language, as it now stands, can lay claim to such grand credentials. The ambiguity of language, for example, seems unnecessary (as computers have shown), and language works in ways neither logical nor efficient (just think of how much extra effort is often required in order to clarify what our words mean). If language were a perfect vehicle for communication, infinitely efficient and expressive, I don't think we would so often need 'paralinguistic' information, like that provided by gestures, to get our meaning across.

As it turns out, Chomsky actually has something different in mind. He certainly doesn't think language is a perfect tool for communication; to the contrary, he has argued that it is a mistake to think of language as having evolved 'for' the purposes of communication at all. Rather, when Chomsky says that language is nearly optimal, he seems to mean that its formal structure is surprisingly elegant, in the same sense that string theory is. Just as string theorists conjecture that the complexity of physics can be captured by a small set of basic laws, Chomsky has, since the early 1990s, been trying to capture what he sees as the superficial complexity of language with a small set of laws.* Building on that idea, Chomsky and his collaborators have gone so far as to suggest that language might be a kind of 'optimal solution . . . [to] the problem of linking the sensory-motor and conceptual-intentional systems' (or, roughly, connecting sound and meaning). They suggest that language, despite its obvious complexity, might have required only a single evolutionary advance beyond our inheritance from ancestral primates, namely, the introduction of a device known as 'recursion.'

Recursion is a way of building larger structures out of smaller structures. Like mathematics, language is a potentially infinite system. Just as you can always make a number bigger by adding one (a trillion plus one, a googleplex plus one, and so forth), you can always make a sentence longer by adding a new clause. My favorite example comes from Maxwell Smart on the old Mel Brooks TV show Get Smart: 'Would you believe that I know that you know that I know that you know where the bomb is hidden?' Each additional clause requires another round of recursion.

There's no doubt that recursion — or something like it— is central to human language. The fact that we can put together one small bit of structure {the man) with another {who went up the hill) to form a more complex bit of structure {the man who went up the hill) allows us to create arbitrarily complex sentences with terrific precision ( The man with the gun is the man who went up the hill, not the man who

* Although I have long been a huge fan of Chomsky's contributions to linguistics, I have serious reservations about this particular line of work. I'm not sure that elegance really works in physics (see Lee Smolin's recent book The Trouble with Physics), and in any case, what works for physics may well not work for linguistics. Linguistics, after all, is a property of biology — the biology of the human brain — and as the late Francis Crick once put it, 'In physics, they have laws; in biology, we have gadgets.' So far as we know, the laws of physics have never changed, from the moment of the big bang onward, whereas the details of biology are constantly in flux, evolving as climates, predators, and resources, change. As we have seen so many times, evolution is often more about alighting on something that happens to work than what might in principle work best or most elegantly; it would be surprising if language, among evolution's most recent innovations, was any different.

drove the getaway car). Chomsky and his colleagues even have suggested that recursion might be 'the only uniquely human component of the faculty of language.'

A number of scholars have been highly critical of that radical idea. Steven Pinker and the linguist Ray Jackendoff have argued that recursion might actually be found in other aspects of the mind (such as the process by which we recognize complex objects as being composed of recognizable subparts). The primatologist David Premack, meanwhile, has suggested that although recursion is a hallmark of human language, it is scarcely the only thing separating human language from other forms of communication. As Premack has noted, it's not as if chimpanzees can speak an otherwise humanlike language that lacks recursion (which might consist of language minus complexities such as embedded clauses).* I'd like to go even further, though, and take what we've learned about the nature of evolution and humans to turn the whole argument on its head.

The sticking point is what linguists call syntactic trees, diagrams like this:

*In a hypothetical recursion-free language, you might, for example, be able to say 'Give me the fruit' and 'The fruit is on the tree,' but not the more complex expression 'Give me the fruit that is hanging on the tree that is missing a branch.' The words 'that is hanging on the tree that is missing a branch' represent an embedded clause itself containing an embedded clause.

Small elements can be combined to form larger elements, which in turn can be combined into still larger elements. There's no problem in principle with building such things — computers use trees, for example, in representing the directory, or 'folder' structures, on a hard drive.

But, as we have seen time and again, what is natural for computers isn't always natural for the human brain: building a tree would require a precision in memory that humans just don't appear to have. Building a tree structure

Добавить отзыв
ВСЕ ОТЗЫВЫ О КНИГЕ В ИЗБРАННОЕ

0

Вы можете отметить интересные вам фрагменты текста, которые будут доступны по уникальной ссылке в адресной строке браузера.

Отметить Добавить цитату