Computer-programming languages admit no such imprecision; they have ways of representing formal quantifiers ([DO THIS THING REPEATEDLY UNTIL EVERY DATABASE RECORD HAS BEEN EXAMINED ] ) but no way of expressing generics at all. Human languages are idiosyncratic — and verging on redundant — inasmuch as they routinely exploit both systems, generics and the more formal quantifiers.
Why do we have both systems? Sarah-Jane Leslie, a young Princeton philosopher, has suggested one possible answer. The split between generics and quantifiers may reflect the divide in our reasoning capacity, between a sort of fast, automatic system on the one hand and a more formal, deliberative system on the other. Formal quantifiers rely on our deliberative system (which, when we are being careful, allows us to reason logically), while generics draw on our ancestral reflexive system. Generics are, she argues, essentially a linguistic realization of our older, less formal cognitive systems. Intriguingly, our sense of generics is 'loose' in a second way: we are prepared to accept as true generics like
Leslie further suggests that generics seem to be learned first in childhood, before formal quantifiers; moreover, they may have emerged earlier in the development of language. At least one contemporary language (Piraha, spoken in the Amazon Basin) appears to employ generics but not formal quantifiers. All of this suggests one more way in which the particular details of human languages depend on the idiosyncrasies of how our mind evolved.
For all that, I doubt many linguists would be convinced that language is truly a kluge. Words are one thing, sentences another; even if words are clumsy, what linguists really want to know about is
In the past several years, Noam Chomsky, the founder and leader of modern linguistics, has taken to arguing just that. In particular, Chomsky has wondered aloud whether language (by which he means mainly the syntax of sentences) might come close 'to what some super-engineer would construct, given the conditions that the language faculty must satisfy.' As linguists like Tom Wasow and Shalom Lappin have pointed out, there is considerable ambiguity in Chomsky's suggestion. What would it mean for a language to be perfect or optimal? That one could express anything one might wish to say? That language is the most efficient possible means for obtaining what one wants? Or that language was the most logical system for communication anyone could possibly imagine? It's hard to see how language, as it now stands, can lay claim to such grand credentials. The ambiguity of language, for example, seems unnecessary (as computers have shown), and language works in ways neither logical nor efficient (just think of how much extra effort is often required in order to clarify what our words mean). If language were a perfect vehicle for communication, infinitely efficient and expressive, I don't think we would so often need 'paralinguistic' information, like that provided by gestures, to get our meaning across.
As it turns out, Chomsky actually has something different in mind. He certainly doesn't think language is a perfect tool for communication; to the contrary, he has argued that it is a mistake to think of language as having evolved 'for' the purposes of communication at all. Rather, when Chomsky says that language is nearly optimal, he seems to mean that its formal structure is surprisingly
Recursion is a way of building larger structures out of smaller structures. Like mathematics, language is a potentially infinite system. Just as you can always make a number bigger by adding one (a trillion plus one, a googleplex plus one, and so forth), you can always make a sentence longer by adding a new clause. My favorite example comes from Maxwell Smart on the old Mel Brooks TV show
There's no doubt that recursion —
* Although I have long been a huge fan of Chomsky's contributions to linguistics, I have serious reservations about this particular line of work. I'm not sure that elegance really works in physics (see Lee Smolin's recent book
A number of scholars have been highly critical of that radical idea. Steven Pinker and the linguist Ray Jackendoff have argued that recursion might actually be found in other aspects of the mind (such as the process by which we recognize complex objects as being composed of recognizable subparts). The primatologist David Premack, meanwhile, has suggested that although recursion is a hallmark of human language, it is scarcely the
The sticking point is what linguists call syntactic trees, diagrams like this:
*In a hypothetical recursion-free language, you might, for example, be able to say 'Give me the fruit' and 'The fruit is on the tree,' but not the more complex expression 'Give me the fruit that is hanging on the tree that is missing a branch.' The words 'that is hanging on the tree that is missing a branch' represent an embedded clause itself containing an embedded clause.
Small elements can be combined to form larger elements, which in turn can be combined into still larger elements. There's no problem
But, as we have seen time and again, what is natural for computers isn't always natural for the human brain: building a tree would require a precision in memory that humans just don't appear to have. Building a tree structure