The trouble kicks in when we start to believe things that we don’t directly observe. And in the modern world, much of what we believe is not directly or readily observable. Our capacity to acquire new beliefs vicariously — from friends, teachers, or the media, without direct experience — is a key to what allows humans to build cultures and technologies of fabulous complexity. My canine friend Ari learns whatever he learns primarily through trial and error; I learn what I learn mainly through books, magazines, and the Internet. I may cast some skepticism on what I read. (Did journalist-investigator Seymour Hersh really have a well-placed, anonymous source? Did movie reviewer Anthony Lane really even see Clerks IE) But largely, for better or worse, I tend to believe what I read, and I learn much of what I know through that medium. Ari (also for better or worse) knows only what he sees, hears, feels, tastes, or smells.

In the early 1990s, the psychologist Daniel Gilbert, now well known for his work on happiness, tested a theory that he traced back to the seventeenth-century philosopher Baruch de Spinoza. Spinoza’s idea was that “all information is [initially] accepted during comprehension and… false information… unaccepted [only later].” As a test of Spinoza’s hypothesis, Gilbert presented subjects with true and false propositions — sometimes interrupting them with a brief, distracting tone (which required them to press a button). Just as Spinoza might have predicted, interruptions increased the chance that subjects would believe the false proposition;[20] other studies showed that people are more likely to accept falsehoods if they are distracted or put under time pressure. The ideas we encounter are, other things being equal, automatically believed — unless and until there is a chance to properly evaluate them.

This difference in order (between hearing, accepting, and evaluating versus hearing, evaluating, and then accepting) might initially seem trivial, but it has serious consequences. Take, for example, a case that was recently described on Ira Glass’s weekly radio show This American Life. A lifelong political activist who was the leading candidate for chair of New Hampshire’s Democratic Party was accused of possessing substantial amounts of child pornography. Even though his accuser, a Republican state representative, offered no proof, the accused was forced to step down, his political career essentially ruined. A two-month investigation ultimately found no evidence, but the damage was done — our legal system may be designed around the principle of “innocent until proven guilty,” but our mind is not.

Indeed, as every good lawyer knows intuitively, just asking about some possibility can increase the chance that someone will believe it. (“Isn’t it true you’ve been reading pornographic magazines since you were twelve?” “Objection — irrelevant!”) Experimental evidence bears this out: merely hearing something in the form of a question — rather than a declarative statement — is often enough to induce belief.

Why do we humans so often accept uncritically what we hear? Because of the way in which belief evolved: from machinery first used in the service of perception. And in perception, a high percentage of what we see is true (or at least it was before the era of television and Photoshop). When we see something, it’s usually safe to believe it. The cycle of belief works in the same way — we gather some bit of information, directly, through our senses, or perhaps more often, indirectly through language and communication. Either way, we tend to immediately believe it and only later, if at all, consider its veracity.

The trouble with extending this “Shoot first, ask questions later” approach to belief is that the linguistic world is much less trustworthy than the visual world. If something looks like a duck and quacks like a duck, we are licensed to think it’s a duck. But if some guy in a trenchcoat tells us he wants to sell us a duck, that’s a different story. Especially in this era of blogs, focus groups, and spin doctors, language is not always a reliable source of truth. In an ideal world, the basic logic of perception (gather information, assume true, then evaluate if there is time) would be inverted for explicit, linguistically transmitted beliefs; but instead, as is often the case, evolution took the lazy way out, building belief out of a progressive overlay of technologies, consequences be damned. Our tendency to accept what we hear and read with far too little skepticism is but one more consequence.

Yogi Berra once said that 90 percent of the game of baseball was half mental; I say, 90 percent of what we believe is half cooked. Our beliefs are contaminated by the tricks of memory, by emotion, and by the vagaries of a perceptual system that really ought be fully separate — not to mention a logic and inference system that is as yet, in the early twenty-first century, far from fully hatched.

The dictionary defines the act of believing both as “accepting something as true” and as “being of the opinion that something exists, especially when there is no absolute proof.” Is belief about what we know to be true or what we want to be true? That it is so often difficult for members of our species to tell the difference is a pointed reminder of our origins.

Evolved of creatures that were often forced to act rather than think, Homo sapiens simply never evolved a proper system for keeping track of what we know and how we’ve come to know it, uncontaminated by what we simply wish were so.

4. CHOICE

People behave sometimes as if they had two selves, one who wants clean lungs and long life and another who adores tobacco, one who yearns to improve himself by reading Adam Smith on self-command (in The Theory of Moral Sentiments) and another who would rather watch an old movie on television. The two are in continual contest for control.

— THOMAS SCHELLINC

IN THE LATE 1960s and early 1970s, in the midst of the craze for the TV show Candid Camera (forerunner of YouTube, reality TV, and shows like America’s Funniest Home Videos), the psychologist Walter Mischel offered four-year-old preschoolers a choice: a marshmallow now, or two marshmallows if they could wait until he returned. And then, cruelly, he left them alone with nothing more than themselves, the single marshmallow, a hidden camera, and no indication of when he would return. A few of the kids ate the oh-so-tempting marshmallow the minute he left the room. But most kids wanted the bigger bonus and endeavored to wait. So they tried. Hard. But with nothing else to do in the room, the torture was visible. The kids did just about anything they could to distract themselves from the tempting marshmallow that stood before them: they talked to themselves, bounced up and down, covered their eyes, sat on their hands — strategies that more than a few adults might on occasion profitably adopt. Even so, for about half the kids, the 15 or 20 minutes until Mischel returned was just too long to wait.

Giving up after 15 minutes is a choice that could only really make sense under two circumstances: (1) the kids were so hungry that having the marshmallow now could stave off true starvation or (2) their prospects for a long and healthy life were so remote that the 20-minute future versions of themselves, which would get the two marshmallows, simply weren’t worth planning for. Barring these rather remote possibilities, the children who gave in were behaving in an entirely irrational fashion.

Toddlers, of course, aren’t the only humans who melt in the face of temptation. Teenagers often drive at speeds that would be unsafe even on the autobahn, and people of all ages have been known to engage in unprotected sex with strangers, even when they are aware of the risks. The preschoolers’ marshmallows have a counterpart in my raspberry cheesecake, which I know I’ll regret later but nevertheless want desperately now. If you ask people whether they’d rather have a certified check for $100 that they can cash now, or a check for twice as much that they can’t cash for three years, more than half will take the $100 now. (Curiously— and I will come back to this later — most people’s preferences reverse when the time horizon is lengthened, preferring $200 in nine years to $100 in six years.) Then there are the daily uncontrollable choices made by alcoholics, drug addicts, and compulsive gamblers. Not to mention the Rhode Island convict who attempted to escape from jail on day 89 of a 90-day prison sentence.

Collectively, the tendencies I just described exemplify what philosophers call “weakness of the will,” and they are our first hint that the brain mechanisms that govern our everyday choices might be just as klugey as those that govern memory and belief.

Wikipedia defines Homo economicus, or Economic man, as the assumption, popular in many economic theories, that man is “a rational and self-interested actor who desires wealth, avoids

Добавить отзыв
ВСЕ ОТЗЫВЫ О КНИГЕ В ИЗБРАННОЕ

0

Вы можете отметить интересные вам фрагменты текста, которые будут доступны по уникальной ссылке в адресной строке браузера.

Отметить Добавить цитату
×