straight out of Pollyanna: people who expect to play with the perfect-game-playing confederate are impressed; the guy must be great, they think. People who expect to play against the confederate are dismissive; they attribute his good performance to luck rather than skill. Same data, different interpretation: both groups of subjects observe someone play a perfect game, but what they make of that observation depends on the role they expect the observed man to play in their own life.

In a similar study, a bunch of college students viewed videos of three people having a conversation; they were asked to judge how likable each of the three was. The subjects were also told (prior to watching the video) that they would be going out on a date with one of those three people (selected at random for each subject). Inevitably, subjects tended to give their highest rating to the person they were told they would be dating — another illustration of how easily our beliefs (in this case, about someone’s likability) can be contaminated by what we wish to believe. In the words of a musical I loved as a child, Harry Nilsson’s The Point!, “You see what you want to see, and you hear want you want to hear. Dig?”

Our tendency to accept what we wish to believe (what we are motivated to believe) with much less scrutiny than what we don’t want to believe is a bias known as “motivated reasoning,” a kind of flip side to confirmation bias. Whereas confirmation bias is an automatic tendency to notice data that fit with our beliefs, motivated reasoning is the complementary tendency to scrutinize ideas more carefully if we don’t like them than if we do. Take, for example, a study in which Kunda asked subjects, half men, half women, to read an article claiming that caffeine was risky for women. In line with the notion that our beliefs — and reasoning — are contaminated by motivation, women who were heavy caffeine drinkers were more likely to doubt the conclusion than were women who were light caffeine drinkers; meanwhile, men, who thought they had nothing at stake, exhibited no such effect.

The same thing happens all the time in the real world. Indeed, one of the first scientific illustrations of motivated reasoning was not a laboratory experiment but a clever bit of real-world fieldwork conducted in 1964, just after the publication of the first Surgeon General’s report on smoking and lung cancer. The Surgeon General’s conclusion — that smoking appears to cause lung cancer — would hardly seem like news today, but at the time it was a huge deal, covered widely by the media. Two enterprising scientists went out and interviewed people, asking them to evaluate the Surgeon General’s conclusion. Sure enough, smokers were less persuaded by the report than were nonsmokers, who pretty much accepted what the Surgeon General had to say. Smokers, meanwhile, came up with all kinds of dubious counterarguments: “many smokers live a long time” (which ignored the statistical evidence that was presented), “lots of things are hazardous” (a red herring), “smoking is better than excessive eating or drinking” (again irrelevant), or “smoking is better than being a nervous wreck” (an assertion that was typically not supported by any evidence).

The reality is that we are just not born to reason in balanced ways; even sophisticated undergraduates at elite universities tend to fall prey to this weakness. One famous study, for example, asked students at Stanford University to evaluate a set of studies on the effectiveness of capital punishment. Some of the students had prior beliefs in favor of capital punishment, some against. Students readily found holes in studies that challenged what they believed but often missed equally serious problems with studies that led to conclusions that they were predisposed to agree with.

Put the contamination of belief, confirmation bias, and motivated reasoning together, and you wind up with a species prepared to believe, well, just about anything. Historically, our species has believed in a flat earth (despite evidence to the contrary), ghosts, witches, astrology, animal spirits, and the benefits of self-flagellation and bloodletting. Most of those particular beliefs are, mercifully, gone today, but some people still pay hard-earned money for psychic readings and seances, and even I sometimes hesitate before walking under a ladder. Or, to take a political example, some 18 months after the 2003 invasion of Iraq, 58 percent of people who voted for George W. Bush still believed there were weapons of mass destruction in Iraq, despite the evidence to the contrary.

And then there is President George W. Bush himself, who reportedly believes that he has a personal and direct line of communication with an omniscient being. Which, as far as his getting elected was concerned, was a good thing; according to a February 2007 Pew Research Center survey, 63 percent of Americans would be reluctant to vote for anyone who doesn’t believe in God.

To critics like Sam Harris (author of the book The End of Faith), that sort of thing seems downright absurd:

To see how much our culture currently partakes of… irrationality… just substitute the names of your favorite Olympian for “God” wherever this word appears in public discourse. Imagine President Bush addressing the National Prayer Breakfast in these terms: “Behind all of life and all history there is a dedication and a purpose, set by the hand of a just and faithful Zeus.” Imagine his speech to Congress (September 20,2001) containing the sentence “Freedom and fear, justice and cruelty have always been at war and we know that Apollo is not neutral between them.”

Religion in particular enjoys the sway that it does in part because people want it to be true; among other things, religion gives people a sense that the world is just and that hard work will be rewarded. Such faith provides a sense of purpose and belonging, in both the personal and the cosmic realms; there can be no doubt that the desire to believe contributes to the capacity to do so. But none of that explains how people manage to cling to religious beliefs despite the manifest lack of direct evidence.[18] For that we must turn to the fact that evolution left us with the capacity to fool ourselves into believing what we want to believe. (If we pray and something good happens, we notice it; if nothing happens, we fail to notice the non-coincidence.) Without motivated reasoning and confirmation bias, the world might be a very different place.

As one can see in the study of cigarette smokers, biased reasoning has at least one benefit. It can help protect our self-esteem. (Of course it’s not just smokers; I’ve seen scientists do much the same thing, nitpicking desperately at studies that challenge beliefs to which they’re attached.)

The trouble, of course, is that self-deception often costs us down the road. When we fool ourselves with motivated reasoning, we may hold on to beliefs that are misguided or even delusional. They can cause social friction (when we abruptly dismiss the views of others), they can lead to self-destruction (when smokers dismiss the risks of their habit), and they can lead to scientific blunders (when scientists refuse to recognize data challenging their theories).

When people in power indulge in motivated reasoning, dismissing important signs of their own error, the results can be catastrophic. Such was probably the case, for example, in one of the great blunders in modern military history, in the spring of 1944, when Hitler, on the advice of his leading field marshal, Gerd von Rundstedt, chose to protect Calais rather than Normandy, despite the prescient lobbying of a lesser-ranked general, Erwin Rommel. Von Rundstedfs bad advice, born of undue attachment to his own plans, cost Hitler France, and possibly the entire Western Front.[19]

Why does motivated reasoning exist in the first place? Here, the problem is not one of evolutionary inertia but a simple lack of foresight. While evolution gave us the gift of deliberate reasoning, it lacked the vision to make sure we used it wisely: nothing forces us to be evenhanded because there was no one there to foresee the dangers inherent in pairing powerful tools of reasoning with the risky temptations of self-deception. In consequence, by leaving it up to our conscious self to decide how much to use our mechanism of deliberate reasoning, evolution freed us — for better or for worse — to be as biased as we want to be.

Even when we have little at stake, what we already know — or think we know — often further contaminates our capacity to reason and form new beliefs. Take, for example, the classic form of logic known as the syllogism: a formal deductive argument consisting of major premise, minor premise, and conclusion — as stylized as a sonnet:

All men are mortal.

Socrates was a man.

Therefore, Socrates was mortal.

Добавить отзыв
ВСЕ ОТЗЫВЫ О КНИГЕ В ИЗБРАННОЕ

0

Вы можете отметить интересные вам фрагменты текста, которые будут доступны по уникальной ссылке в адресной строке браузера.

Отметить Добавить цитату
×