the less-than-objective ancestral system. We can reason as carefully as we like, but, as they say in computer science jargon, 'garbage in, garbage out.' There's no guarantee that the ancestral system will pass along a balanced set of data. Worse, when we are stressed, tired, or distracted, our deliberative system tends to be the first thing to go, leaving us at the mercy of our lower-tech reflexive system — just when we might need our deliberative system the most.
The unconscious influence of our ancestral system is so strong that when our conscious mind tries to get control of the situation, the effort sometimes backfires. For example, in one study, people were put under time pressure and asked to make rapid judgments. Those who were told to (deliberately) suppress sexist thoughts (themselves presumably the product of the ancestral reflexive system) actually became
No matter what we humans think about, we tend to pay more attention to stuff that fits in with our beliefs than stuff that might challenge them. Psychologists call this 'confirmation bias.' When we have embraced a theory, large or small, we tend to be better at noticing evidence that supports it than evidence that might run counter to it.
Consider the quasi-astrological description that opened this chapter. A person who wants to believe in astrology might notice the parts that seem true ('you have a need for other people to like and admire you') and ignore the parts that aren't (maybe from the outside you don't really look so disciplined after all). A person who wishes to believe in horoscopes may notice the one time that their reading seems dead-on and ignore (or rationalize) the thousands of times when their horoscopes are worded so ambiguously that they could mean anything. That's confirmation bias.
Take, for example, an early experiment conducted by the British psychologist Peter Wason. Wason presented his subjects with a triplet of three distinct numbers (for example, 2-4-6) and asked them to guess what rule might have generated their arrangement. Subjects were then asked to create new sequences and received feedback as to whether their new sequences conformed to the rule. A typical subject might guess '4-6-8,' be told yes, and proceed to try '8-10-12' and again be told yes; the subject might then conclude that the rule was something like 'sequences of three even numbers with two added each time.' What most people failed to do, however, was consider potentially
In another, later study, less benign, two different groups of people saw a videotape of a child taking an academic test. One group of viewers was led to believe that the child came from a socioeconomically privileged background, the other to believe that the child came from a socioeconomically impoverished background. Those who thought the child was wealthier reported that the child was doing well and performing above grade level; the other group guessed that the child was performing below grade level.
Confirmation bias might be an inevitable consequence of contextually driven memory. Because we retrieve memory not by systematically searching for all relevant data (as computers do) but by finding things that
To consider something
The same, of course, goes for scientists. The aim of science is to take a balanced approach to evidence, but
In 1913 Eleanor Porter wrote one of the more influential children's novels of the twentieth century,
Consider the following study, conducted by the late Ziva Kunda. A group of subjects comes into the lab. They are told they'll be playing a trivia game; before they play, they get to watch someone else, who, they are told, will play either on their team (half the subjects hear this) or on the opposite team (that's what the other half are told). Unbeknownst to the subjects, the game is rigged; the person they're watching proceeds to play a perfect game, getting every question right. The researchers want to know whether each subject is impressed by this. The result is straight out of
In a similar study, a bunch of college students viewed videos of three people having a conversation; they were asked to judge how likable each of the three was. The subjects were also told (prior to watching the video) that they would be going out on a date with one of those three people (selected at random for each subject). Inevitably, subjects tended to give their highest rating to the person they were told they would be dating — another illustration of how easily our beliefs (in this case, about someone's likability) can be contaminated by what we
Our tendency to accept what we wish to believe (what we are motivated to believe) with much less scrutiny than what we don't want to believe is a bias known as 'motivated reasoning,' a kind of flip side to confirmation bias. Whereas confirmation bias is an automatic tendency to notice data that fit with our beliefs, motivated reasoning is the complementary tendency to scrutinize ideas more carefully if we don't like them than if we do. Take, for example, a study in which Kunda asked subjects, half men, half women, to read an article claiming that caffeine