comeback. Every primary and caucus after that, we convinced ourselves we still had it. As the weeks went by, as the sinking feeling got stronger that we would lose to John Kerry, we got hungrier and hungrier for any poll that would give us even a slim chance of winning.
If, a month later, you had polled the staff to ask who would win the Wisconsin primary—our line in the sand—we’d have told you it was Howard Dean. And we’d believe, out of desperation, anything that told us we were right.
We came in third.
Reality Dysmorphia
In eating disorder treatment centers, a physician will often ask the patient to draw an outline of her own body on a large chalkboard. Then, the doctor will ask the patient to place her back against the wall, and trace the actual outline of her body. For many patients, the outlines that they draw are quite exaggerated, sometimes twice as large as their actual bodies.
It’s a phenomenon called
During the Dean campaign, the delusion that resulted from my poor information diet was a cognitive version of this disease: reality dysmorphia. I haven’t met a single campaign operative here in Washington, D.C., on either side, that didn’t have at least a mild case of it.
This kind of delusion comes from psychological phenomena like
It turns out our brains are remarkable energy consumers. Though it typically represents only 2% of the human body’s weight, the brain consumes about 20% of the body’s energy resources.[40] As such, we’ve evolved—both for our brain’s energy consumption, and for our social survival—to use shortcuts in order to be able to handle more complex thoughts.
Think of a
Heuristics have a dark side, though: they cause us to have unconscious biases towards things we’re familiar with, and choose to do the same thing we’ve always done rather than do something new that may be more efficient.
They cause us to make logical leaps that take us to false conclusions. For instance, these mental shortcuts underpin our capacity for racism, sexism, and other forms of discrimination.
One such nefarious heuristic is called
In 2005, Emory University professor Drew Westen and his colleagues recruited 15 self-described strong Democrats and 15 strong Republicans for a sophisticated test. They used a functional magnetic resonance imaging (fMRI) machine to study how partisan voters reacted to negative remarks about their party or candidate. Westen and his colleagues found that when these subjects processed “emotionally threatening information” about their preferred candidates, the parts of the brain associated with reasoning shut down and the parts responsible for emotions flared up.[41] Westen’s research indicates that once we grow biased enough, we lose our capacity to change our minds.
Following Westen’s study, social scientists Brendan Nyhan and Jason Reifle conducted a new test,[42] and discovered what they believe is a “backfire effect.”
Nyhan and Reifle provided the subjects with sample articles claiming that President Bush stated that tax cuts would create such economic growth that it would increase government revenues. The same articles included corrective statements from a 2003 Economic Report of the President and various other official sources, claiming that this was implausible. The researchers then showed the students the actual tax revenues as a proportion of GDP
The results were fascinating: after reading the article, the conservatives in the study were still more inclined to believe that tax cuts
We already know that things like confirmation bias make us seek out information that we agree with. But it’s also the case that once we’re entrenched in a belief, the facts will not change our minds.
Politics is the area in which scientists have studied the psychological causes of bias the most. It’s easy to get people to self-identify, and universities tend to have more of an interest in political science than in other realms of social studies. But you can also see the results of this kind of bias in areas other than politics: talk to a Red Sox fan about whether or not the Yankees are the best team in baseball’s history, and you’ll see strong bias come out. Talk to MacBook owners about the latest version of Windows and you may see the same phenomenon.
We’ve likely evolved this way because it’s safer. Forming a heuristic means survival: watching your caveman friend eat some berries and die doesn’t make you want to conduct a test to see if those berries kill people. It makes you want to not eat those berries anymore, and to tell your friends not to eat those berries either.
Cognitive scientists Hugo Mercier and Dan Sperber took reasoning and turned it on its head. After all, if all the evidence around reasoning shows that we’re actually pretty bad at using it to make better choices, then maybe that’s not reason’s primary function. In their paper “Why do humans reason?,”[43] they argue instead that “reasoning does exactly what can be expected of an argumentative device: Look for arguments that support a given conclusion, and, ceteris paribus, favor conclusions for which arguments can be found.” Mercier and Sperber argue that our minds may have evolved to value persuasion over truth. It certainly is plausible— human beings are social animals, and persuasion is a form of social power.
The seeds of opinion can be dangerous things. Once we begin to be persuaded of something, we not only seek out confirmation for that thing, but we also refute fact even in the face of incontrovertible evidence. With confirmation bias and Nyhan and Reifle’s backfire effect in full force, we find ourselves both addicted to more information and vulnerable to misinformation for the sake of our egos.
This MSNBC Is Going Straight to My Amygdala
Neuroscience is the new outer space. It’s a vacuum of promise and fantasy waiting to be filled with science and data. There’s no greater, no more mysterious, no more misunderstood organ in our bodies than our brains. If one weighed the pages of mythology around the brain against that of all scientific papers ever written about it, the scale would likely tip towards myth.
The fields of psychology and neuroscience are filled with misinformation, disagreement, untested hypotheses, and the occasional consensus-based, verifiable, and repeatably tested theory. And so it’s a struggle for me: on one hand, I’m preaching about information diets, but—in trying to synthesize my own research in the field—I run the risk of accidentally feeding you junk information myself. On the other hand, so much of both fields is applicable to an information diet that it’s impossible not to draw on them.
Banting had an advantage on me. When he wrote his