explanation that we like and that sounds reasonable enough to believe. And when the story portrays us in a more glowing and positive light, so much the better.
Cheating Myself
In a commencement speech at Cal Tech in 1974, the physicist Richard Feynman told graduates, “The first principle is that you must not fool yourself—and you are the easiest person to fool.” As we have seen so far, we human beings are torn by a fundamental conflict—our deeply ingrained propensity to lie to ourselves and to others, and the desire to think of ourselves as good and honest people. So we justify our dishonesty by telling ourselves stories about why our actions are acceptable and sometimes even admirable. Indeed, we’re pretty skilled at pulling the wool over our own eyes.
Before we examine in more detail what makes us so good at weaving self-glorifying tales, allow me to tell you a little story about how I once (very happily) cheated myself. Quite a few years ago (when I was thirty), I decided that I needed to trade in my motorcycle for a car. I was trying to decide which car would be the perfect one for me. The Internet was just starting to boom with what I’ll politely call “decision aids,” and to my delight I found a website that provided advice for purchasing cars. The website was based on an interview procedure, and it presented me with a lot of questions that ranged from preferences for price and safety to what kind of headlights and brakes I wanted.
It took about twenty minutes to answer all the questions. Each time I completed a page of answers, I could see the progress bar indicating that I was that much closer to discovering my personalized dream car. I finished the final page of questions and eagerly clicked the “Submit” button. In just a few seconds I got my answer. What was the perfect car for me? According to this finely tuned website, the car for me was … drum roll, please … a Ford Taurus!
I confess that I did not know much about cars. In fact, I know very little about cars. But I certainly knew that I did not want a Ford Taurus.*
I’m not sure what you would do in such a situation, but I did what any creative person might do: I went back into the program and “fixed” my previous answers. From time to time I checked to see how different answers translated into different car recommendations. I kept this up until the program was kind enough to recommend a small convertible—surely the right car for me. I followed that sage advice, and that’s how I became the proud owner of a convertible (which, by the way, has served me loyally for many years).
This experience taught me that sometimes (perhaps often) we don’t make choices based on our explicit preferences. Instead, we have a gut feeling about what we want, and we go through a process of mental gymnastics, applying all kinds of justifications to manipulate the criteria. That way, we can get what we really want, but at the same time keep up the appearance—to ourselves and to others—that we are acting in accordance with our rational and well-reasoned preferences.
Coin Logic
If we accept that we frequently make decisions in this way, perhaps we can make the process of rationalization more efficient and less time-consuming. Here’s how: Imagine that you’re choosing between two digital cameras. Camera A has a nice zoom and a hefty battery, while camera B is lighter and has a snazzier shape. You’re not sure which one to get. You think that camera A is better quality but camera B will make you happier because you like how it looks. What should you do? Here is my advice: Pull a quarter out of your pocket and say to yourself, “Camera A is heads, camera B is tails.” Then toss the coin. If the coin comes up heads and camera A is the one you wanted, good for you, go buy it. But if you’re not happy with the outcome, start the process again, saying to yourself, “The next toss is for real.” Do this until the coin gives you tails. You’ll not only get camera B, which you really wanted all along, but you can justify your decision because you only followed the “advice” of the coin. (You could also replace the coin with your friends and consult them until one of them gives you the advice you want.)
Perhaps that was the real function of the car recommendation software I used to get my convertible. Maybe it was designed not only to help me make a better decision but to create a process that would allow me to justify the choice I really wanted to make. If that is the case, I think it would be useful to develop many more of these handy applications for many other areas of life.
The Liar’s Brain
Most of us think that some people are especially good (or bad) at deception. If this is indeed the case, what characteristics distinguish them? A team of researchers led by Yaling Yang (a postdoc at the University of California, Los Angeles) tried to find out the answer to this question by studying pathological liars—that is, people who lie compulsively and indiscriminately.
To find participants for their study, Yang and her colleagues went to a Los Angeles temporary employment agency. They figured that at least a few of those who were without permanent employment would have had difficulty holding a job because they were pathological liars. (Obviously, this doesn’t apply to all temps.)
The researchers then gave 108 job seekers a battery of psychological tests and conducted several one-on-one interviews with them, their coworkers, and their family members in order to identify major discrepancies that might reveal the pathological liars. In this group, they found twelve people who had pervasive inconsistencies in the stories they told about their work, schooling, crimes committed, and family background. They were also the same individuals who frequently engaged in malingering, or pretending that they were sick in order to get sickness benefits.
Next, the team put the twelve pathological liars—plus twenty-one people who were not pathological liars and were in the same pool of job seekers (the control group)—through a brain scanner to explore each person’s brain structure. The researchers focused on the prefrontal cortex, a part of the brain that sits just behind our foreheads and is considered to be in charge of higher-order thinking, such as planning our daily schedule and deciding how to deal with temptations around us. It’s also the part of the brain that we depend on for our moral judgments and decision making. In short, it’s a kind of control tower for thinking, reasoning, and morality.
In general, there are two types of matter that fill our brains: gray and white. Gray matter is just another name for the collections of neurons that make up the bulk of our brains, the stuff that powers our thinking. White matter is the wiring that connects those brain cells. We all have both gray and white matter, but Yang and her collaborators were particularly interested in the relative amounts of the two types in the participants’ prefrontal cortices. They found that pathological liars had 14 percent less gray matter than the control group, a common finding for many psychologically impaired individuals. What could this mean? One possibility is that since the pathological liars had fewer brain cells (the gray matter) fueling their prefrontal cortex (an area crucial to distinguishing between right and wrong), they find it harder to take morality into account, making it easier for them to lie.
But that’s not all. You might wonder about the extra space that pathological liars must have in their skulls since they have so much less gray matter. Yang and her colleagues also found that pathological liars had 22 to 26 percent more white matter in the prefrontal cortex than non–pathological liars. With more white matter (remember, this is what links the gray matter), pathological liars are likely able to make more connections between different memories and ideas, and this increased connectivity and access to the world of associations stored in their gray matter might be the secret ingredient that makes them natural liars.
If we extrapolate these findings to the general population, we might say that higher brain connectivity could