they would perform on the second phase of the test showed that participants not only used the answer key in the first phase to exaggerate their score, but had very quickly convinced themselves that they truly earned that score. Basically, those who had a chance to check their answers in the first phase (and cheated) started believing that their exaggerated performance was a reflection of their true skill.
But what would happen if we paid participants to predict their score accurately in the second phase? With money on the line, maybe our participants wouldn’t so patently ignore the fact that in phase one they had used the answer key to improve their scores. To that end, we repeated the same experiment with a new group of participants, this time offering them up to $20 if they correctly predicted their performance on the second test. Even with a financial incentive to be accurate, they still tended to take full credit for their scores and overestimate their abilities. Despite having a strong motivation to be accurate, self-deception ruled the day.
I KNEW IT ALL ALONG
I give a considerable number of lectures about my research to different groups, from academics to industry types. When I started giving talks, I would often describe an experiment, the results, and finally what I thought we could learn from it. But I often found that some people were rather unsurprised by the results and were eager to tell me so. I found this puzzling because, as the person who actually carried out the research, I’d often been surprised by the outcomes myself. I wondered, were the people in the audience really that insightful? How did they know the results sooner than I did? Or was it just an ex post facto feeling of intuition?
Eventually I discovered a way to combat this “I knew it all along” feeling. I started asking the audience to predict the results of the experiments. After I finished describing the setup and what we measured, I gave them a few seconds to think about it. Then I would ask them to vote on the outcome or write their prediction down. Only once they committed to their answer would I provide the results. The good news is that this approach works. Using this ask-first method, I rarely receive the “I knew it all along” response.
In honor of our natural tendency to convince ourselves that we knew the right answers all along, I call my research center at Duke University “The Center for Advanced Hindsight.”
Our Love of Exaggeration
Once upon a time—back in the early 1990s—the acclaimed movie director Stanley Kubrick began hearing stories through his assistant about a man who was pretending to be him. The man-who-would-be-Kubrick (whose real name was Alan Conway and who looked nothing like the dark-bearded director) went around London telling people who he famously was(n’t). Since the real Stanley Kubrick was a very private person who shunned the paparazzi, not many people had any idea of what he looked like. So a lot of gullible people, thrilled to “know” the famous director personally, eagerly took Conway’s bait. Warner Bros., which financed and distributed Kubrick’s films, began calling Kubrick’s office practically every day with new complaints from people who could not understand why “Stanley” would not get back to them. After all, they had treated him to drinks and dinner and paid for his cab, and he had promised them a part in his next film!
One day, Frank Rich (the former theater critic and op-ed columnist of
Very shortly after this encounter, things began to unravel for Conway as it dawned on Rich and others that they’d been conned. Eventually the truth came out when Conway began selling his story to journalists. He claimed to be a recovering victim of a mental disorder (“It was uncanny. Kubrick just took me over. I really did believe I was him!”). In the end Conway died a penniless alcoholic, just four months before Kubrick.*
Although this story is rather extreme, Conway may well have believed that he was Kubrick when he was parading around in disguise, which raises the question of whether some of us are more prone to believe our own fibs than others. To examine this possibility, we set up an experiment that repeated the basic self-deception task, but this time we also measured participants’ general tendency to turn a blind eye to their own failures. To measure this tendency, we asked participants to agree or disagree with a few statements, such as “My first impressions of people are usually right” and “I never cover up my mistakes.” We wanted to see whether people who answered “yes” to more of these questions also had a higher tendency for self-deception in our experiment.
Just as before, we saw that those in the answer-key condition cheated and got higher scores. Again, they predicted that they would correctly answer more questions in the following test. And once more, they lost money because they exaggerated their scores and overpredicted their ability. And what about those who answered “yes” to more of the statements about their own propensities? There were many of them, and they were the ones who predicted that they would do best on our second-phase test.
HEROIC VETERANS?
In 1959, America’s “last surviving Civil War veteran,” Walter Williams, died. He was given a princely funeral, including a parade that tens of thousands gathered to see, and an official week of mourning. Many years later, however, a journalist named William Marvel discovered that Williams had been only five years old when the war began, which meant he wouldn’t have been old enough at any point to serve in the military in any capacity. It gets worse, though. The title that Walter Williams bore falsely to the grave had been passed to him from a man named John Salling, who, as Marvel discovered, had also falsely called himself the oldest Civil War veteran. In fact, Marvel claims that the last dozen of so-called oldest Civil War veterans were all phony.
There are countless other stories like these, even in recent wars, where one might think it would be more difficult to make up and sustain such claims. In one example, Sergeant Thomas Larez received multiple gunshot wounds fighting the Taliban in Afghanistan while helping an injured soldier to safety. Not only did he save his friend’s life, but he rallied from his own wounds and killed seven Taliban fighters. So went the reporting of Larez’s exploits aired by a Dallas news channel, which later had to run a retraction when it turned out that although Larez was indeed a marine, he had never been anywhere near Afghanistan—the entire story was a lie.
Journalists often uncover such false claims. But once in a while, it’s the journalist who’s the fibber. With teary eyes and a shaky voice, the longtime journalist Dan Rather described his own career in the marines, even though he had never made it out of basic training. Apparently, he must have believed that his involvement was far more significant than it actually was.1
THERE ARE PROBABLY many reasons why people exaggerate their service records. But the frequency of stories about people lying on their resumes, diplomas, and personal histories brings up a few interesting questions: Could it be that when we lie publicly, the recorded lie acts as an achievement marker that “reminds” us of our false achievement and helps cement the fiction into the fabric of our lives? So if a trophy, ribbon, or certificate recognizes something that we never achieved, would the achievement marker help us hold on to false beliefs about our own ability? Would such certificates increase our capacity for self-deception?
BEFORE I TELL you about our experiments on this question I should point out that I proudly hang two diplomas on my office wall. One is an “MIT Bachelor of Science in Charm,” and the other is a “PhD in Charm,” also from MIT. I was awarded these diplomas by the Charm School, which is an activity that takes place at