Then a pair of graduate students at the University of Berkeley noticed something strange. David Broockman and Joshua Kalla had wanted to run their own study, building on LaCour’s impressive analysis. ‘The most important paper of the year. No doubt,’ Broockman had told a journalist after the Science paper was published. But when they looked at LaCour’s dataset, it seemed far too pristine; it was almost as if someone had simulated the data rather than collecting it.[58] In May 2015, the pair contacted Green with their concerns. When questioned, LaCour denied making up the data, but couldn’t produce the original files. A few days later, Green – who said he’d been unaware of the problems until that point – asked Science to retract the paper. It wasn’t clear exactly what had happened, but it was clear that LaCour hadn’t run the study he said he had. The scandal came as a huge disappointment to the Los Angeles LGBT Center. ‘It felt like a big punch to our collective gut,’ said Laura Gardiner, one of their organisers, after the problems emerged.[59]
Media outlets quickly added corrections to their earlier stories, but perhaps journalists – and the scientific journal – should have been more sceptical in the first place. ‘What interests me is the repeated insistence on how unexpected and unprecedented this result was,’ wrote statistician Andrew Gelman after the paper was retracted. Gelman pointed out that this seems to happen a lot in psychological science. ‘People argue simultaneously that a result is completely surprising and that it makes complete sense.’[60] Although the backfire effect had been widely cited as a major hurdle to persuasion, here was a study claiming it could be cleared in one short conversation.
The media has a strong appetite for concise yet counter-intuitive insights. This encourages researchers to publicise results that show how ‘one simple idea’ can explain everything. In some cases, the desire for surprising-yet-simple conclusions can lead apparent experts to contradict their own source of expertise. Antonio García Martínez, who spent two years working in Facebook’s ads team, recalled such a situation in his book Chaos Monkeys. Martínez tells the story of a senior manager who built a reputation with pithy, memorable insights about social influence. Unfortunately for the manager, these claims were undermined by research from his company’s own data science team, whose rigorous analysis had shown something different.
In reality, it’s very difficult to find simple laws that apply in all situations. If we have a promising theory, we therefore need to seek out examples that don’t fit. We need to work out where its limits are and what exceptions there might be, because even widely reported theories might not be as conclusive as they seem. Take the backfire effect. After reading about the idea, Thomas Wood and Ethan Porter, two graduate students at the University of Chicago, set out to see how common it might actually be. ‘Were the backfire effect to be observed across a population, the implications for democracy would be dire,’ they wrote.[61] Whereas Nyhan and Reifler had focused on three main misconceptions, Wood and Porter tested thirty-six beliefs across 8,100 participants. They found that although it can be tough to convince people they’re wrong, an attempted correction doesn’t necessarily make their existing belief stronger. In fact, only one correction backfired in the study: the false claim about weapons of mass destruction in Iraq. ‘By and large, citizens heed factual information, even when such information challenges their partisan and ideological commitments,’ they concluded.
Even in their original study, Nyhan and Reifler found that the backfire effect is not guaranteed. During the 2004 presidential campaign, Democrats claimed that George Bush had banned stem cell research, whereas in reality, he’d limited funding for certain aspects of it.[62] When Nyhan and Reifler corrected this belief among liberals, the information was often ignored, but didn’t backfire. ‘The backfire effect finding got a lot of attention because it was so surprising,’ Nyhan later said.[63] ‘Encouragingly, it seems to be quite rare.’ Nyhan, Reifler, Wood and Porter have since teamed up to explore the topic further. For example, in 2019 they reported that providing fact-checks during Donald Trump’s election speeches had changed people’s beliefs about his specific claims, but not their overall opinion of the candidate.[64] It seems some aspects of people’s political beliefs are harder to alter than others. ‘We have a lot more to learn,’ Nyhan said.
When examining beliefs, we also need to be careful about what we mean by a backfire. Nyhan has noted that there can be confusion between the backfire effect and a related psychological quirk known as ‘disconfirmation bias’.[65] This is when we give more scrutiny to arguments that contradict our existing beliefs than those that we agree with. Whereas the backfire effect implies that people ignore opposing arguments and strengthen their existing beliefs, disconfirmation bias simply means they tend to ignore arguments they view as weak.
It might seem like a subtle difference, but it’s a crucial one. If the backfire effect is common, it implies that we can’t persuade people with conflicting opinions to change their stance. No matter how convincing our arguments, they will only retreat further into their beliefs. Debate becomes hopeless and evidence worthless. In contrast, if people suffer from disconfirmation bias, it means their views could change, given compelling enough arguments. This creates a more optimistic outlook. Persuading people may still be challenging, but it is worth trying.
A lot rides on how we