The science of social contagion has come a long way in the past decade, but there is still much more to discover. Not least because it’s often difficult to establish whether something is contagious in the first place. In many cases, we can’t deliberately change people’s behaviour, so we have to rely on observational data, as Christakis and Fowler did with the Framingham study. However, there is another approach emerging. Researchers are increasingly turning to ‘natural experiments’ to examine social contagion.[51] Rather than imposing behavioural change, they instead wait for nature to do it for them. For example, a runner in Oregon might change their routine when the weather is bad; if their friend in California changes their behaviour too, it could suggest social contagion is responsible. When researchers at MIT looked at data from digital fitness trackers, which included a social network linking users, they found that the weather could indeed reveal patterns of contagion. However, some were more likely to catch the running bug than others. Over a five-year period, the behaviour of less active runners tended to influence more active runners, but not the other way around. This implies that keen runners don’t want to be outdone by their less energetic friends.
Behavioural nudges like changes in weather are a useful tool for studying contagion, but they do have limits. A rainy day might alter someone’s running patterns, but it’s unlikely to affect other, more fundamental behaviours like their marital choices or political views. Dean Eckles points out there can be a big gap between what is easily changed and what we ideally want to study. ‘A lot of the behaviours we care the most about are not so easy to nudge people to do.’
In november 2008, Californians voted to ban same-sex marriage. The result came as a shock to those who’d campaigned for marriage equality, especially as pre-vote polls had appeared to be in their favour. Explanations and excuses soon began to emerge. Dave Fleischer, director of the Los Angeles LGBT Center, noticed that several misconceptions about the result were becoming popular. One was that the people who voted for the ban must have hated the LGBT community. Fleischer disagreed with this idea. ‘The dictionary defines “hate” as extreme aversion or hostility,’ he wrote after the vote. ‘This does not describe most who voted against us.’[52]
To find out why so many people were against same-sex marriage, the LGBT Center spent the next few years conducting thousands of face-to-face interviews. Canvassers used most of this time to listen to voters, a method known as ‘deep canvassing’.[53] They encouraged people to talk about their lives, and reflect on their own experiences of prejudice. As they conducted these interviews, the LGBT Center realised that deep canvassing wasn’t just providing information; it appeared to be changing voters’ attitudes. If so, this would make it a powerful canvassing method. But was it really as effective as it seemed?
If people are rational, we might expect them to update their beliefs when presented with new information. In scientific research this approach is known as ‘Bayesian reasoning’. Named after eighteenth-century statistician Thomas Bayes, the idea is to treat knowledge as a belief that we have a certain level of confidence in. For example, suppose you are strongly considering marrying someone, having thought carefully about the relationship. In this situation, it would take a very good reason for you to change your mind. However, if you’re not totally sure about the relationship, you might be persuaded against marriage more easily. Something that might seem trivial to the infatuated may be enough to tip a wavering mind towards a break-up. The same logic applies to other situations. If you start with a firm belief, you’ll generally need strong evidence to overcome it; if you are unsure at first, it might not take much for you to change your opinion. Your belief after exposure to new information therefore depends on two things: the strength of your initial belief and the strength of the new evidence.[54] This concept is at the heart of Bayesian reasoning – and much of modern statistics.
Yet there are suggestions that people don’t absorb information in this way, especially if it goes against their existing views. In 2008, political scientists Brendan Nyhan and Jason Reifler proposed that persuasion can suffer from a ‘backfire effect’. They’d presented people with information that conflicted with their political ideology, such as the lack of weapons of mass destruction in Iraq before the 2003 war, or the decline in revenues following President Bush’s tax cuts. But it didn’t seem to convince many of them. Worse, some people appeared to become more confident in their existing beliefs after seeing the new information.[55] Similar effects had come up in other psychological studies over the years. Experiments had tried to persuade people of one thing, only for them to end up believing something else.[56]
If the backfire effect is common, it doesn’t bode well for canvassers hoping to convince people to change their minds about issues like same-sex marriage. The Los Angeles LGBT Center thought they had a method that worked, but it needed to be evaluated properly. In early 2013, Dave Fleischer had lunch with Donald Green, a political scientist at Columbia University. Green introduced Fleischer to Michael LaCour, a graduate student at UCLA, who agreed to run a scientific study testing the effectiveness of deep canvassing. The aim was to carry out a randomised controlled trial. After recruiting voters to participate in a series of surveys, LaCour would randomly split the group. Some would get visits from a canvasser; others, acting as a control group, would have conversations about recycling.
What happened next would reveal a lot about how beliefs change, just not quite in the way