guns answering the rattle of his typewriter’. Other journalists, not wanting to be left out, arrive and concoct similar stories. Before long, stocks plummet and the country suffers an economic crash, leading to a state of emergency and finally a revolution.

Waugh’s tale was fictional, but the underlying news feedback he describes still occurs. However, there are some major differences with modern information. One is the speed with which it can spread. Within hours, something can grow from a fringe meme into a mainstream talking point.[108] Another difference is the cost of producing contagion. Bots and fake accounts are fairly cheap to create, and mass amplification by politicians or news sources is essentially free. In some cases, popular false articles can even make money by bringing in advertising revenue. Then there’s the potential for ‘algorithmic manipulation’: if a group can use fake accounts to manufacture the sort of reactions that are valued by social media algorithms – such as comments and likes – they may be able to get a topic trending even if few people are actually talking about it.

Given these new tools, what sort of things have people tried to make popular? Since 2016, ‘fake news’ has become a common term to describe manipulative online information. However, it’s not a particularly helpful phrase. Technology researcher Renée DiResta has pointed out that ‘fake news’ can actually refer to several different types of content, including clickbait, conspiracy theories, misinformation, and disinformation. As we’ve seen, clickbait simply tries to entice people to visit a page; the links will often lead to real news articles. In contrast, conspiracy theories tweak real-life stories to include a ‘secret truth’, which may become more exaggerated or elaborate as the theory grows. Then we have misinformation, which DiResta defines as false content that is generally shared by accident. This can include hoaxes and practical jokes, which are created to be deliberately false but are then inadvertently spread by people who believe them to be true.

Finally, we have the most dangerous form of fake news: disinformation. A common view of disinformation is that it’s there to make you believe something false. However, the reality is subtler than this. When the KGB trained their foreign agents during the Cold War, they taught them how to create contradictions in public opinion and undermine confidence in accurate news.[109] This is what disinformation means. It’s not there to persuade you that false stories are true, but to make you doubt the very notion of truth. The aim is to shift facts around, making the reality difficult to pin down. And the KGB wasn’t just good at seeding disinformation; they knew how to get it amplified. ‘In the quaint old days when KGB spies deployed the tactic, the goal was pickup by a major media property,’ as DiResta put it, ‘because that provided legitimization and took care of distribution.’[110]

In the past decade or so, a handful of online communities have been particularly successful at getting their messages picked up. One early example emerged in September 2008, when a user posted on the Oprah Winfrey Show’s online message board. The user claimed to represent a massive paedophile network, with over 9,000 members. But the post wasn’t quite what it seemed: the phrase ‘over 9,000’ – a reference to a fighter shouting about their opponent’s power level in the cartoon Dragon Ball Z – was actually a meme from 4chan, an anonymous online message board popular with trolls. To the delight of 4chan users, Winfrey took the paedophilia claim seriously and read out the phrase on air.[111]

Online forums like 4chan – and others such as Reddit and Gab – in effect act as incubators for contagious memes. When users post images and slogans, it can spark large numbers of new variants. These newly mutated memes spread and compete on the forums, with the most contagious ones surviving and the weaker ones disappearing. It’s a case of ‘survival of the fittest’, the same sort of process that occurs in biological evolution.[112] Although it isn’t anything like the millennia-long timescales that pathogens have had, this crowd-sourced evolution can still give online content a major advantage.

One of the most successful evolutionary tricks honed by trolls has been to make memes absurd or extreme, so it’s unclear whether they are serious or not. This veneer of irony can help unpleasant views spread further than they would otherwise. If users take offence, the creator of the meme can claim it was a joke; if users assume it was a joke, the meme goes uncriticised. White supremacist groups have also adopted this tactic. A leaked style guide for the Daily Stormer website advised its writers to keep things light to avoid putting off readers: ‘generally, when using racial slurs, it should come across as half-joking.’[113]

As memes rise in prominence, they can become an effective resource for media-savvy politicians. In October 2018, Donald Trump adopted the slogan ‘Jobs Not Mobs’, claiming that Republicans favoured the economy over immigration. When journalists traced the idea to its source, they found that the meme had probably originated on Twitter. It had then spent time evolving on Reddit forums, becoming catchier in the process, before spreading more widely.[114]

It’s not just politicians who can pick up on fringe content. Online rumours and misinformation have spurred attacks on minority groups in Sri Lanka and Myanmar, as well as outbreaks of violence in Mexico and India. At the same time, disinformation campaigns have worked to stir up both sides of a dispute. During 2016 and 2017, Russian troll groups reportedly created multiple Facebook events, with the aim of getting opposing crowds to organise far-right protests and counter-protests.[115] Disinformation around specific topics like vaccination can also feed into wider social unrest; mistrust of science tends to be associated with mistrust in government and the justice system.[116]

The spread of harmful information is not a new problem. Even the term ‘fake news’ has emerged before, briefly becoming popular in the late 1930s.[117] But the structure of online networks has made the issue

Вы читаете The Rules of Contagion
Добавить отзыв
ВСЕ ОТЗЫВЫ О КНИГЕ В ИЗБРАННОЕ

0

Вы можете отметить интересные вам фрагменты текста, которые будут доступны по уникальной ссылке в адресной строке браузера.

Отметить Добавить цитату