Vaccinating against viruses of the mind
David Robson on psychological efforts to achieve ‘herd immunity’ against the spread of misinformation in pandemic times.
15 May 2020
There are multiple pandemics sweeping the globe in 2020.
The first, of course, is caused by the Covid-19 virus. But hot on its heels, we have seen conspiracy theories and false information spreading in its wake. There was the idea that the pandemic itself was a hoax, and that it was being spread by the new 5G networks, leading vigilante gangs to attack mobile masts. Then there were the claims of miracle cures – such as the idea that drinking methanol could kill the virus, a belief that caused hundreds of deaths in Iran.
The sources are varied – from misguided health gurus to anti-vaccination campaigners and even foreign governments who may wish to sow discord for political gain – but the total inundation of misinformation may be unlike anything we've ever seen before. 'All the stories I'm seeing suggest that it's gotten much worse under the current pandemic,' says Jay Van Bavel, a social neuroscientist at New York University, who recently co-authored a paper on the ways that behavioural sciences can help with the pandemic response (Van Bavel et al., 2020).
With so much uncertainty around the coronavirus pandemic already, the misinformation threatens to confuse people's understanding of the disease and the best ways to protect themselves and their loved ones. It is, Van Bavel says, 'the perfect recipe for disaster. This is not misinformation about the normal types of political debates. We're talking about potentially the worst pandemic in 100 years.'
While there is no single panacea, the latest psychological research might help us to stem the spread of false claims using a form of inoculation. When that strategy is combined with health campaigns that use the cutting-edge psychology of persuasion, some scientists even hope that we may reach a kind of 'herd immunity' against misinformation.
Mental antibodies
In the same way that a regular vaccination uses a weakened or inert form of the pathogen to prime the immune system for the real thing, a fake news inoculation requires people to be exposed to the threat in a safe environment (where the claims are easily debunked). This then heightens our awareness of misinformation in the real world – activating so-called 'mental antibodies' that help us detect unverified claims in the future.
The concept originated with the American social psychologist, William McGuire, in the 1960s. Under the political tensions of the Cold War, McGuire was concerned about the potential of foreign propaganda to brainwash US citizens, and began to look for ways to combat misinformation. He realised that many people have the knowledge and intelligence to rationally appraise a false claim – if they pay enough attention. But many people simply don't engage those skills, allowing their opinions to be swayed by the propaganda (Pratkanis, 2011).
To avoid that outcome, McGuire suspected you needed to make someone aware of their own vulnerability to the lies; only then would they be mentally engaged enough to resist persuasion.
In his initial studies, he examined participants' susceptibility to dubious health claims – like the idea that teeth brushing is bad for dental health. As he'd hoped, initially warning people about the potential threat of misinformation, and then providing them with examples of the fallacious arguments, provided the necessary shock to their system – so that they were more sceptical of the fallacious material at a later date (McGuire et al., 1961; McGuire et al., 1962).
Unfortunately, the idea never really took off until around 50 years later, when a PhD student called Sander Van der Linden came across one of McGuire's paper one day in the library of the London School of Economics.
He says that he was immediately struck by McGuire's prescience. 'He wrote the paper long before the internet, long before we knew that misinformation spreads through a network, in a way that is very analogous to how a virus replicates in infected hosts.' With the spread of misinformation online only increasing, it seemed like the perfect time to resuscitate McGuire's ideas.
His first experiment asked whether an inoculation could stem the spread of false information around global warming. For around a decade, climate change deniers have been attempting to question the scientific consensus with the so-called 'Oregon Petition' – a website that claimed to have the signatures of more than 31,000 American scientists who believed 'there is no scientific evidence that the human release of carbon dioxide will, in the foreseeable future, cause catastrophic heating of the Earth's atmosphere'. In reality, fewer than 1 per cent of the signatories had a background in climate science, and many of the signatures were clearly fabricated. (Charles Darwin and 'Dr' Geri Halliwell are among the signatories.)
The website is surprisingly convincing: around 10 per cent of people change their opinion about the existence of a scientific consensus, having seen the petition – a huge effect for one single piece of misinformation. It's powerful enough to completely 'wipe out' any benefits from traditional educational campaigns, says Van der Linden, who is now based at the University of Cambridge.
To see if an inoculation might prevent them from being swayed, Van der Linden took a group of participants and offered them one of two warnings before they saw the Oregon Petition. The first was a general message that 'some politically motivated groups use misleading tactics to try to convince the public that there is a lot of disagreement among scientists' – followed by a reiteration of the actual climate science. Not only did it protect against the misinformation; after seeing the petition, belief in the scientific consensus was actually around 6.5 per cent higher than before the intervention.
The second form of inoculation was an even more detailed description of the petition itself, and its fabricated signatures. This was even more effective – boosting the acceptance of the scientific consensus by nearly 13 per cent. Both inoculations worked better than a third intervention, which involved arming the participants with the facts about climate change, without a specific warning about potential misinformation on the topic, which had barely any effect (Van der Linden, 2017).
Following the controversies over the US presidential election in 2016, Van der Linden next attempted to protect people against political fake news more generally. In this study, the inoculation was a board game – in which participants were encouraged to create a viral article on a controversial topic like the refugee crisis. Afterwards, they were significantly less likely to be persuaded by actual fake news articles on the same issue, compared to a control group who had not played the game (Rozenbeek et al., 2019a).
Van der Linden's greatest success is an online game (called Bad News) that simulates sites such as Twitter, allowing users to build followers by employing misinformation techniques like impersonating or delegitimising official accounts, appealing to partisan divides, or creating a conspiracy theory (Rozenbeek et al., 2019b). Once again, the inoculation worked – and proved to be so popular it has attracted around a million users to date. The UK Foreign Office was quick to note its success and has now translated the game into 15 languages to help combat misinformation worldwide.
Van der Linden is now updating the Bad News game to specifically target the misinformation around Covid-19. It's drawing particular attention to the use of 'fake experts' (without any real qualifications) to question medical advice, for example, which is one of the most common strategies being used at the moment. But Van der Linden hopes that more organisations will take note of the idea of inoculation, to pre-emptively warn people about the strategies that may be used to spread misinformation around the pandemic. 'We're trying to get the policy conversation going around this idea of pre-bunking rather than debunking.' This will be especially important as we gear up for any potential release of a real Covid-19 vaccine, he says, since we are likely to see an explosion of fake news around its safety and effectiveness.
Research from the University of Regina in Canada suggests that even small nudges can activate our 'mental antibodies'. For a recent pre-print, Gordon Pennycook examined people's tendency to share fake coronavirus news. As McGuire had first noted, he found that many people sharing the misinformation just weren't questioning its accuracy before deciding to pass it on. Simply asking the participants to rate the reliability of a single headline primed people to think more carefully about the messages they were reading, and subsequently reduced their willingness to share a host of other fake stories (Pennycook et al., 2020).
Pennycook argues that the companies themselves would need to find the best ways to implement this, but he could see the advantage of occasionally prompting users to rate the accuracy of what they are seeing. 'They could gain information about the stuff that is being spread around their platform, and at the same time, get people to think about the accuracy [of what they are sharing],' he told me.
Anti-viral measures
Clearly, prevention will be best medicine. But what can be done when the message is already out there?
Many previous campaigns – such as the widespread attempts to debunk anti-vaxxer myths – may have failed because they struggled to present the information effectively. It was rather common, for instance, for myth-busting articles to repeat the false claims, often in the headline. Unfortunately, that allows the lie to take root before the reader knows to be sceptical, days or weeks later. It is the false claim that becomes the thing they remember, rather than the debunking (Pluviano et al., 2017). Some campaigns also suffered from overloading the reader with statistics – which confused rather than persuaded.
Fortunately, the latest research suggests some principles that should make any message stick. These tend to centre on the principle of fluency, or how easily the brain can process a claim. As Norbert Schwarz and Eryn Newman wrote in a recent review of the best ways to combat misinformation, 'when thoughts flow smoothly, people nod along' (Schwarz et al., 2016). Fluency can come from vivid storytelling (with the use of colourful anecdotes, say) and slick presentation (such as setting out the key facts in a bold, easy-to-read font). Just a single image can increase a statement's persuasiveness, since it makes the central concept easier to visualise, even if it doesn't actually offer any proof for the actual claim.
The purveyors of misinformation will already be using these strategies, but any individual, media outlet or health organisation hoping to combat those claims will have to pay equal attention to these factors. If possible, they should try to avoid repeating the false claims themselves and build their campaigns around the true facts themselves. So a campaign about the best ways to reduce coronavirus transmission, that tacitly deals with some of the fake news claims without expressing them, would be better than an article on the '10 coronavirus myths'.
If repeating the claim is unavoidable, they should make sure that truth is more salient than the lie – both in the presentation and the storytelling. And, using the principle of inoculation, they should warn people of the fact that it is untrue before the passage itself (rather than, say, placing the myth upfront in a heading) and explain how and why they are being misled.
Whether we are an official spokesperson or simply trying to make a small individual contribution, we can all use these principles to fight the spread of misinformation. (See 'How to clean up your Facebook page', below, for more information on the ways to deal with misinformation in your own social network.) 'They are often small effects, but it all adds up,' says Van Bavel.
Like a real virus, the survival of misinformation relies on its R (reproduction) number – essentially, whether each person will pass it on to more than one other person. If enough people are educated about misinformation – and sceptical enough not to pass it on – could we ever reach a state of 'herd immunity'? In the first place, that might concern immunity against misinformation on a single topic (like the coronavirus), but the dream would be to have sufficient scepticism to radically reduce the spread of lies in multiple domains.
There is already some evidence that knowledge of misinformation in one domain can inoculate you in another (Cook et al., 2017) and Van der Linden's new research programme is currently investigating how to maximise the effects. 'People can raise their eyebrows at that idea but we think it's possible and worth pursuing', says Van der Linden. 'For me, the whole point of the "inoculation" metaphor is to achieve herd immunity.' One strategy would be to ensure that the inoculations – such as his Bad News game – themselves go viral, so that they reach the majority of the population and train them to think more critically about what they are consuming.
He readily admits you are not going to convince the most hardcore conspiracy theorists. 'But that's not necessary, right? You only need a critical part of the population to be vaccinated. We don't know what that number is – whether it's 50, 60 or 70 per cent – but that's what we're working towards.' It is wonderful to imagine that, by the time the real Covid-19 pandemic is under control, we may also have a working treatment for the viruses of the mind.
David Robson is the author of The Intelligence Trap: Revolutionise Your Thinking and Make Wiser Decisions, out now in paperback (Hodder and Stoughton). He is @d_a_robson on Twitter.
How to clean up your Facebook page
Besides guiding official responses to the misinformation surrounding coronavirus, psychological principles can also guide our own behaviour on social media.
The first step is to avoid repeating the false claims themselves and only share material you know to be factually accurate. The research shows that even intelligent people can be easily fooled by fake claims thanks to their 'cognitive miserliness', so try to verify information against a reliable source before sharing it.
When you do attempt to debunk a false claim, try not to be too emotional – particularly if it relates to a politically sensitive issue, says Jay Van Bavel. 'People who disagree with you will be much more likely to share it.' His work shows that moderating your language can increase the reach of your message to the people who need to hear it, rather than simply preaching to your own echo chamber.
If you get into an active disagreement with someone sharing a conspiracy theory, try to ask questions rather than simply presenting them with facts. When doing so, ask them to explain how this elaborate conspiracy works, rather than why they believe it. Experiments show that this shift in focus forces the person to confront the fact that they don't understand the details of the issue as well as they think – the illusion of explanatory depth – which ultimately causes them to question their beliefs (Johnson, 2017).
References
Bright, J., Au, H., Bailey, H., Elswah, M., Schliebs, M., Marchal, N., ... & Howard, P. N. (2020) Coronavirus Coverage by State-Backed English-Language News Sources. Oxford Internet Institute Data Memo. https://comprop.oii.ox.ac.uk/wp-content/uploads/sites/93/2020/04/Coronavirus-Coverage-by-State-Backed-English-Language-News-Sources.pdf
Cook, J., Lewandowsky, S., & Ecker, U. K. (2017). Neutralizing misinformation through inoculation: Exposing misleading argumentation techniques reduces their influence. PloS one, 12(5).
Johnson, D. R. (2017). Improving Skeptics' Reasoning When Evaluating Climate Change Material: A Cognitive Intervention. Ecopsychology, 9(3), 130-142.
Lewandowsky, S., Ecker, U. K., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological science in the public interest, 13(3), 106-131.
McGuire, W. J., & Papageorgis, D. (1961). The relative efficacy of various types of prior belief-defense in producing immunity against persuasion. The Journal of Abnormal and Social Psychology, 62(2), 327.
McGuire, W. J., & Papageorgis, D. (1962). Effectiveness of forewarning in developing resistance to persuasion. Public Opinion Quarterly, 26(1), 24-34.
Pennycook, G., McPhetres, J., Zhang, Y., & Rand, D. (2020). Fighting COVID-19 misinformation on social media: Experimental evidence for a scalable accuracy nudge intervention. https://psyarxiv.com/uhbk9/
Pluviano, S., Watt, C., & Della Sala, S. (2017). Misinformation lingers in memory: failure of three pro-vaccination strategies. PLoS One, 12(7).
Pratkanis, A. R. (Ed.). (2011). The science of social influence: Advances and future progress (p 86). Psychology Press.
Roozenbeek, J., & Van Der Linden, S. (2019a). The fake news game: actively inoculating against the risk of misinformation. Journal of Risk Research, 22(5), 570-580.
Roozenbeek, J., & van der Linden, S. (2019b). Fake news game confers psychological resistance against online misinformation. Palgrave Communications, 5(1), 1-10.
Schwarz, N., Newman, E., & Leach, W. (2016). Making the truth stick & the myths fade: Lessons from cognitive psychology. Behavioral Science & Policy, 2(1), 85-95.
Van Bavel, J. J., Baicker, K., Boggio, P. S., Capraro, V., Cichocka, A., Cikara, M., ... & Drury, J. (2020). Using social and behavioural science to support COVID-19 pandemic response. Nature Human Behaviour, 1-12.
Van der Linden, S., Leiserowitz, A., Rosenthal, S., & Maibach, E. (2017a). Inoculating the public against misinformation about climate change. Global Challenges, 1(2), 1600008.