The “Backfire Effect”: Correcting false beliefs about vaccines can be surprisingly counterproductive
Nearly half of the US population wrongly believes the flu vaccine can give you flu, but correcting this error has the opposite of the desired effect.
23 February 2015
By Guest
By guest blogger Simon Oxenham
According to a new study, 43 per cent of the US population believes wrongly that the flu vaccine can give you flu. In fact any adverse reaction from the vaccine, besides a temperature and aching muscles for a short time, is rare. It stands to reason that correcting this misconception would be a good move for public health, but the study by Brendan Nyhan and Jason Reifler published in Vaccine found that debunking this false belief had a seriously counterproductive effect.
The researchers looked at 822 US adults who were selected to reflect the general population in terms of their mix of age, gender, race and education. About a quarter of this sample were unduly concerned about the side effects of the flu vaccine. It is amongst these individuals, that attempting to correct the myth that the flu vaccine gives you flu backfired. The researchers showed participants information from the Center for Disease Control (CDC), which was designed to debunk the myth that the flu vaccine can give you flu. This resulted in a fall in people's false beliefs but, among those concerned with vaccine side-effects, it also resulted in a paradoxical decline in their intentions to actually get vaccinated, from 46 per cent to 28 per cent. The intervention had no effect on intentions to get vaccinated amongst people who didn't have high levels of concerns about vaccine side effects in the first place.
Why is it that as false beliefs went down, so did intentions to vaccinate? The explanation suggested by the researchers is that the participants who had "high concerns about vaccine side effects brought other concerns to mind in an attempt to maintain their prior attitude when presented with corrective information". A psychological principle that might explain this behaviour is motivated reasoning: we are often open to persuasion when it comes to information that fits with our beliefs, while we are more critical or even outright reject information that contradicts our world view.
This is not the first time that vaccine safety information has been found to backfire. Last year the same team of researchers conducted a randomised controlled trial comparing messages from the CDC aiming to promote the measles, mumps and rubella (MMR) vaccine. The researchers found that debunking myths about MMR and autism had a similarly counterproductive result – reducing some false beliefs but also ironically reducing intentions to vaccinate.
Taken together, the results suggest that in terms of directly improving vaccination rates, we may be better off doing nothing than using the current boilerplate CDC information on misconceptions about vaccines to debunk false beliefs. If this is the case then the ramifications for public health are huge, but before we can decide whether this conclusion is accurate we'll have to wait to see if the finding can be replicated elsewhere. History has taught us that when it comes to vaccines, acting on scant evidence can have catastrophic consequences.
The studies do have their limitations: both looked at intentions to vaccinate rather than actual vaccination rates, which may be different in practice. Furthermore, in both sets of experiments, only the official US CDC vaccine safety messages were used. It is possible that if the experiments were repeated with other wordings, perhaps those used by the NHS in the UK for example, we would see different results.
If the backfire effect is replicated in future studies, how are we to proceed? Research into the backfire effect can provide some tentative suggestions. To begin with, it is likely we should avoid restating myths wherever possible and when we must restate myths, we should try to precede the myth with a warning that misleading information is coming up. This can help prevent myths from growing in our minds through mere familiarity. When we debunk myths we should also try to offer an alternative explanation for false beliefs, to fill the gap left by misinformation. We should also try to keep our explanations brief, which can help counter the imbalance that often occurs between simple, memorable myths and the more complicated reality. What is clear from the recent findings regarding beliefs about vaccines and the recent outbreaks in vaccine preventable diseases in the UK, the US and elsewhere, is that what we are currently doing to try to convince people to get vaccinated — may no longer be working.
Further reading
Nyhan, B., & Reifler, J. (2015). Does correcting myths about the flu vaccine work? An experimental evaluation of the effects of corrective information Vaccine, 33 (3), 459-464 DOI: 10.1016/j.vaccine.2014.11.017
Brendan Nyhan, Jason Reifler, Sean Richey, & Gary L. Freed (2014). Effective Messages in Vaccine Promotion: A Randomized Trial PEDIATRICS, 133 (4) DOI: 10.1542/peds.2013-2365d
Lewandowsky, S., Ecker, U., Seifert, C., Schwarz, N., & Cook, J. (2012). Misinformation and Its Correction: Continued Influence and Successful Debiasing Psychological Science in the Public Interest, 13 (3), 106-131 DOI: 10.1177/1529100612451018
About the author
Post written by Simon Oxenham for the BPS Research Digest. Simon Oxenham covers the best and the worst of the world of psychology and neuroscience on his Neurobonkers blog at the Big Think. Follow @Neurobonkers on Twitter, Facebook, Google+, RSS or join the mailing list.