Psychologists show it’s possible to fix misleading press releases - without harming their news value
Corrected press releases led to more accurate news, without any dip in quantity of coverage; according to a new study by Adams et al, 2019.
17 June 2019
By Jesse Singal
There are many reasons why media outlets report scientifically misleading information. But one key site at which this sort of misunderstanding takes root is in the press releases that universities issue when one of their researchers has published something that has a chance of garnering some attention. A new open-access study in BMC Medicine attempts to change this by intervening in the process directly.
Press releases are often misleading in many different ways, but a common flaw is their tendency to confuse causal and correlational claims. That is, an observational study will find that (to take a hypothetical), the more wine people drink, the more likely they are to be diagnosed with cancer over a given period.
A study like this, reported accurately, doesn't show that drinking wine causes cancer – it shows that wine consumption is associated with cancer diagnoses. It could be some other factor or factors is/are responsible for the link, like maybe people who drink more wine engage in other behaviours that themselves increase the risk of cancer.
Experimental studies, on the other hand, allow for the more confident drawing of causal inferences. If (again hypothetically) you took two otherwise equivalent groups, assigned one to make no lifestyle changes other than beginning to drink more wine, and then tracked differences in long-term cancer diagnoses, it's more likely any observed group differences were caused by the experimental intervention.
The way that health press releases often present purely correlational evidence as though it is causal is, to an extent, understandable: "Wine Causes Cancer" is more eye-catching than "Researchers Uncover A Correlation Between Wine Consumption And Cancer That May Or May Not Be Causal." And because journalists often write stories based entirely on press releases, the end result is that news coverage often lacks nuance.
For the new research, a team of psychologists led by Rachel Adams at Cardiff University and including her colleague Christopher Chambers, the author of "The Seven Deadly Sins of Psychology: A Manifesto for Reforming the Culture of Scientific Practice", asked a bunch of university press offices to participate.
The press offices sent the team hundreds of their biomedical and health-related press releases before they went out to the public, and Chambers and his colleagues randomly assigned them to different conditions: some they didn't touch regardless of their contents or accuracy (the control group), whereas for the others they proposed edits that "aligned" the press release's headline and content with the nature of the evidence (with experimental evidence allowing for stronger causal claims, and purely correlational evidence presented with cautious language). Then they watched the press releases go into the wild, evaluating how often their more careful approach was carried over into any ensuing national and international press stories – and whether the toned down releases led to less media coverage.
The most important takeaways are that news headlines were more accurate when they were written off more accurate press releases (which shows that journalists really are relying heavily on press releases rather than reading the studies themselves). And as judged by the amount of media coverage each press release generated, there was "no evidence of reduced news uptake for press releases whose headlines and main claims aligned to evidence."
Now, press releases should be accurate for accuracy's own sake, but this does offer some evidence that honest press-release writers won't be punished, in terms of reduced media coverage, by doing the right thing. This suggests, as Chambers and his team write in their abstract, that "[c]autious claims and explicit caveats about correlational findings may penetrate into news without harming news interest."
As the research team further point out, what this study can't answer is the actual effect of misleading versus appropriately hedged media coverage on news consumers themselves – that is, will their behaviour change on the basis of whether or not they are reading accurate health coverage? That's a question for future research.
It would be fascinating and important to conduct a study like this on press releases related to psychological findings – another area where it's been fairly standard for a while for weak or conflicting findings to be presented in a much stronger, more attention-getting manner in press releases, potentially causing harm to readers.
Further reading
—Claims of causality in health news: a randomised trial
About the author
Jesse Singal (@JesseSingal) is a contributing writer at BPS Research Digest and New York Magazine, and he publishes his own newsletter featuring behavioral-science-talk. He is also working on a book about why shoddy behavioral-science claims sometimes go viral for Farrar, Straus and Giroux.