Skip to content

Lies, damned lies and statistics (about press releases)

by on 2014/12/19

According to the famous saying, there are three types of lies: lies, damned lies and statistics. Earlier this month, we had some of the latter, though interestingly the statistics concerned the extent to which researchers exaggerate their findings in press releases, and how that exaggeration filters through to media reports. The Independent’s headline suggested “Bad science reporting blamed on exaggerations in university press releases”, so just what have us researchers been up to?

What did the research say?

Professor Petroc Sumner and colleagues from universities in Wales and Australia were interested in how the content of press releases associated with research studies influenced the way in which these findings were then handled in the media. This is an important topic as the public understanding of research can be influenced by media representations. However, the process has a number of weak points where misunderstandings can creep in (intentionally or otherwise). Misleading messages have the potential to cause harm, or at least affect the perception of research in general.

In their study, the researchers focussed on the press releases from universities and institutions often distributed alongside new research. They accessed the press releases for the top 20 universities in the UK for research with “possible relevance to human health” from 2011. They then tracked down the original research paper and the media coverage associated with it. Each of these sources was rated on a number of things (more below), and a batch were double rated to check how the agreement of different raters. In total they rated 462 press releases and research papers, which generated 668 news stories.

The sources were rated for how explicit any advice given was, if causal statements were made from correlations, the presence of caveats and justifications, if details of research methods appeared, and if animal studies were discussed in terms of their implications for humans. The study, published in the British Medical Journal, is freely accessible, and the coding is described in detail there.

The study reported a number of things but included the finding that 40% of press releases had more direct advice than the original research paper and 33% of results based on correlations were described in more causal terms in the press release. Importantly, the media coverage was 6.5 times more likely to contain exaggeration when the press release was itself exaggerated. The findings for correlations being described causally were even stronger: when a press release misrepresented a correlation in causal terms, the media coverage was 20 times more likely to also report a more deterministic view.

So, while exaggeration existed in media reports from non-exaggerated press releases, the researchers concluded “that most of the inflation detected in our study did not occur de novo in the media but was already present in the text of the press releases produced by academics and their establishments”.

What did the media say?

The study has been well reported in the media, not only The Independent quoted above but also TIME, the Atlantic and the Guardian to name a few. There’s also a really good piece from NHS Choice’s Behind the Headlines, and an editorial associated with the study in the BMJ from Ben Goldacre, an advocate of clear and constructive communication between researchers and the media.

Given the findings though, what can be done? It’s clear that in many cases, press releases have been (and are) exaggerated. The simplest response would obviously be for the researchers and their institutions to take responsibility and be more cautious in their press releases. That might seem straightforward, but the pressure to publish and have that research disseminated can sometimes result in an over-egged pudding (although that is no excuse). Of course, the press release is only one part of the process, and in our How to “Research the Headlines” series, we cautioned against an over-reliance on press releases as the primary source of a news story for exactly the reason that they can “overemphasise certain aspects of the findings”. Again, that suggests an idealised situation where the journalists have both the time and the specialised expertise to go back to the original research to get the facts straight. That is less common given the reduction in specialised health or research correspondents at many media outlets, and another of our How to “Research the Headlines” pieces suggested that as researchers “we need to be aware of this when compiling our press releases, and ensure we carefully choose our words (and ensure they’re not lost in edits by university or journal press offices)”.

The bottom line.

Just a few weeks ago, I wrote about my most recent experience of going from a published paper to a press release, and the media reporting of that: “Read all about it: from “bench to newsstand”. The paper and press releases are all freely available, so have a look and see how well my colleagues and I did in keeping our enthusiasm in check…

Sumner, P., Vivian-Griffiths, S., Boivin, J., Williams, A., Venetis, C. A., Davies, A., Ogden, J., Whelan, L., Hughes, B., Dalton, B., Boy, F., & Chambers, C. D. (2014). The association between exaggeration in health related science news and academic press releases: retrospective observational study. The British Medical Journal. DOI:10.1136/bmj.g7015

From → News Stories

Leave a comment