HT10: Exaggeration and opinion versus research evidence
We’ve now reached the final post in our How to “Research the Headlines” series, and our last tip is about news stories that appear to be based on evidence gathered from a research study or studies, but on closer inspection range from simple exaggeration to unsupported opinions of the writer or some other source. This shaky factual basis for reporting can easily happen when a researcher has been speaking about their recent findings and goes beyond what their data can actually support, often simply because of over-enthusiasm for their latest piece of work. It can be more serious, however, if there’s genuinely no research basis for a story that’s being reported as such. So here are a few pointers to help you sort the fact (all we mean by “fact” in this context, of course, is that the story is based on some published research; the nature of a fact is beyond our scope today!), from the fiction (opinion or hyperbole masquerading as research).
Part 10: Exaggeration and opinion versus research evidence
In previous posts (many of which we call upon in this final part of How to “Research the Headlines”), we’ve suggested looking for details in the news report that might link back to the research the story is based on. For example, does the media article cite a published paper, or give details of those who conducted the research, or even include some quotes from the team? That can be a pretty good clue that the story being reported has some basis in research. It doesn’t mean the research is faultless, of course, but at least some verifiable publication exists that can be checked.
With a verifiable primary source, it is possible to see how closely the reporting matches the research. Given that many of these research papers are inaccessible without a subscription, or are inaccessible because of the specialised language and analysis, checking their content isn’t always possible. If in doubt about a story, do have a look at the excellent websites Behind the Headlines from NHS Choices and Sense about Science, or similar, which provide expert critiques of both the reporting and the research behind some of the latest headline-grabbing stories. We’re hopefully contributing our voice to that agenda too; get in touch if you’d like us to look into a story, and check out the full How to “Research the Headlines” series for tips on critically consuming research in the media.
There are a number of mechanisms through which research reporting can stray from the primary source, and therefore introduce inaccuracies. These might range from the innocuous to the downright dangerous, and we’ve arranged the examples below very roughly in order of their likelihood to create misleading or damaging coverage.
Researchers sometimes say things off the cuff
Anecdotally, a fear that many researchers have in talking to the media directly is “What if I say something stupid?!”. It’s easily done; under pressure of a deadline from a journalist or press office, and taking a call between other things, a word or two can often get mixed up. To err (and um and ah, and “hang on, no, what I meant to say was…”) is human. That fear usually subsides with experience, and we would definitely encourage researchers to engage as fully as possible with the media, and to take up the offer of any training offered by their research centres or universities.
However, that’s not the only reason why researchers can sometimes be reluctant to speak directly to the media. In April, we wrote about a case where a researcher, Dr. Rebecca Sear, discussed her research with a journalist face-to-face and the interview was not recorded. Dr. Sear then found the report contained snippets that she had said in different contexts incorporated as direct quotes, and felt that it misrepresented what she had said. In our coverage, we discussed the importance of researchers providing clear sound-bites and avoiding or terminating conversations about the research implications that they are not comfortable with. To avoid the potential for confusion to arise, many researchers choose to only communicate via a carefully worded press-release (which is why you might find the same quotes liberally sprinkled across different news stories).
Going beyond the data
Errors of expression aside, it’s not unknown for a researcher to be asked a leading question, and while usually directly related to their recent work, it might go beyond what their specific study was looking at: “So, given your recent work on X, what do you think about the state of Y?”. Most researchers, being painfully polite individuals, will answer the question in terms of “Well, we’d need to look at that, but I would assume…”, but it can then be reported alongside their confident discussion of their recent findings, and the two become conflated and given equal pegging.
…but other researchers can help keep that in check
It’s not easy to spot those little errors of expression or instances of researcher going off-topic slightly (although direct overselling is usually more apparent) without being able to access the primary research, but if the piece has been well-reported, then there may be someone independent from the research team being asked to comment. Inclusion of those external voices can often be really helpful as, on the whole, they temper the enthusiasm and point out how the specific research might fit within a wider context, or need to be repeated and extended before it might have more general applications. We highlighted an excellent example of this in our piece on whether autism can be identified as young as 2 months old. Most of the reporting (for example the BBC) included experts such as academics with expertise in the area or a representative from the National Autistic Society, who raised caution about the small sample size and emphasised the need for further research to replicate the results before concrete conclusions could be drawn.
Another ‘fact versus fiction’ aspect to look out for is when the headline promises the Earth, and the article just can’t deliver. You often see this example in health reporting. For example, a headline might state “Chocolate as new cure for Alzheimer’s”, but the rest of the article might not actually be about Alzheimer’s but maybe some possible marker of it. And it might not be a study in humans, but rather a small animal study (small animal study in that it uses small animals, more often than not rats or mice, and/or that the animal sample is small in number). And maybe there was nothing about chocolate in the study either, it might just have been some compound often found in chocolate that in the context of a chocolate bar could have other effects (because of all the other good and bad things chocolate might contain). Sound like a bit of a red herring, and possibly a rather stretched example? Well, sadly not. In our posts we often highlight instances where recognisable terms used in the headline may distort what the research was actually about (“Cocoa and the brain” and “Young blood and old brains” highlight a couple of those errors). And as we said in our first How to “Research the Headlines” post: Don’t stop at the headline. Headline hyperbole is pretty common. The particularly frustrating thing about it is that in many cases, the subsequent article can be a good summary of the research.
No research behind the story
Every so often, a news story comes along and it’s based on nothing. Surprising, we know. Column inches need filled, and sometimes stories are padded out, or regurgitated from earlier pieces. Those situations can be harmless (if misleading) but when they affect decisions around health, social policy, or other important issues, they become more problematic. We can’t reason why, but every so often those fabrications are completely constructed in the newsroom. One of our Research the Headlines contributors detailed just such an occasion in “No science behind the story”, and was only aware of it because it linked back to her own, legitimate research.
So having a genuine researcher, university or group named in the publication isn’t always a failsafe way of knowing the story is reporting some recent findings. Again, a crucial thing to check is if the story has any details of a link to a specific publication that you would be able to access.
If it sounds too good to be true...
Well, it usually is. As with headline hyperbole, the latest miracle cure is often found lacking on closer scrutiny, or sometimes it might be somewhat premature to describe it as such. That’s not to say major breakthroughs don’t happen, but when they do, they’re usually covered very widely across a number of news and other media outlets in detail, rather than one or two splashes to generate website traffic or an extra sale (“2 minutes exercise will stop you ageing”, for example!). As with the consumption of any media, an optimistically cynical disposition is probably to be encouraged: optimistic that we will make advances for the greater good, while retaining our critical faculties in appraising those.
“I’m a scientist, get my opinion out there”
The final thing to look out for is when a researcher is quoted in an article, and something they say suggests it’s based on evidence whereas it’s just their belief that something is the case. There are many examples of this, but one of the more famous cases from a few years ago involved Baroness Susan Greenfield, scientist, writer and broadcaster. In some interviews, she suggested a link between computer games and the development of ‘dementia’ in children. The statement was not based on specific research, but rather Greenfield’s assumptions regarding the possible harm computer games might have. Given her status, these thoughts and musings were sometimes mistakenly reported as evidence. At the time, Ben Goldacre in his Bad Science blog, and high-profile academics including Professor Dorothy Bishop, were quick to point out the lack of research behind the claims. Thankfully there are a lot of good researchers out there keeping others in check, but sometimes the damage has already been done before the counter arguments can be heard, and the latter are sadly often less well-reported.
Of course, people have opinions and it is right that we are allowed to express those. Those in ‘expert’ positions have an even greater duty of care in expressing those in the media. But as with all opinions, context is key. If there’s no basis in research evidence, that needs to be made clear. At their most banal, they can be distracting or simply generate confusion, but at worst, they can be dangerous. Again, and again, check and see if the news story links back to a specific piece of research. If it doesn’t, Ask for Evidence before making your mind up.