A heuristic for sorting science stories in the news

Dominic Lawson's article in the Sunday Times today[paywall] quotes me as having the rather cynical heuristic: "the very fact that a piece of health research appears in the papers indicates that it is nonsense." I stand by this, but after a bit more consideration I would like to suggest a slightly more refined version for dealing with science stories in the news, particularly medical ones.

"Ask yourself: if the study had come up with a negative result, would I be hearing about it? If NO, then don't bother to read or listen to the story"

The immediate impulse behind Lawson's article was a spate of studies claiming associations between ordinary daily habits and future bad outcomes: eating a lot of white bread with becoming obese, being cynical with getting dementia, light bedrooms with obesity (again). All these stories associate mundane exposures with later developing dread outcomes, i.e. the classic 'cats cause cancer' type. My argument is that, since we would not be reading about a study in which these associations had not been found, we should take no notice of these claims.

Why my cynicism? There has been a lot of public discussion of potential biases in the published scientific literature – see for example, commentaries in the Economist and Forbes magazine. The general idea is that by the time research has been selected to be submitted, and then selected for publication, there is a good chance the results are false positives: for a good review of the evidence for this see ‘A summary of the evidence that most published research is false’. There is also an excellent blog by Dorothy Bishop on why so much research goes unpublished.

The point of this blog is to argue that such selection bias is as nothing compared to the hurdles overcome by stories that are not only published, but publicised. For a study to be publicised, it must have

• Been considered worthwhile to write up and submit to a journal or other outlet
• Have been accepted for publication by the referees and editors
• Been considered ‘newsworthy’ enough to deserve a press release
• Been sexy enough to attract a journalist’s interest
• Got past an editor of a newspaper or newsroom.

Anything that gets through all these hurdles stands a huge chance of being a freak finding. In fact, if the coverage is on the radio, I recommend sticking your fingers in your ears and loudly saying ‘la-la-la’ to yourself.

The crucial idea is that since there is an unknown amount of evidence that I am not hearing about and that would contradict this story, there is no point in paying attention to whatever it is claiming. It is like watching a video of a football team scoring goals, and then suddenly realising that you are only being shown the 'successes' and not the ones they let in: the evidence just shows that they are capable of scoring, but not whether they score more than they concede. So, if you're interested in assessing the quality of the team, stop watching the video [of course if you just enjoy the spectacle, carry on].

The heuristic is even more appropriate when you hear or read of any survey by any organisation, particularly charities.

This all may seem rather cynical, and keep in mind that I am a grumpy old git (although now trying to avoid cynicism, as I have no wish to become demented). But just think of the time you can save.

[Added 2nd June 2014: I should have made clear that I am only talking about single studies: proper reviews of the totality of evidence should be listened to. So this is not an excuse to ignore evidence connecting smoking and lung cancer.]

PS A recent study argues that newspapers preferentially cover medical research with weaker methodology. However I must apply my own heuristic to this: would I have heard about it if the researchers had found out the opposite? And you should ask yourself, would I be telling you about it?

PPS I have been struggling to find a suitable name for this heuristic, perhaps with some literary or classical allusion to someone who was misled by only being told selected items of information. Perhaps the ‘Siddhartha’ heuristic? Siddhārtha Gautama was a prince who was only told good news, and protected from seeing suffering and death. But he finally realised that he was not seeing the world as it really was, and so he left his palace to first take on the life as a wandering ascetic, and eventually to become the Buddha.

Comments

Keith Grimaldi's picture

Re the white bread story - it's even worse when a conference presentation is publicised - it's not even been peer reviewed yet and no-one can access the data.

Re: ‘most published research is false’ - the original Ioannidis paper is often used to state this. I have a question - does it really refer to ALL research or only that research which relies on statistical association (like the white bread)? If the latter this distinction is extremely important and should always be made. If the former - well, just about all of my published laboratory experiments have been reproduced many times by others, am I an outlier? Don't think so!

So is it really true that "most published research is false" - is this really the impression we want to give?

RJStephens's picture

Is it useful to apply the rule of thumb that if a positive piece of research (or indeed a negative piece of research) results from a pre-stated (and reasonably logical) hypothesis it is much more likely to be true than if it comes from trawling the data?
Back Alley's picture

Too many people stray away from publishing results that they think failed. There was an organization that looked at this and they realized that a great deal of studies published were 'successful' ones. (I can't remember the percentage, but it was something really high).
-The Back Alley: http://www.backalleywebzine.com/how-to-write-a-story-and-touch-your-audi...