Response to Simon Jenkins

As of the 23rd May 2022 this website is archived and will receive no further updates.

understandinguncertainty.org was produced by the Winton programme for the public understanding of risk based in the Statistical Laboratory in the University of Cambridge. The aim was to help improve the way that uncertainty and risk are discussed in society, and show how probability and statistics can be both useful and entertaining.

Many of the animations were produced using Flash and will no longer work.

DJS, Guardian, 1st August 2009

Simon Jenkins’ tirade against weather forecasters (“The Met Office thinks August will be wet. Buy futures in ice cream now” 31st July 2009) shows a misunderstanding of what science can deliver in the face of extreme complexity. He contrasts “scientists who lecture ministers on the exactitude of their calling” with “public predictions so smothered in caveats and qualifiers as to be drained of significance”. He seems to expect precise predictions of the future in spite of deriding any such claims in the light of “the irreducible probabilistic nature of life”. But there is a middle way between a demand for certainty and fatalistic resignation. I am a member of a rich community, which proudly includes insurers, statisticians, doctors and bookies, who use probability theory for prediction - this may have an unfamiliar language but it’s not the “pseudo-science” claimed by Jenkins.

Jenkins’ view that predictions should be left to “astrologers, ball-gazers and seaweed” was, at least in the medieval period, very respectable: it was not until the 17th century that gamblers and life-insurers realised that they could make more money if they could put a number on the odds of winning or dying. Weather forecasters now routinely qualify their forecasts with probabilities, at least in private: the fact that a 65% chance of above-average temperatures - clearly of meagre informational value - is fed to the public as a promise of a “barbeque summer” is presumably the fault of an over-enthusiastic (to be generous) Met Office press department.

Perhaps they feel they have to indulge the unwillingness of people like Jenkins to deal with probabilities, since he dislikes the use of qualifiers such as “66% certain”, saying “the information is useless without knowing the likelihood of the ‘66%’ being correct”. But here I must admit he has made an excellent point. It is clear when an unqualified prediction is wrong, but how can we say that a probability is wrong? Fortunately this has been closely studied by weather forecasters who strive to produce ‘reliable’ probabilities: of all the situations in which they say ‘there is a 60% chance of rain’, then it should rain in 60% of these cases. Reliable probabilities are essential if they are to be of use - if a bookie’s odds did not reflect the proportion of winners then he would lose money. If I had a seriously ill relative, then I would want a reliable assessment of their chances of survival – not some vague reassuring platitude. It would be bad if the Met Office’s probabilities were truly unreliable, but the current lack of a barbeque summer is not sufficient evidence to conclude this.

Jenkins does seem happy in putting numbers on the chance of a British soldier being a casualty in Helmand (“in the order of 1 in 10”) and, rather remarkably, has pre-empted NICE by calculating in his head that the millions spent on swine flu would be better spent elsewhere. His conclusions about swine flu may or may not be wrong (although I suspect they are) but at least he should be congratulated for suggesting that quantitative analysis is needed in balancing the potential benefits and harms of policy decisions. Such analyses cannot tell us what to do since there are always uncertainties, moral ambiguities and political pressures, but it makes explicit the evidence being used and the judgements being made. Which is presumably why it is all too rarely applied.

Free tags: