Why live interviews are a particular challenge for statisticians.

As of the 23rd May 2022 this website is archived and will receive no further updates.

understandinguncertainty.org was produced by the Winton programme for the public understanding of risk based in the Statistical Laboratory in the University of Cambridge. The aim was to help improve the way that uncertainty and risk are discussed in society, and show how probability and statistics can be both useful and entertaining.

Many of the animations were produced using Flash and will no longer work.

I like doing live interviews for radio or TV – it’s exciting and they can’t edit what you say. The programme is almost inevitably running late, so last Saturday morning when I did an interview for Radio 4’s Today I remembered my media training and had prepared carefully to get my points over before they cut me off.

But it was a nightmare. Well maybe that’s an exaggeration, but the start went very badly. The topic was pre-election polls, and the researcher had assured me that they would ask me about the stuff in my recent blog about what went wrong and what might be done in future. So what was the first question from Sarah Montague? “Do the polls influence voting behaviour?”

This is a very fair question, and topical in the light of Lynton Crosby’s comments arguing that publishing polls should be banned in the three weeks to elections. The problem is that I had not given this issue any prior thought, had not studied the evidence, and had not constructed a coherent opinion about the influence of polls. Even as she was asking the question I could feel rising anxiety, and I am embarrassed to admit I just blathered on until I could turn the discussion back to what I had prepared for. After that it went quite well, although I spoke too fast, but afterwards I cycled home from the Cambridge studios, cursing myself loudly for my incompetence.

My partner had heard the interview and, as usual, had already diagnosed my problem: I did not have a standard response ready for when I was asked about something outside my expertise. So if I were not so slow-witted, I could perhaps have said that this was a very good question, my gut feeling is that the polls might have some effect on behaviour, but it was not my area of work and I don’t know what real evidence there is. But instead I essentially panicked.

When I calmed down I remembered that this had happened before: a few years ago I did another Today interview when I was supposed to talk about assessing flood risks, but John Humphrys immediately asked me ‘Why do people stay in their homes when a flood is predicted?’ So I blathered. I now realise there is a pattern: I have got myself all prepared to talk about how numbers are constructed and how reliable they are, but the interviewer is not very interested in this - they want to ask about how people react to the numbers.

I have a similar experience when I talk about the statistics in my new book Sex by Numbers (available from all good, and a few not-so-good, book-sellers), as the inevitable question comes up – why do people act in that way? To which my only answer can be: I don't know, and I don’t want to just come up with a fairly ill-informed opinion.

The common factor is this: the interviewer wants to turn the discussion either to

  • the effect the numbers have on people, or
  • why people affect the numbers.

I cannot criticise them - it is entirely understandable that they are interested in the human story around the stats, and it probably reflects what the audience would ask. But all this is generally outside the expertise of the statistician.

It doesn’t seem fair: astronomers don't expect to be asked about the effect their discoveries might have on people. But we statisticians clearly have to be ready.

The first tactic, of course, is to have done your homework and spent some time examining the human context of the numbers, and at least be ready to summarise what social scientists have said about people’s behaviour. This is good professional practice. But it will not always work, as although you may feel that you should have a nice neat answer to whatever you are asked, it is impossible to be properly informed about everything and so unanswerable questions outside your comfort zone will always come up.

So the second tactic is (unlike me on Saturday morning) to not only be ready for the question that you are not qualified to answer, but to positively welcome it. It gives a chance to explain that (to parody an old cliché) science means not having to say you know. It is OK not to have opinions about things until you have studied the evidence, and even then the conclusions may not be clear.

Perhaps an easier challenge is being asked about a pure matter of opinion, such as whether polls should be banned in the period before an election. I think I would have been better able to bat this aside, as it is certainly not part of my job as a statistician to have opinions about such a policy. But I probably would have made a mess of that as well.

[You can hear the interview here at 1:45, but I would rather you didn’t.]