20130502_NS

Source: BBC Radio 4: Today Programme

URL: http://audioboo.fm/boos/1365961-nate-silver-on-today-programme-pundits

Date: 02/05/2013

Event: Nate Silver: "people are overconfident when they make predictions"

Attribution: BBC Radio 4

People:

  • Evan Davis: Presenter, BBC Radio 4: Today Programme
  • Nate Silver: Author and statistician

Evan Davis: On this programme each morning you hear a lot of predictions - weather, racing tips, election forecasts, economic news - so, how seriously should you take them? Well, in Britain at the moment is a man who has thought hard about predictions - Nate Silver is his name. He has a good forecasting record, having famously called the results of the last US election in all 50 states - he got 49 out of 50 in the election before. He has written a book called The Signal and the Noise - Why So Many Predictions Fail but Some Don't, and I sat down with him earlier this week. And I thought it might be useful if he gave you a guide as to how to consume all those predictions we're bombarded with. So, what is the first thing to look out for, when hearing a prediction on this programme?

Nate Silver: One question I would ask is: how confident does the person seem to be, in their prediction? And the more confident they are, the more suspect you should regard it, actually. There's a lot of evidence that shows that people are overconfident when they make predictions, that every now and then, there's a real cause to put your neck on the line and be very, very sure of yourself. But we're dealing with a complex society, a complex universe. And in general, people who are able to understand what the dissenting arguments are, and weigh that information carefully and make a thoughtful reply, are more useful than the pundits.

Evan Davis: How much weight should we give to the political pundits who we have on our programme? They often make predictions about who's up, who's going to be up, who's down, what the - what's going to happen at the next election, what's going to happen to the Coalition, this and that...

Nate Silver: So, the short answer is zero. [They laugh.] Over time, I've tracked how well their predictions do, and they're fifty-fifty. So they do exactly as well - no better, no worse - than flipping a coin.

Evan Davis: So political pundits are not strong. Of course, we have a lot of economic and business forecasts, I'd say, quite a few forecasts of what the next quarter's GDP figure is going to be. Normally, you know that one within a relatively small range - it's not going to be plus ten or minus ten...

Nate Silver: Yeah, but I would say there's uncertainty in calculating GDP, not just predicting it but knowing what it is right now. People were relieved, for example, that the UK did not go into a triple dip, but the growth rate was so anaemic and so slow that when that figure is revised it might turn out to be a recessionary number, after all. What studies have found, and this has been true for years and years, is that to forecast the economy more than three to six months in advance is nearly impossible. Where the way we interact today is so much more complex than it used to be, it's so dynamic where consumer expectations figure in, that to know what the UK economy will look like in 2015, when you guys have your next election, is really, kind of, a fool's errand, to guess that.

Evan Davis: I mean, there's a very interesting section in the book - very relevant, I think, to our programme - about foxes and hedgehogs. I mean, I think maybe you should describe it, rather than me, but in a way it tells us that we should try and pick more foxes as guests, rather than hedgehogs. But you explain...

Nate Silver: Yes, so, a fox - to simplify, a bit - but a fox is someone who knows a lot of little things. They kind of scavenge for different pieces of information and data, and put it together. Whereas a hedgehog is someone who has one really big idea, and usually the people with one really gigantic idea make for better guests on radio programmes and television. They tell a very cohesive and coherent narrative. But the real world is messy, right? And people who can scrounge through that mess, and say "This piece of data is very relevant, this is somewhat relevant, this is a distraction, here's a bright shiny object that I want to ignore. That fox-like personality has been shown to yield better forecasts.

Evan Davis: Right, so basically we tend to put on the radio hedgehogs who've got a very simple story...

Nate Silver: A study this economist in the United States did, who studied how well economists and political scientists forecasted over a period of 20 years, and he found the more people were in the media, the worse their forecasts tended to be. [They laugh.]

Evan Davis: Oh dear. Well, that's one for us to ponder on. Let me just ask you something completely different, because of course statistics do play a large part in all of this. We've had this enormously big spat, in the last couple of weeks, about a very influential economic paper on austerity and the level of debt that causes catastrophic declines in growth and output. Reinhart and Rogoff wrote the paper, it had a big effect and then it turned out there was an Excel spreadsheet error in the original paper.

Nate Silver: Yeah.

Evan Davis: I mean, I suppose that should just make everybody more humble, on all sides of all debates.

Nate Silver: I mean, that's kind of my general universal contention, right, is that everyone else is - you're dumber than you think, and everyone else is also dumber than they think, it kind of works out. But they made a very simple Excel coding error, they also made a lot of assumptions that are debatable. And I read a lot of academic papers, and look, there are problems here, too. Academics also want to provide false certainty, and they also tend to be very skilled at obfuscating how messy their conclusion is, by using very cumbersome technical language - it's hard to parse and so you assume that if someone uses a lot of jargon they must know what they're talking about, and I think the reverse is true. Because the world is complicated, and we human beings are a small part of it, then, you know, our understanding of it is relatively simple, and so if you stick to the basics and can explain something to a wide audience, I think that's usually a sign of credibility. If you have some complex model that you barely understand, then who knows - your heart might be in the right place but your head is probably all spun around.

Evan Davis: Okay, one last question, then. Are you dumber than you think? Because you've had a fantastically good record -

Nate Silver [laughing]: I'm sure I will -

Evan Davis: - you got to, surely, have a fall coming, no?

Nate Silver: I'm pretty sure of it. People love to - especially in the news media - love to build someone up and build someone down, and we know that we're overperforming relative to how well we're supposed to do, right? You're not supposed to get all 50 states right in 2008 - or over, it was about 46 - so I can guarantee you, if you make enough 80-20 bets - you'd much rather make an 80-20 bet than a 55-45 bet - but you're going to get them wrong 20% of the time. And do enough elections and, you know, I will be unravelled at some point, I'm sure.

Evan Davis: Nate Silver, lovely talking to you, and the book I should say - The Signal and the Noise - the art and science of prediction, thanks for talking to us.

Nate Silver: Thank you.