20170102_NW

Source: BBC Radio 4: The New World

URL: http://www.bbc.co.uk/programmes/b086nzlg

Date: 02/01/2017

Event: The New World: Nothing but the Truth

Credit: BBC Radio 4

People:

    • Dick Cheney: Vice President of the United States, 2001-2009
    • Jo Fidgen: BBC journalist
    • Professor Stephan Lewandowsky: Cognitive psychologist
    • Professor Brendan Nyhan: Professor of Government, Dartmouth College

[This is a partial transcript of the BBC Radio 4 programme The New World: Nothing but the Truth.]

Jo Fidgen: First question: if this a post-truth era, when did it start?

Dick Cheney: Simply stated, there is no doubt that Saddam Hussein now has weapons of mass destruction. There is no doubt that he is amassing them against our friends, against our allies and against us.

Jo Fidgen:The US Vice President Dick Cheney, speaking in 2002, a year that saw the truth being spun and sexed up as never before. Cognitive scientist Stephan Lewandowsky was paying close attention.

Stephan Lewandowsky: You may remember these weapons of mass destruction that we were told about, at the time - that they were being stockpiled in Iraq. And then, as it turns out, after the invasion, none of them were found. To my mind, I would see that as the first clear instance of a lot of information being out there that then, later on, turned out to be without any basis in fact.

Jo Fidgen: Professor Lewandowsky's professional curiosity was piqued. He's an expert in the workings of the brain. I should mention that - we're going to be hearing from a lot of experts in this programme. Old habits, and all that. I promise to address that, later. Anyway, back to the Prof.

Stephan Lewandowsky: What fascinates me, as a cognitive scientist, is to see how people respond when they develop the expectation that something should be the case, and then it turns out it's not the case. What happens?

Jo Fidgen: He started designing a series of experiments to explore our relationship with the truth, with some pretty disconcerting results, as we'll hear. Around the same time, over in the States, Brendan Nyhan was having a similar thought. He went on to become Professor of Government at Dartmouth College but back in his youth in 2001, he'd been a fact-checking pioneer, setting up a website called Spinsanity.

Brendan Nyhan: And I saw how difficult it was to change people's minds about political facts that ran contrary to their political predispositions.

Jo Fidgen: He suspected there was something particular in the way we process facts, when they're coloured by ideology.

Brendan Nyhan: So when I entered academia, we did a number of experiments with mock news articles, in which claims by prominent political figures were directly contradicted.

Jo Fidgen: One was from the year after the Iraq War and quoted President Bush making the claim that there had been a real risk Saddam Hussein would pass on weapons of mass destruction to terrorists. A second article made it clear that the claim was false and that there was no evidence of WMDs in Iraq. Brendan Nyhan showed them to people, then asked who still believed that Iraq had WMDs.

Brendan Nyhan: And what we found was: at least in some cases, it was possible for the people who were most vulnerable to those false claims - the ideological group, in this case conservatives, because we're talking about George W Bush - to come to believe in them even more strongly, if they were exposed to corrective information.

Jo Fidgen: That is so extraordinary that it's worth saying again. The people who were ideologically invested in believing that there were weapons of mass destruction in Iraq, that George W Bush didn't lie, not only tended to resist the evidence that there were no such weapons but some became even more convinced that there were. It's called the "backfire effect".

Stephan Lewandowsky: Now the process by which that happens is probably something that we call "counter-arguing". So when people are confronted with information that is challenging to their world views, then while they're listening to you telling them something, they will be sitting there thinking "Well, but actually no", and they will think of arguments why that might not be the case. But developing these counterarguments - what can happen is that people become more entrenched in their false beliefs than they were to begin with.

Jo Fidgen: What, they persuade themselves, effectively.

Stephan Lewandowsky: They persuade - exactly.

Brendan Nyhan: There's growing evidence that what we think of as rationality is often us rationalising instincts, intuitions and feelings that we've already had, after the fact. And people can confabulate all sorts of rationalisations for beliefs or opinions they hold, but those are often retrospective justifications that we construct.

Jo Fidgen: In other words, we're inclined to put our feelings first, and make the facts fit. If a fact challenges how we see the world, we just don't believe it. We think we're reasoning but we're actually rationalising.

Brendan Nyhan: We have to be honest with ourselves about this - one thing it's important for your listeners to think about is the extent to which this is not a problem that someone else has. We all are vulnerable to this. We all have these predispositions, and in fact, the people who are more knowledgeable or sophisticated, in some cases, are better at this kind of rationalisation, they're better at filtering the information that they hear and selectively accepting information that's consistent with their point of view. So, BBC listener - that means you [Jo is laughing] - it could be you.

[To be continued.]