사람되기/인문학

(인문학) 칠콧: 보고서가 제기하는 질문: 왜 우리는 진실에 귀기울이지 않는가?

밝은하늘孤舟獨釣 2016. 7. 15. 22:53

출처: http://www.bbc.com/news/magazine-36744911


밝은 하늘: 이 기사를 통해 "인지적 부조화(cognitive dissonance)"를 알게 되었다. 아래 기사를 읽고 내가 이해한 바로는, 인지적 부조화는 새로운 사실이나 증거가 주어져도 그 사실이나 증거를 변조하여 끊임없이 자기의 신념에 대한 합리화를 꿰하는 행위이자 태도이다. 토니 블레어의 이라크戰 參戰 결정이 대표적인 사례로서, 그는 인지적 부조화(cognitive dissonance)라는 심리문제를 안고 있다. 한국에도 이런 문제를 안고 있는 사람들이 있어 그들의 얼굴이 떠오르지만 여기서는 거명하지는 않겠다.


Chilcot: Why we cover our ears to the facts 칠콧 보고서가 제기하는 질문: 우리는 왜 진실에 귀기울이지 않는가?


Tony Blair speaking at a press conference after the Chilcot report publication

Image copyrightGETTY IMAGES

Do people moderate their views when presented with overwhelming evidence to the contrary? Not necessarily, writes Matthew Syed.

We like to think that we apportion our beliefs to the evidence. After all, isn't this the hallmark of rationality? When information comes along which reveals we should change our minds, we do. (apportion A to B: 나누다, 할당하다. A를 B에 끼워맞추다.) (hallmark: 특징)

Or do we?

Consider an experiment, where two groups were recruited. one group was adamantly in favour of capital punishment. They had strong feelings on the issue, and had publicly backed the policy. The other group were adamantly against, horrified by "state-sanctioned murder". (capital punishment:사형) (state-sanctioned: 국가가 허가한) (state-sanctioned murder: 사형)

These groups were then shown two dossiers. Each of these dossiers were impressive. They marshalled well-researched evidence. But here's the thing. The first dossier collated all the evidence in favour of capital punishment. The second collated all the evidence against. (dossier: 서류나 자료 일체)

Two men in suits arguingImage copyrightISTOCK

Now you might suppose that, confronted by this contradictory evidence, the two groups would have concluded that capital punishment is a complex subject with arguments on both sides. You might have expected them to have moved a little closer in their views. In fact, the opposite happened - they became more polarised.

When asked about their attitudes afterwards, those in favour of capital punishment said they were impressed with the dossier citing evidence in line with their views. The data was rigorous, they said. It was extensive. It was robust. As for the other dossier - well, it was full of holes, shoddy, weak points everywhere.

The opposite conclusions were drawn by those against capital punishment. It was not just that they disagreed with the conclusions. They also found the (neutral) statistics unimpressive. From reading precisely the same material, they became even more entrenched in their positions.

What this (and dozens of other experiments) reveal is the way we filter new information when it challenges our strongly-held beliefs or judgements. We use a series of post hoc manoeuvres to reframe anything inconvenient to our original position. We question the probity of the evidence, or the credentials of the people who discovered it, or their motives, or whatever. The more information that emerges to challenge our perspective, the more creatively we search for new justifications, and the more entrenched we become in our prior view. 우리의 관점을 훼손하는 정보들이 많아질 수록, 우리는 새로운 합리화를 더 많이 시도하게 되며 지금까지 견지했던 관점에 더욱 갖히게 된다. (entrenched: 참호에 둘러싸인, 견고한)

This tendency is called "cognitive dissonance". 이런 경향을 가리켜, "인지적 부조화"라 함.


picture of head and brain inside, with padlock drawn on brainImage copyrightISTOCK
Image captionWhen beliefs are challenged by evidence, people may become more entrenched in those beliefs

You can see the hallmarks of cognitive dissonance in the build-up to and aftermath of the Iraq War. The Chilcot report made pointed criticisms over the legal advice, lack of cabinet oversight and post-war planning and policy. But let us focus on the way the primary evidence used to justify war - namely, the existence of WMD - was serially reframed.

On 24 September 2002, before the conflict, Tony Blair made a speech where he emphatically stated: "His [Saddam Hussein's] WMD programme is active, detailed and growing… he has existing plans for the use of weapons, which could be activated in 45 minutes…"

The problem with this claim is that Saddam's troops didn't use such weapons to repel Western forces, and the initial search for WMD drew a conspicuous blank. And yet, as the social psychologists Jeff Stone and Nicholas Fernandez have pointed out in an essay on the Iraq conflict, Blair didn't amend his view - he reframed the evidence. In a speech to the House of Commons, he said: "There are literally thousands of sites... but it is only now that the Iraq Survey Group has been put together that a dedicated team of people… will be able to do the job properly… I have no doubt that they will find the clearest possible evidence of WMD."


The Chilcot report

For BBC News reports and analysis of the Chilcot report, click here


So, to Blair, the lack of WMD didn't show that they were not actually there. Rather, it showed that inspectors hadn't been looking hard enough. Moreover, he had become more convinced of the existence of WMD, not less so.

Twelve months later, when the Iraq Survey Group couldn't find the weapons either, Blair still couldn't accept that WMD were not there. Instead, he changed tack again arguing in a speech that "they could have been removed, they could have been hidden, they could have been destroyed".

So now, the lack of evidence for WMD in Iraq was no longer because troops hadn't had enough time to find them, or because of the inadequacy of the inspectors, but because Iraqi troops had spirited them out of existence.

Tony Blair in Basra, southern Iraq, early 2004Image copyrightAP
Image captionTony Blair in Iraq after the 2003 invasion - failure to find WMDs did not change his mind

But this stance soon became untenable, too. As the search continued in a state of desperation, it became clear that not only were there no WMD, but there were no remnants of them, either. Iraqi troops could not have spirited them away.

And yet Blair now reached for a new justification for the decision to go to war. "The problem is that I can apologise for the information that turned out to be wrong, but I can't, sincerely at least, apologise for removing Saddam," he said in a speech. "The world is a better place with Saddam in prison."

This is not intended as argument against Blair - rather, as an illustration of the reach of cognitive dissonance. Indeed, when you read the Chilcot report, this tendency, not just with regard to WMD, peppers almost every page.

Chilcot report
Image captionChilcot report: Illustrations of cognitive dissonance "pepper" nearly every page

Science has changed the world because it prioritises evidence over conviction. Judgements are subservient to what the data tells us. The problem is that in many areas of our world, evidence is revised to fit with prior assumptions - and the tragedy is that we are often unaware of this process because it happens subconsciously. It is noteworthy, for example, that the Chilcot report nowhere states that Blair was actively deceitful.

The good news is that we can combat this tendency, and measurably improve our judgements, when we become alert to it. Indeed, the hallmark of pioneering institutions is that they deal with cognitive dissonance not by reframing inconvenient evidence, but by creating systems that learn from it (and thus avoid related biases such as "group think"). This should be the most important lesson of Chilcot. 칠콧 보고서에서 얻게 되는 가장 중요한 교훈: 불편한 증거를 조작함으로써 인지적 부조화를 다룰 게 아니라, 그 인지적 부조화를 통해 뭔가 배우고 더 나가서 "집단순응적 사고"와 같은 편견을 벗어나는 것.

When so-called Islamic State launched a major offensive in Iraq in 2014, and the country was on the brink of a civil war - which some commentators linked to the 2003 invasion - Blair found another avenue of justification.

He pointed to the policy of non-intervention in Syria, which had descended into its own civil war. In an article written for his personal website, he said: "In Syria we called for the regime to change, took no action and it is in the worst state of all." In other words he might be suggesting: "If things look bad in Iraq now, they would have been even more awful if we had not invaded in 2003."

For our purposes, the most important thing is not whether Blair was right or wrong on this point, one which he re-affirmed this week. The vital thing to realise is that had non-intervention in Syria achieved peace, Blair would likely still have found a way to interpret that evidence through the lens of the rightness of his decision to invade Iraq. In fact, he would probably have become more convinced of its rightness, not less so. (rightness: 올바름, 정당성)

And this is why the Chilcot report, despite its mammoth detail, will have little effect on the core judgements of those involved with the Iraq War. As with everything else, it will simply be reframed.