Richard Wilson's blog

richardcameronwilson AT yahoo dot co dot UK

Archive for the ‘groupthink’ Category

Groupthink, Self-serving bias, Space Cadets and the Stanford Prison Experiment – “Don’t Get Fooled Again” at the Beyond Words book festival

with one comment

Earlier this week I had a very enjoyable afternoon discussing “Don’t Get Fooled Again” at University College School’s “Beyond Words” book festival, and looking at some of the more eye-catching bits of psychological research that I came across while writing the book. The audience had some challenging and wide-ranging questions, and I thought it might be interesting to post some more background links here.

I was about halfway through my research when I realised that a great many of the delusions I was looking at seemed to come down, at least in part, to what might once have been called “excessive vanity”: The conspiracy theorists who think that they alone are part of some special group with special knowledge about AIDS/the Illuminati/911 or the “great asbestos scam” while everyone else muddles along in their brainwashed ignorance. Or the crank who’s convinced that, just by using his uncommon “common sense”, he’s managed to find a fatal flaw in a well-established scientific theory that generations of world-renowned biologists have somehow contrived to miss.

But what I hadn’t known was the degree to which almost all of us seem to over-rate ourselves and our own abilities, at least to some degree. The average driver thinks that they’re a better driver than the average driver – and reason dictates that we can’t all be above average. Most people also seem to rate themselves at least slightly better than others in intelligence, and significantly better in warmth, generosity and – my personal favourite – likelihood of leaving the largest slice of pizza for someone else when there are only two slices left. The research link for that particular claim can be found here – and for a more general overview I’d recommend Cordelia Fine’s excellent book “A Mind of Its Own: How your brain distorts and deceives”.

Somewhat less academic but still very interesting was the case of a reality TV show called “Space Cadets” where a Channel Four production company managed to convince a group of contestants that they were being sent into space and put into orbit around the earth. In fact they were sitting in a Hollywood space shuttle simulator at a disused military airbase in Suffolk.

The programme-makers had set out explicitly to recreate a psychological phenomenon known as “groupthink”, where members of a close-knit social group become so focussed on a particular group belief or objective that they lose their ability to think critically. But what hadn’t been predicted was the effect that the experience would have on the actors who were in on the hoax, and whose job it was to pretend to be an ordinary member of the group.

“My poor brain is a scramble of half-truths, astronomical lies and unbridled lunacy”, Skelton wrote in the Guardian, shortly after the hoax was finally revealed.

“I’ve just scribbled a list of what I know for sure: I’ve been a mole on a fake reality show called Space Cadets; I have a Russian doll in my hand luggage; I’ve just spent the past five days in a flight simulator in a hangar on the Suffolk coast; and – last but by no means least – I’ve just spent the past five days in space.

My default brain position aboard Earth Orbiter One was that we were 200 kilometres up, travelling at about seven kilometres per second. Too many things were telling me that for me to think otherwise.”

The psychological manipulation had been so strong that even though Skelton knew, rationally, that
the whole thing was a hoax, he found himself believing it anyway.

The third case study I looked at was the notorious Stanford Prison Experiment, which took place in the early 1970s. Researcher Philip Zimbardo constructed a model prison in the basement of the Stanford Psychology Department, and got a group of students to play the roles of prison guards and prisoners. Within days, a significant proportion of the guards, who just days previously had been seemingly normal college students, had been transformed into brutal sadists, who relished the power that they’d been given and the opportunities for abuse that it gave them. In the end, the experiment had to be terminated early. Full details about the experiment and its wider implications can be found at Zimbardo’s excellent Stanford Prison Experiment website.

Interestingly, when the soldiers implicated in the horrific Abu Ghraib prison abuse scandal were put on trial, Philip Zimbardo was one of the key witnesses for the defence, arguing that the “situational pressures” on the guards, stemming from the way the prison had been mismanaged, made such abuses entirely predictable.

In “Don’t Get Fooled Again” I argue that human beings are rather more vulnerable to delusion and manipulation than we are usually be prepared to admit – but that confronting these vulnerabilities, and doing our best to understand them, is crucial in reducing our risk of being fooled in future.

Don’t Get Fooled Again reviewed by Tom Cunliffe

leave a comment »

From A Common Reader

Scepticism about media, politics and finances comes naturally to most of us these days, particularly when people who should know better have brought the world to a state of economic crisis (did our rulers really not know that unfettered greed is no basis for an economic world-order?). It is refreshing to read a book like Don’t Get Fooled Again, which takes our vague feeling that “things aren’t quite right” and shows us that gut instincts are often quite correct, and we really shouldn’t believe the utterances of any institution or public figure without first submitting them to some pretty stringent tests.

Richard Wilson puts forward a good case for scepticism, reminding his readers that humanity has a long history of “meekly engaging in depraved acts of inhumanity on the basis of ideas that turned out to be total gibberish”.

Much of his book focuses on the public relations industry, citing a number of case studies to show how opinion can be manipulated. He devotes a whole chapter to the way tobacco companies in the 1950s manipulated news organisations to question the increasingly obvious link between smoking and lung cancer. The strategy consisted of getting an influential academic on-side (geneticist Clarence Cook Little in this case), and using him to question every scrap of evidence which research scientists gathered supporting the need for anti-smoking legislation.

Little insisted that it was not enough to show that lung cancer victims were smokers, but that until the cause of the link could be demonstrated under laboratory conditions, the link was irrelevant. Tests showing that mice contracted cancer when exposed to cigarette smoke were contested, but on the other hand, animal tests which were favourable to the tobacco industry were heavily publicised. Wilson shows that genius of the PR campaign was capitalising on the media’s love of “debate”.

A story really takes off when two sides are seen in opposition, even when it is obvious that the alleged “controversy” is falsely based. This can be observed every day on programmes like BBC Radio 4′s Today programme, when even the most blindingly obvious truth has to be contested by a protagonist with opposing views, with the result that equal weight is given to both nonsense and fact. One million people walked the streets of London to protest about the US/GB invasion of Iraq but this had no effect on those who wanted for a variety of reasons to believe the fantastic reports about Iraq’s offensive capability.

Wilson warns of the dangers of pseudo-science, and its ability to influence government and other decision-makers. Wilson traces this back to Trofim Lysenko, Stalin’s favorite scientist who’s wrong-headed ideas about agronomy led to mass starvation throughout Russia. Even worse, Lysenko’s ideas were taken up by Chairman Mao and his followers whose Lysenko-inspired agrarian reforms led to the worst man-made famine in history, with the loss of 30 million lives.

The chapter on “groupthink” describes that way in which a closed group of people can adopts a false belief and then support itself in perpetuating it despite mounting evidence suggesting its falsity. I found myself thinking again of the decision to invade Iraq taken by Tony Blair’s cabinet when I read Richard Wilson’s list of symptoms of groupthink:

  1. Invulnerability – everything is going to work out right because we are a special group
  2. Rationalisation – explaining away warnings that challenge the group’s assumptions
  3. Unquestioning belief in the morality of the group and ignoring moral consequences of the group’s decisions
  4. Sterotyping those who oppose the group’s view as weak, evil, impotent of stupid
  5. Direct pressure being placed on any member who questions the group couched in terms of “disloyalty”
  6. Self-censorship of ideas that stray from the consensus
  7. The illusion of unanimity among group members with silence being viewed as agreement.

I have worked on many large I.T. projects and have seen these processes at work when projects have begun to fail and careers and reputations are at risk. Project teams easily acquire the need to plough on despite all warning signals to the contrary until finally the project is abandoned far too late for anyone to be able to recover any benefits from it.

Wilson goes on to consider the HIV/AIDS denial movement, begun in America and then influencing the thinking of the South African government where “AIDS dissidents” have had a malign effect on public policy leading to the denial of effective treatment for many. President Tabo Mbeki immersed himself in AIDS denial literature and invited American AIDS dissidents to join a presidential advisory panel on AIDS and HIV, one of whose aims was to inivestigate “whether there’s this thing called AIDS . . . whether HIV leads to AIDS, whether there’s something called HIV”. By 2005, more than 5.5 million South Africans were infected with HIV and 1000 were dying each day from AIDS.

In his concluding chapter, Richard Wilson lists the common threads which run through false and illusory belief systems: fundamentalism, relativism, conspiracy theories, pseudo-scholarship, pseudo-news, wishful thinking, over-idealisation, demonisation of perceived enemies, groupthink. While many of the ideas in this book are nothing new in themselves, Wilson has gathered them together, with many fascinating examples from recent history, to provide a very useful handbook for people who know that things they read in the paper or hear on the television are “not quite right” and need to be challenged.

I was pleased to find that Richard Wilson has a blog Don’t Get Fooled Again in which he reports on many of the topics covered in his book.