Review: “Mistakes Were Made (But Not By Me)” by Carol Tavris and Elliot Aronson
I’m just back from a very relaxing break in which I re-read the absolutely outstanding book “Mistakes Were Made (But Not By Me)” by psychologists Carol Tavris and Elliot Aronson. It has the rare merit of being accessible, well-written, solidly backed-up and jaw-droppingly interesting, all at the same time.
Elliot Aronson was one of the leading pioneers of cognitive dissonance theory, which holds that a prime motivation for human beings in forming new ideas is to justify and defend our pre-existing beliefs, even where this means torturing logic and evidence to breaking point. This, in turn, very often entails (more dangerously) justifying at all costs the decisions we have previously taken, more or less regardless of whether those decisions were right. Dissonance theory predicts that, in general, the more serious and irrevocable a particular decision is, the harder we will work to defend it – and (more dangerously still) to dismiss any evidence that the decision may have been a really bad one. Most dangerously of all, our need to defend a bad decision made on the basis of a flawed principle may lead us into making yet more bad decisions in future: The cost of giving up our flawed principle, and choosing a different path next time, would be to admit that we were wrong and accept responsibility for the damage done by our initial mistake. For most human beings this is an incredibly difficult thing to do. So instead we stick to our guns, and compound our flawed decision with further flawed decisions, regardless of all evidence.
There’s some particularly compelling stuff in the book about the damage done by Freudian pseudo-psychology in the 80s and 90s, when dozens of people were falsely imprisoned on the say-so of psychologists who believed in the now almost wholly discredited notion that child abuse victims would routinely repress and forget traumatic memories, which could later be recovered by hypnosis. In fact, according to Tavris and Aronson there is no reliable evidence that traumatic memories are repressed in this way – if anything, traumatic memories tend to be so clear and intense that the victim is regularly and painfully reminded of them even when they’d rather not be. Worse still, there is strong evidence that hypnosis and the use of leading questions can lead to the creation of ‘memories’ that, whilst compelling and ‘true’ to the person experiencing them, are wholly false. The authors recount a litany of cases where parents and teachers were imprisoned solely on the basis of such ‘recovered’ memories, only later to be exonerated after evidence emerged of just how flawed and unscientific the methods of ‘recovered memory’ specialists actually were.
I found this account especially striking for two reasons: Firstly, I was not aware of just how weak the evidential basis is for so many of the Freudian concepts – like ‘repression’ – which still hold such sway within our culture. Secondly, and more tragically, Tavris and Aronson show how, just as dissonance theory would predict, many of the leading lights of the ‘recovered memory’ movement – including some of those who were later found guilty of professional misconduct for their flawed testimony in court cases – continue to insist that they were right all along. To do otherwise would be to admit to themselves that their incompetence helped imprison dozens of innocent people, and tear apart countless families.
One of the most disconcerting implications of the book is, of course, that dissonance theory applies to all of us, and nobody is immune – the authors give a couple of examples from their own lives to drive this point home. But the good news is that by becoming more aware of our enormous drive towards self-justification, we can – at least on occasions – catch ourselves before we’ve gone too far, and try to steer ourselves back to reality.
A key point that emerged for me – and this is something that I’ve really seen a lot of since I began researching “Don’t Get Fooled Again” – is the extent to which so many of the most dangerous delusions seem to revolve around an over-attachment to our own egos. The quacks and cranks who desperately need to convince the world that they possess a special, earth-shattering insight that the scientific community has somehow missed – or is conspiring to suppress. The corporate executives who have persuaded each other that their cleverly-rebranded variation on the age-old “Ponzi scheme” is in fact a revolutionary “new paradigm” to which the basic laws of economics simply do not apply. The conspiracy theorists who believe that they uniquely can see through the ‘official story’ that everyone else is too stupid or cowardly to question. The antidote, it seems to me, may have something to do with re-emphasising one of the key elements of scepticism (albeit one that often seems to get forgotten) – intellectual humility:
Having a consciousness of the limits of one’s knowledge, including a sensitivity to circumstances in which one’s native egocentrism is likely to function self-deceptively; sensitivity to bias, prejudice and limitations of one’s viewpoint. Intellectual humility depends on recognizing that one should not claim more than one actually knows. It does not imply spinelessness or submissiveness. It implies the lack of intellectual pretentiousness, boastfulness, or conceit, combined with insight into the logical foundations, or lack of such foundations, of one’s beliefs.
This, of course, is easier said than done…