Richard Wilson's blog

richardcameronwilson AT yahoo dot co dot UK

Archive for the ‘Bogus experts’ Category

Utter humiliation: Old media meets new media, old media wins…

with 10 comments

Utter humiliation: Bloggers beware…

See also: “Your Freedom” – Hit, Miss or Maybe?, and Memo to John O’ Farrell

I was on BBC Newsnight on Thursday, talking about this and this, though you probably would not have guessed it from the way the issue was presented.

The programme ‘hook’ was the launch of a new UK government website, “Your Freedom”, through which members of the public are invited to nominate laws that need to be scrapped in order to reverse the erosion of civil liberties that took place under the last government. The discussion was broadly in line with most BBC coverage of anything to do with “people from the internet”, and I am ashamed and embarrassed to have been associated with it. Given the BBC’s track record, I really should have seen this coming…

I’d been contacted by a Newsnight researcher who’d seen my blog post suggesting we bin the vaguely-defined crime of “aiding and abetting misconduct in public office”, which has been used in some disturbing court cases against people who receive information from government whistleblowers.

The format was as follows:  Five people who’d each submitted an idea to “Your Freedom” got to speak for about 25 seconds about a law they’d nominated, and the reason they felt it should be scrapped.

Then there was a much longer discussion between the presenter and two ‘experts’ – a guy named Andy Williamson from something called the Hansard Society, and a writer called John O’ Farrell. Although Newsnight chose not to disclose this, O’ Farrell had been actively involved with the government which did so much to attack civil liberties, and whose authoritarian laws the public is now being invited to review.

The last thing I did for TV about the online media (a piece for Al Jazeera on Trafigura/Carter Ruck) had been quite a positive experience, which was partly why I agreed to do this one. The Al Jazeera feature is still online here, and it makes for an interesting comparison with the BBC’s loud-and-proud “old media” approach.

O’ Farrell and the Hansard guy (along with the presenter Gavin Esler) seemed to be competing with each other to flaunt their contempt for the public in general, and the online world in particular. According to O’ Farrell, the five ‘bad law’ examples that had just been featured were nothing more than “single issue obsessions”. According to Williamson, most of the ideas the public had submitted to the new website were “utterly stupid” or from “single issue fanatics”. All the while, the five token faces of Joe Public adorned the wall behind them.

The overarching message seemed to be that the very idea of asking ordinary people to participate in an online policy discussion was completely absurd, and we really ought to leave the thinktankery to people like Andy Williamson and John O’ Farrell.

The Hansard Society modestly purports to be “universally recognised as the independent and non-partisan authority on Parliament and democracy”. I beg to differ (and these guys don’t seem too impressed either).

The Newsnight discussion might at least have been enlightening if O’ Farrell had been bumped, and the guy from Hansard put up against someone who actually understood the relationship between democracy and the web, such as the people behind the excellent website, www.theyworkforyou.com.

As things stood, it was like watching a gruesome TV re-enactment of Radio 4’s “The Moral Maze”, with my own face superimposed in the background the whole time. Utter humiliation…

*UPDATE* – The Hansard Society’s corporate donor list makes for interesting reading. Turns out that this “authority on Parliament and democracy” is bankrolled by MBDA, a subsidiary of the arms manufacturer BAE, whose relationship with British democracy is, let’s say, somewhat questionable

Other donors include the Rio Tinto mining group, who are notorious for their alleged complicity in serious human rights abuses around the world, the lobbying firm Ellwood and Atfield, and, perhaps inevitably, BP.Yet again, the BBC gifts air-time to a corporate front-group without any disclosure of its affiliations…

“He never has to know the actual facts of any issue; instead he’s equipped himself with a persuasive ploy which enables him to make non-experts believe he knows more than experts.”

with 2 comments

Here’s Plato’s take on experts, evidence, and evidence of expertise. These words were first written more than 2,000 years ago – it seems both intriguing and perhaps also a bit depressing that they still have so much currency today.

The text below is from a dialogue between Socrates and Gorgias, a well-known ‘sophist’ who made his living from teaching the art of persuasion – aka “rhetoric”. The word ‘sophistry’ is today synonymous with arguments that are superficially plausible, yet nonetheless bogus…

From Plato’s Gorgias

Socrates: …You claim to be able to train up as a rhetorician anyone who’s prepared to listen to your teaching on the subject. Yes?

Gorgias: Yes.

Socrates: And you’ll teach him all he needs to know to persuade a crowd of people – not to make them understand, but to win them over. Is that right?

Gorgias: Yes.

Socrates: Now you claimed a little while back that a rhetorician would be more persuasive than a doctor even when the issue was health.

Gorgias: Yes I did, as long as he’s speaking in front of a crowd.

Socrates: By ‘in front of a crowd’ you mean ‘in front of non-experts’, don’t you? I mean, a rhetorician wouldn’t be more persuasive than a doctor in front of an audience of experts, of course.

Gorgias: True.

Socrates: Now, if he’s more persuasive than a doctor than he’s more persuasive than an expert, isn’t he?

Gorgias: Yes.

Socrates: When he isn’t actually a doctor himself. Yes?

Gorgias: Yes.

Socrates: And a person who isn’t a doctor is ignorant, of course, about the things which a doctor knows.

Gorgias: Obviously.

Socrates: So any case of a rhetorician being more persuasive than a doctor is a case of a non-expert being more persuasive than an expert in front of an audience of non-experts. Isn’t that what we have to conclude?

Gorgias: Yes, in this instance, anyway.

Socrates: But isn’t a practitioner of rhetoric in the same situation whatever the area of expertise? He never has to know the actual facts of any issue; instead he’s equipped himself with a persuasive ploy which enables him to make non-experts believe he knows more than experts.

Gorgias: Doesn’t that simplify things, Socrates? Rhetoric is the only area of expertise you need to learn. You can ignore all the rest and still get the better of the professionals!

Written by Richard Wilson

February 7, 2010 at 2:19 pm

Review: “Mistakes Were Made (But Not By Me)” by Carol Tavris and Elliot Aronson

with one comment

I’m just back from a very relaxing break in which I re-read the absolutely outstanding book “Mistakes Were Made (But Not By Me)” by psychologists Carol Tavris and Elliot Aronson. It has the rare merit of being accessible, well-written, solidly backed-up and jaw-droppingly interesting, all at the same time.

Elliot Aronson was one of the leading pioneers of cognitive dissonance theory, which holds that a prime motivation for human beings in forming new ideas is to justify and defend our pre-existing beliefs, even where this means torturing logic and evidence to breaking point. This, in turn, very often entails (more dangerously) justifying at all costs the decisions we have previously taken, more or less regardless of whether those decisions were right. Dissonance theory predicts that, in general, the more serious and irrevocable a particular decision is, the harder we will work to defend it – and (more dangerously still) to dismiss any evidence that the decision may have been a really bad one. Most dangerously of all, our need to defend a bad decision made on the basis of a flawed principle may lead us into making yet more bad decisions in future: The cost of giving up our flawed principle, and choosing a different path next time, would be to admit that we were wrong and accept responsibility for the damage done by our initial mistake. For most human beings this is an incredibly difficult thing to do. So instead we stick to our guns, and compound our flawed decision with further flawed decisions, regardless of all evidence.

There’s some particularly compelling stuff in the book about the damage done by Freudian pseudo-psychology in the 80s and 90s, when dozens of people were falsely imprisoned on the say-so of psychologists who believed in the now almost wholly discredited notion that child abuse victims would routinely repress and forget traumatic memories, which could later be recovered by hypnosis. In fact, according to Tavris and Aronson there is no reliable evidence that traumatic memories are repressed in this way – if anything, traumatic memories tend to be so clear and intense that the victim is regularly and painfully reminded of them even when they’d rather not be. Worse still, there is strong evidence that hypnosis and the use of leading questions can lead to the creation of ‘memories’ that, whilst compelling and ‘true’ to the person experiencing them, are wholly false. The authors recount a litany of cases where parents and teachers were imprisoned solely on the basis of such ‘recovered’ memories, only later to be exonerated after evidence emerged of just how flawed and unscientific the methods of ‘recovered memory’ specialists actually were.

I found this account especially striking for two reasons: Firstly, I was not aware of just how weak the evidential basis is for so many of the Freudian concepts – like ‘repression’ – which still hold such sway within our culture. Secondly, and more tragically, Tavris and Aronson show how, just as dissonance theory would predict, many of the leading lights of the ‘recovered memory’ movement – including some of those who were later found guilty of professional misconduct for their flawed testimony in court cases – continue to insist that they were right all along. To do otherwise would be to admit to themselves that their incompetence helped imprison dozens of innocent people, and tear apart countless families.

One of the most disconcerting implications of the book is, of course, that dissonance theory applies to all of us, and nobody is immune – the authors give a couple of examples from their own lives to drive this point home. But the good news is that by becoming more aware of our enormous drive towards self-justification, we can – at least on occasions – catch ourselves before we’ve gone too far, and try to steer ourselves back to reality.

A key point that emerged for me – and this is something that I’ve really seen a lot of since I began researching “Don’t Get Fooled Again” – is the extent to which so many of the most dangerous delusions seem to revolve around an over-attachment to our own egos. The quacks and cranks who desperately need to convince the world that they possess a special, earth-shattering insight that the scientific community has somehow missed – or is conspiring to suppress. The corporate executives who have persuaded each other that their cleverly-rebranded variation on the age-old “Ponzi scheme” is in fact a revolutionary “new paradigm” to which the basic laws of economics simply do not apply. The conspiracy theorists who believe that they uniquely can see through the ‘official story’ that everyone else is too stupid or cowardly to question. The antidote, it seems to me, may have something to do with re-emphasising one of the key elements of scepticism (albeit one that often seems to get forgotten) – intellectual humility:

Having a consciousness of the limits of one’s knowledge, including a sensitivity to circumstances in which one’s native egocentrism is likely to function self-deceptively; sensitivity to bias, prejudice and limitations of one’s viewpoint. Intellectual humility depends on recognizing that one should not claim more than one actually knows. It does not imply spinelessness or submissiveness. It implies the lack of intellectual pretentiousness, boastfulness, or conceit, combined with insight into the logical foundations, or lack of such foundations, of one’s beliefs.

This, of course, is easier said than done

Written by Richard Wilson

July 13, 2009 at 6:23 pm

Pocket journalism: Telegraph hack Con Coughlin goes into bat for torture

leave a comment »

In the early days of the Iraq war, Telegraph columnist Con Coughlin was famously obliging in disseminating the bogus claims of the US and UK governments about Weapons of Mass Destruction and a supposed link between Iraq and Al Qaeda.

When the focus of US “public diplomacy” switched towards the clamour for military action in Iran, Coughlin was equally helpful in promoting unsubstantiated claims about a link between Al Qaeda and the Iranian government.

Amid growing evidence that many of the false (yet politically useful) intelligence claims used to justify the Iraq war came from confessions extracted through torture, one might think that Coughlin, and the Telegraph, would now treat the assertions of the security services with a little more scepticism.

Instead, Coughlin seems to have gone the other way, cautioning Barack Obama not to “pick a fight with Dick Cheney”, asserting, without offering any evidence, that “We know that at least two major terrorist attacks against the UK were avoided thanks to vital intelligence provided to MI6 and MI5 by the CIA”, and suggesting that “There are always two sides to a story”.

“Are interrogation methods like waterboarding justified if they save lives”, Coughlin asks, “or should we respect the detainees’ human rights, thereby enabling the terror attacks to take place and claim innocent lives? I know which option I’d go for.”

Daily Mail gets fooled again by Booker’s quack-journalism

leave a comment »

Another corking piece of journo-quackery from Christopher Booker, this time in the Daily Mail. All the usual elements are there, including Booker’s oft-repeated claims about Creutzfeldt-Jakob Disease not being linked to BSE, and about a supposed scientific “confusion” about the health risks of asbestos “costing literally hundreds of billions of pounds”. Sam Wong on “Just a Theory” does an excellent debunking of the rest.

Written by Richard Wilson

May 2, 2009 at 11:16 am

Skeptics in the Pub – evidence-based-policy-making versus policy-based-evidence-making

with 6 comments

Monday’s book talk at Skeptics in the Pub certainly wasn’t my best, though things warmed up a bit with the Q&A discussion at the end.

My main focus was on the value of scepticism in, and about, politics – and I put forward three key examples to try to illustrate this: the case of the Soviet pseudo-scientist Trofim Lysenko, the UK government’s misleading statements about Iraq’s “WMD”, and the South African authorities’ embrace of “AIDS denialism” in the year 2000.

All three of these cases arguably involved costly government decisions being made on the basis of bad evidence that had not been properly scrutinised.

Lysenko’s theories about agriculture were far-fetched and unworkable, but they were ideologically agreeable to the Communist regime, and after he rose to prominence the totalitarian nature of the Soviet system made it very difficult for anyone to challenge his authority. When Lysenko’s ideas were implemented in China, they contributed to a famine that is believed to have claimed up to 30 million lives.

The evidence cited by the UK government in support of its view that Iraq possessed chemical weapons was famously “dodgy”. It’s widely believed that the Prime Minister at the time, Tony Blair, lied about the strength of that evidence, and about the views of his own experts (many of whom, it later, transpired, had grave doubts about the claims being made), not only to the public at large and the UK’s Parliament, but also to many members of his own cabinet. One ex-minister, Clare Short, has suggested that Blair believed he was engaging in an “honorable deception” for the greater good. But whatever his motives, in lying to his own cabinet and Parliament, Blair was effectively shutting out of the decision-making process the very people whose job it is to scrutinise the evidence on which government policies are based. John Williams, one of the spin doctors involved in drawing up the famous “dodgy dossier” – which at the time the government insisted was the unvarnished view of the intelligence services – later admitted that “in hindsight we could have done with a heavy dose of scepticism” (though it should be said that some of his statements raise more questions than they answer).

In South Africa in the early part of this decade, President Thabo Mbeki chose to believe the unsubstantiated claims of fringe scientists and conspiracy theorists over those of established AIDS researchers – including members of South Africa’s own scientific community. Under the influence of denialists who insist that HIV is not the cause of AIDS, and that AIDS deaths are in fact caused by the lifesaving medicines given to people with HIV, Mbeki’s government chose to block the availability of anti-retroviral drugs in South Africa – even after the pharmaceutical companies had been shamed into slashing their prices and international donors were offering to fund the distribution. It was only after a series of court cases by the indefatigable Treatment Action Campaign that, in 2004, the authorities began to change their position. A recent study by Harvard University concluded that the deliberate obstruction of the roll-out of lifesaving drugs may have cost more than 300,000 lives.

The broad conclusion I think all of this points to is that the truth matters more in politics than ever before. Because of power and influence that governments now hold, the consequences of a bad policy implemented on the basis of bad evidence can adversely affect millions.

In an ideal world governments would be engaging in evidence-based-policy-making: deciding policy on the basis of the best available evidence – rather than policy-based-evidence-making: cherry-picking or concocting evidence to support a decision that has already been made. But obviously this doesn’t always happen, and as a result wholly preventable mistakes continue to be made.

“Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments”

with one comment

From The American Psychological Association

People tend to hold overly favorable views of their abilities in many social and intellectual domains. The authors suggest that this overestimation occurs, in part, because people who are unskilled in these domains suffer a dual burden: Not only do these people reach erroneous conclusions and make unfortunate choices, but their incompetence robs them of the metacognitive ability to realize it

In 1995, McArthur Wheeler walked into two Pittsburgh banks and robbed them in broad daylight, with no visible attempt at disguise. He was arrested later that night, less than an hour after videotapes of him taken from surveillance cameras were broadcast on the 11 o’clock news. When police later showed him the surveillance tapes, Mr. Wheeler stared in incredulity. “But I wore the juice,” he mumbled. Apparently, Mr. Wheeler was under the impression that rubbing one’s face with lemon juice rendered it invisible to videotape cameras…

Written by Richard Wilson

February 18, 2009 at 12:34 am