To point out that experts are wrong, however, is to misunderstand the purpose of them. Their function is not to provide knowledge, and still less clear thinking. Instead, it is to provide certainty. People hate dissonance, doubt and uncertainty. Experts help dispel these. So, Paul Britton’s function was to tell the police that they had the right man, whilst economic forecasters’ job is to provide an impression that the future is knowable; no-one wants to hear about standard errors, parameter uncertainty or the Lucas critique.
What’s so pernicious here, though, is that people have ways of achieving an illusory certainty anyway. As Sir Harry Ognall - the judge who acquitted Stagg - says: “The police closed their minds to any other possibility than that of his guilt.”
There are several ways they got these closed minds. All have analogues in corporate planning and financial trading.
1. The confirmation bias. Having acquired the belief that Stagg was guilty - he fitted the profile, was on the scene and a bit of a weirdo - subsequent evidence was interpreted as corroborating this. So, the fact that he looked shifty in interview was seen as evidence of guilt, not as the sign of an innocent man nervous of being fitted up.
A similar thing happens in economic forecasting. If you thought last week that we’re heading for a very deep recession, you put great weight on Wednesdays’s jobless numbers and find ways of dismissing yesterday’s retail sales figures. If you thought the recession would be mild, you do the opposite.
2. The halo effect. In their book, Mistakes were Made, Carol Tavris and Elliot Aronson describe how policemen believe “I couldn’t have been wrong because I’m a good guy.” Even if we grant the premise, the error here lies in believing that good qualities - moral rectitude and cognitive skills - must be correlated. They are not. Of course, coppers are not unique in thinking this.
3. Groupthink. If our colleagues agree with us, our confidence in our judgment rises, especially if we like them.
This error arises in part because we fail to see that correlated data points add little to certainty. If our colleagues have the same training and evidence as us, and are also prone to groupthink, their beliefs will be correlated with ours, and so will not be new evidence - no more than a second copy of the Daily Mail corroborates the stories made in the first. But we interpret them as if they are.
4. Ego-involvement. Admitting that we are wrong means more than just fessing up to narrow technical error. We interpret it as a blow to our ego - a sign that we are not the infallible, uber-competent professionals we think. We’ll do anything to squirm out of facing this. Hence the failure of the police, until yesterday, to apologize to Colin Stagg, and the failure of many bank bosses, Tom McKillop excepted, to apologize for their errors.
And herein lies the purpose of experts. It’s to reinforce these mechanisms, to help people avoid the uncomfortable facts that the world is uncertain, that mistakes are inevitable, and that we are not as in control of things as we think.
Blaming experts for being wrong is like complaining that the economy is not yellow. It’s a category error so howling as to be nonsensical.