Have I been wrong for years? What I mean is that I have often attributed ideas I believe to be mistaken to cognitive biases. But as Dan Kahan points out, this might not be the case. Thinking, he says. is not always a means of reaching accurate perceptions of the world. It can also be a way of protecting our status and (self) image - of expressing who we are.
Take for example a classic experiment (pdf) by Lord Ross and Lepper in the late 70s. They showed a group of people with strong views for and against capital punishment some empirical studies of the issue. They found that this led to greater polarization. This is inconsistent with conventional Bayesian reasoning. But it is entirely consistent with the theory that people form beliefs to maintain a coherent and consistent self-image. As Sherman and Cohen say (pdf):
People defensively distort, deny, and misrepresent reality in a manner that protects self-integrity.
Professor Kahan says this is not necessarily irrational. For most people, there is no cost to having erroneous beliefs about political issues, simply because their individual vote doesn't affect the outcome of elections. But there is a cost to being "good" Bayesians: you'll feel uncomfortable if you come to doubt your long-held beliefs, and you might even lose friends if you don't fit in. It can, therefore, be rational to hold views which are mistaken from a Bayesian perspective*.
One fact is consistent with this. Professor Kahan points out that people with greater cognitive skills tend to be more partisan than others. They have the skill to be what Glaeser and Sunstein call asymmetric Bayesians: they are capable of discrediting evidence against their beliefs whilst seeing the scientific merits of corroborative evidence.
Perhaps, then, I have often been guilty of a category error. Beliefs that I have attributed to irrational biases might instead be the product of a different form of rationality - expressive rationality.
One should ask of any theory in the social sciences: how widely does it apply? Many things are true of some people, but not of all. For example, I don't think expressive rationality is very relevant in my day job. When retail investors' make losing investments, it is often because of cold cognitive errors rather than expressive rationality**. There are, though, domains where it might be applicable:
- Stephan Lewandowsky says climate change denialism is motivated by “identity-protective cognition” - a desire of free-marketeers to reject interventions in the market.
- "Very Serious People" believe things which uphold their self-image as sober and serious people, regardless of their empirical validity. This, I suspect, is why professional investors prefer futurological judgments to the much more empirically grounded "sell on May Day, buy on Halloween rule".
- "No platforming" and "safe spaces" are absurd from a Bayesian point of view; they entail being wilfully blind to dissonant ideas. But they make sense from the point of view of a desire to maintain a coherent self-image. When rationalists such as Nick Cohen and Richard Dawkins criticise such policies, they might be committing the same sort of category error of which I might have often been guilty.
- Professional trolls such as Toby Young and Katie Hopkins hold beliefs which are wrong by our standards. But they are rational from their point of view, because they have based a career upon them. Newspaper and TV editors don't want columnists and guests who say "we don't know: the evidence is missing and ambiguous and the world's a complex place", even though this is often true.
You'll notice that these examples come from across the political spectrum. Insofar as expressive rationality has eclipsed Bayesian rationality, it is not a partisan phenomenon.
* Of course, if everyone is expressively rational there's a massive cost in terms of bad policy and terrible political discourse. This is another example of how individual rationality can conflict with collective well-being.
** Herding is an ambiguous case. People can jump onto bandwagons and buy at the peak of bubbles because of cognitive error - a mistaken belief in the heuristic of social proof - or because of a desire to be part of a group, to do what others are doing.
Well, obviously.
The perfect example of this is religious belief, especially Fundamentalism.
What is more likely - that someone who finds evolution difficult to believe adopts Creationism, and as a result accepts misogynistic and homophobic beliefs (even if they don't want to) - or that people inclined towards misogyny and homophobia find Creationism a highly convenient way of validating their beliefs in a way that cannot be argued with (because religion ).
This is a major problem for New Atheists - an inability to identify as anything other than irrationally what is really expressive rationality.
Posted by: Rodney | October 28, 2015 at 02:32 PM
I was thinking something along these lines when I saw people pointing to Corbyn's A-levels and asking if he's smart enough. Smart is great if you happen to be smart and right, but there's plenty of smart and wrong about and these people could be worse than not-so-smart and more malleable.
also I am not really seeing the distinction between a cognitive bias and habits of thought aimed at protecting our status and (self) image
Posted by: Luis Enrique | October 28, 2015 at 02:49 PM
@Luis, a cognitive bias is an error of judgement; expressive rationality is a matter of affinity. This (I think) is what Chris means by a category error.
Posted by: Dave Timoney | October 28, 2015 at 03:14 PM
sounds confirmation bias to me, just adding idea that confirmation bias is self-serving.
Posted by: Luis Enrique | October 28, 2015 at 03:42 PM
“Professor Kahan says this is not necessarily irrational”
Irrationality explicable as non-consciously motivated irrationality is still irrationality in my book.
Anyway, although I expect the “protecting self-integrity” explanation is often the correct one, as Jaynes has pointed out¹, such phenomena aren't necessarily inconsistent with (valid) Bayesian reasoning.
¹ http://www-biba.inrialpes.fr/Jaynes/cc05e.pdf
Posted by: phayes | October 28, 2015 at 04:06 PM
It is difficult to advocate listening to feelings or understanding emotion in a society that prizes the rational man above all other things. The problem with over-rational thinking is that it fosters blinkered thoughts.
In order to think, you have to be prepared to let in your feelings. After all, even Descartes stated "I doubt, I think therefore I am" (interesting that that line is often misquoted in line with our rational thinking is the only way of thinking project)
Posted by: TowerBridge | October 28, 2015 at 05:08 PM
You're missing a whopper - religion. A person's religious beliefs are hard to criticize because they are so wrapped up in a person's sense of identity and purpose.
Of course this is not the whole deal with conflicts between religion and empirical validity - by its very nature it prizes revelation and faith over proof - but it's a big part.
Posted by: Steven Clarke | October 28, 2015 at 11:01 PM
Ideologues frequently assume that anybody who does not agree with them is an ideologue on the Other Side. If an ideologue on the Other Side shows you an inconclusive study, it's plausible that such a study was the best he could do and that, in turn, implies that the Other Side has no conclusive studies.
Posted by: Joseph Hertzlinger | October 29, 2015 at 08:37 PM