Rationalism and rationality, wrote Deirdre McCloskey, “are not often found in the same company.*” Robert Wright’s piece on the myth of perfectly rational thought reminded me of that saying. People who claim to be rationalists are often not wholly rational. As Robert says:
Cognitive biases are so pervasive and subtle that it’s hubristic to ever claim we’ve escaped them entirely.
Such hubris is, of course, itself that most ubiquitous of biases – overconfidence.
I’d add one thing to his examples: ego-involvement. The moment we’ve expressed a view, we feel we own it and have to defend it: this is a form of the endowment effect. We thus look for confirmatory evidence and downplay dissonant evidence. The more public your writings, or the more tied up your ideas are with your sense of identity, the greater these dangers will be**. A recent tweet by Jo Wolff captured a too-rare reaction to this danger:
Just remembered a pre-Dropbox story of colleague who had laptop and hard drives stolen, containing only copies of five book manuscripts that had been in progress for decades. Claimed it was a great relief - now freed up to think about new things.
What’s more, like Michael Oakeshott, I’m not sure that rationalism (at least in some senses) is appropriate for human affairs. We cannot treat these as mere operations of logic in part simply because there are too many plausible premises to start from. Niels Bohr’s saying fits the social sciences well: the opposite of a great truth is another great truth; I find that the appropriate response to very many claims is “yes, but.” The crooked timber of humanity does not fit logical schema.
Instead, our thinking should be a movement – which for me is incomplete – towards what Rawls called reflective equilibrium. We should toggle between general principles and specific judgments and empirical facts until they are more or less consonant.
What matters, I suspect, is not so much that one be rational as reasonable. There are at least three things one can do here:
- Check for one’s own cognitive biases. In particular, be awake to the confirmation bias, groupthink and overconfidence. It's far easier to spot others' biases than your own. At least try to lean against that tendency.
- Look for missing evidence and – better still – uncomfortable truths. The latter are not things with which to taunt opponents, but facts that make you yourself uncomfortable. For me, these include: the fact that there’s little popular demand for worker democracy; the possibility that a high citizens’ income would lead to a drop in labour supply or that it is ill-suited for the disabled or those with high housing costs; or the possibility that open borders would indeed lead to overcrowding. And so on.
- Construct your own counter-arguments. Our opponents don’t usually help us improve our thoughts. Too often they merely preach to their own side: who’s persuaded by talk of gammons or remoaners? Or they build straw men: the left want to turn us into Venezuela and the right want to destroy the NHS. Or they are just too obscure: Jordan Peterson for example seems to me to be emulating the worst of the post-modernist left. To avoid this, try to build your own arguments against your own beliefs. I’ve tried this here and here, for example.
Of course, I wouldn’t pretend for a moment that I always follow this advice. Although I might in moments of high arrogance consider myself intelligent or well-read or educated, I’d never describe myself as rational.
Here, though, are two problems.
One is that there are pitifully few role models for this sort of thinking. The confident assertions of newspaper columnists are pretty much the antithesis of what I have in mind – just as Brian Cox’s “isn’t science great?” schtick is a lousy advert for the scientific method.
Secondly, let’ suppose – contrary to the vast bulk of evidence – that somebody were wholly reasonable and free of cognitive biases. Such a person would, I fear, never be invited onto the BBC’s political discussion shows. And their political influence would probably be minimal in the face of the overconfident untruths of their more passionate and less reasonable opponents. In the marketplace of ideas there is adverse selection. There is a sharp trade-off between intellectual virtues and effectiveness.
* Knowledge and persuasion in economics, p323 in my copy.
** One reason I like writing for the IC is that the views I express there are distant from my identity: my idea of who I am doesn’t depend much upon my view of whether stock markets are efficient or not. Even then, however, I’m not immune from ego-involvement.
Wright's piece, like a couple of his recent Blogginheads shows on the IDW, is haunted by the irony of its own tendentiousness. There are things he needs to think not true, and you can read and hear the strain in his words and voice.
This post of yours, however, is excellent.
Posted by: Handy Mike | May 23, 2018 at 09:49 PM
This post is very idealistic, and there is my usual point: material interests may matter rather more than cognitive biases as to why people make some arguments, that cognitive biases are indeed important, but self-interest (even when misperceived) bias is also quite important.
Posted by: Blissex | May 23, 2018 at 10:13 PM
@blissex
As the truism says, "It's difficult to make a man understand something that his salary depends on him not understanding"
But how do you differentiate between A. people pretending not understand things B. actually not understanding them due to cognitive biases?
Posted by: D | May 24, 2018 at 08:53 AM
«how do you differentiate between A. people pretending not understand things B. actually not understanding them due to cognitive biases?»
Depends on whether one is in a court of law or in Parliament, or not. If not, all that is needed is a call of judgement based on the context, and an acceptable error rate.
Posted by: Blissex | May 24, 2018 at 07:39 PM
Re overconfidence, there is a big incentive to be that way. As Eric Hoffer wrote in the preface to the True Believer "The book passes no judgments, and expresses no preferences. It merely tries to explain; and the explanations - all of them theories - are in the nature of suggestions and arguments even when they are stated in what seems a categorical tone".
He went on to quote Montaigne:
"All I say is by way of discourse, and nothing by way of advice. I should not speak so boldly if it were my due to be believed".
It would be nice to think that much modern overconfidence is motivated by such reasoning. Nice, but naive.
Posted by: GeorgeCostanzaIrl | May 24, 2018 at 10:50 PM
"Lord, enlighten thou our enemies. Sharpen their wits, give acuteness to their perceptions, and consecutiveness and clearness to their reasoning powers: we are in danger from their folly, not from their wisdom."
- JS Mill "Essay on Coleridge"
It seems to me Chris is advocating we all try to be our own worst enemy, in the Millian sense. A noble aspiration, perhaps, but one doomed to fail. The dialectic with your enemies works better in the long run.
Posted by: derrida derider | May 25, 2018 at 02:42 AM