One of the best-attested cognitive biases - which has effects in boardrooms (pdf) and stock markets - is overconfidence. A new paper (pdf) by Pietro Ortoleva and Erik Snowberg shows that it also has political effects.
They estimated the overconfidence of almost 3000 American voters by asking them what the latest unemployment and inflation rates were, and getting them to say how confident they were of these numbers. They then found that overconfidence on this measure was strongly correlated with ideological extremism, especially on the right. What's more, such overconfidence and extremism were both correlated with the propensity to vote.
In this sense, the shrill partisanship of American politics is rooted in a cognitive bias.
But here's the puzzle. There should in theory be a negative correlation between confidence and extremism; note the tagline of my blog. People should think: "most people don't hold my views; it's possible therefore that they know something I don't". But this view is the minority one. Why?
One reason lies in the asymmetric Bayesianism proposed by Glaeser and Sunstein; we are more sceptical about information that disconfirms our priors than that which corroborates them.
Another reason, say Ortleva and Snowberg, is that "it is very dicult to persuade overcondent citizens that their prior is incorrect as they will tend to attribute contradictory information to others' biases." (I might be guilty of this!).
I'd add that these mechanisms are supported by two others.
One is that politicians are themselves overconfident; you wouldn't enter a career as precarious as politics if you weren't. Leaders thus tend to buttress their followers' confidence (at least in public).
Secondly, in politics, the cost of being wrong might be large for society, but it is small for the individual. Those equity investor who thinks a £5 share is worth £10 will, sometimes, learn the hard way not to be so overconfident.But political partisans don't bear a personal cost for being wrong.
At risk of being epically self-unaware, this corroborates one of my priors - that overconfidence, far from being selected against, can actually be magnified by politics.
You should be interested by what Iain Couzin has shown about animal herd behaviour and confidence - paper here http://www.pnas.org/content/early/2011/12/13/1118318108.abstract and (very entertaining) 1h lecture here http://m.youtube.com/playlist?list=PL7D69D1FFE63DA81A#/watch?v=YzvTMwBD0ZA&feature=plpp
Posted by: Tournyol | July 30, 2013 at 01:47 AM
"But here's the puzzle. There should in theory be a negative correlation between confidence and extremism; note the tagline of my blog. People should think: "most people don't hold my views; it's possible therefore that they know something I don't"."
I disagree, and I think this is wrong because it's looking at it the wrong way round. I suggest that you don't start with a set of views and then decide how confident you are about them. You start with an inherent level of self-confidence and you develop your views accordingly.
So if you are a naturally self-questioning sort of person, you will be continually checking your initial views against reality and will eventually arrive at a position approximating consensus reality. Maybe you start off with a completely wrong estimate of unemployment, but your level of confidence in it is low and so you'll be continually seeking out new sources of information, which will make you less wrong.
But if you're naturally unquestioning, you're pretty much stuck with whatever initial views happen to have seeded themselves into your head, and a lot of the time those will be wrong and extreme views.
Posted by: ajay | July 30, 2013 at 11:35 AM
Fascinating.
But perhaps the more time invested in forming a worldview or political opinion, the higher the bar the individual sets for arguments and evidence which might challenge it?
After all, if you've spent time reading Marx, Locke, Mill, Hayek, Nozick et al and come to the conclusion after many years that, on balance, you are a Classical Liberal, isn't that investment depreciated if you reassess your worldview in the face of every opposing opinion? Especially if you've heard it before and dismissed it previously to your satisfaction?
For example, one of my objections to Marx is the labour theory of value. You may think you have an amazing defence of it, but the chances are I've heard it before, it hasn't convinced me, and I can't be bothered to go through it all again either in a pub or on a blog.
Perhaps a more useful frame is around where political opinions come from. Not everyone gets theirs from reading lots of books and academic study. Some are 'inherited' from parents. Others through identity or the desire to affiliate with a community or lifestyle. Some are based on cynical self-interest (I'm wealthy and therefore like low inflation). Some come from emotion (fear of change or violence).
My observation is that where political views are incorporated strongly intro a person's identity, they're unlikely to consider counter-arguments no matter how empirical - because the benefit to their identity of retaining their view is greater than the benefit of holding a different view, which might have a better empirical basis but might also challenge the individual's notion of his constructed identity and how settled he feels.
Holding a view which the evidence doesn't support may be irrational in and of itself, but the costs of changing that view may make it rational for the individual to persist with it.
Thoughts?
Posted by: Staberinde | July 30, 2013 at 12:20 PM
All true enough, I think the second point about cost to society versus cost to individual is key.
I can remember back to the Macmillan government and no government then or since has proved successful, all end in failure, screwup after screwup. So just suppose failure in politics were punished by the sack and a rotten and unavoidable CV, well no-one would go into the trade. Ipso facto we must just put up with the over confident.
A rotten job but someone has to do it.
Posted by: rogerh | July 30, 2013 at 02:08 PM
Off course under dictatorial forms of government political failure can be punished by death or exile. But that does not make the system work better at promoting human happiness either as the more authoritarian the system the more rigid the adherence to bad policy.
Staberinde may be right that people embrace political positions for reasons of a non rational kind. This is the argument of Sartre I believe. But Sartre as a philosopher still thinks you can decide certain positions are wrong. But presumably you have to have a very good French philosophical education to do so.
Posted by: Keith | July 30, 2013 at 07:15 PM
I can remember back to the Macmillan government and no government then or since has proved successful, all end in failure, screwup after screwup.
If this is just another way of saying that governments end when they lose elections, then it's incredibly banal. If it's another way of saying that they lose elections when they get too many things wrong, it's equally banal. If it's trying to say that no government since Macmillan has ever been successful at anything, then it's nuts.
Posted by: ajay | July 31, 2013 at 02:04 PM
Awesome blog! Do you have any suggestions for aspiring writers? I'm planning to start my own site soon but I'm a little lost on everything. Would you propose starting with a free platform like Wordpress or go for a paid option? There are so many choices out there that I'm totally confused .. Any tips? Thanks a lot!
fossil 時計 http://www.fossiltokei.com/
Posted by: fossil 時計 | August 22, 2013 at 09:30 AM