It is a truth universally acknowledged that the best lack all conviction whilst the worst are full of passionate intensity. Which is another way of describing the Dunning-Kruger effect - incompetent people don't know they are incompetent. They are overconfident.
However, a new paper by Peter Schanbacher suggests that, in many cases, what looks like overconfidence might in fact be quite rational.
To see his point, take a different case. Imagine a coin is biased to land on heads 80% of the time and tails 20%. Predict the next 10 tosses.
A good answer to this question would simply be 10 heads. This will be more likely to be right than any random-looking sequence of eight heads and two tails. But this answer is biased - it overpredicts heads.
The lesson here is that there can be a trade-off between bias and variance. The "10 heads" answer is biased, but it has less variance (is less likely to be wrong) than (say) HHHTHHHTHH.
A similar problem faces the ignorant individual when asked about his chances of solving a problem correctly. The same lack of expertise that makes someone ignorant can also mean there is large variance around his estimate of his competence. One rational solution to this, says, Schanbacher, is for him to give a biased answer - to over-estimate his competence - just as a biased answer to our coin-toss question is reasonable.
In this sense, overconfidence might be a good answer to a bias-variance trade-off.
This is not the only way in which overconfidence might be rational:
- Overconfidence might be the only way of steeling oneself to take risky decisions, such as to start one's own business.
- Overconfidence can be mistaken for genuine skill. Being overconfident, then, is a good career strategy.
All this raises a distinction which is often overlooked.
I suspect that the large majority of people who are overconfident have not solved a bias-variance trade-off or chosen a career-enhancing strategy. They are just mistaken. In this sense they are irrational, in the sense of having a belief for which they do not have evidence. However, they act as if they are rational, in the sense of doing something which maximizes their chances of success.
In other words, rational behaviour is more common than rational thinking.
"This is not the only way in which overconfidence might be rational:
- Overconfidence might be the only way of steeling oneself to take risky decisions, such as to start one's own business.
- Overconfidence can be mistaken for genuine skill. Being overconfident, then, is a good career strategy."
The second point is entirely correct, and represents a common form of bias in our assessment of others' competence. We believe in braggarts. The first point involves an unfounded tacit assumption. Starting one's own business must be an inherently good thing, with the goodness under-recognized, in order for deceiving one's self into taking a risk to start a business to be "rational".
Posted by: kharris | July 09, 2012 at 03:07 PM
So, an incompetent person has the confidence to believe that they have the skills to ensure that they are never irrational or mistaken… I think I know that person… in fact I think I know quite a lot of those people.
Posted by: Edward Harkins | July 09, 2012 at 07:03 PM
Hi Chris
I really enjoy your blog. Therefore, I must point out that "The "10 heads" answer is biased, but it has less variance (is less likely to be wrong) than (say) HHHTHHHTHH" is not true. P(10 heads) = 0.8^10 = 0.107. P(HHHTHHHTHH) = 0.8*0.8*0.8*0.2*0.8*0.8*0.8*0.2*0.8*0.8 =0.0067. Maybe I misunderstand your phrase "has less variance"?
Anyway, it probably doesn't affect your conclusion, which seems very very sensible.
Posted by: Phil | July 11, 2012 at 01:40 AM
Many apologies, I misread your article! Ignore me ;-)
Posted by: Phil | July 11, 2012 at 01:42 AM
You are playing fast and loose with terms like bias here. Ten head is by far the most likely individual outcome.
Precisely what do you mean when you say that answer has "bias"? Compared to what other model? The mean squared error, calculated how? Remember if you answer, that we were asked for a specific sequence prediction.
This bears no relation to the notion of bias/variance trade off discussed in your (excruciating) first link. And you (Bayesian, and therefore intelligent) second link author explains why that trade off is bollocks anyway.
Posted by: Andrew | July 11, 2012 at 10:00 PM
My God that paper is mind-numbing! Do you actually read this stuff for pleasure? Gibberish!:
"The correct answer can be regarded as a random variable Y with Y = 1 if the
subject has choosen the correct answer."
Typical main stream economics/social science. Submerge total muddy headed and ill-defined confusion under thirty three metric shitloads of equations and hope nobody will notice.
For the record, when he says that it can be "rational" to be overconfident he doesn't mean anything of the kind.
He means that when asked to estimate probability of success, it may produce a lower mean squared error in certain situations to give a number above the mean actual probability. That is called bias. But it isn't overconfidence. It is the most reliable answer, defined in the experimenter's own terms of MSE, that the subject can give.
Posted by: Andrew | July 11, 2012 at 10:15 PM
The only real point here is that the mean squared error uses....the mean average.
And that there are some distributions where you get a lower MSE if you bias your estimates away from the mean average, given certain assumptions about that average (Perhaps towards the median or mode).
That IS NOT the same as over or underconfidence, despite it being your pet cognitive bias.
Posted by: Andrew | July 11, 2012 at 10:24 PM