More information can lead to worse decision-making. This study of Italian investors finds that "investors who acquire more information attain lower returns per unit of risk."
This is because of over-confidence. People rely too much upon facts which have little value in predicting outcomes, and as a result take too much risk.
This is not just a problem for people daft enough to be stock-pickers. If this is true of investment decisions, isn't it more likely to be true of political beliefs? I say so for four reasons:
1. Incentives. If I make a bad investment decision, I lose money. If I hold bad political beliefs ("you do" you cry) I lose nothing.
2. Feedback. If I pick bad stocks, I lose money. End of, no excuses. But if my favoured policies seem to have adverse effects, I can hide behind loads of immunizing strategies: the policy was badly implemented; there are other causes; the problems are just short-term, etc.
3. Biased information. The "facts" that form political beliefs are more likely to be a biased sample than those which form stock-pickers' opinions, as they are likely to be disproportionately drawn from friends, bloggers and newspapers who share our beliefs.
4. Groupthink. Stock-picking tends to be a private activity, whereas political beliefs are shared with our friends, fellow bloggers and party colleagues. This can lead to over-confidence in our beliefs, as we think "he's a good guy and he agrees with me - I must be right." Significantly, it seems that when stock-picking is done by groups (pdf), performance worsens.
For these reasons, Mick Hume is right to complain about politicians' reliance upon scientists. The danger is that, as politicians get evidence from scientists, their confidence in their policies will increase by more than the evidence actually warrants.