Back in 2016 I wrote that “immigration controls will hurt decent people…because they are the softest targets.” Paul Cotterill interprets this as meaning that I saw the Windrush scandal coming. I think he’s being too kind, but this illustrates an under-appreciated point – that good predictions can be “thin” in the sense of containing little data.
I’ll give you two other examples from my day job.
One is that back in 2007 I pointed to record foreign buying of US shares as a warning that the All-share index would fall sharply over the following 12 months. Which it did.
Then last February I forecast that sterling would rise. Which it has – by 15% against the US$.
Now, I don’t say this to toot my own horn. (There's a reason why these examples are ten years apart!) I do so to point to something these calls have in common. Both are lacking in description. The forecast that share prices would fall in 2008 makes no reference to the financial crisis. And my forecast for sterling ignored most of the things that are usually thought (perhaps wrongly!) to determine exchange rates such as growth or interest rates.
Instead, both these calls rested upon single facts and single theories – that foreign buying of US equities was a measure of irrational exuberance (it still is) ; and that exchange rates overshoot (pdf).
These are not isolated examples. A single simple fact – is the yield curve inverted or not? – has done a better (pdf) job of predicting recessions than the more data-intensive forecasts of macroeconomists, most of whom have missed (pdf) almost all recessions.
What this tells us is that sometimes we don’t need big models to predict things – be they micro-founded or not. One strong fact sometimes beats lots of weak ones*. In these cases, we have lead indicators, but not any detailed explanation.
In saying this I am echoing Jon Elster:
Sometimes we can explain without being able to predict, and sometimes predict without being able to explain. True, in many cases one and the same theory will enable us to do both, but I believe that in the social sciences this is the exception rather than the rule. (Nuts and Bolts for the Social Sciences, p8)
You might think that this contradicts the message of Tetlock and Gardner’s Superforecasting. They say that foxes (who know many things) make better forecasters than hedgehogs, who know one thing.
I’m not sure there’s a contradiction here. For one thing, many of the cases they consider are one-off political events – sometimes quite obscure ones – where success requires careful data-gathering. In the cases I’m considering, however, data is abundant and the trick is to distinguish between signal and noise, to decide what to ignore. Occasionally, one fact sends a stronger signal than many. Which corroborates Tetlock and Gardner’s point, that “in most cases, statistical algorithms beat subjective judgment.”
This is not, of course, to say that simple lead indicators always work. Obviously, they don’t. My point is merely that forecasting is not the same as modelling, nor the same as telling a good story.
* Certainly, rules of thumb often beat attempts to forecast particular events, but that's a different story.
We could save the intellects of a lot of prospective economists from serious damage by requiring that particular book of Elster’s to be compulsory reading for all undergraduates in the subject. Then again, why limit it to prospective economists. Plenty of professionals could learn something too.
Posted by: rjw | April 18, 2018 at 06:20 PM
Do you ever invest on the basis of these forecasts?
Of course the economists whom we see on TV prognosticating and/or justifying the failure of their last prognostications are completely different from the vast majority of serious academic economists.
The Nobel Prize is not awarded to "forecasters".
Posted by: cjcjc | April 19, 2018 at 08:17 AM
Yes, sometimes a strong intuition beats a formal data-driven model. And sometimes the reverse.
The difficulty is obviously knowing which "sometime" dominates in today's decision. Superforecasting's point is simply that knowledge (aka a data-driven perspective) gets a few more sometimes right than intuition does.
Posted by: derrida derider | April 20, 2018 at 03:03 AM