"Private firms don't hire people to make DSGE models" says Noah Smith. We should read this alongside Less Wrong's endorsement of Fermi estimates. Both, in different ways, tell us about the virtues of imprecision.

The problem with precision is that it is very often wrong. In economics, this is generally for two reasons: model assumptions don't match reality, which is true by definition of the word "model"; and data are subject to revision or sampling error.

This means there, as Thomas Mayer said, a trade-off between truth and precision. For example, the statement "GDP has been more or less flat (after seasonal adjustment!) recently" is imprecise, but quite possibly truer than the claim "GDP fell by 0.3% in Q4". And in early 2008 it was more useful to have a rough feeling that we were heading for trouble than it was to have a precise solution to a DSGE model which predicted no recession. It's better to be roughly right than precisely wrong.

Of course, economic forecasters provide precise numbers. But we shouldn't read them as being precise. The guy who's forecasting 1.2% growth this year isn't really saying GDP will rise precisely this amount. He's saying there are reasons to be slightly more optimistic than the consensus (pdf), which is forecasting 0.9% growth.

In other contexts, a rough and ready estimate is mostly good enough, at least as a starting point. Here are three examples:

1. How much does welfare scrounging cost the economy? Guesstimate the number of scroungers. Guesstimate the value-added they'd contribute if they were working. Express as a proportion of GDP. For plausible values, it's a small number.

2. What impact will the small uprating in the minimum wage have on jobs? The adult rate will rise by 1.9%. Economists forecast inflation this year of 2.5%, so this is roughly a 0.6% real fall. Let's call the price-elasticity of demand for labour 1.5. The Low Pay Commission estimates (pdf) that 5.3% of jobs are around minimum wage ones. Multiply these three numbers together and we get 0.048%. Multiply by the number of jobs in the economy (29.73m) and we have roughly 14,000.That's roughly one-eleventh of the sampling variability of employment figures.

3. How risky are shares? The long-term standard deviation of UK annual returns has been 20 percentage points. If we assume average annual real returns of 5%, this implies there's a one-in-six chance of losing 15% or more in a 12-month period.If we assume returns are serially uncorrelated, this implies a one-in-six chance of a weekly fall of 2.7% or more, and of a daily fall of 1.2%. What are the chances of a crash - say a 10% in a day? This is an 8.3 standard deviation event, which a normal distribution tells us is vanishingly unlikely. However, we know that a cubic power law fits the data better. And this tells us we should expect such a move once every 3173 days, or about once every 12 years.This isn't intended as a precise number, but an illustration of the sort of risk involved.

You can quibble with all these numbers. But what would be the point? The answers to my three questions would still be: small, small, big. And the implications for economic policy or for your personal financial planning wouldn't change much either. Indeed, we know from the crisis of 2008 that attempts at more precise risk management did not yield better results than would the back of a beermat.

My point here should be a trivial one - that rough numbers and a feel for magnitudes can get us a long way. This shouldn't need saying. But I fear it does, as it's a correction to two vices. One is a statistical fetishism which thinks that numbers must somehow be precise, and which uses them not to illuminate the truth but to hide it. The other is an academic rigour which inculcates in students an ability to give precise answers to irrelevant questions.

Two responses:

One: Alternatively, you could say that point estimates (predictions) should come with error bars, and we should care greatly about how wide they are. Are there any economic forecasters who do this? Be surprised if there weren't.

Two: sometimes you need a numerical estimate/prediction to feed into some other process (i.e. you need to predict demand for something so you can order stock). Here, you need a numerical estimate, regardless of precision. You accept the error (i.e. eat the loss caused by it, or force your supplier to eat it for you).

Perhaps you're really ranting against the tiny up/down flickers of GDP that are freighted with so much import; but that issue is manufactured by Osborne (in fact, the only manufacture that has increased under his tenure as Chancellor).

Posted by: william | April 15, 2013 at 03:35 PM

@ William - the Bank of England routinely provides error bars. I suspect they're implicit in most other forecasts. But it is often criticised by non-economists for not being more precise; there's a tradeoff between precision and accountability.

I'm not sure how many people in practice base inventories on forecasts.I suspect more often it's a rule of thumb (buy x when inventories hit x) adjusted for eg seasonal variations.

Posted by: chris | April 15, 2013 at 05:44 PM

Mechanisms rather than models.

An approximate answer to the right question is always better than a precise answer to the wrong question.

Posted by: rogerh | April 16, 2013 at 06:47 AM

The word you are looking for is "accuracy".

Your post is about the well-understood ( at least in science) trade-off between precision and accuracy.

Posted by: Andrew | April 17, 2013 at 01:37 PM