A generation of youngsters may be developing a skewed view of sex from pornography, a court has heard, after a 12-year-old schoolboy raped and sexually assaulted a younger girl after copying a hardcore film he watched on the internet.
He's right to object. This is an example of what we might call the "journalist's fallacy" - though in this case it's perpetrated by someone from a profession even more ignorant of statistics than journalists. I mean by this the tendency to draw strong inferences from one or two observations, without asking: are these observations a representative sample? In this example, we have a good reason to suspect not; 12-year-old rapists are rare, whilst internet porn is ubiquitous.
I call this the journalist's fallacy (though it's related to the availability heuristic) because journalists are especially prone to it - perhaps because they know that a human interest story or lively image makes for "better writing" than statistical evidence, and they mistake good writing for good thinking*.
Take just three examples.
In the Times, Anushka Asthana writes of "women friends who say they felt they had to slip their engagement rings into their pockets before job interviews". But she doesn't say that, among people of fianceable age (22-29), women actually earn more than men, suggesting either than such behaviour is unnecessary or that employers are easily fooled.
Zoe Williams writes: "My son and his friends are constantly worrying about sugar, how it makes you fat, how you can find it even in places you don't expect." But she doesn't tell us whether her children are representative or not.
Joan Smith writes:
The problem is how many adults ignore health advice altogether. Alcoholism is a huge social problem and so is obesity. A couple of days ago, I walked past a shop where a hugely overweight assistant had slipped outside for a quick cigarette, and I couldn't believe that someone who already had a life-shortening condition was blithely risking lung cancer as well.
She fails to see that one lardy doesn't provide evidence of "many".
I mention these not because they are especially bad examples - I'm sure you could find much worse - but simply because they are recent ones, taken from three different papers today and yesterday. That they are so easy to find might be a sign of how common the error is.
Not only is the error (I suspect) common, it might also be costly, in three related ways:
- In relying so much upon personal experience and anecdote, other information gets downgraded. That information is the hard-found evidence provided by serious statistical research. The journalist's fallacy can easily lead to an anti-intellectualism in which personal, biased and limited experience is prioritized over proper social science. One example of this could be Mary Ann Sieghart's assertion that grammar schools would increase social mobility, oblivious to the fact that the hard research on this issue is, ahem, more ambiguous.
- The ignorance of serious social science breeds overconfidence. The message of lots of research is that facts are hard to come by, exceptions are common, and evidence ambiguous or missing. Such warnings rarely surround anecdotal evidence.
- The combination of overconfidence and generalization from what are often vivid and extreme cases (our 12-year-old alleged rapist) can lead to demands that "something must be done". It might be no accident that the increased influence of columnists upon politicians (if not the public!) under the last government coincided by legislative hyper-activism.
* It does not, of course, follow that bad writing is a mark of good thinking; this is the sociologist's fallacy.