Pi Attitude Zone: Ethics & Altruism

How Do I Know? An InfoGraphic Told Me

Are they serious?  Does that scary graph really mean what I think it means?

A recent bulletin from Britain’s National Union of Journalists prompts Pi to remind this blog’s readers what a tricky area statistics-based charting is.  In the era of visual aids, you need to be constantly on the lookout for truth-bending graphic presentations of numerical “facts”.

It’s a constant danger with market research, especially when the data is in the hands of people with a point to prove.  Pi sees with depressing frequency the kind of charts – and headlines inspired by charts – that twist the truth into pretzel shapes.  For instance, look carefully at the date labels on timeline charts: quite often the intervals between years get shorter as the timeline advances, which can give the false impression that a wiggly-line trend is rapidly accelerating when in fact no such thing is happening.  Similarly, bar charts can be made to seem far more dramatic by fiddling around with the scale on the left-hand side.  Viewer beware.

Then there are all those false-use-of-averages charts, and the silly pronouncements based on them.  Britain’s all-wise, all-knowing BBC recently ran a news story saying that – shock, horror! – “49% of internet users get less than the national average broadband speed”.  Okaaaayyyy....  since the whole statement is based on an average, for heaven’s sake, you would kind of expect roughly half of people to be below the average, and the other half above it, no?  Yet the story is positioned as if it were a national scandal that “they” ought to do something about.

Then there was the story in a British tabloid newspaper, headlined “Daily Fry-Up Boosts Cancer Risk By 20%”.  The story postulated that people who eat lots of fried food have a “significantly” increased chance of contracting pancreatic cancer.  If you took the trouble to delve deeper, however, you would realize that five strict vegetarians per thousand die from the disease, while the rate for people who eat bacon fry-ups is – wait for it – six people per thousand.  The numerical basis for the “20% increased cancer risk” story was therefore a difference of 0.1%.  And this is their justification for starting yet another national health-scare?

Then there are the cases where a “statistical result” trumpeted in the news media is actually smaller than the margin of statistical error, or “confidence interval”.  Example:  a headline saying “Employment shock: jobless up 38,000” was based on a randomized sampling of local job-center figures, with a built-in error potential of plus or minus 87,000. The reality could have been that the jobless number fell by up to 50,000.  But they don’t tell you that, do they? 

If they did, the headline would have read “Jobless figures probably up, or maybe down”.  As an eyeball-grabber, it doesn’t quite have the same punch, does it?

*Sigh*  Pi wonders whether all journalists and propagators of press releases shouldn’t be forced to take a basic course in statistics, and sign a pledge not to use numbers to mislead people.

Zone: Ethics & Altruism Country: Europe Product – Communications