Columbia Journalism Review lowers journalistic standards while lecturing on said standards: Part I – Suppressing Uncertainty

Isn’t Columbia University supposed to be good at journalism?  And isn’t Columbia Journalism Review considered to be a decent publication?

It sure doesn’t look that way based on this remarkably poor article by Pierre Bienaimé which turns out to be right up my alley.

The article is an embarrassment of riches for the blog since it spreads numerous falsehoods about the Iraq conflict while simultaneously bidding for a record on the number of journalism traps.that can be packed into a single little article.

So this is a teaching moment.  But I’ll have to spread the material out over several posts.

Where to start?

I have a huge amount of respect for Beth Daponte and she has a quite a nice quote in the article:

When asked about journalistic responsibility, Daponte explains that the press must “take each study and really look at what it does say and what it doesn’t say. This is where journalists really do need to have at least a rudimentary understanding of statistics and confidence intervals, and what sampling really means.”

Bienaimé immediately botches the explanation of what a confidence interval is, surely leading some readers to conclude that Daponte herself does not know what she’s doing.

A confidence interval is the degree of certainty researchers attain that their estimates—in this case, of the death toll—fall within a certain range.

That was CJR, not Beth Daponte, explaining confidence intervals.

Given that he does not know what a confidence interval is it is not surprising that Bienaimé reports only central estimates with no quantification whatsoever of the uncertainty surrounding these estimates.  This is a cardinal sin when reporting on statistical estimates.

Bienaimé compounds the error by reporting central estimates to a preposterous number of significant digits, creating a spurious sense of diamond-sharp precision.

The 2006 study estimated that the war had caused roughly 654,965 excess deaths….

A study published in PLOS Medicine in 2013, applying the same surveying methods used in the Lancet studies on a broader scale,estimated 461,000 excess deaths throughout the war from 2003 to 2011.

Maybe the 2006 study was little off, with the true number being 654,963?

In fairness, Bienaimé just reproduces the quantitatively ignorant presentation of the original paper but at least the original gives a confidence interval.  And perhaps one can make a case for carrying non-zero digits to the 1,000’s place as the second study does.

Totally suppressing all uncertainty is unconscionable.

For your convenience the 95% uncertainty interval for:

1.  the 2006 study is 392,979 to 942,636 (with no information provided on in the digits in the tenth place although I suspect the top should really go up to 942,636.199):

2.  the 2011 study is 48,000 to 751,000.

Advertisements

3 thoughts on “Columbia Journalism Review lowers journalistic standards while lecturing on said standards: Part I – Suppressing Uncertainty

  1. Good point on the CI’s. It seems to be a common habit for some to drop these and operate as if you just have a single solid number. Although with this 461,000 number there actually isn’t a proper CI. This was sort of a speculative fudge that pushed the number from 405,000 up to 461,000 and then no CI is given. I suppose you could just carry the original CI over but that wouldn’t really be right. If you tried to somehow make a CI for the fudged up number it would probably have to be even wider than the original.

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s