Lies, Damn Lies, and New York Times Graphics

The New York Times has a policy on anonymous sources. Great! But do they have a policy on statistics? They certainly need it. I mean, take a look at the graphic from an article on women and smart phones:
i-6c0ed10f448713e85c4e7846037644ea-nyblah.png

I mean, sure the numbers on the right hand side could be used to support the caption "women are more concerned with price, size and design" (and keyboards by their same reasoning) but the hell if I'm going to believe that without knowing what a reasonable estimate of the margin of error is in the survey being reported on. I mean, without such data, the graphic displayed has absolutely no content (and, if the margin of error is plus or minus three your still damn safe ignoring the results.) How can a major media outlet, and one that is constantly bashed as being intellectually elitist, get away with this kind of garbage? (Oh right they're entertainment, and numbers are only entertaining if you don't have to actually think about what they mean.)

More like this

Lott has posted a transcript of the AEI event to publicize The Bias Against Guns. I'll try to correct some of the false statements in the transcript: In 2001, according to government survey evidence, there were about 450,000 crimes that were committed with guns. Of those, there were…
The American Council on Science and Health recently got some exposure on twitter, then a little too much exposure, after publishing this highly problematic (and hysterically bad) op-ed/infographic on twitter and on their site. This opinion piece, presented as if there is some method or objective…
Mark Kleiman: Yes, the survey projected 600,000 excess deaths based on 547 actually reported deaths. That's what "sampling" means, doofus. Every four years, pollsters in the U.S. project the results of voting by 100,000,000 people based on samples of 1000 or so, and get within a few percentage…
I've gotten an absolutely unprecedented number of requests to write about RFK Jr's Rolling Stone article about the 2004 election. RFK Jr's article tries to argue that the 2004 election was stolen. It does a wretched, sloppy, irresponsible job of making the argument. The shame of it is that I…

Yep. There's not much in the way of serious mainstream media anymore. It's not that there's a 'liberal bias,' it's that there's a bias towards making money and that means entertaining instead of enlightening.

The sum of the numbers in the graphic is 78% for the women and 76% for the women. Which suggests that (a) they probably asked "which factor is most important," ignoring that most respondents would have more than one factor they consider important, and (b) for both genders, more respondents chose "none of the above" than chose any of the listed options. So in addition to your concerns about the margin of error, there is the problem that the survey was badly designed to begin with. I've seen more informative "statistical" graphics than this one on the pages of the Onion.

By Eric Lund (not verified) on 11 Jun 2008 #permalink

I am constantly running into this problem trying to read the news and such... what is the methodology? I recently heard an interesting statistic quoted from The Omnivore's Dilemma that 33% of Americans eat at least one fast food meal a day. In trying to parse that, I was struck with so many questions that the statistic's interest quickly faded. What is fast food? What is the margin of error? Are weekends counted? Is each participant in the study (of unknown size) tracked and required to eat fast food all seven days of the week, or is it that on any given day, E[# of Americans eating fast food today]/# of Americans = 0.33?

I guess learning some rudimentary statistics prohibits you from ever seeing factoids the same way again. I think that's a good thing...

By Chris Granade (not verified) on 11 Jun 2008 #permalink

In Israel the law stipulates you need to write the sample size and std. err. in every published statistics. I'm surprised you don't have an equivalent law in the states.

By Eyal Ben David (not verified) on 11 Jun 2008 #permalink

Come on. You're a quantum scientist. Use your quantum consciousness to quantumly manifest a quantum margin of error. Well, quantumly speaking.

By JohnQPublic (not verified) on 11 Jun 2008 #permalink

Unfortunately, JohnQ, "quantum" is not entirely analogous to "bullshit" even if *you* can't tell the difference.

Great print journalism is still around. Unfortunately, it seems as though it is getting harder and harder to find. Thankfully, we now have the internet to compensate.

Sorry, my poor attempt at humor was misunderstood. I was poking fun out how often the word quantum is picked up in the public ("Quantum Healing", "Quantum Wellness", "Quantum Consciousness", etc.) I meant in no way to equate to bullshit. (I certainly do not think that.) And I certainly didn't mean to disparage anyone in the field.

By JohnQPublic (not verified) on 11 Jun 2008 #permalink

I was far more concerned with my phone's wifi capabilities than the price, Kea.

Yeah, that lot of stats is pretty bunk.

Visual Revelations: Improving Data Displays: Ours and the Media's from Visual Revelations (Summer 2007) by Howard Wainer, Chance.

[Chance is the publication of The American Statistical Association (ASA) is a scientific and educational society founded in 1839 with the following mission: To promote excellence in the application of statistical science across the wealth of human endeavor.]

"In the transactions between scientists and the media, influence flows in both directions. About 25 years ago I wrote an oft-cited article with the ironic title: 'How to Display Data Badly.' In it I chose a dozen or so examples of flawed displays and suggested paths toward improvement. Two major newspapers, The New York Times and The Washington Post, were the source of most of my examples, which were drawn over a remarkably short period of time. It wasn't hard to find examples of bad graphs."

"Happily in the intervening years, the same papers have become increasingly aware of the canons of good practice and improved their data displays profoundly. Indeed, when one considers both the complexity of the data often displayed and the short time intervals permitted for their preparation, the results are frequently remarkable.... Graphical practices in scientific journals have not evolved as fast as the mass media. It is time we learned from their example...."

Damn right. That's why I don't believe any numbers unless they have a p-value next to them.

By Josh Schraiber (not verified) on 12 Jun 2008 #permalink