A poll of 1,200 undergrads at 100 colleges in the United States found that 73% of the students think iPods are “in”. One tenth of all old people know that “in” means “hip”. Half of all old people think “hip” means “the thing I just got replaced”. Drinking beer and
stalking Facebook tied for second most “in” thing — scoring affirmative amongst “71%” of the students. Sorry, got a little bit too aggressive with the quotes; I promise it won’t happen again. Given my infatuation with alcohol, I figured this problem needed to be addressed. By problem, I mean the 29% who don’t think beer is the shit.
Please bear with me as I take a diversion into nerdiness (some will wonder whether I really need to divert from anything to approach nerdiness). The article notes, “The margin of error is plus or minus 2.3 percentage points.” I’ve played around with different values, and I can’t figure out what the hell they are referring to as the “margin of error”. (Those quotes were necessary.) The 95% confidence intervals for the two surveys, iPods and beer, are +/-2.5% and +/-2.6%, respectively. Did they just divide one side of the C.I. in half to get the margin of error?
I also remembered (from intro stats) that many surveys assume that the probability of success is 0.5 (rather than the observed values of 0.73 and 0.71) to maximize the error (making any conclusions highly conservative). If we do that, we get a 95% C.I. of +/-2.8%. Once again, there is no obvious way to get from that C.I. to the reported 2.3% margin of error. Anyone have any clue what the hell the margin of error means? Drop a note in the comments if you’ve got an idea.
It’s important to note that the iPods are not significantly more popular than beer (p=0.8). Thank god. A significant difference would provide irrefutable evidence that college students are absolute tools. Please follow that link. It was the number one Google Image result for the search string beer+ipod.