Are bloggers better writers, or do they just like to listen to themselves talk?

Today's analysis of the Blogger SAT Challenge results is the one I've been looking forward to the most. After subjecting 109 people to a sample question from the SAT writing test, we've learned that bloggers are dumber than high school kids (though there's some reason to question that analysis). Our participants, most of them bloggers, didn't fare nearly as well as high schoolers.

But bloggers have all sorts of excuses to explain their poor results: They were multitasking at the time; they hadn't spent 18 months in an SAT prep course like the high schoolers; the judges don't "get" sarcasm. Fine. Let's compare the bloggers to another group with all the same disadvantages: the non-bloggers who participated in the challenge. Of the 109 participants, only 63 were genuine bloggers with listings in Technorati. So, how did they compare to the non-bloggers in the group?

Actually, not too badly:

The bloggers' average grade on the test was a 3.13, while non-bloggers averaged just 2.60. Bloggers still didn't score better than high schoolers: Their 5.74 average SAT-format grade was much lower than high schoolers' 7.2 average. Below the fold, I'll present a graph of three different comparisons between bloggers and non-bloggers.

i-27f54737d2c6579885e3f6a9ff51d22f-blogger1.gif

As you can see, bloggers did better in all three measures (though only the "average most popular rating" difference was actually statistically significant). But this brings up another question. Can we find scoring differences among the bloggers as well?

I looked up the Technorati rankings for each of the 63 participating blogs. I also recorded the number of blogs linking to each of these blogs. Here's a chart showing the relationship between the number of links to each participating blog and that blogger's score on the Challenge.

i-cd0e75fd6047570be3bf0ab00b9d8013-blogger2.gif

The trendline shows that scores do appear to be higher when blogs have more links to them. Since the number of blogs linking to a particular blog is an accepted measure of a blog's popularity, we can say that more popular bloggers tend to score higher on the SAT test. Indeed, the number of links to a blog is positively correlated to grade with a correlation coefficient of .36 -- a significant correlation!

The chart of average reader ratings tells a similar story:

i-214d35155347e7c17d720f26f735f59a-blogger3.gif

Once again, there's a significant positive correlation (r=.32) between the number of links to a blog and the quality of the blog author's SAT challenge essay.

What this data doesn't tell us is whether blogging makes you a better writer. It might be that better writers choose to write blogs by virtue of their writing ability. Nonetheless, these results may be somewhat of a vindication for the bloggers. They may not be as good as high schoolers, but at least they're better than all those other schmucks out there!

More like this

Two weeks ago, after reading the New York Times Article which judged the best high school writers harshly, Chad Orzel came up with an idea that was so good it just had to be tried: Somebody ought to get a bunch of bloggers together, and give them the writing SAT under timed conditions, and see what…
As of yesterday, readers had made an astonishing 3,878 individual ratings of the essays in the Blogger SAT Challenge. The average rating was 2.76, compared to 2.9 from the expert judges. Averaging the most popular rating for each essay comes up with an even lower number, 2.51. Anyone who thought…
As discussed last week, the comments about the perfect-scoring SAT essays published in the New York Times made me wonder whether bloggers could do any better. On the plus side, bloggers write all the time, of their own free will. On the minus side, they don't have to work under test conditions,…
Last week we wondered how thorough news reporters were being when they conducted "person on the street" interviews with questions from the U.S. citizenship test. We decided to administer the test a bit more systematically (but still not scientifically). Over 680 people responded to our study,…

How about correlation to time since they started blogging?

Blogging time should correlate with writing ability, simply because practice makes perfect. But it should also correlate with popularity, since it takes time to build an audience, and since becoming a better writer over time should also mean more enjoyment forthe reader.

By that token, a "naive" blogger, the non-bloggers, could perhaps fit into that data as the close-to-zero writing experience group?

It would also be interesting to look at frequency of posts (along with the "practice makes perfect" theme) and the content of posts. Personal ramblings on MySpace would correlate with a different score, I think, than an academic blog.

couldnt get my brain going to even think about the graphs, how about good bloggs are good to read if you like the subject they are talking about and blogs that are boring your not intrested in because the subject isnt important to you

ck and Janne: I think both frequency of posting and length of blogging career are probably related quite reliably to blog popularity, so we may have those issues covered.

ck: Hmmm, yes, a content-based analysis might be interesting, but I think I've about hit my limit on this subject.

Pete: Yes, clearly writing isn't among your interests. Perhaps you could take up fishing.

I don't know whether I am a good writer. I do know that I'm a better writer than I use to be. I also know that the main thing that improved my writing was getting into email flame-wars with my fellow grad students back in the day. There's nothing like having your ego converted into text and put on the chopping block for immediate feedback in front of an audience to really condition you into being (more) worth reading (than you were before).

Why shouldn't the same be true of blogs?

Could it be that beyond a point, better writing (as judged by an expert's score) turns out to be negatively co-related with a blog's popularity? Perhaps, because it's too difficult or too different from what their audience can relate to?

The second figure is not really convincing. What happens to the trendline and correlation if you remove the point with a grade of 6?

Lylebot, the correlation diminishes substantially (to r=0.21) if you remove the outlier, but in principle, I find little justification for doing so. The same essay was rated highly by both the expert graders and by the blog readers. *Someone* has to score the highest, and if that person also has the most popular blog, doesn't that say something about the relationship between blog popularity and writing ability?

The big problem with this, of course, is that the essay in question was written by me.

That said, I think a case could also be made for removing a second outlier, the essay from the fourth-most-popular blog which received a score of 1. If you remove both my essay and that one from the analysis, the correlation moves back up to a fairly respectable level (r=0.28), still a significant correlation.