We received an astonishing number of responses to last week’s Casual Fridays study, which claimed to be able to identify what makes a good writer in just a few minutes.
Of course, I wasn’t actually very confident that a brief survey could actually identify the factors that make a good writer. But I did have a hunch that there were certain traits that were more likely to be associated with good writing.
Was there a trick to the study?
Some respondents had a hunch that writing wasn’t the only thing we were interested in. You were right — we were also studying a completely unrelated phenomenon — more on that later.
But we did want to know about your writing as well, so let’s start with that. The study asked a few questions about writing ability: how much writing you do for work/study, how easy writing comes to you, whether you’ve been published, and so on. Then there was a surprise writing test: 3 minutes to write as much as you can on any topic, to be judged for coherence but not content. Finally, a few more questions.
This week’s study asked more of our readers than we usually do, so we expected that we wouldn’t get as many responses as usual. We were wrong about that: over 1,400 responded to the survey, and over 800 wrote an essay response. The average response length was 133 words — quite impressive for a three-minute time limit!
Many of the essays were skeptical that any human would actually read them, but I read every single one. I wanted to get a rough sense of the quality of the essays, so I assigned each a “grade.” To get an A, you had to be coherent for the entire essay, and not switch topics. Just writing complete sentences and only switching topics once or twice earned a B. A semi-random string of sentences earned a C. Incoherent drivel got a D, or in rare cases, an F. These were converted to a 4-point grade scale (where A=4 and F=0). This graph shows the distribution of grades:
As you can see, B was by far the most common grade, with very few Ds and Fs. There were some great little stories, including several I wish the writers had had time to finish. Lots about babies and cats. But did the questions we asked shed any light on what makes a good writer?
For the most part, it confirmed what you might expect. People who said they were good writers were more likely to write better essays. People who said writing came easy to them tended to write more. People who wrote more for work or study also wrote more in our test:
We asked respondents to estimate how much serious writing they did each week, whether for work or for study. This graph plots the number of pages they said they wrote against what they actually wrote in our test. As you can see, people who write more for work/study also tended to write more here, and the correlation was significant.
If English wasn’t your native language, or if you don’t keep a blog, or if you weren’t participating in NaNoWriMo, you were likely to write less and get a lower grade on your writing. People who had read a novel more recently tended to write more (although there was no correlation between reading recently and the grade received). If you type faster, you were likely to write longer and get a better grade.
These, of course, are all just correlations. We can’t say whether any of these things cause you to be a better writer, or whether the reverse is true. You could, for example, have been motivated to learn to type by your interest and writing ability.
So, what’s the catch?
As I mentioned, there was one additional twist to this study–a genuine experiment. It was motivated by the fact that my family participated in the Arbitron Radio ratings this past week. It’s quite a bit of work to do–every member of the family has to record every single radio station they listen to for an entire week. So how does Arbitron get people to do it? They use several tricks. They send a couple dollars with each family member’s diary. They call several times to make sure you received their materials. But one thing I noticed is that at the end of every call, the interviewer was always sure to ask “can I count on you to return your surveys?” I wondered if that language could be used to motivate people to write.
So in our study, we divided respondents into three groups. Before they started their essays, everyone received the same instructions:
The next part of the study is a brief writing test. It will require you to be totally focused on your writing, but it only takes three minutes.
But one group was specifically asked “Can we count on you to devote your full attention to this section?”, one was asked “will you do it?”, and one group was asked no question at all, and instead proceeded directly to the essay test. Did people write more based on the question they were asked? Here are the results:
Those who were asked specifically about whether they would write were significantly more likely to actually complete the essay, and they wrote significantly longer essays. However, there was no significant difference in the results based on the question asked — while more people completed the essay after being asked “can we count on you” rather than “will you do it,” this difference wasn’t statistically significant. What about writing quality? Here are those results:
This time, there was no significant difference in grades between the three groups. But that means that there was also no speed-accuracy trade-off. Even though those who were asked whether they would respond wrote more, their quality wasn’t any worse than those who weren’t asked. Even though writing is a difficult and painful activity for many, just asking them if they will do it actually motivates people to write more.
I have to say, some of the writing produced for this test was very entertaining. Unfortunately, a lot of it is personal and there may be copyright issues, so I’m not going to share any of it with you. This was a fascinating study, and well-worth the time it took me to grade.
(Just a reminder: All Casual Fridays studies are non-scientific. This doesn’t mean we can’t use scientific principles to assess what’s going on, but we can’t make general claims based on the results)