Casual Fridays: Is test-prepping worth it?

Last week I created a survey that was truly humbling. The goal was to find out if time and money invested in preparing for the SAT and other standardized college admission tests is worth it. The first thing I learned from the study was that Cognitive Daily readers are incredibly smart -- much smarter than me, for example. Take a look at this graph of high-school class ranking among survey respondents:

i-9dffac66e299ca55a2ed811e6068a977-sat1.png

As you can see, nearly half of survey respondents were in the top 4 percent of their high school class, and over 70 percent were in the top 10 percent. Only 15 percent of respondents ranked lower than me.

Of those who gave their SAT or ACT scores (the two major college admissions tests in the U.S.), their average percentile ranking was 93, which means they scored better than 93 percent of those taking the test.

The other reason this study was humbling for me is the massive quantity of data it generated. What was I thinking? (Clearly I wasn't thinking very hard, because as I've pointed out, my readers are much smarter than I am.)

But, undaunted, I've spent all day sorting through the data, and I have found some interesting results. The primary question we wanted to address is whether test-preparation helps. But we were interested in more than just increasing your scores -- we wanted to know if that increase (if it exists) actually helps you get into a better college or university, and whether you in turn had a more satisfying career.

First things first: How does test-prepping affect your scores? Take a look at these two graphs:

i-df6464f24b4d5caee3ef0b12d2ef60ae-sat2.png

There's no relationship between scoring well on the tests and either how much time was spent preparing, or how much money was spent on preparation. This doesn't necessarily mean that people don't improve through preparation, though. People who initially scored lower on the tests or in practice might be more motivated to study and spend money to get better. In fact, there is a significant positive correlation between improvement on the SAT after the first time you take the test and both how much money and time you spend preparing for the test. But the correlation is small: r = .11 for our sample of 582 test-takers.

To find a larger effect, we need to take a look at how much people believe they improved:

i-14a4d481fc2d03625814f1f5976d3198-sat3.png

The blue lines show the relationship between respondents' ratings of how much their scores improved as a result of their preparation, and how much time and money they spent preparing. In both cases, the relationship between preparation and perceived improvement was significant and positive, with rs above .2.

But as prep-time and money increased, so did both stress in taking the test and pressure from parents and teachers to do well. Stress about the test was in turn correlated with lower scores, r = -.18. Not a pretty picture.

And how did all this stress and pressure pan out in the long run? Not so well. Take a look at this graph of job satisfaction compared to test scores:

i-9c657db3afd36284fb068579d4f74af3-sat4.png

There's no relationship. There's no relationship between job satisfaction and how much time or money you spent on test-prep, either. Spending more money and time on test-prep isn't more likely to get you into your first-choice college, either. One factor that might be mitigating all this is the fact that more recent high-school grads have been spending more time and money on test-prep, and they're less likely to be happy with their jobs, possibly just because they haven't yet reached their career goals and are still in near-entry-level positions.

So what good is test-prep? It does mean one thing: The more hours you spent on test-prep when you were in high school, the less likely you are to be unemployed right now.

i-8145d128206678b701ccca8024fafb33-sat5.png

Why? It's hard to say, but one guess is that high-schoolers who spent more time on test-prep are just harder workers, which is one thing that may help you hang onto a job during difficult economic times.

More like this

In yesterday's post, we discussed sex differences in achievement and ability. Few were identified. For the most part, however, this research discussed average differences. The problem with only discussing averages is that people engaged in science and math careers are far from "average" when it…
Given the academic circles I run in, it's not surprising that one of the most repeated stories crossing my social media feeds yesterday had to do with the changes to the SAT. Starting in 2015, the essay section will no longer be mandatory, and they're going to reconfigure the reading and math…
Last week we wondered how thorough news reporters were being when they conducted "person on the street" interviews with questions from the U.S. citizenship test. We decided to administer the test a bit more systematically (but still not scientifically). Over 680 people responded to our study,…
It's test-taking season for high-school juniors in the U.S. Most students take the SAT test, which claims to assess mathematical, verbal, and writing ability to help the college/university admissions process. The pressure to succeed on this and other tests has led to the creation of a whole cottage…

Eh. I was interested in the results, which is why I participated. However, given your enormously skewed distribution, there's not much room for improvement. I.e., if a person's score on first taking the test was at the 93rd percentile, there aren't that many points available to gain no matter how much time/money they spend on studying. It might be more useful to break out the lower portion of the distribution, where there's room for improvement.

By fizzchick (not verified) on 17 Apr 2009 #permalink

It's true that it's tough to improve from the 93rd percentile; this doesn't seem to be a simple random sample, in which case your students would have started at the 50th percentile. Also, what were these students doing to prepare? For the most part, the test prep industry is pretty worthless, so I'm curious to hear what they did to "prepare."

Thanks for this!

I think part of the problem with this is that people's backgrounds before the exams also varies. I missed the original survey, but I think there's one data point you could have collected that you didn't. I suspect most of the survey respondents also took the PSAT or an earlier test with minimal or no practice. Measuring the score jump is a much more stable measure (and what the testing companies usually tout). The trouble with the prep companies claims is that most scores will go up on the second testing, but it's unclear how much is a bit more experience and how much is the duration/quality of training.

I took a test prep course, but I didn't need to learn a shred of new material. The course just made it so that I felt completely comfortable with the format and could pretty much answer SAT-style questions in my sleep. If I remember correctly, my SAT score jumped around 250 points from an already respectable PSAT score. I suspect about half of that jump could have been achieved by taking a few practice tests on my own, but the part of the increase was definitely due to the prep.

Interesting analyses. Does controlling for age affect any of the relationships?

No doubt Cog Daily readers are smart, but I am skeptical that 70 percent were in the top 10% of their graduating classes. I wonder how much is due to inflation in the recall of academic history. I certainly didn't recall my rank, and I graduated in 2001. I guessed, but I'll bet my guess was inflated from my actual rank.

Significance of class rank is totally dependent on the school attended. I was 26th in a class of 104 in a public high school, but 102 of 104 went on to higher education. My SAT score was 1522 (without any prep). I was accepted at every college I applied to and went to a top-rated northeastern SLAC.

I wonder how many people were from very small schools. The top 10% of your class is much easier to acheive in a class of 20 or 200 than of 2000.

By hypatia cade (not verified) on 17 Apr 2009 #permalink

I wonder if anyone other than me had problems recalling class rank or percentile on tests. I know my GPA in high school and college, and my specific scores on the SAT and ACT, but I honestly have no clue what my class rank was or what percentile I was in on anything. You seem to rely a lot on that data, so if anyone other than me totally estimated, your data would be thrown off.

You wrote:

The first thing I learned from the study was that Cognitive Daily readers are incredibly smart -- much smarter than me, for example.

I'm sure you meant to say, "... much smarter than I," or "... much smarter than I am."

I think it does help to have prepared for the test some. But in my experience with my children you don't have to pay a fortune. You can even buy a book and if you'll bother to go through it you come out ahead.

One of the most interesting things I encountered was an English teacher who had a list of novels that had a lot of SAT words. The winner was "Gone with the Wind." So I bribed both my kids to read it. Maybe they even picked up a little Civil War too.

Why would it be easier to achieve the top 10 percent in a smaller school? The smaller the school, the less number of actual slots available that fall in the category of 10 percent. If a school had only 10 students, only one person gets to achieve this feat; in a school of 100, 10 people get a chance. (I think that's why it's a percentage, which makes the size of the school irrelevant.) (Now, quality of a school, that's a different story.)

By Andy Esquivel (not verified) on 18 Apr 2009 #permalink

Dave--
If you're looking for something to study, I'd suggest the visual presentation of data. Without your commentary, I would never have understood those graphs.

The only test of this sort that I prepped for (with a book) was the GMAT. The only way I think it helped was that I was ready for questions that were in a bizarre format that I would have wasted considerable test time trying to understand. Since I have an M.S. and an MBA with distinction, I don't think my lack of intelligence is the issue.

I think restriction-of-range limits the usefulness of your study to an extreme degree.

By Tom Ballard (not verified) on 18 Apr 2009 #permalink

I think that preparation depends on the person. I've always been good at standardized tests (much better than my actual grades, etc.), so any preparation probably only boosted my score maybe a point or two. However, my brother, had a lot of trouble with the English and Reading sections. For him, a Kaplan class did nothing to improve his math scores, which were already good, but did bring his lower scores up probably 5 pts each.

cc: Thanks for the "advice." The graphs weren't intended to be understood without reading the accompanying text. But I'll try to do better next time, for those who can't be bothered to read along.

Your score booster: You're right -- I always get that one wrong. Sorry.

Sorry to be so negative in the comments on this post, but I thought I'd respond to the commenters claiming there's some kind of ceiling effect going on. You're wrong. There is no ceiling effect. Test prep DOES improve your scores, as I said in the post. What we don't have evidence for is the idea that it makes you better than people who don't test prep. That may well be due to a ceiling effect, but that's precisely the point: it may be that it's nearly impossible to test-prep yourself above other equally qualified applicants.

All that time and money spent test-prepping also doesn't appear to be associated with improving your long-term career satisfaction, either -- no ceiling effect there; there's quite a wide distribution in job-satisfaction ratings. There's also no association between test-prepping and getting into the school of your choice.

As long as correlation between amount of money spent and exam scores can be explained relatively easily, I can hardly understand why there is no correlation beetwen time you spent on learning and the scores you get.
For the first example (money vs. scores) I can say:
- not all the tutors are worth the money they get,
- you can pay somebody to explain difficult matter but you can't buy motivation to learn it,
- etc.
For the second (time vs. score)I can think of:
- you can learn even 20 hours per day but more of it is daydreaming, so it doesn't count, to be honest,
- you've got a lot to make up for, so it's not preparing for the exams but learning the basis,
- you can overestimate the time you spent learning.
But none of them can explain why clever people don't profit from the time devoted to learning.

Of course there are a lot of other factors as respondent's background, skills etc (I hope that researcher took care of them). But still the results given above are rather upsetting.

You also have to take into account differences in tests and curves and the fact that the SAT makers themselves claim no difference between people with up 100 points between them (I hope that made sense).

By Woot Wooters (not verified) on 19 Apr 2009 #permalink

This is interesting, but for me my person experience is a bit different. When I went to take the GRE for the 3rd time in my life, I began a systematic test prep on my own. I decided what to study and practice, and then under real test conditions, I took real GRE tests (using the Big book and also the computer prep tests saved for the end). Over time, I was able to chart my progress. The more I studied and practiced, the better the running average of my scores became. The key turned out to be the ability to take actual GRE exams, in my opinion. Personally, the gain for me was significant, but it took considerable effort. What is my point? I already scored very highly on standardized tests. For me, I already knew the ins and outs that are the bread and butter of test prep schools. If you are already at the 90+ percentile, I believe scores can be improved but improving them requires work much different than what commercial test prep likely offers. Thus, I'm not surprised that you did not find an effect given your sample. Standard test prep is likely to do little to aid these already good scores.

I must be my own data point because my test score DROPPED about 110 points after taking test prep course! And since I didn't make their guaranteed 100-point increase, they let me take the course again - for free! The second time around I did make the 100 point increase. But I did much better on SAT II's (no prep) than on SAT I's (2 prep courses).

[I wonder how many people were from very small schools. The top 10% of your class is much easier to achieve in a class of 20 or 200 than of 2000.]

Huh? How does this work? It's still a percentage of the class, so it should be the same. It might actually be more difficult, as some of the more competitive high schools are the smaller ones. I was in a graduating class of 150, where virtually everyone went on to higher education and at least the top 5-6% were admitted to Ivies.

I don't think people are necessarily lying or misremembering - my guess to account for the skew is self-selection bias - only people who did pretty well on the tests wanted to take the survey.