Over at Open Left, jeffbinnc pithily summarizes all of the metrics of which educational 'reformers' are fond:
Then, to illustrate just how the focus on more and better tests is going to be raised to the levels of panacea, the CAP rolled out a new report last week that based just about everything on the notion that test scores are the be-all and the end-all of education attainment in our country.
As I noted in a Quick Hit here on Open Left, CAP's analysis of school performance "relied on the results of 2008 state reading and math assessments in fourth grade, eighth grade, and high school" along with "spending data from the 2008 school year" to calculate a Return on Educational Investment for nearly every major school district in America.
So now this new ROEI metric joins all the other test-based metrics driving school improvement efforts - including VAM to measure teacher effectiveness, AYP (or whatever its new derivative will be) to measure school effectiveness, and HST to determine an individual student's progress in learning.
Unfortunately the test-based alphabet soup that Obama and the neoliberal idealogues will serve up to the American public on January 25 is in reality a shit sandwich.
Yummy educational goodness! In a slightly more serious vein...
Regular readers of this blog will know that I do think numerical analysis is critical for understanding educational outcomes and performance. But you have to understand the limitations of your methods (and, of course, use them correctly). As Rick Ayers notes, you have to make sure your measurements shed light, not obscure important information (italics mine):
This occurred to me while watching the baseball playoffs and thinking about Moneyball, Michael Lewis' brilliant exploration of Billy Beane and his strategy for turning around the Oakland A's even as a small market, low-budget team. Beane's management team was decried by the baseball traditionalists as a group of pinheads and bean-counters -- basically MBAs.
But what they came up with was extraordinary. Instead of using the traditional narrow measurements of a player's value, such as batting average or runs batted in, they examined the many ways a player can add value to a team -- the slugging percentage (which accounts for multiple base hits), on-base percentage (which includes times one was walked or hit by a pitch), etc. Moving away from narrow measures, they did a more complex analysis. As assessment data was more democratized and as more people could use computers to broaden their analysis, these small teams could go around the traditional hierarchies that had dominated player hiring and promotion. And, using their more complex analytical system, the Oakland A's were able to recruit AAA players who were low cost (maybe their batting average was not stellar) but high value to the team.
But here's the take home point. The upstart MBAs in the Billy Beane mold were broadening their metrics. They were choosing more and more data points, more information, more nuanced understanding of performance, to reward value. But our education MBA's have taken the lazy route. They should be broadening assessments to understand what students know and are able to do -- looking at qualitative evaluations, performance and portfolio and project based assessments, and learning in multiple modes that include creative and arts fields. Instead, they have narrowed the assessment -- really to only one measure, the standardized test. And that way lies disaster. Because, as anyone in business can tell you, a single metric bends all the efforts to polishing up that one measure, to the detriment of other important factors and even to the derailing of the whole enterprise.
We are going to see an incredible amount of faith-based 'reasoning' regarding education during the next few years, and it's going to come from progressives (as opposed to the Left) as much as it will conservatives.
Suffer the little children....
Because, as anyone in business can tell you, a single metric bends all the efforts to polishing up that one measure, to the detriment of other important factors and even to the derailing of the whole enterprise.
I can attest to that. I teach chemistry, which is taught after biology in my county of Maryland. The students are less able to think, less able to learn, then before the state standardized test. They are definitely not dumber kids, they have just been taught to parrot back answers to particular questions. "If the question says Blah-blah, the answer is 'ribosomes'."
Really, Really, REALLY basic management principle: you get what you measure. If you measure lines of code per day, you get some seriously verbose code -- and generally poor quality. Measure defects and you get very low defect rates. It may not do much, but it does whatever without errors. Etc.
If what you want is to graduate young adults prepared for the lives that they want to lead, measure that -- and then correlate it to earlier metrics that can be corrected more rapidly.
We haven't done part 1, so for everything else we're either guessing or hoping to get lucky.