But let's start with the positive. There are several groups of categories that are based on value-added data. The state's Value-Added Measure (VAM) is notoriously opaque because it remains a proprietary calculation. So it's hard to figure out the metrics that go into the calculation. However, value-added measures hold much more hope for true analysis of achievement than straight test scores. On Performance Index scores, for example, I can predict what three out of four districts would get. Not as easy to do with VAM.
VAM looks at what the expected learning growth would be for a particular child or cohort, then sees what the growth actually was. If the child exceeds expectations, the VAM score is good. If they fall short, then it's bad. That's really oversimplified, but in a nutshell that's how VAM works.
What we've seen around the state on the new report card is the VAM scores of districts were really low in some traditionally high-performing districts, like Hudson, which got a D on the VAM scores for the bottom performers. Meanwhile, traditionally poorer performing districts, like Barberton, scored As in that category. Some with the measures for Gifted, Special Education and the other VAM categories.
What I'm hoping is that this new look at growth, rather than score, will help traditionally ridiculed districts start to demonstrate what is almost certainly true -- many are performing miracles in incredibly difficult and trying situations.
However, we should be careful to jump quickly down the throats of wealthy, suburban areas who aren't seeing great growth among the most vulnerable students attending those schools. That's because the calculation is based on an examination of the bottom 20% of scores, which in Hudson are probably much higher, on the whole, than in Barberton. So while Barberton receives high marks (and rightly so) for growth, perhaps Hudson's growth is equally impressive, even though it may not be as great an improvement.
Is growing a child from 40 to 60 more impressive than growing one from 70 to 80? I don't know. But that is a nuance that the state should try to account for in future calculations. And frankly, using sophisticated statistical analysis, it's possible to do that.
Now for the problem, and it's a problem nearly all metrics have: they are based on standardized tests taken in a few courses in a few grades. And they give nowhere near a complete picture of a child's educational experience. We need to be measuring more meaningful things, such as critical thinking, creativity, love of learning, etc. But all we do is a few core subjects.
But at least with VAM, we are negating a bit the utter dependence test scores have traditionally had upon a child's home life.
The other amazing thing about the new report card is that even though the worst 25% of Charter Schools are no longer included with the other Charter Schools, Charters' overall performance remains far worse, on the whole, than traditional public schools. I blogged about this today over at Innovation Ohio.
In short, here's a graphic to help illustrate the point:
The bottom line is this: far too many Charters are doing worse, costing more and draining far too many resources from the traditional public schools. If 60% of your grades are D and F, isn't it time to completely re-think what you're doing? Just saying.