Saturday, September 13, 2014

Report Card: Charter Value Added Grades Not Much Better

One of the only areas where Charter Schools have been able to make an argument for their success has been on value-added measures (VAM). VAM is, essentially, a growth measure. It measures actual student academic growth versus expected student growth. I've always had questions about how the expected growth is calculated, and Ohio's VAM formula is notoriously proprietary (so it can't be re-produced). But that's the idea.

I'm more intrigued by VAM at the macro, district level because its calculation breaks down as you drill into classrooms, and especially teachers. For example, there have been many incidents around the country where a VAM determined a teacher was the worst teacher in the district one year, then the best the next year -- and vice versa. But at the district level, VAM holds more promise, is less swayed by demographics than raw test scores, and is better philosophically. Though it still needs a lot of work.

Anyway, I broke down Ohio's report card, which was released yesterday, on the VAM categories the state measures. Then compared Charter-District performance. Specifically, the grades for overall value added, lowest 20% (how well do the lowest performers grow), and disabled all receive VAM scores. The state also does gifted VAM, but only a handful of Charters have enough gifted kids to qualify. So the comparison is pretty meaningless.

The results are below. As you can see, Charters do a little bit better than their raw scores would indicate. But it's still nothing to write home about.


















Districts still get higher percentages of As and Bs on all the value added categories. Meanwhile, Charters get higher percentages of Ds and Fs than districts do. So, on overall value added for example, a higher percentage of districts get As than Charters get As and Bs, even though the highest percentage of Charter As are in that category.

And even when Charters do get a higher percentage of As, as they do in the lowest 20% category, districts so outperform them on Bs that a higher percentage of districts still get As and Bs in that category. Meanwhile, Charters get higher percentages of Ds and Fs, even in this category.

Remember that every Ohio school district lost money and children to Charter Schools last year (only Ohio's tiny Lake Erie island districts did not). Even Charters in the urban core receive a significant number of kids from outside that urban core. The most famous example, perhaps, is the Columbus Preparatory Academy -- run by the for-profit Mosaica Education, Inc., which has among the highest performance index scores in the state. Yet about half the kids don't come from Columbus City Schools. So is it fair to judge Columbus based on this school's performance? Yet Columbus always is.

Overall, we know that about half of kids in Ohio Charter Schools do not come from the state's urban core districts -- the original site of Ohio's Charter School experiment. We spent $914 million on Charters last school year. And about all we can say positive is that in one value added category, they got 1% more As than districts. But we can also say that they fail at a significantly higher level in all these categories than the districts from which they receive their children and money.

And in perhaps the most bottom line measure there is -- graduation rates -- the difference is stunning:

















What these value-added data demonstrate for me is this: Ohio's Charter Schools perform marginally better overall on them than more traditional measures. But the question I ask is this: does this marginal improvement justify kids in Columbus losing $1,063 every year because the Charter School deduction is so huge? Or kids in Boardman losing $1 million? Or kids in Brooklyn losing more than 60% of their state revenue?

Yes, there are a small handful of Charter Schools that are doing the innovative things Charter Schools were originally intended to do. These tiny pockets of success could be achieved with far less damage to the remaining 90% of kids in Ohio's school districts, where, overall, they attend higher performing options. Instead, the 90% of kids in those districts lose about 7% of their state revenue because Ohio's General Assembly pays more than double the state money to Charter Schools as they do to the child's district of residence and have cared little about the performance of these schools.

Charter School proponents over the next few weeks will probably be able to drill down the VAM enough to show that schools in some area outperform Districts in an area on a measure or two. But should it be this hard to show Charter School success after $7.4 billion spent since 1998? It's not that hard in other states. But it is here in Ohio. The Stanford CREDO study found that Ohio's Charters are one of only 4 states to see their performance slip between 2009 and 2013, while the average Ohio Charter student loses a full marking period in math and 1/3 of one in reading to their public school counterparts.

Given the amount the state has poured into Charter Schools, which is now more than the state spends in a year on kids in school districts through the state's funding formula, you would think the evidence would be overwhelming that Charter Schools are superior. But instead, we have to spend weeks with algorithms and sophisticated statistical tests to find some permutation that shows Charters may be slightly better at one tiny thing the state measures. On all the big, obvious measures, Ohio's Charter Schools just don't cut it overall.

Charter School quality, not quantity must fuel this debate from now on. Whether a school succeeds should be paramount. Not whether it simply exists. We're beyond the point of asking whether we should have school choice. Fifteen years and $7.4 billion in, it's safe to say that choice is here to stay. Now we need to ask a very simple question: "What should those choice options look like?" and, more importantly, "Should they be any good?"