Steep spikes and drops on standardized test scores, a pattern that has indicated cheating in Atlanta and other cities across the nation, have occurred in hundreds of school districts and charter schools across Ohio in the past seven years, a Dayton Daily News analysis found.While quick to point out that it doesn't prove cheating, the DDN was equally quick to point out that similar analysis done a few years ago by the AJC led to an investigation that infamously exposed massive cheating throughout Atlanta Public Schools. The methodology, as described by the DDN was this:
The newspapers used a statistical model on year-to-year changes in reading and math test scores at the school level for third- through eighth-grade students during seven years, beginning in 2005.
The analysis flagged as suspicious any score change that had less than a 5 percent probability of occurring by chance based on all the other scores on that test in that state.
Both the Ohio Department of Education and the Ohio Education Association called the methodology used by the papers as suspect and questionable. And, as an apparent acknowledgement of the methodology's shakiness, the DDN declined to name any of the suspect schools (both traditional and charter) that popped up in their analysis.
I, too, have questions about the methodology. It seems to me that the papers took an important first step toward learning something. They needed to take more. Chance may not have caused it, but it doesn't mean cheating did; nor does it mean chance didn't. They needed to do additional analysis on their suspicious districts.
As anyone who has taught knows, one year's group of students may be more successful than other years', which helps explain some variation; this is why it is important to use multi-year rolling averages, not year-to-year figures when using test scores to determine teacher evaluations. Some years, districts may choose to focus more on test taking techniques to drive up scores in anticipation of a levy, which may explain variation. There may be cheating scandals, yet one would think that if there were, it would be more sustained and not as variable as the DDN piece suggests. You would think that if the district would choose to cheat one year, they would keep cheating; so there wouldn't be the yo-yos between years. I would want to look at schools that had one huge jump in scores, then kept those higher scores in subsequent years.
One thing that makes the rampant cheating argument tougher to make, at least in Ohio, is that a regression analysis of Performance Index scores shows such a strong correlation between demographics and test scores, it is difficult to imagine cheating in about 75 percent of districts because their outcomes can be predicted from their demographic makeup.
One could use that analysis to take the districts whose scores were higher than expected and examine them more closely, looking at erasure marks (like what the USA Today did with Washington, DC's scores) and other methodologies to determine if cheating is indeed as widespread as the DDN is intimating. The paper could have done a regression as a back up to their test data as well to determine if the scores were in line with how districts and charters had scored previously, or whether they scored about where you would expect them to score statewide. For instance, if you would predict a building to be the 2,750th rated building based on its demographics, and they instead ranked 2,735th, you wouldn't think as much about their sudden jump in scores because they continue to be rated about where you'd expect statewide.
I'm not saying any of these examples prove or disprove the DDN story, or that these are the only questions that need answered. What I am saying is answers to these questions would have made it a much stronger story.
If you're going to write a story that suggests massive, statewide (and in AJC's case, national) cheating on standardized tests, you'd better be prepared to name the offenders and feel solid enough in your methodology to refute the state's education agency and largest teachers union, both of whom knocked the papers' methods. If you have to spend a large chunk of your story having competing experts defend and knock your statistical analysis, you need to re-do the analysis. Though it showed integrity for the paper to allow those critical comments in the story.
As a former reporter, I can say these issues would invariably pop up before big stories ran. Sometimes, it means delaying your story for a day or two, or in a few cases, never run them at all. As a journalist, you, as a general rule, cannot spend any time in your story defending your story. If you have to, it means you don't have it nailed down yet; it needs more time in the oven.
However, the part of the DDN story that's not flawed (and didn't require a single bit of statistical analysis) is this: Ohio (and I would assume other states) pay very little attention to cheating, and that's a problem. Ohio only requires test vendors to spent $17,000 of a $39 million contract on it. This means more as greater and greater emphasis is placed on testing. Under the state's Model Teacher Evaluation System, for example, 50 percent of a teacher's evaluation will be based on test scores.
As the stakes go higher, the potential for mischief grows as well. Is this to say that with higher stakes testing, teachers and administrators will cheat more? Not necessarily. But shouldn't the state pay more attention now that tests are assuming not just a position of of parochial pride, but are determining where money goes? Remember that in addition to the new evaluation system, the state provided a bonus for highly rated schools in this year's budget and allowed Charters to open in any district that scored in the bottom 5 percent of school districts on the Performance Index score, potentially removing additional millions in state dollars from districts that scored that low and tend to be more dependant on state dollars.
The DDN and AJC stories brought up very important points: As testing determines more and more in education funding, shouldn't those overseeing tax dollars be more vigilant of potential problems, especially in light of what's happened around the country with huge cheating scandals?
To that question, the answer is obvious.
Whether there's rampant cheating in Ohio and across the country remains to be seen. The DDN and AJC stories, unfortunately, leave more questions than they answer. It will be interesting to see whether the stories lead to greater vigilance and analysis of what's going on in today's world of high-stakes testing.