Recent reports have flagged yet another possible explanation for this year’s dramatic statewide improvement in English test scores: a drop in the number of questions students had to answer correctly in order to pass.
The New York Post wrote Friday that students needed fewer raw score points to pass the exam this year. In 2015, for example, third-grade students needed 34 out of 55 points to pass the English test, while in 2016 they needed 28 out of 47 points. Leonie Haimson, executive director of Class Size Matters, called the overall changes “very fishy” on her blog.
But does the scoring mean the test was easier this year? Not necessarily, said both state officials and outside researchers we consulted.
A spokesperson for the state education department said that questions are often, on average, slightly easier or more difficult each year. In order to ensure the tests are comparable, the number of questions students have to answer correctly can fluctuate each year.
“This analysis is flawed, irresponsible and misleading,” said Emily DeSantis, spokesperson for the New York State Education Department, about the New York Post story. “This year’s tests were just as rigorous as those in the past and the cut scores are the same that have been used since 2013.”
Some researchers agreed that slight fluctuations in the raw scores used don’t automatically imply the test was easier to pass. Recalibration is part of a normal process the state undergoes each year to determine the number of “raw score” points a student needs to earn to be considered proficient, said Jennifer Jennings, a New York University professor and testing expert.
If the test questions are slightly harder, the number of raw score points a student needs to earn could drop — that’s part of the process for determining scores.
“That’s like gravity. That’s just something that has to happen in this world. It’s just a principle of assessing,” she said. “Just eyeballing those numbers [for third grade English], that just looks like year-to-year variation to me.”
Researchers will be better able to discern the difficulty of this year’s exam after the state releases a technical report, which typically comes out a year after the exam results are released, said Aaron Pallas, a professor of sociology and education at Teachers College at Columbia University.
New York State Allies for Public Education, leaders of the statewide opt-out movement, put out their own analysis Friday, detailing decreases in the raw scores it took to pass the tests this year in 11 out of 12 exams. (Though, according to their analysis, the percentages of raw score points required to pass the math tests dropped even more sharply than the percentages in English, without a rise in proficiency scores as steep as the state saw on English.)
Whether or not this year’s test proves comparable to last year’s, advocates’ fears are grounded in a long history of questionable scoring. In 2010, the state was forced to acknowledge score inflation and make the tests harder.
Haimson said she had “15 years of skepticism” about the tests. “I don’t think we should be so innocent to allow these manipulations,” she said. “There is no reason to trust these new tests.”
Meanwhile, the researchers we spoke with said any comparisons between this year’s test and last year’s are inherently flawed. The state acknowledged Friday that the tests are not directly comparable because this year’s tests were shorter, for instance, and untimed.
“The state said in their release that the tests are not comparable,” said Dan Koretz, a professor of education at Harvard. “That’s sort of the end of the story. They’re not comparable.”