Oversight of Regents scoring has serious flaws, state audit finds

The New York State Education Department is failing to ensure that Regents tests are properly scored, according to an audit published today by the state comptroller’s office.

The exams are given to high school students, who have to pass five in different subject areas in order to receive a Regents diploma. Teachers normally administer and score the tests under the supervision of each school’s principal, and the school district is responsible for reporting scores to the state.

The audit focused on the review process the state uses to ensure the scoring is accurate and consistent. In these reviews, a group of teachers and NYSED officials re-score a random selection of exams and compare them to how the tests were originally scored to judge the accuracy. The review team then makes recommendations to the state and to schools about how to improve the scoring process.

In the most recent review, completed in 2005, the scores awarded by schools were routinely higher than the scores given by the reviewers, and reviewers reported that school scorers frequently assigned full credit to student answers that were “vague, incomplete, inaccurate or insufficiently detailed.”

But auditors found little to suggest that the state followed up to improve the process, the report says.

“For example, we found no evidence actions were taken to implement the Review team’s recommendations to improve scoring training and enhance quality control during the scoring process. We also found no evidence actions were taken to bring about improvements at particular schools,” the auditors write.

Moreover, auditors found that when school districts don’t submit their scored exams for review, NYSED regularly failed to follow up with the districts to obtain the tests and review them.

“We recommend follow up take place because there is considerable risk that the failure to submit scored exams may be a willful attempt to avoid scrutiny of scoring accuracy,” the report states.

The audit also reported flaws in the state’s investigations of complaints about Regents exam scoring practices. Of 13 complaints logged in recent years, auditors found documentation that an investigation had occurred in only one case. They did find evidence of investigations into complaints not found in the state’s log, raising questions about the education department’s record-keeping.

Auditors reviewed the state’s oversight process between 2006 and 2009 both in the department and in selected school districts, though it did not specify which districts. (I’ve asked the state comptroller’s office for this information and have not heard back; I’ll update when I find out.)

The audit arrives at a time when the reliability of state exam results has come under increasingly intense scrutiny. Most of the attention has been directed toward the third through eighth grade tests, which critics say have become increasingly easy over time and subject to score inflation. But the Regents exams have not escaped criticism.

The state education department has “played with the scoring in a way that has played with the public trust,” said education historian Diane Ravitch. “I think a shadow has been cast over the Regents because they’re not transparent.”

The audit issued 12 recommendations for how the state should improve its oversight of the exam scoring process, including requiring schools to implement a plan to correct their practices if scoring is found to be unreliable and creating a formal way to identify schools that are at high risk for scoring problems.

State education commissioner David Steiner said in a statement today that the department is reviewing its entire accountability system. “We are implementing the recommendations in the [Office of the State Comptroller] audit and have moved to put additional improvements in place for the oversight of the scoring process for Regents examinations,” Steiner said, though he did not specify what those additional measures might be.

In a September letter to auditors, interim education commissioner Carole Huxley agreed with 11 of the proposals and said that the education department would implement them.

The one area that state officials contested was the recommendation that the department require schools with known scoring irregularities to report changes made to exam scores that the reviewers found to have errors. Huxley argued that because students have already graduated by the time scores are reviewed, it’s not possible to go back and revise the scores.

Ravitch said that while it’s impossible to recall the diplomas of individual students, it is important to know which scores may have been compromised for policy and research purposes.

“Basically the state needs to say any research study based on these scores must acknowledge that the scores are compromised,” she said. “You cannot reach any policy decision based on these scores.”

The reliability of state test scores may also come under even more scrutiny because of the federal government’s emphasis on building data tracking systems and using them to evaluate schools and teacher performance over time. The federal Department of Education is tying state initiatives in these areas to Race to the Top grant money.

Dee Alpert, publisher of the advocacy site Special Education Muckraker, said that problems with state test data should hurt the state’s chances at receiving the federal grant money. “Why on earth is anyone going to give New York State money to set up big sophisticated databases when the numbers in it are complete fiction?” she said.

Steiner and Board of Regents Chancellor Merryl Tisch have said that one of their main priorities is to improve the validity and reliability of state exams.

Here’s the audit report from the state comptroller’s office. The education department’s response to the audit begins on page 21: