The national test of students’ progress has gone digital. A state leader is raising questions about what that means.

The release of “the nation’s report card” on April 10 will be a big deal. The scores put a spotlight on the academic performance of all 50 states and many big school districts, and inevitably lead to jockeying about what the numbers mean for education policy.

That’s why it’s also a big deal when a state leader raises questions about the National Assessment of Educational Progress — which Louisiana Superintendent John White is doing now in the lead-up to the scores’ release.

In a March 23 letter to the National Center for Education Statistics, which administers the NAEP tests, White said he was concerned that this year’s switch from paper-and-pencil exams to computer-based exams might unfairly penalize some states. He called on NCES, a branch of the U.S. Department of Education, to release more information about the issue.

“I would like to be assured,” White wrote, that “the results and trends reported at the state level reflect an evaluation of reading and math skill rather than an evaluation of technology skill.”

Peggy Carr, the acting head of NCES, told Chalkbeat that she does intend to release the information White is requesting, and that the testing group has made extensive efforts to ensure that comparisons are valid during the transition to computer-based testing.

“We did the best of the best in terms of how we executed it,” Carr said of the organization’s study of the print and digital test results. “That is what I will share with the states.”

Still, White’s letter is likely to attract attention because he is among the best-known state schools chiefs — and because his request for more information was also backed by a letter from the Council of Chief State Schools Officers, which represents all state education leaders.

The digital-test dip

Students tend do tend to do worse on exams taken on a computer or a tablet than on one taken with pencil and paper. States have been finding this out on their own yearly tests, including the PARCC.

For the first time in 2017, most students took the NAEP tests digitally. A small number of students continued to take the test on paper, allowing officials to adjust for differences caused by the test-taking mode. The NAEP results are even being released later this time around because that analysis took extra time.

Carr won’t talk about this year’s results yet, and wouldn’t say whether NCES found that taking the test on a tablet affected students’ scores. But she noted that when tablets first started being used, in a 2015 pilot phase, NCES did see such a phenomenon, and that it’s common in educational testing.

“Everyone finds a mode effect when they go from paper and pencil to [digital],” said Carr, using the technical phrase for how different test-taking methods affect student performance.

White’s issue is with how NCES addresses the score dip that comes with the digital tests.

In his letter, White suggests that NCES is making the same adjustment for every student. That might not make sense, he argues. Louisiana and Massachusetts students, for example, have different levels of exposure to technology. Different states and groups of students might need different adjustments.

“No Louisiana student in 4th grade or 8th grade had ever been required to take a state assessment via a computer or tablet as of the 2017 NAEP administration,” White wrote. “This fact, coupled with a variety of social indicators that may correspond with low levels of technology access or skill, may mean that computer usage or skill among Louisiana students, or students in any state, is not equivalent to computer skills in the national population.”

It won’t be clear if White’s concerns have merit until NCES releases its state-by-state analysis.

Andrew Ho, a professor at Harvard and member of the NAEP governing board, said that such questions could be legitimate. For a state truly to be unfairly penalized, though, their students would have to respond differently than other students nationally with similar demographics.

The Louisiana angle

It’s worth noting that there are political incentives for leaders to raise questions about results that don’t make their state look good.

The 2017 NAEP results have been shared with state testing officials, who are instructed not to discuss the results publicly until their wider release. White — who has highlighted gains his state had made on NAEP between 2009 and 2015 — said he couldn’t comment on Louisiana’s performance in the latest round of tests.

Even though researchers warn that it is inappropriate to judge specific policies by raw NAEP results, if White’s letter is a signal that Louisiana’s scores have fallen, that could deal a blow to his controversial tenure, where he’s pushed for vouchers and charter schools, the Common Core, letter grades for schools, and an overhaul of curriculum.  

White said his state’s results are not what’s driving his concerns.

“I doubt that any mode effect would have radically vaulted Louisiana to the top or dropped Louisiana further below,” he said. “The issue is from a national perspective.”

Carr said she plans to respond to White’s letter in writing.