broken promise?

Common Core tests were supposed to usher in a new era of comparing America’s schools. What happened?

PHOTO: Pete Souza / White House
President Barack Obama and former Secretary of Education Arne Duncan in 2010.

How well does an elementary school in Maryland stack up to one in New Jersey? Do California’s eighth graders make faster academic gains than their peers in Connecticut?

In 2010, then-Secretary of Education Arne Duncan made the case for common state tests that would allow parents and educators to find out — and predicted that the comparisons would lead to dramatic policy changes.

“For the first time, it will be possible for parents and school leaders to assess and compare in detail how students in their state are doing compared to students in other states,” Duncan said. “That transparency, and the honest dialogue it will create, will drive school reform to a whole new level.” It was a heady moment: Most states had signed on to at least one of the two cross-state testing groups, PARCC and Smarter Balanced.

Though their numbers have since dwindled substantially, the two groups still count over 20 members between them. But seven years later, it remains difficult to make detailed comparisons across states, as a potent mix of technical challenges, privacy concerns, and political calculations have kept the data relatively siloed. And there’s little evidence that the common tests have pushed states to compare notes or change course.

“This is one unkept promise [of] the common assessments,” said Mike Petrilli, president of the Fordham Institute, a conservative think tank that has backed the Common Core standards.

“I’ve been surprised that there haven’t been more attempts to compare PARCC and Smarter Balanced states,” said Chad Aldeman of Bellwether Education Partners.

What comparisons are available? PARCC publishes a PDF document with scores from different states, based on publicly available information. “We have more states than ever administering tests that will allow for comparability across states,” said Arthur Vanderveen, the CEO of New Meridian, the nonprofit that now manages PARCC. “That data is all public and available. I think the vision really has been realized.”

Smarter Balanced does not publish any data comparing states, though those scores could be collected from each participating state’s website.

The presentation of the data stands in contrast to the National Assessment of Educational Progress, a test taken by a sample of students nationwide. NAEP has an interactive site that allows users to compare state data. No such dashboards exist for Smarter Balanced or PARCC, though both tests could offer more granular comparisons of schools and students.

Tony Alpert, the head of Smarter Balanced, says a centralized website would be difficult to create and potentially confusing, since states report their results in slightly different ways.

“The notion of comparable is really complicated,” he said. Nitty-gritty issues like when a test is administered during the school year, or whether a state allows students who are learning English to use translation glossaries on the math exam, can make what seems like a black and white question — are scores comparable? — more gray, he said.

“Early on our states directed us not to provide a public website of the nature you describe, and [decided that] each state would be responsible for producing their results,” said Alpert.

Neither testing group publishes any growth scores across states — that is, how much students in one state are improving relative to students who took the test elsewhere. Many experts say growth scores are a better gauge of school quality, since they are less closely linked to student demographics. (A number of the states in both consortia do calculate growth, but only within their state.)

“I’m not sure why we would do that,” Alpert of Smarter Balanced said. States “haven’t requested that we create a common growth model across all states — and our work is directed by our members.”

That gets at a larger issue of who controls this data. For privacy reasons, student scores are not the property of the consortia, but individual states. PARCC and Smarter Balanced are also run by the states participating, which means there may be resistance to comparisons — especially ones that might be unflattering.

“The consortium doesn’t want to be in the business of ranking its members,” said Morgan Polikoff, a professor at the University of Southern California who has studied the PARCC and Smarter Balanced tests. “Except for the ones that are doing well, [states] don’t have any political incentive to want to release the results.”

As for PARCC, a testing expert who has works directly with the consortium said PARCC has made it possible to compare growth across states — the results just haven’t been released.

“Those [growth scores] have been calculated, but it’s very surprising to me that they’re not interested in making them public,” said Scott Marion, the executive director of the Center for Assessment. This information would allow for comparisons of not just student proficiency across states, but how much students improved, on average, from what state to the next.

Vanderveen confirmed that states have information to calculate growth across states.

But it’s unclear if any have done so or published the scores.

Chalkbeat asked all PARCC states. Colorado, Illinois and Maryland responded that they do not have such data; other states have not yet responded to public records requests.

Vanderveen said that states are more interested in whether students are meeting an absolute bar for performance than in making comparisons to other states. “A relative measure against how others students are performing in other states — and clearly states have decided — that is of less value,” he said.

The cross-state data could be a gold mine for researchers, who are often limited to single states where officials are most forthcoming with data. But both Polikoff and Andrew Ho, a professor at Harvard and testing expert, say they have seen little research that taps into the testing data across states, perhaps because getting state-by-state permission remains difficult.

Challenges in the ability to make comparisons across states and districts led Ho and Stanford researcher Sean Reardon to create their own solution: an entirely separate database for comparing test scores, including growth, across districts in all 50 states. But it’s still not as detailed as the consortia exams.

“One of the promises of the Common Core data was that you might be able to do student-level [growth] models for schools across different states and our data cannot do that,” he said.

new rules

Now that TNReady scores will count less for students, will they even try?

PHOTO: Nicholas Garcia

In the face of a statewide testing debacle, the Tennessee legislature’s hasty edict this week to discount test results has mollified some teachers and parents, but raised more questions about the role of test scores and further eroded the motivation of students, who must labor for about two more weeks on the much-maligned TNReady test.

Thursday’s sweeping measure to allow districts to ignore test results when grading students and to prohibit the use of test scores when determining teacher compensation has left educators and students shrugging their shoulders.

“I’ve gone from ‘oh well, tests are just a part of life’ to ‘this is an egregious waste of time and resources and does not respect the developmental needs of our children,’” said Shelby County parent Tracy O’Connor. For her four children, the testing chaos has “given them the idea that their school system is not particularly competent and the whole thing is a big joke.”

Her son, Alex O’Connor, was even more succinct. “We spend $30 million on tests that don’t work, but we can’t get new textbooks every year?” said the 10th-grader at Central High School. “What’s up with that? I’m sure half of us here could design a better test. It’s like buying a used car for the price of a Lamborghini.”

The legislature’s decision created a new challenge for Tennessee’s Department of Education, which planned to use 2018 TNReady testing data to rate and identify the lowest-performing schools, as required by the federal government. Now, with the test’s reliability under question, state officials say they are determining “additional guidance” to provide districts on how the state will comply with the U.S. Department of Education.

Student test results still will be used to generate a score for each teacher in the Tennessee Value-Added Assessment System, known as TVAAS. Scores will count for 20 percent of teachers’ evaluations, though districts now cannot use the scores for any decisions related to hiring, firing, or compensating teachers.

For students, local school boards will determine how much TNReady scores will count toward final grades — but only up to 15 percent. Several school districts have already expressed serious reservations about the testing data and likely won’t use them in students grades at all. And in previous years, the results didn’t come back in time for districts to incorporate them anyway.

In sum, asked Memphis sophomore Lou Davis, “Why are we doing this anymore when know it won’t count?”

About 650,000 students are supposed to take TNReady this year, with 300,000 of them testing online, according to the state. Each student takes multiple tests. As of Friday, more than  500,000 online tests sessions had been completed.

Even as testing continues, some education leaders worry the exam’s credibility is likely to sink even further, because students might not try, and parents and teachers may not encourage much effort.

“In the immediate term, there’s concern about how seriously people will take the test if they know it’s not going to count,” said Gini Pupo-Walker, head of the Tennessee Educational Equity Coalition and a member of the state’s testing task force. “Will students continue to take the test? Will kids show up? Will parents send their kids to school?” she asked. “Now, there’s the whole question of validity.”

Sara Gast, spokeswoman for the Department of Education, said while the new legislation provides more flexibility for districts in how they use TNReady results, it doesn’t mean that the results don’t matter.

“The results always matter. They provide key feedback on how students are growing and what they are learning, and they provide a big-picture check on how well they are mastering our state academic expectations,” Gast said. “It serves as accountability for the millions of taxpayer dollars that are invested into public education each year.”

Jessica Fogarty, a Tullahoma school board member and parent, says she thinks this year’s testing issues could lead to more parents telling their kids to refuse state tests in the future.

A proponent of opting out of state tests, Fogarty said, “We need to understand that we can choose what our children do or do not suffer through. I hope this debacle showed parents what a waste of time this is — students would gain more through reading a book.”

Because Tennessee has no official opt-out policy, students wanting to opt out must “refuse the test” when their teacher hands it to them.

Jessica Proseus, a parent of a student at Bartlett High School, said her daughter has opted out of state testing in the past, but started taking the exams this year because she believed it could affect her final grades.

“With college looming in a couple years, she couldn’t afford to get zeroes on her report cards,” Proseus said. But with the test debacle, her daughter might change her mind and just skip the remaining two weeks of testing.

“I even took the online practice TNReady a few years ago and it was terribly confusing to navigate,” Proseus said. “The testing in Tennessee is not transparent — it is almost like it is set up to trick and fail children — and that’s very cruel for a young child to deal with.”

Chalkbeat explains

Four reasons Tennessee likely won’t go back to paper testing

As another wave of problems with online testing plague Tennessee schools, one of the solutions proposed by state legislators — go back to paper exams — is a stretch for a state that has invested millions into electronic exams.

In short, reverting to pencil-and-paper tests would be akin to ordering iPhone users to go back to flip phones. It almost certainly won’t happen.

Two Memphis-area state lawmakers want to ban the online version of TNReady starting next school year until the state comptroller determines its problems are “fully and completely fixed.” And other lawmakers suggest districts should be able to choose between paper and electronic testing..

(Other amendments that would ensure this year’s test results wouldn’t count against teachers, students, or schools passed Thursday.)

The list of problems has grown since the first day of testing Monday, affecting about two dozen districts, including the four largest ones in Tennessee. The meltdown follows the monumental online failure in 2016 when a server crash prompted Education Commissioner Candice McQueen to cancel most of state testing that year.

Here are four reasons why it’s unlikely Tennessee won’t go back to paper testing despite current overwhelming frustrations:

Superintendents think they’ve gone too far to turn back now. Maryville Director of Schools Mike Winstead cautioned against rash decisions in the heat of the moment.

“When things like this happen, it’s easy to overreact,” he told Chalkbeat. “But we’ve come too far. We know that online testing is the future. If we turn back, it will take a long time to get back to where we were.”

And school systems and counties have poured millions into infrastructure and devices, said Dale Lynch, the executive director of the Tennessee Organization of School Superintendents.

“We don’t want to back up. We want to get it right, though,” he said.

Paper is more time consuming. With online testing, McQueen said Wednesday, “we can get test materials [and scores] back or to folks much quicker.”

Preparing paper tests requires hours of sorting and labeling exams. And if the materials arrive late, like they did for several districts this month because of severe weather at Questar’s printing center in the Northeast, the time crunch is especially stressful.

Granted, a top-notch online system that protects against cheating and hacking could be more expensive than a paper version, said Wayne Camara, the research chair at ACT who has long overseen test security.

“The issue of cost is relative.” he said. Multiple versions of computer tests are necessary to help safeguard against cheating, especially via social media.

“If you’ve having to produce 10 or 15 forms of a computer test, most likely it’s not cheaper.”

If Tennessee switches back to paper testing, it will be one of few states nationwide. A recent analysis by John Hopkins School of Education listed 11 states that were still using paper tests in 2016 for elementary students. For middle schools, it was nine states.

Nearly across the board, those states with no experience with online testing did worse in national online testing.


Read more about Tennessee’s most recent performance on “the nation’s report card.”


There’s security issues with paper too. The alleged cyber attack on Questar’s data center Tuesday spraked a statewide outcry, but switching back to paper won’t eliminate security issues.

“Both digital- and paper-based testing are certainly susceptible to cheating,” said Camara, the testing cybersecurity expert. “I don’t think anybody would say that there’s a significant reduction of security measures or cheating with computers, it’s just different.”

One of the largest state test cheating scandals happened in Atlanta with paper tests when principals and teachers changed student answers. That’s much harder to do online.

Jacinthia Jones and Marta W. Aldrich contributed to this story.