broken promise?

Common Core tests were supposed to usher in a new era of comparing America’s schools. What happened?

PHOTO: Pete Souza / White House
President Barack Obama and former Secretary of Education Arne Duncan in 2010.

How well does an elementary school in Maryland stack up to one in New Jersey? Do California’s eighth graders make faster academic gains than their peers in Connecticut?

In 2010, then-Secretary of Education Arne Duncan made the case for common state tests that would allow parents and educators to find out — and predicted that the comparisons would lead to dramatic policy changes.

“For the first time, it will be possible for parents and school leaders to assess and compare in detail how students in their state are doing compared to students in other states,” Duncan said. “That transparency, and the honest dialogue it will create, will drive school reform to a whole new level.” It was a heady moment: Most states had signed on to at least one of the two cross-state testing groups, PARCC and Smarter Balanced.

Though their numbers have since dwindled substantially, the two groups still count over 20 members between them. But seven years later, it remains difficult to make detailed comparisons across states, as a potent mix of technical challenges, privacy concerns, and political calculations have kept the data relatively siloed. And there’s little evidence that the common tests have pushed states to compare notes or change course.

“This is one unkept promise [of] the common assessments,” said Mike Petrilli, president of the Fordham Institute, a conservative think tank that has backed the Common Core standards.

“I’ve been surprised that there haven’t been more attempts to compare PARCC and Smarter Balanced states,” said Chad Aldeman of Bellwether Education Partners.

What comparisons are available? PARCC publishes a PDF document with scores from different states, based on publicly available information. “We have more states than ever administering tests that will allow for comparability across states,” said Arthur Vanderveen, the CEO of New Meridian, the nonprofit that now manages PARCC. “That data is all public and available. I think the vision really has been realized.”

Smarter Balanced does not publish any data comparing states, though those scores could be collected from each participating state’s website.

The presentation of the data stands in contrast to the National Assessment of Educational Progress, a test taken by a sample of students nationwide. NAEP has an interactive site that allows users to compare state data. No such dashboards exist for Smarter Balanced or PARCC, though both tests could offer more granular comparisons of schools and students.

Tony Alpert, the head of Smarter Balanced, says a centralized website would be difficult to create and potentially confusing, since states report their results in slightly different ways.

“The notion of comparable is really complicated,” he said. Nitty-gritty issues like when a test is administered during the school year, or whether a state allows students who are learning English to use translation glossaries on the math exam, can make what seems like a black and white question — are scores comparable? — more gray, he said.

“Early on our states directed us not to provide a public website of the nature you describe, and [decided that] each state would be responsible for producing their results,” said Alpert.

Neither testing group publishes any growth scores across states — that is, how much students in one state are improving relative to students who took the test elsewhere. Many experts say growth scores are a better gauge of school quality, since they are less closely linked to student demographics. (A number of the states in both consortia do calculate growth, but only within their state.)

“I’m not sure why we would do that,” Alpert of Smarter Balanced said. States “haven’t requested that we create a common growth model across all states — and our work is directed by our members.”

That gets at a larger issue of who controls this data. For privacy reasons, student scores are not the property of the consortia, but individual states. PARCC and Smarter Balanced are also run by the states participating, which means there may be resistance to comparisons — especially ones that might be unflattering.

“The consortium doesn’t want to be in the business of ranking its members,” said Morgan Polikoff, a professor at the University of Southern California who has studied the PARCC and Smarter Balanced tests. “Except for the ones that are doing well, [states] don’t have any political incentive to want to release the results.”

As for PARCC, a testing expert who has works directly with the consortium said PARCC has made it possible to compare growth across states — the results just haven’t been released.

“Those [growth scores] have been calculated, but it’s very surprising to me that they’re not interested in making them public,” said Scott Marion, the executive director of the Center for Assessment. This information would allow for comparisons of not just student proficiency across states, but how much students improved, on average, from what state to the next.

Vanderveen confirmed that states have information to calculate growth across states.

But it’s unclear if any have done so or published the scores.

Chalkbeat asked all PARCC states. Colorado, Illinois and Maryland responded that they do not have such data; other states have not yet responded to public records requests.

Vanderveen said that states are more interested in whether students are meeting an absolute bar for performance than in making comparisons to other states. “A relative measure against how others students are performing in other states — and clearly states have decided — that is of less value,” he said.

The cross-state data could be a gold mine for researchers, who are often limited to single states where officials are most forthcoming with data. But both Polikoff and Andrew Ho, a professor at Harvard and testing expert, say they have seen little research that taps into the testing data across states, perhaps because getting state-by-state permission remains difficult.

Challenges in the ability to make comparisons across states and districts led Ho and Stanford researcher Sean Reardon to create their own solution: an entirely separate database for comparing test scores, including growth, across districts in all 50 states. But it’s still not as detailed as the consortia exams.

“One of the promises of the Common Core data was that you might be able to do student-level [growth] models for schools across different states and our data cannot do that,” he said.

To Do

Tennessee’s new ed chief says troubleshooting testing is first priority

PHOTO: (Caiaimage/Robert Daly)

Penny Schwinn knows that ensuring a smooth testing experience for Tennessee students this spring will be her first order of business as the state’s new education chief.

Even before Gov.-elect Bill Lee announced her hiring on Thursday, she was poring over a recent report by the state’s chief investigator about what went wrong with TNReady testing last spring and figuring out her strategy for a different outcome.

“My first days will be spent talking with educators and superintendents in the field to really understand the scenario here in Tennessee,” said Schwinn, who’s been chief deputy commissioner of academics in Texas since 2016.

“I’ll approach this problem with a healthy mixture of listening and learning,” she added.

Schwinn’s experience with state assessment programs in Texas and in Delaware — where she was assistant secretary of education — is one of the strengths cited by Lee in selecting her for one of his most critical cabinet posts.

The Republican governor-elect has said that getting TNReady right is a must after three straight years of missteps in administration and scoring in Tennessee’s transition to online testing. Last year, technical disruptions interrupted so many testing days that state lawmakers passed emergency legislation ordering that poor scores couldn’t be used to penalize students, teachers, schools, or districts.

Schwinn, 36, recalls dealing with testing headaches during her first days on the job in Texas.

“We had testing disruptions. We had test booklets mailed to the wrong schools. We had answer documents in testing booklets. We had online administration failures,” she told Chalkbeat. “From that, we brought together teachers, superintendents, and experts to figure out solutions, and we had a near-perfect administration of our assessment the next year.”

What she learned in the process: the importance of tight vendor management, including setting clear expectations of what’s expected.

She plans to use the same approach in Tennessee, working closely with people in her new department and Questar Assessment, the state’s current vendor.

“Our job is to think about how to get online testing as close to perfect as possible for our students and educators, and that is going to be a major focus,” she said.

The test itself has gotten good reviews in Tennessee; it’s the online miscues that have many teachers and parents questioning the switch from paper-and-pencil exams. Schwinn sees no choice but to forge ahead online and is quick to list the benefits.

“If you think about how children learn and access information today, many are getting that information from hand-held devices and computers,” she said, “so reflecting that natural experience in our classrooms is incredibly important.”

Schwinn said computerized testing also holds promise for accommodating students with disabilities and provides for a more engaging experience for all students.

“When you look at the multiple-choice tests that we took in school and compare that to an online platform where students can watch videos, perform science experiments, do drag-and-drop and other features, students are just more engaged in the content,” she said.

“It’s a more authentic experience,” she added, “and therefore a better measure of learning.”

Schwinn plans to examine Tennessee’s overall state testing program to look for ways to reduce the number of minutes dedicated to assessment and also to elevate transparency.

She also will oversee the transition when one or more companies take over the state’s testing program beginning next school year. Former Commissioner Candice McQueen ordered a new request for proposals from vendors to provide paper testing for younger students and online testing for older ones. State officials have said they hope to award the contract by spring.

In Texas, a 2018 state audit criticized Schwinn’s handling of two major education contracts, including a no-bid special education contract that lost the state more than $2 million.

In Tennessee, an evaluation committee that includes programmatic, assessment, and technology experts will help to decide the new testing contract, and state lawmakers on the legislature’s Government Operations Committee plan to provide another layer of oversight.

Spring testing in Tennessee is scheduled to begin on April 15. You can learn more about TNReady on the state education department’s website.

Editor’s note: This story has been updated with new information about problems with the handling of two education contracts in Texas. 

Class of 2018

Some Colorado schools see big gains in grad rates. Find yours in our searchable database.

PHOTO: Courtesy of Aurora Public Schools
Aurora West College Preparatory Academy graduates of 2018. The school had a 100 percent graduation rate.

Two metro-area school districts, Westminster and Aurora, recently in the state’s crosshairs for their low-performance, posted significant increases in their graduation rates, according to 2018 numbers released Wednesday.

Westminster, a district that got off the state’s watchlist just last year, had 67.9 percent of its students graduate on time, within four years of starting high school. That was a jump of 10 percentage points from its 57.8 percent graduation rate in 2017.

District officials credit their unique model of competency-based education, which does away with grade levels and requires students prove they mastered content before moving up a level. In previous years, district officials pointed to rising graduation rates that Colorado also tracks for students who take five, six or seven years, but officials say it was bound to impact their 4-year rates as well.

“We saw an upward tick across the board this past year,” said Westminster Superintendent Pam Swanson, referring to state test results and other data also showing achievement increasing. “I think this is one more indicator.”

Swanson said the high school has also focused recently on increasing attendance, now at almost 90 percent, and increasing students’ responsibility for their own learning.

(Sam Park | Chalkbeat)

In Aurora schools, 76.5 percent of students graduated on time in 2018 — a jump of almost 9 percentage points from the 67.6 percent rate of the class of 2017.

“We’re excited these rates demonstrate momentum in our work,” Aurora Superintendent Rico Munn said.

He attributed the increased graduation rates to “better practice, better pedagogy, and better policy.”

One policy that made a difference for the district is a change in law that now allows districts to count students as graduates the year they complete their high school requirements, even if they are enrolled in one of Colorado’s programs to take college courses while doing a fifth year of high school.

According to a state report two years ago, Aurora had 65 students enrolled in this specific concurrent enrollment program who previously wouldn’t have been counted in four-year graduation rates. Only the Denver district has a larger number of such students. Aurora officials said 147 students are enrolled this year in the program.

Those students are successful, Munn said, and shouldn’t be counted against the district’s on-time graduation rates.

Aurora’s previously rising graduation rates helped it dodge corrective state action. But its improvement this year included a first: One high school, Aurora West College Preparatory Academy, had 100 percent of its seniors graduate in 2018.

The school enrolls students in grades six through 12 in northwest Aurora, the most diverse part of the district. Of the more than 1,000 students, 89 percent qualify for subsidized lunch, a measure of poverty.

“This incredible accomplishment demonstrates the strong student-focused culture we have created at Aurora West,” said Principal Taya Tselolikhina in a written statement. “When you establish high expectations and follow up with high levels of support, every student is able to shape a successful future.”

Statewide, the four-year graduation rate once again inched higher, and gaps between the graduation rate of white students and students of color again decreased. But this time, the gaps narrowed even as all student groups increased their graduation rates.

(Sam Park | Chalkbeat)

The rising trend wasn’t universal. In some metro area school districts, graduation rates fell in 2018. That includes Adams 14, the district that is now facing outside management after years of low performance.

The tiny school district of Sheridan, just southwest of Denver, saw a significant drop in graduation rates. In 2018, 64.7 percent of students graduated within four years, down from 72.7 percent of the class of 2017.

Look up four-year graduation rates for your individual school or district in our databases below.

Districts here: