Data dive

Colorado students show gains in literacy on 2018 state tests, but disparities remain

Yadira Rodriguez gets her hair done by Mareli Padilla-Mejia on the first day of school at McGlone Academy. (Photo by AAron Ontiveroz/The Denver Post)

More than half of all Colorado students in third through eighth grade continue to fall below state expectations in reading, writing, and math, according to results of state tests students took this spring. That’s been the case since Colorado switched to more rigorous tests four years ago.

Find your school’s test scores
Look up your elementary or middle school’s test scores in Chalkbeat’s database here. Look up your high school’s test results here.

In literacy, 44.5 percent of students in those grades statewide met expectations. In math, 34.1 percent did. It’s difficult to compare this year’s scores, released Thursday, to scores from previous years because of changes in requirements for which students take which tests.

However, the percentage of students meeting expectations in literacy went up at least slightly this year in every grade, three through eight. The math results were mixed.

Results in both subjects show a persistent and troubling reality mirrored across the country: White and Asian students continue to score higher than black and Hispanic students, and students from middle- and high-income families outperform students from low-income families.

The gaps between students from higher- and lower-income families are about 30 percentage points. For example, 45 percent of sixth-graders from middle- and high-income families met expectations on the state math test, but only 14 percent of sixth-graders from low-income families did.

“As a society and a state, this is unacceptable,” Colorado Education Commissioner Katy Anthes said in a statement. “And every effort must continue to be made to reverse this course.”

Credit: Sam Park

About 550,000 students across Colorado were tested in the spring. Students in third through eighth grades took literacy and math tests that are Colorado’s modified version of the PARCC tests. (The state refers to the tests as the Colorado Measures of Academic Success, or CMAS, tests.) High school students took well-known college entrance exams: Ninth- and 10th-graders took the PSAT, and 11th-graders took the SAT.

The percentage of students meeting expectations on the literacy and math tests varied by grade. In third grade, for example, 40 percent of students met expectations on the literacy test and 39 percent met expectations on the math test. Both represent a 2 percentage-point increase from 2015, the first year Colorado gave the PARCC tests.

Joyce Zurkowski, who oversees testing for the state education department, said that while the upward trends are encouraging, “the change is not happening as quickly as we’d hope.”

Credit: Sam Park

At the high school level, this spring marked the second year Colorado 11th-graders took the SAT, and the third year 10th-graders took the PSAT. Ninth-graders also took the PSAT this year.

Scores on those exams were similar to last year, with Colorado students continuing to do better than national averages. For example, Colorado 11th-graders scored an average of 513 on the SAT reading and writing section, and 501 on the math. The average score of students who took the SAT on the same day nationwide was 497 in reading and writing, and 489 in math.

As in previous years, the data shows girls in grades three through eight scored better on state literacy tests than did boys. The gap between the genders increased the older students got: 54 percent of eighth-grade girls met expectations in literacy, while only 34 percent of boys did.

The reverse was true in math, at least in the lower grades. Boys in grades three through seven scored higher than girls, but eighth-grade girls did slightly better than eighth-grade boys.

Girls also scored higher than boys on the PSAT and SAT, though by 11th grade the gap narrowed to a single point: The average score for girls was 1015; for boys, it was 1014.

Some of the biggest gaps are between students with and without disabilities. For example, just 6 percent of eighth-graders with disabilities met expectations in literacy, compared with 48 percent of eighth-graders without disabilities, a whopping 42-point difference.

Measuring academic progress

The state also calculates the progress students make on the tests year to year. This calculation, known as the “median growth percentile,” measures how much students improve in an academic year compared with other students with similar scores in the previous year.

The state – and many school districts – consider this measurement just as important, if not more important, than raw test scores, which often correlate to students’ level of societal privilege. Growth scores, on the other hand, measure the improvement students make in a year – and provide insight into how effective their teachers and schools are in teaching them.

Because of that, growth scores make up a big portion of the ratings the state gives to schools and districts. Low-rated schools and districts are subject to state sanctions.

A student’s growth is ranked on a scale of 1 to 99. A score of 99 means a student did better on the test than 99 percent of students who scored similarly to him the year before.

Students who score above 50 are considered to have made more than a year’s worth of academic progress in a year’s time, whereas students who score below 50 are considered to have made less than a year’s worth of progress.

Credit: Sam Park

Statewide data shows white students, students from higher-income families, and students without disabilities had growth scores above 50. Students of color, students from low-income families, and students with disabilities had scores below 50.

For example, elementary students who do not qualify for subsidized lunches had a growth score of 54 in both literacy and math. Elementary students who do qualify had a growth score of 47. Having a lower growth score means it may be harder for those students to reach grade level.

Credit: Sam Park

The state also compares the scores of students learning English as a second language to the scores of students who are not. When the data is cut in that way, the differences are minimal in elementary and middle school. For example, the overall growth score in math for elementary-aged English learners was 49, while the score for non-English learners was 51.

However, the difference in growth scores between those two groups was bigger in high school – a trend that holds true for several other student groups, as well.

Difficult to discern

The reason educators and state officials focus on how different groups of students do on the tests is to ensure schools are educating all students – not just those with the most privilege.

Of all the groups, it can be most difficult to tell how well schools are serving students learning English as a second language. That’s because of the way the state categorizes students.

English language learners who attain fluency score very well on the state tests, especially in literacy. But whether they score on par with – or perhaps even better than – native English speakers remains an open question because that category includes other students as well.

That’s not the only reason it can be hard to draw conclusions about the academic progress of different student groups. Colorado has strict student privacy rules that, for example, obscure the growth scores of any group with fewer than 20 students, officials said.

Education advocacy groups have called on the state to release more information that would provide a fuller picture of whether schools and districts are serving all students well.

Participation rates up

Colorado was once a hotbed of the testing opt-out movement, with tens of thousands of fed-up parents excusing their children from taking the state assessments. But participation has been rising, and it was up again this past spring for students in grades three through 10.

It’s likely that part of the increase is due to the passage of a bill in 2015 paring back the amount of time Colorado students spend taking standardized tests.

But there was another factor this year, too: Zurkowski attributed a bump in ninth-grade participation, in particular, to a switch in tests. Ninth-graders took the PSAT this past spring instead of the PARCC tests. Whereas just 76 percent of Colorado ninth-graders participated in the PARCC literacy test last year, nearly 94 percent of ninth-graders took the PSAT, a preparatory test for college-entrance exams and a qualifying test for National Merit scholarships.

“I believe students and parents are recognizing the relevance of the PSAT test,” Zurkowski said.

The state is set to make another switch next year. Instead of administering the PARCC tests to students in grades three through eight, Colorado is developing its own literacy and math tests.

But state officials said they don’t anticipate a significant change in participation or the ability to compare student scores from year to year. The Colorado-developed test questions will be based on the same academic standards as the PARCC questions, Zurkowski said.

Mapping a Turnaround

This is what the State Board of Education hopes to order Adams 14 to do

PHOTO: Hyoung Chang/The Denver Post
Javier Abrego, superintendent of Adams 14 School District on April 17, 2018.

In Colorado’s first-ever attempt to give away management of a school district, state officials Thursday provided a preview of what the final order requiring Adams 14 to give up district management could include.

The State Board of Education is expected to approve its final directives to the district later this month.

Thursday, after expressing a lack of trust in district officials who pleaded their case, the state board asked the Attorney General’s office for advice and help in drafting a final order detailing how the district is to cede authority, and in what areas.

Colorado has never ordered an external organization to take over full management of an entire district.

Among details discussed Thursday, Adams 14 will be required to hire an external manager for at least four years. The district will have 90 days to finalize a contract with an external manager. If it doesn’t, or if the contract doesn’t meet the state’s guidelines, the state may pull the district’s accreditation, which would trigger dissolution of Adams 14.

State board chair Angelika Schroeder said no one wants to have to resort to that measure.

But districts should know, the state board does have “a few more tools in our toolbox,” she said.

In addition, if they get legal clearance, state board members would like to explicitly require the district:

  • To give up hiring and firing authority, at least for at-will employees who are administrators, but not teachers, to the external manager.
    When State Board member Steve Durham questioned the Adams 14 school board President Connie Quintana about this point on Wednesday, she made it clear she was not interested in giving up this authority.
  • To give up instructional, curricular, and teacher training decisions to the external manager.
  • To allow the new external manager to decide if there is value in continuing the existing work with nonprofit Beyond Textbooks.
    District officials have proposed they continue this work and are expanding Beyond Textbooks resources to more schools this year. The state review panel also suggested keeping the Beyond Textbooks partnership, mostly to give teachers continuity instead of switching strategies again.
  • To require Adams 14 to seek an outside manager that uses research-based strategies and has experience working in that role and with similar students.
  • To task the external manager with helping the district improve community engagement.
  • To be more open about their progress.
    The state board wants to be able to keep track of how things are going. State board member Rebecca McClellan said she would like the state board and the department’s progress monitor to be able to do unannounced site visits. Board member Jane Goff asked for brief weekly reports.
  • To allow the external manager to decide if the high school requires additional management or other support.
  • To allow state education officials, and/or the state board, to review the final contract between the district and its selected manager, to review for compliance with the final order.

Facing the potential for losing near total control over his district, Superintendent Javier Abrego Thursday afternoon thanked the state board for “honoring our request.”

The district had accepted the recommendation of external management and brought forward its own proposal — but with the district retaining more authority.

Asked about the ways in which the state board went above and beyond the district’s proposal, such as giving the outside manager the authority to hire and fire administrative staff, Abrego did not seem concerned.

“That has not been determined yet,” he said. “That will all be negotiated.”

The state board asked that the final order include clear instructions about next steps if the district failed to comply with the state’s order.

Indiana A-F grades

Why it’s hard to compare Indianapolis schools under the A-F grading system

PHOTO: Dylan Peers McCoy
Because Thomas Gregg Neighborhood School became an innovation school last year, the state uses a different scale to grade it.

A-F grades for schools across Indiana were released Wednesday, but in the state’s largest district, the grades aren’t necessarily an easy way to compare schools.

An increasing share of Indianapolis Public Schools campuses, last year about 20 percent, are being measured by a different yardstick than others, creating a system where schools with virtually identical results on state tests can receive vastly different letter grades.

The letter grades aim to show how well schools are serving students by measuring both how their students score on state tests and how much their scores improve. But as Chalkbeat reported last year, new schools and schools that join the IPS innovation network can opt to be graded for three years based only on the second measure, known as growth. Schools in the innovation network are part of the district, but they are run by outside charter or nonprofit operators.

Of the 11 out 70 Indianapolis Public Schools campuses that received A marks from the state, eight were graded based on growth alone. They included a school in its first year of operation and seven innovation schools.

At the same time, traditional neighborhood and magnet schools with growth scores as good as or better than the scores at A-rated innovation schools received Bs, Cs, and even Ds.

Of the 13 innovation schools that received grades for last school year, eight received As, two got Bs, two got Cs, and one got a D. Only Herron High School was graded on the same scale as other schools. (For high schools, grades incorporate other measures including graduation rates.)

The result is a system that most parents don’t understand, said Seretha Edwards, a parent of four children at School 43, a school that received a failing grade from the state but would have gotten a B if it were measured by growth alone.

“I just think it’s kind of deceiving,” she added. “I don’t think it paints a fair picture of the schools.”

Indianapolis Public Schools deputy superintendent for academics Aleesia Johnson said the growth scores show schools are on a good trajectory.

“If you see that kids are making progress in terms of growth, that’s a good sign that you’re on the right track,” she said.

Still, she acknowledged that “there’s still a lot of work to do” to get students to pass tests and show proficiency.

Johnson pointed out that often-changing standardized tests and different A-F grades can cause confusion for families, and those measures don’t provide a complete or timely picture for families who want to assess their schools or choose new ones. “I don’t think it gives a lot of valuable information,” she said.

Advocates have said the growth only model makes sense because schools shouldn’t be held accountable for the low passing rates of students that they just began educating. But in practice, the policy benefits charter and innovation schools, which enjoy strong support from Republican lawmakers.

“The concept behind the growth-only model was that we measured newer schools based off of what they are able to do for their students, rather than taking them where they received them,” said Maggie Paino, the director of accountability for the education department. “You’re taking strides to get toward proficiency.”

The situation is even more muddled than usual this year. Schools across the state received two letter grades. One was calculated under a state model that relies largely on test scores, and the other was determined under a plan the state uses to comply with federal standards.

In addition to helping parents choose schools, years of repeated low letter grades from the state can trigger intervention or takeover. But the state has deferred in decisions about intervening in low-rated schools to IPS in recent years.

Back in 2012, the state took over four chronically low-performing Indianapolis schools. Since Superintendent Lewis Ferebee took over, IPS has taken aggressive steps to overhaul struggling schools by “restarting” them as innovation schools with new managers. Other struggling schools have been closed.

School 63, which received its sixth consecutive F from the state, might have faced state intervention in the past. But the school is unlikely to face repercussions because IPS restarted the school by turning it over to an outside manager. The Haughville elementary school is now managed by Matchbook Learning.

Shaina Cavazos and Stephanie Wang contributed reporting.