Denver Public Schools released its annual school ratings Thursday.
Known as the School Performance Framework, or SPF, the district’s rating system assigns each school a color that’s not unlike a letter grade. But instead of A through F, schools are rated from blue (the highest rating) to red (the lowest rating).
Want to know more about how it works and why it matters?
DPS has a website devoted to the School Performance Framework that answers many common questions. We’ve also written a guide with all you need to know about this year’s ratings:
How was my school rated?
Schools are awarded points based on a number of factors and those points are combined to come up with a final score. The factors differ slightly from elementary to middle and high school. For instance, elementary schools are judged partly on how many kindergarteners are reading at grade-level, while high schools are rated in part on how many graduates need — or, preferably, don’t need — remedial classes in college.
But there are several factors on which all schools are evaluated. They include:
Academic Growth: How much students’ scores on state standardized tests improved compared to the scores of students across the state who started at a similar academic level.
Academic Proficiency: The percentage of students who met or exceeded expectations on state tests — in other words, who scored at grade level. This factor is often referred to as “status.”
Enrollment Rates: How many students re-enroll at a school year to year.
Parent Satisfaction: How many parents are satisfied with a school, as measured by a survey.
My school was rated (your color here). What does that mean?
Each school is assigned a color based on its final score. There are five colors on DPS’s scale.
Blue: Distinguished.
Green: Meets Expectations.
Yellow: Accredited on Watch.
Orange: Accredited on Priority Watch.
Red: Accredited on Probation.
Are there consequences connected to a school’s rating?
Yes. And not all of them are necessarily bad.
For instance, DPS doles out extra funding, sometimes referred to as “tiered supports,” to low-rated schools in an effort to boost achievement.
Last school year, 35 schools shared just shy of $14 million in DPS dollars, in addition to federal grant money, according to a presentation given by district staff to the school board in March. Those schools got a total of $1,674 more per pupil, according to the presentation.
However, if schools continue to falter even after getting help, they face the possibility of closure.
This fall, the district will use a new policy to determine which schools should be “restarted,” or closed and replaced. The policy, called the School Performance Compact, calls for using three criteria to identify persistently low-performing schools. The first is whether a school ranks in the bottom 5 percent of all DPS schools based on multiple years of color-coded school ratings.
The ratings also have consequences for teacher pay under DPS’s incentive-based system.
The last time DPS rated schools was in 2014. Why were there no ratings last year?
There were no ratings last year because the state switched to a new set of standardized tests in math and English. The tests are known as PARCC, and Colorado students have now taken them twice: first in the spring of 2015 and again in the spring of 2016.
You may also hear them called CMAS, which refers to the entire bundle of tests that Colorado students take, including science and social studies tests first taken by kids in 2014.
Because 2015 was the first year DPS students took PARCC, the district was unable to calculate students’ academic growth, which requires at least two years’ worth of test scores and is a big part of a school’s rating. As Superintendent Tom Boasberg likes to say, “What’s most important is not where you start, but how much you grow.”
In the absence of growth data, the district decided to forgo rating schools last year.
I heard DPS changed the way it calculates its ratings this year. Is that true?
Yes. In fact, DPS changed the calculation in two ways.
The first is that the district added additional factors. One example: The SPF will now include multiple measures of how well a school is teaching literacy to young children, including how much progress students designated as “significantly below grade level” are making.
This year’s ratings will also include a factor based on equity. Schools will be more explicitly evaluated on how well they’re serving students of color, for instance. However, because the equity factor is new this year, it won’t count toward a school’s overall rating.
The second way DPS changed the calculation was last-minute. Last week, district officials decided to lower the bar on one key measure after hearing concerns from school leaders.
The details are somewhat complicated. Because the new PARCC tests are more rigorous than the old state tests, fewer students across Colorado — and in DPS — met or exceeded expectations on the tests. But until last week, DPS wasn’t planning to lower the percentage of students who’d have to meet that bar for a school to receive a high rating.
District officials changed their minds, however, when they saw how the ratings shook out.
How much stock should families put into these ratings? How worried should I be if my child’s school dropped a color rating or two?
District officials are telling families to exercise caution when reading into this year’s ratings. In fact, at a recent school board meeting, some board members suggested printing the word “CAUTION” on top of a school’s color rating.
There are a couple of reasons why, officials said. One of the biggest is that because last year was just the second year Colorado students took PARCC, only one year of growth data is available. In the past, DPS has used two years’ worth of growth data to calculate schools’ ratings in order to smooth out one-time anomalies that can cause scores to swing up or down.
Having just one year of data means schools this year are likely to see bigger swings in their ratings, either for better or worse, Boasberg said at that board meeting.
“As we talk to parents and community members, we say, ‘Yes, the SPF is important,’” he said. “But the most important thing is to go visit a school, talk to parents, talk to students.
“No system (is) ever going to be perfect,” he added, referring to the district’s ratings. “The way we do the SPF is more comprehensive and reliable than anything we’ve seen out there, nationally or statewide. People do care deeply about the SPF. It does tell an important story. But it’s important that we tell that with humility and we tell that with caution.”