Testing burden detailed

Study: Testing costs up to $78 million, covers most of school year

Colorado state government and school districts spend up to $78 million a year on testing, and some kind of standardized testing takes place during every week of the school year, according to a new study.

“Only accounting for direct costs, and not the additional opportunity costs incurred by redirected staff time, in total $70-$90 a student is spent on assessments in Colorado. This is between $61.1 to $78.4 million annually,” said the study by Augenblick, Palaich and Associates, a Denver education research firm.

On the issue of testing time, the study said, “When considered in the context of a typical school year of 175 days … between 7 percent and 15 percent of time in the school year [is spent] preparing for or taking assessments.”

The study was done for the Standards and Assessments Task Force, the 15-member appointed group that is studying the state testing system and which will develop recommendations for the 2015 legislative session. The task force and the study were authorized by a 2014 law that was a legislative compromise in response to growing concerns about assessments. The group was briefed on the study last Monday.

The study’s conclusions were based primarily on information provided by surveys of district-level administrators, building administrators and teachers. Information about testing costs was based on the survey, state data and interviews with administrators in five districts.

Other key findings of the study include:

  • “It is clear that both teachers and students are spending a significant amount of time that could otherwise be devoted to instruction on these assessment-related activities,” despite variations among respondents about specific amounts of time spent on test prep and test taking.
  • “Respondents from all three levels indicated significant impacts and relatively few benefits for most assessments. … A majority of respondents at all levels reported disagreement that the benefits of assessments outweighed the impacts.” (The one exception was general agreement that the benefits of the ACT tests given to high school juniors outweigh its impacts.)
  • “Respondents … suggested changes to assessments, focusing on reducing the length and number of grades of students taking assessment or reducing to the federal minimum.”

The educator opinions collected by the study mirror those captured in an earlier Department of Education survey (see this story for details). But the APA study does combine a wide range of testing data in a single document and provides a fresh look at the alphabet soup of tests facing Colorado students every year – DIBELS, STAR, CMAS, ACCESS and many more.

Here’s a quick look at some of the study’s major findings.

Tests and the school year

From a week of school readiness assessments in August to three weeks of early literacy progress monitoring in May, testing goes on across the school year, the study found.

“In this … example, over 40 weeks of assessment windows are open for 10 unique assessments (with specific date ranges overlapping) over a typical 36 week school year. This does not include additional formative assessments, course exams, or AP/IB exams. As is apparent, assessment is a year long process with at least one assessment testing window being open nearly every week of the school year.”

Student time on tests

“While the number of assessments administered varies by grade level, students at every level spend over a week of school time preparing for assessments, with students at key grade levels spending over two weeks of school time preparing for assessments. Time spent taking assessments is similarly high, taking at least a week of school time for students at all levels and more than two weeks of school time for students in some grade levels.”

Teacher time on tests

In the context of a 175-day school year, teachers spend between 5 percent and 26 percent of their time “preparing for or administering assessments.” The variation is accounted for partly by different loads for teachers depending on the grades and subjects they teach.

Costs of testing

Chart

The study found direct per-student costs for testing varying between $5 and $50 for state tests and $15-$58 for district tests.

“These figures would be much higher if opportunity costs due to diverted staff time were included. The costs range dramatically between districts and represent different resource starting points and capacity capabilities. Though there is not a perfect correlation the smaller districts tended to have higher costs than the larger districts.”

Costs & benefits

“Ratings of assessment impacts were remarkably similar across district, school, and teacher respondents. Teacher respondents tended to rate the impact of assessments as slightly higher than district and school respondents, but differences were not large. Impact ratings did, however, vary significantly by assessment, with all respondents indicating high level of impact from the CMAS and TCAP/PARCC assessments across all impact areas. Conversely, respondents indicated lower impacts from the ACT.”

What should be done

“A minority of respondents at all levels suggested keeping assessments as they were, with the exception of the ACT. Across all assessments, respondents at all levels favored reducing the length of assessments. There were not major differences in suggested changes from respondents at the district, school, and teacher level.”

The study found about 60 percent of district administrators want to reduce language arts and math tests to the federal minimum of testing 3rd-8th graders and once in high school. Only about a third of building administrators supported that. There was majority support across all three groups for reducing the length of tests.

What happens next

The study is expected to be a key piece of evidence in the task force’s deliberations as it works to prepare its report – or possibly reports – during its final two scheduled meetings on Dec. 16 and Jan. 12.

The group also has gathered a wide variety of information, including comments at several public meetings around the state. (See this page for links to summaries of those meetings.)

TestTakingTime112014

An outside advocacy group, the Denver Alliance for Public Education, also is seeking additional parent comment through an online survey, which it intends to present to the task force.

Some members of the task force have indicated support for trimming the testing system back to federal minimums. But there is a wide variety of views represented on the group, and members representing education reform groups are nervous about tinkering too much with the current system. (See the list of members at the bottom of this page.)

While the task force is still deliberating, members of the legislature already are at work on the issue.

“The legislators are drafting their own bills. We’re going to see bills that are across the spectrum,” said one lobbyist. “The legislature is going to have to pick and choose.”

How study was done

APA gathered information through document review, an online survey of district administrators, school administrators and teachers, follow-up interviews with five districts (Aurora, Center, Eagle, Kit Carson and Poudre) on costs and from CDE information. Here’s the breakdown of responses:

  • District-level administrators – Reponses represent 64 districts, or 36 percent
  • School-level administrators – Responses represent 12 percent of schools
  • Teachers – Responses represent 4 percent of statewide workforce

The study concluded that the responses were representative of statewide opinion.

ASD scores

In Tennessee’s turnaround district, 9 in 10 young students fall short on their first TNReady exams

PHOTO: Scott Elliott

Nine out of 10 of elementary- and middle-school students in Tennessee’s turnaround district aren’t scoring on grade level in English and math, according to test score data released Thursday.

The news is unsurprising: The Achievement School District oversees 32 of the state’s lowest-performing schools. But it offers yet another piece of evidence that the turnaround initiative has fallen far short of its ambitious original goal of vaulting struggling schools to success.

Around 5,300 students in grades 3-8 in ASD schools took the new, harder state exam, TNReady, last spring. Here’s how many scored “below” or “approaching,” meaning they did not meet the state’s standards:

  • 91.8 percent of students in English language arts;
  • 91.5 percent in math;
  • 77.9 percent in science.

View scores for all ASD schools in our spreadsheet

In all cases, ASD schools’ scores fell short of state averages, which were all lower than in the past because of the new exam’s higher standards. About 66 percent of students statewide weren’t on grade level in English language arts, 62 percent weren’t on grade level in math, and 41 percent fell short in science.

ASD schools also performed slightly worse, on average, than the 15 elementary and middle schools in Shelby County Schools’ Innovation Zone, the district’s own initiative for low-performing schools. On average, about 89 percent of iZone students in 3-8 weren’t on grade level in English; 84 percent fell short of the state’s standards in math.

The last time that elementary and middle schools across the state received test scores, in 2015, ASD schools posted scores showing faster-than-average improvement. (Last year’s tests for grades 3-8 were canceled because of technical problems.)

The low scores released today suggest that the ASD’s successes with TCAP, the 2015 exam, did not carry over to the higher standards of TNReady.

But Verna Ruffin, the district’s new chief of academics, said the scores set a new bar for future growth and warned against comparing them to previous results.

“TNReady has more challenging questions and is based on a different, more rigorous set of expectations developed by Tennessee educators,” Ruffin said in a statement. “For the Achievement School District, this means that we will use this new baseline data to inform instructional practices and strategically meet the needs of our students and staff as we acknowledge the areas of strength and those areas for improvement.”

Some ASD schools broke the mold and posted some strong results. Humes Preparatory Middle School, for example, had nearly half of students meet or exceed the state’s standards in science, although only 7 percent of students in math and 12 percent in reading were on grade level.

Thursday’s score release also included individual high school level scores. View scores for individual schools throughout the state as part of our spreadsheet here.

Are Children Learning

School-by-school TNReady scores for 2017 are out now. See how your school performed

PHOTO: Zondra Williams/Shelby County Schools
Students at Wells Station Elementary School in Memphis hold a pep rally before the launch of state tests, which took place between April 17 and May 5 across Tennessee.

Nearly six months after Tennessee students sat down for their end-of-year exams, all of the scores are now out. State officials released the final installment Thursday, offering up detailed information about scores for each school in the state.

Only about a third of students met the state’s English standards, and performance in math was not much better, according to scores released in August.

The new data illuminates how each school fared in the ongoing shift to higher standards. Statewide, scores for students in grades 3-8, the first since last year’s TNReady exam was canceled amid technical difficulties, were lower than in the past. Scores also remained low in the second year of high school tests.

“These results show us both where we can learn from schools that are excelling and where we have specific schools or student groups that need better support to help them achieve success – so they graduate from high school with the ability to choose their path in life,” Education Commissioner Candice McQueen said in a statement.

Did some schools prepare teachers and students better for the new state standards, which are similar to the Common Core? Was Memphis’s score drop distributed evenly across the city’s schools? We’ll be looking at the data today to try to answer those questions.

Check out all of the scores in our spreadsheet or on the state website and add your questions and insights in the comments.