There is good reason to fear that this spring’s school closures hurt students’ academic progress. But how much learning, exactly, did students lose?
On a national level, we don’t yet know. State tests were canceled last spring, and this year’s tests won’t be given for many months, if they happen at all.
That’s prompted researchers to release their own projections of learning loss — and they paint a grim picture.
The nonprofit testing organization NWEA predicted that students started this school year having lost roughly a third a year in reading and half a year in math. CREDO, an education research organization, recently projected that the average student lost 136 to 232 days of learning in math, depending on their state. McKinsey, the consulting firm, predicts that by the fall of 2021, students will have lost three months to a year of learning, depending on the quality of their remote instruction.
In the absence of actual data on student learning — which is only just starting to emerge from diagnostic exams — those estimates have been widely cited by state and federal officials. The projections, then, are influencing decisions about how to help students catch up, what schools and students need extra resources, and when to reopen school buildings.
“I think people really latched onto it because it gave some certainty in a very uncertain world of what this could look like,” said Megan Kuhfeld, a research scientist for NWEA.
But those projections aren’t ironclad. Here’s what they can tell us — and why educators and policymakers should use them carefully.
Projections of learning loss are just that.
Researchers projecting learning loss are not doing so with actual information about what happened after schools closed due to COVID, but with historical data and a host of assumptions — “statistically informed guesses,” says education researcher Paul von Hippel. That’s understandable: Perfect data doesn’t exist, so researchers are trying to fill that hole.
But those same researchers say the projections shouldn’t be confused with actual learning loss. “They took on a kind of life of their own, beyond what I would have ever imagined, and are sometimes spoken about with far more confidence that we have in them as researchers,” said Kuhfeld.
The projections rely on the assumption that students learned nothing (or worse) once schools shut their doors.
The backbone of both the NWEA and CREDO projections is the idea that students essentially went on an extended summer break when school buildings closed, and that remote learning was a total wash.
That was certainly true for some students. Teachers, particularly in high-poverty areas, reported widespread absences during remote instruction, and some students became disconnected from school entirely. Many students lacked internet access, necessary devices, and critical in-person support, and the vast majority of teachers say students learned less than they would have in person.
“There is no dispute that the amount and quality of learning that has occurred since school buildings were closed has been deeply inferior,” writes Macke Raymond, the director of CREDO.
But is it fair to assume that the average student learned nothing — and in fact, lost learning as if it were summer? That’s less clear. Surveys of parents and teachers show remote schooling began fairly quickly after buildings closed, and most parents said their child’s school did a good job providing remote instruction despite the challenges.
Further complicating things is that the degree of “summer slide” that students experience in normal years is a matter of ongoing debate among researchers.
Some view CREDO’s projections of learning loss as implausibly large.
CREDO projects that in many states, students lost a full school year — 180 days or more — or more in math this spring. The organization describes its projections as “chilling.”
But how could students have lost hundreds of days of learning from missing 60 or so actual days of in-person school?
It has to do with how CREDO converts learning loss, measured in standard deviations, into “days of learning.” The approach is controversial among researchers. Some say it should be avoided because different ways of doing the translation can lead to wildly different results.
In this case, some outside researchers say CREDO’s results are questionable.
“It doesn’t pass the smell test,” said Harvard education professor Andrew Ho of CREDO’s days of learning projections.
“It’s hard for me to believe that somebody lost a year in a quarter,” said Constance Lindsay, a professor at the University of North Carolina School of Education.
McKinsey and NWEA also use variations of the days of learning approach, but come to more modest conclusions.
Raymond of CREDO said its projections are plausible because students were missing out on learning new things while buildings were closed and also forgetting concepts they’d already been taught.
She compared it to learning but failing to retain a foreign language. “I don’t think it’s all that different to say a kid would lose all of his math,” she said.
Learning loss predictions for individual students are especially imprecise.
CREDO has gone a step further than others by creating an individual projection of learning loss for each student in 17 states. Raymond said this data has been shared with state departments of education, allowing officials to look up projected 2020 test scores for individual students and pass that data on to schools.
But pinpointing the numerous factors that might contribute to learning loss is even harder for individuals than it is for large groups.
The British government faced backlash — and ultimately backed down — when it tried to use statistical projections in place of real final exams for high school students. CREDO warns that its estimates for individual student learning loss are imprecise and shouldn’t be used for high-stakes decisions.
“We have encouraged our state partners to not look at these as precise point estimates,” said Raymond. “Think of these as approximations.”
The projections don’t take into account other factors, like COVID-related trauma, that may have hurt students’ learning.
What’s not included in the projections are some of the other ways COVID affected students, like trauma from family members being sick or dying, additional family stress and financial insecurity, and school budget cuts — all of which can affect students’ academic performance.
Additionally, some projections focus solely on learning loss from building closures last school year. But many buildings — particularly in districts serving more students of color — remained closed as this year began. And even when buildings have opened, the quality of instruction may have declined as teachers, for instance, struggle to instruct students in person and virtually at the same time.
“We view this as a worst case scenario, because for most of our projections, we’re basically assuming that kids are getting no instruction in the spring,” said Kuhfeld. “But there could also be a much worse case scenario out there, which is that a child missed three months of school because they don’t have a device at a home, they’re facing parental job loss and family stress and housing instability all at the same time.”
Wrong projections come with risks
If students’ academic challenges are understated, policymakers may not do enough to respond, missing chances to introduce ambitious programs like the widespread tutoring being adopted in England to make up for learning loss.
“If you’re over optimistic, you might get the idea that you don’t really need to do anything to compensate, which I think would be not a good decision,” said von Hippel, an education professor at the University of Texas at Austin.
On the other hand, overestimating learning loss could give school officials and teachers a skewed picture. Overestimates could be particularly concerning for individual students, if they lead to lower expectations or reduce students’ likelihood of being exposed to high-level academic content.
“You lost 180 days. What does that mean — you have to repeat a whole grade?” said Lindsay. “That seems kind of ridiculous.”
Experts say that teachers should instead use diagnostic tests or exams they create themselves to figure out where their students are at. “Putting students back for remediation when they may not need it based on projections” would be unwise, said Kuhfeld.
“Teachers should be using the data that’s most close to the students,” she said.