Nick Childers’ 10th-graders at Denver’s West High School are studying the causes of World War II. As the teens enter the classroom, he greets each by name, makes eye contact, and shakes their hands.
On this spring day, however, there is an unexpected – or at least partially unexpected – guest. Marianne Kenney is one of Denver Public Schools’ 45 paid “peer observers.” She’s a former Cherry Creek teacher and passionate school reformer. She also helped write the state’s content standards in social studies as Colorado’s former social studies specialist.
It’s her job to unobtrusively watch DPS teachers in action and grade them against a grid of expectations. She is in charge of observing 70 secondary and 25 upper elementary educators. Today, the subject of her scrutiny is Mr. Childers, U.S. history teacher and Teach for America alumnus.
Kenney sits at a desk in a rear corner of the room, and flips open her laptop. Childers begins the lesson.
Welcome to the fish bowl that is teacher effectiveness in Colorado. Right now, one of the biggest fish in the bowl is Denver Public Schools.
DPS stands apart from other Colorado districts for its combination of size and magnitude of challenges. Seventy-three percent of its 80,000 students qualify for free- and reduced-priced lunch based on family income. It also stands out because of the work and money it is pumping into LEAP, Leading Effective Academic Practice, the district’s pilot teacher evaluation program, which focuses as much – if not more – on professional development as it does on rating teachers. Other Colorado districts testing out new teacher evaluation models are Jeffco, Eagle, Harrison, Brighton, and Douglas County.
All Colorado districts will be required to implement some form of “educator effectiveness” measures after the passage of Senate Bill 10-191 two years ago. With the help of a three-year, $10 million grant from the Bill & Melinda Gates Foundation, DPS got a jump start and created its own system.
“What sets us apart is how thoughtful we’ve been,” said Tracy Dorland, deputy chief academic officer for teaching and learning in DPS. “It’s not just a system of evaluation. It’s a system that respects the teaching profession.”
DPS test-drives teacher effectiveness
Key to SB 10-191 are comprehensive teacher evaluations to “provide a basis for making decisions in the areas of hiring, compensation, promotion, assignment, professional development, earning and retaining non-probationary status, dismissal, and nonrenewal of contract.” Most teachers now work under collective bargaining rules that place a greater emphasis on years in the classroom than results. Under SB 10-191, at least half a teacher’s evaluation beginning in 2014-2015 will be based on his or her students’ academic growth as evidenced by test scores and other, yet-to-be-determined academic measures.
With LEAP, DPS is also experimenting with peer observations, principal observations and student feedback. In addition, the district is piloting meetings between teachers and school leaders to discuss a teacher’s “professionalism” – the things a teacher does that don’t always get captured during a classroom visit, such as relationships with colleagues and parents. Built into LEAP is support for teacher improvement: Books to read, videos to watch, online or in-person classes to take – all available to the teacher via Schoolnet.
“There is not a teacher out there in any classroom who doesn’t want to be the best they can be,” said former LEAP spokeswoman Amy Skinner, who is now working for the Colorado Department of Education as Race to the Top communications director. “It’s the hardest job in the world. You’re not doing it if you don’t want to get results for kids. (LEAP) is about giving them more of that support they’ve never had.”
LEAP began with a 16-school pilot in spring 2011, then expanded to 127 district schools this year — 94 percent of all district schools — resulting in 3,800 teachers going through the process.
A centerpiece of LEAP was the hiring of 45 peer observers – trained and experienced educators who have the knowledge and expertise in the same subject area as the teacher they’re evaluating. The $3.8 million price tag of the peer observers comes out of the DPS general fund. The average peer observer salary is nearly $64,000.
Under the old teacher evaluation system, teachers were rated “satisfactory” or “unsatisfactory.” More nuanced information was provided to teachers, but most ranked “satisfactory” nonetheless. Statistically speaking, the ratings didn’t add up. In 2007-08, DPS principals and assistant principals gave unsatisfactory ratings to 33 out of 2,185 teachers evaluated – or 1.5 percent. And that was actually one of the highest percentages of unsatisfactory ratings in any metro district, according to a report in Education News Colorado.
It remains to be seen whether a similar pattern will emerge with LEAP, which uses numerical ratings against four major areas: Positive classroom culture and climate; effective classroom management; masterful content delivery; and high-impact instructional moves, such as checking for understanding of content and language objectives or differentiating lessons based on ability.
A score of 1 or 2 means the teacher is not meeting expectations; a 3 or 4 means a teacher is approaching expectations; a 5 or 6 signals an effective teacher; and 7 is distinguished.
During the first of three evaluation windows this year, teachers were given numeric scores. In the second window, they weren’t. In the third, numeric scores were used again but the framework had changed. As a result, DPS officials declined to release any of the ratings at this time.
“Until we are able to show more data points, it is unfair to share the observation data,” said Skinner.
In the past, teachers also complained about inconsistency in how principals evaluated them. At one school, a principal might have said a teacher was “top-notch.” But at another school, a different principal gave the same teacher negative reviews. Politics could also become a factor. And observations by principals were not consistent and only happened once every three years.
“It was more about a relationship with an adult as opposed what you did with the kids,” said Pam Shamburg, a Denver Classroom Teachers Association (DCTA) representative on LEAP.
A look at peer observation
At first, many DPS teachers weren’t happy about unannounced visits to their classrooms by peer observers. But LEAP
staffers say teachers are warming up to the idea now that they’re getting used to the observers. Of the teachers who participated in LEAP observations in spring 2011, 81 percent reported they would be able to improve their practice based on feedback, and 74 percent said they would speak positively about the observation and feedback experience to colleagues.
This year, trained peer observers visited teachers at least twice, evaluating them against the original 21-indicator rubric and later against a condensed, 12-point rubric. (Check out the revised rubric.)
Candis Hitchcock, 57, a veteran special education teacher at South High School, said she likes the idea of peer observations – even though she was skeptical at first.
“You’re going to be evaluated no matter what,” Hitchcock said. “It’s nice to have someone from outside come in. My observer was wonderful. She taught special ed, too. Just because I have all these years of experience doesn’t mean I know everything.”
But she worries about all the things an observer doesn’t see – like the time spent running a sensitive IEP meeting with parents, or carefully completing mounds of legal paperwork.
“I would love to be observed holding an IEP meeting,” Hitchcock said.
And she’s not sure other parts of her job are captured, either.
“It’s much more than academics,” she said. “I’m a counselor, a mother, a father, a feeder. I take time to be patient with kids if they’re upset. You can’t say, ‘You can’t do that – we’re doing math right now. You can’t cry.’ There are many things they don’t really see us do.”
Shamburg, though, said there are other teachers who have not been too happy about their peer observers – especially if the observers are young and brash and telling a veteran teacher how things should be done.
Building principals also play a key role as to whether teachers embrace the peer observations.
“You can feel it when you go into a building,” Shamburg said. “The (teachers’) attitude is mirrored by the principal. They’re not always comfortable having a second eye.”
Childers’ number comes up
As for Childers, he knew he had one more observation this school year by Kenney. He found out five minutes before her visit. For the next 45 minutes, he would be watched closely.
A timer on a cord dangles from Childers’ neck – his way of making sure he stays track with his lesson plan, which he carries out with military precision. The 20 students sit in clusters, working silently at their desks. They draw pictures and write a sentence to go along with each of four vocabulary words: totalitarianism, fascism, Nazism, and militarism.
Many of his students are English language learners, so images are a key part of building vocabulary.
Kenney occasionally gets up and wanders around the room with her laptop. She listens in on quiet, one-on-one conversations. Sometimes, she asks students questions about what they’re doing, and why.
Childers watches his timer, then moves on to the next segment of the day’s lesson. He instructs students to write down the day’s “content objective.” Today, the objective is to analyze Hitler’s goals for Germany and the reasons for Japanese militarism. He shares stories about his own family members being persecuted in the Holocaust.
A follow-up visit
Kenney is back the next day over Childers’ lunch hour. This time, her visit is no surprise. This is the most delicate part of the LEAP peer observation process. Kenney has to talk to Childers about his teaching in a way that is non-judgmental. She has to keep her opinions out of it, and avoid “should” statements.
They talk about her earlier visit this school year and what he has worked on over the past several months based on Kenney’s first round of feedback. He says he has worked on creating “thoughtful” class groupings, and differentiating assignments. Both agree his classroom management skills are top-notch.
Now, she has to deftly guide him to the conclusion she wants him to reach. She wants to see more passion about the subject matter, more creative ways to engage students in historical events.
“Not a moment is wasted in your class,” she tells him. “While working on things, you supported each kid, gave them feedback on their notes. I saw a difference from last class to this class.”
Kenney asks him to provide more context about the lesson she observed. She wants to know “the big idea.”
He talks about his students being able to write strong, 11-sentence paragraphs, support their opinions, and explain how facts or quotes support certain statements. His first answer is narrower than she wants it to be.
She tries a different tack: Say these kids are all married and have their own kids in high school. They’re now studying World War II. What would these former students – now parents – say about what they learned in Mr. Childers’ class?
Childers pauses, then says students should remember the goals these countries had leading into World War II, the political motivations that led to war and connect them to current or future situations, such as the conflicts in Iraq or Afghanistan.
Kenney wants more. “In your heart of hearts, what’s really important; what sticks with them?”
“Half of my family is Jewish,” Childers says. “Half escaped; half didn’t. How can these things happen? How did totalitarian regimes come to be? …How can we make sure they don’t happen again in the future?”
In the end, Kenney encourages Childers to go deeper with his lessons. She offers him tangible ideas. She suggests he put students in the role of historian, have them pretend to be journalists on carrier planes when the atomic bomb was dropped. She suggests he have students think about whether they have ever felt repressed and without choices the way people living under totalitarian regimes may feel.
Then she asks Childers how she can do a better job as an observer.
He describes her feedback as “excellent.” He says he liked how she pushed him to think about the big idea, but he’s also a bit frustrated. Considering the amount of time in class and the fact that many students are well below grade level, is it more important to teach a student how to write a topic sentence or emphasize the big picture?
“I think they can do both,” Kenney says, before sending him a link to a book called Reading Like a Historian, along with some tip sheets.
For now, this observation is merely a way to help Childers improve. It has no bearing on his tenure status or movement up the pay scale. But, in 2014, it will – along his principal’s observations of him; a review of his professionalism, which includes how well he knows his students and their personal backgrounds; student test scores; and student feedback, which asks questions such as, ‘Are you always busy in this class?’ or ‘If you don’t understand something, does the teacher help explain it in a different way?’
What’s next for LEAP
The LEAP pilot will continue next year.
The district will use the revised rubric. Teachers complained the first one was too long, and sometimes redundant. The new one is more focused. The new framework also better integrates instructional technology and best practices for linguistically diverse students. Most importantly, Dorland said, the revised framework is now tied to the Common Core Standards.
The length of the observation was also increased based on teacher feedback in the early pilot, from 30 to 45 minutes. Ratings summary sheets are now provided to the teacher in advance of the final wrap-up meeting with the observer to make the meetings as efficient and useful as they can be.
The principal observations have also not been as strong as they should be, with very few teachers actually having been observed twice during the year by a principal, Shamburg said.
LEAP staffers are now starting to put more work into the student outcomes side of the equation (i.e. test scores), to be piloted next year. The tricky part is what measures to use in non-tested subject areas, such as music, art or library.
For Shamburg, a former lawyer turned educator, adding test scores into the mix demonstrates how “politics has overcome common sense.” To the public, it seems straightforward to link test scores to teacher evaluations. But in DPS, for instance, a majority – or about 70 percent of teachers – do not teach classes in which standardized tests are administered, which means the district must figure out what other reliable assessments to use.
Unlike many of his peers, Childers said he supports the idea of linking student achievement to teacher evaluations – the most controversial aspect of SB 10-191 – with conditions.
“If you didn’t have that it would be like having a sales job and none of performance tied to how many sales you made. If there’s not any learning going on, then there’s not any teaching going on.”
But Childers is adamant that the focus needs to be on where the student starts out the school year, and the growth he makes while in a class. It is not fair, Childers said, to apply the same benchmark goals to all students without taking into consideration where they started the school year. Some of his students start off at a third grade reading level.
Another huge piece that needs to be worked out is how each piece of the evaluation will be weighted for each teacher.
“The pieces that will be in the new evaluation system aren’t all there yet,” Shamburg acknowledged.
In 2014-2015 when LEAP becomes the law, things will be different. While no one category would result in a teacher losing non-probationary status or being placed on an improvement plan, an overall score will ultimately be used to determine these and other decisions. However, non-probationary teachers in the “approaching” category would maintain their status even though their overall rating is not in the “effective” range.
Then, there’s the continued cost of LEAP. The Gates grant runs through next summer. The main ongoing expense is the peer observers. There are sure to be debates about how to best spend the $3.8 million it took to hire them.
The LEAP office continues to seek out feedback from teachers through its website.“We are being deliberately more responsive and more open,” DPS spokesman Mike Vaughn said. “ We want to think about this long and hard, and make sure we take the time to get it right…(People) complain about tenure. But there has not been enough attention paid to how broken the support system for teachers has been.”
Teacher views after first peer observation fall 2011
• 66.8 percent – The observer had the subject knowledge to rate the content of my lesson.
• 70 percent – During the feedback meeting, my observer provided feedback that was appropriate for the content of my lesson/grade-level.
• 70 percent – During the feedback meeting, my observer helped me understand which indicators I need to focus on for growth.
• 71.6 percent – During the feedback meeting, my observer facilitated a collaborative discussion of my teaching.
• 60.7 percent – The Framework is a useful tool for self-reflection about my teaching practice.
•68.7 percent – The feedback experience was positive.
This survey by DPS was based on 1,849 survey responses sent to 3,523 teachers.