Tag Archives: Value added

Where Is the Village?

It takes a village.  Really?

Since I became an educator about 15 years ago, I have had a sinking feeling every time I have heard the famous West African proverb, “It takes a village to raise a child.”  I’ve actually felt sick to my stomach hearing it while immersed in the culture of our school system.  I could never pin down exactly why until I had the opposite feeling on October 18—the night of our first alumni reunion.

I’ve had that sinking feeling because I’ve known, even though I have not been able to articulate that knowledge, that the box we put ourselves in when we think about “school” doesn’t create the environment we mean when we talk about a village.  In terms of school life, our reunion was countercultural.  Three years—middle school, practically a lifetime—had gone by, but there were our babies, all growing up.  Reveling in their individuality, they told us how proud they were of where and who they were, and how prepared they felt after graduating from Harlem Link.

And there we were for the strugglers, playing the role of critical friend now that we don’t have to be the enforcer.  Because we are no longer in a position of authority, we can engage in a safe, different and productive way those few children who reject seemingly all behavioral interventions.

“I don’t have to tell you what to do anymore,” I told one scholar who had been asked to leave two schools since graduating from ours.  “And you don’t have to take my ___ anymore, either.  I’m still going to tell you the same things I used to, but now you can be sure it’s because I care about you and not just because I am doing my job and bossing you around.”  She smiled—sheepishly.

Where is the village?

Not only do our peer schools not keep up with alumni, but legal interpretations of the FERPA law actively discourage us from being involved in the lives of our students once they leave; under current federal regulations of FERPA, former schools are no longer “interested parties” with a right to access student contact information.  More important, and I think one reason for this regulation, staying in touch and staying involved are not values of the school system.

I’ve never been to a West African village (the proverb is generally attributed to the Igbo people of Nigeria),  but because of my own experience living among a swarm of relatives as a child, I’m going to guess that those villages are organized sort of like my own Italian immigrant family.

Let me focus on one aspect of my childhood: intergenerational relations.  In my family, old people continue to be part of the fabric.  When their kids have grown up and moved on, when their careers have come to an end, when they don’t need as much living space as they used to, they don’t go off to a nursing home to die.  They move in with their children.

My grandfather lived with my aunt and her husband for the last 15 years of his life.  Discharged from intensive care in a hospital when a medical storm seemed to have passed, he died at home, sleeping on an easy chair, as peaceful as the breeze.  I was 10 years old.

Nonno wasn’t some distant old man whom we made special trips to visit; he was as much a part of the family as the aunt and uncle he lived with, or another aunt and uncle who lived next door to me.

He transmitted faith in a way no religion possibly could.  I knew that Nonno prayed to his wife’s memory and for her peace, while looking up at a old, brown photograph of her hanging in his small room, every night from when she died suddenly in 1956 until his last days more than 30 years later.

He was the genial, appreciative father figure who watched cartoons with me every Saturday morning while my older sister was off doing activities.  He didn’t understand English, but he understood joy and love, and he gave them even more than he received them.

When he died I didn’t understand shock, so I wasn’t sure why I didn’t cry for three days, but then I couldn’t stop.  We were at my aunt and uncle’s house when it hit me that he wasn’t coming back.  My cousin took me to her room and laid me down on the bed.  “I’m not a baby,” I thought, “but, okay, I will lay down and cry.”

My mother had had a zia in Astoria who lived with her daughter and son-in-law until she died not long before Nonno did.  So the concept of death wasn’t new to me when Nonno died.  But there we all were, in mourning in our own ways together, and unsure how we could deal with life without him.

Nonno was important to each of us in a different way.  Whether he was giving guidance to his children, handing out orange Tic Tacs to his 11 grandchildren, making funny attempts at broken English or showing the example of a life faithfully and earnestly lived to everyone, he gave something to everyone in our family.

I do not see this in our school system.  In fact, I don’t see anything even remotely like it.  The impulse of most adults in the school system seems to be to care, just not too much.  I was told about this boundary over and over again when I started teaching.

This is the advice I heard: Don’t get too involved in the lives of your students, because you are bound to be disappointed.  And anyway, who knows where they are going to end up?  What could you, their third grade teacher, do to keep them out of prison?  Their home life is such a mess.  And don’t go visit to learn about it first-hand; take my word for it. Besides, it’s probably dangerous.

I hope it is self-evident how destructive these words are.

But it isn’t just the attitude that’s the problem.  It’s the structure—or, rather, the lack of structures that encourage longitudinal thinking about children and meaningful, ongoing relationships with them.

With test-driven accountability as a sole measure of a child’s progress, everything is placed on this year’s teacher, as if Johnny walked in to my classroom as a blank slate.  Calculating “value added” by considering where Johnny scored last year and how far his teacher can move his score this year (rather than his final test score alone) addresses this problem the way Google Translate would have helped Nonno learn English.  It would have been a nice tool (had it existed in 1987), but what was really needed was the human connection.  Value-added data formulations acknowledge that children have actually had prior experiences with other teachers, and that’s an important start, but they do nothing to address the future relationship between today’s teacher and tomorrow’s alumnus.

The older residents of the village

I would like to see us reframe our thinking about the outcome of “school.”  It’s not only about this year’s test, or this year’s graduates, or this year’s teaching and learning.  It’s about the impact of our work and our relationships on the lives of the children in our care.  And those lives extend far into the future, rather than coming to a full stop at the end of June.

The older residents of the village, it turns out, aren’t the people you picture when you think about a school community.  I’m not talking about the gray-haired principal, or the wizened special education teacher, or the grandfather who volunteers with the PTA.  I’m talking about the alumni, who are now downright invisible.

In a time when schools and communities are clamoring for more support, alumni are a powerful untapped resource for our school system.  And they are continually ignored, because to many members of our school communities, they seem irrelevant—or, worse, threatening.

Difficult questions

I believe alumni are also ignored because it’s so darn hard to pin them down.  How would you evaluate the impact of a teacher and a school on a child 20 or 30 years into the future?  That task is even more confounding when we live in a time when a third or even half of a school’s teachers might have moved on to other schools or careers by next year.

If it truly “takes a village” then why does our definition of village end in June?  How can we preserve the relationships that teachers form with their students longitudinally at a time of so much change and movement?

I don’t care about the difficulty of answering these questions.  Even a cursory examination of them reveals that they raise all sorts of interesting questions about assumptions we are making.

The problem is that no one seems to be asking them.

Posted in Uncategorized | Tagged , | 1 Comment

Survivorship Bias

A hidden bias

Out of 90 charter schools that administered the New York State standardized tests in both 2011 and 2012, Harlem Link had the 8th highest average increase in English Language Arts (ELA) and Math scores.  This score improvement was amazing, fantastic, even inspiring.  And misleading—because of a small, relatively unknown factor called survivorship bias.

Survivorship bias is a statistical term for an indication that there is some hidden factor that excludes certain members of a data set over time—namely, part of a sample that was there at the beginning is no longer there at the end and does not count in the final analysis.  The smaller subset of those who “survive” over time might be better off than the original whole group simply because of who stayed and who left, not any value added over time.

Simply put, every year, at every school, some students leave, and their departure changes the profile of who takes the test from year to year.  Sometimes high-scoring students depart.  At other times low-scoring students depart.

If schools continuously enroll new students (and some don’t), the same factor impacts the student population for these incoming students.  At the end of this blog I chart a hypothetical situation in which survivorship bias shows how a school can appear to improve while not actually adding any value simply by not adding new students year after year.

In large systems, there is so much mobility that these student profiles tend to cancel each other out because of scale.  For example, the student population appears relatively stable from year to year in the third grade in Community School District 3, where 1,342 students in 30 school took the state English Language Arts exam in 2012.  But in small student populations like the one at Harlem Link, where only 52 third grader students took the 2012 exam, a few students entering or leaving the school with certain test scores can make a big difference.

When the state department of education releases test scores each year, however, it does not provide this or any other contextual background information alongside the scores.  I believe that this process penalizes, in the public eye, schools that continue to enroll students to replace those that depart.

(Partly) illusory gains

At Harlem Link, the fact that we only test in three grades guarantees that at least 1/3 of our students taking the tests each year will be different students than those who took it the year before.  Putting aside the variability in the state test from year to year, this rolling of the dice has influenced some dramatic swings in achievement that mean our school’s test scores have looked worse than the actual performance of our teachers in some years, and at other times (like this year) they may have looked better than they really were.

It turns out that the profile of our students who departed before the last school year was a much less successful one than the profile of the group that left the prior year.  In other words, we had to improve less to get apparently lofty gains.

In English Language Arts, we saw an improvement of 18 percentage points from 2011 to 2012, according to the state’s way of reporting the scores.  But since many of the students who graduated in 2011 or left for other reasons following the 2010-11 academic year performed poorly on the 2011 exams, the students who returned had a better passing rate by 10 percentage points than the original group.  In other words, more than half of our test score gains in ELA could be accounted for by attrition.

Now, I’m not going to say that I’m not proud of our scores or that they are not indicative of a powerful effort by talented and dedicated professionals.  I’m not even going to tell you that we didn’t improve our practice last year.  I think we have improved in that area every year, because we have been an honest, self-examining, learning organization.  But the wide swing in test scores and the state’s failure to describe enrollment patterns when reporting the scores masks the true story of a gradual, continuous march to improvement that is the real hallmark of the growth at Harlem Link.

Best practices often begin as difficult, controversial and seemingly impossible changes to “the way things are.”  Strong schools take the time required to plan, assess and tweak new initiatives until they become standard operating procedures.  The lack of information provided alongside scores obscures this type of growth, creating perverse incentives for schools to “push out” students who are low performers and to “quick fix” by whittling down large original cohorts to smaller groups of survivors, uncompromised by new admittees.

At Harlem Link, we have resisted these perverse incentives.  We have always replaced students who leave, for budgetary reasons (being a small, standalone charter school) and to serve a greater portion of the community starved for high-quality school choices.  Each year, we have encouraged some students who are particularly high achieving to leave a year early by helping them apply to competitive public and independent middle schools that only admit in fifth grade, reasoning that we’d rather lose their strong fifth grade test scores than see them lose an opportunity to get firmly on the college track a year ahead of their peers.  If we followed the short-sighted state incentive, we would not have urged four of our highest-scoring fourth graders on the state exams in 2012 to apply to and enter the Upper West Side’s highly sought-after Center School.  They were admitted and are all attending—a fact that may well push down our fifth grade test scores by as much as 10% next year—and we are thrilled, because we helped four more students living in a high-poverty environment to gain admission to this exclusive public school.  We also would not have pushed students leave after fourth grade in years past to embark on the independent school track by attending the East Harlem School at Exodus House and the George Jackson Academy in lower Manhattan.

In the context of reform

This issue has been raised before in the blogosphere, but not in a thoughtful manner.  Instead, it has been wielded as a weapon by those who are against the current strain of education reform.  It has been used to defeat the straw man argument that charters are silver bullets and to denigrate the success of networks like KIPP, which is another organization that deserves no such uninformed criticism.  (Each year, KIPP asks itself several questions in its annual internal reporting, including, “Are we serving the students who need us?”)

Because it is potentially embarrassing and might burst the balloon of so-called charter education miracles, this issue has also (to my knowledge) been ignored publicly by my colleagues in the charter community.  There are many groups of charter schools that go happily on their way winnowing down their large kindergarten classes, educating fewer and fewer students in each cohort each year, not adding new students and narrowing down their challenges as they deal with fewer and fewer “survivor” students well.  And those charters that benefit from network infrastructure and economies of scale can balance their budgets even while shrinking six to eight kindergarten sections down to three or four fifth grade sections.

I’m not passing judgment on those networks.  As a charter school founder who has been running a school for almost 10 years, I still believe that the charter experiment has been a profoundly positive one for the communities where such schools have flourished.  What I want is for the public to have some understanding of the context behind test scores, so alleged miracles can be put in their proper place, and year to year statistical swings that have nothing to do with a school community’s actual performance can be put into their proper perspective.

Hypothetical (with some assumptions): survivorship bias in action

In the example below, compare two schools that start out with similar student profiles.  School A replaces each student who departs.  School B does not.

Each year at both schools, a greater percentage of academically struggling students than successful students leave.  Each year at both schools, neither school is adding any value since no individual’s test scores are changing.

Because the entering students at School A are similarly academically disadvantaged to those who depart, its scores do not change.  School B’s scores improve more than 20 percentage points—simply by virtue of attrition, the decision not to enroll new students, and the mix of which students are taking the test each year.

School A = Enrolls new students continuously
School B = Does not enroll new students

School A

Grade

5

6

7

8

Passing students added

40

5

5

5

Failing students added

60

15

15

15

Passing students leaving

0

5

5

5

Failing students leaving

0

15

15

15

Total Passing Students

40

40

40

40

Total Failing Students

60

60

60

60

Pct. Passing

40.0%

40.0%

40.0%

40.0%

 

School B

Grade

5

6

7

8

Passing students added

40

0

0

0

Failing students added

60

0

0

0

Passing students leaving

0

5

5

5

Failing students leaving

0

15

15

15

Total Passing Students

40

35

30

25

Total Failing Students

60

45

30

15

Pct. Passing

40.0%

43.8%

50.0%

62.5%

 This post also appeared at GothamSchools.

Posted in Education Policy | Tagged , | 1 Comment

Make Teacher Peer Evaluation Happen

Forget the fuss.  Let’s put in the hard work required to involve teachers in evaluating each other.

As is often the case when complex topics are debated in the media, creative thinking is a casualty in the current hubbub over whether school districts should publicly release teacher value-added scores.  Reformers on both “sides” are digging their trenches so fast and sternly that they are missing the busy bees on the surface spreading good ideas like pollen.

Deputy Chancellor John White, even while staunchly defending the controversial metric, said much the same himself in a letter to the New York Times published on New Year’s Day: “It would be unfair to claim that any one statistic, such as newly developed ‘value-added data,’ should stand alone as definitive evidence of a teacher’s effectiveness.”

One such idea is including a peer component in teacher evaluations.  This practice has been tried in some districts, most notably in Ohio, in some form since the 1980s.  But the fact that it requires a nuanced and locally specific structure – a strength that is a counterbalance to value-added data – makes it difficult if not impossible to bring to scale.  For this reason and others, I’m doubtful that the idea has been given a fair shake.

In other professions, peer evaluation is the norm, along with feedback from one’s superiors and direct reports.  Teachers deserve “360-degree feedback”; they occupy one of the most complex, demanding professions around.  Supervisors should have at their disposal more data and more diverse sources of information.

I know from experience that as a teacher surrounded by one’s four walls and focused so tightly on one’s classroom, it’s hard to see the big picture of what a school community needs as a whole.  But teachers have valid opinions about the practices of their colleagues that simply can’t be ignored, opinions that are sometimes more pointed and helpful than those of administrators. 

At our school, building the boat as we’ve been sailing for the past five years, we have not yet incorporated peer feedback into teacher evaluation in a meaningful way.  But we have laid the groundwork with teacher leadership through school walkthroughs, teacher-facilitated lesson study observations and teacher coaching. 

We already trust our teachers to give meaningful feedback.  For example, teachers are heavily involved in the hiring process.  Our teachers observe and debrief teacher candidates’ demonstration lessons, and recommend student teachers for a temporary or permanent assignment when a vacancy arises.  Anecdotal evidence tells me that while teacher leadership opportunities (outside of the traditional, Peter Principle-plagued ladder climbing to Assistant Principal, Principal, district office, etc.) are scattershot, but these feedback opportunities in the hiring process are more commonplace.  Should teachers’ opinions stop counting once a colleague signs on the dotted line?

Some critics of this idea – especially those stumbling over the sins of a toxic work environment – would question teachers’ willingness to criticize their colleagues when necessary.  After all, why wouldn’t a teacher concerned about protecting his or her own hide take it easy on a colleague in exchange for the same treatment back?  One clear answer is that a strong process with purposeful layers of feedback and oversight will prevent such an indulgence.

A better answer is that as constant learners, committed teachers are their own harshest critics—and when relationships with families and students are involved, that value extends to colleagues.  Those who work hard and take pride in their students’ success, who have strong bonds throughout the school community and are focused on children, will not stand for a teacher in the next grade faltering and ruining their hard work.  By the same token, imagine the motivation to help a teacher in a lower grade get it right when the evaluator will be teaching the affected students in only a few months! 

Finally, if folks aren’t willing to be honest when it comes to student results, then the school community has bigger problems than incompetence.  In other words, a modicum of two-way trust is required for this process to work.  But rather than an obstacle, that’s another reason why fretting over value-added data is beside the point; without a supportive, trusting environment, it’s a fool’s errand to evaluate teachers anyway.

This post also appears at the Bank Street Alumni Blog.

Posted in Education Policy, Newsletter column | Tagged , , | 3 Comments