A wannabe hot-shot technical advisor, I had just enough knowledge to be useful and just enough naïveté to be dangerous.
I was new to the country office and eager to demonstrate my monitoring and evaluation (M&E) prowess.
The task at hand was completing a mid-term review of a “problematic” implementing partner whose work focused on HIV and food security programming. There was no baseline, but go for it.
Oh, and this particular country office needed to put partners “on notice.” (No wonder M&E is associated with scrutiny, policing, fear and confusion by implementing and local organizations.) The country office’s partnerships in general had been weak since the reign of an insane country director. The message from management was that it was time to “bring down the hammer” and “whip these partners into shape.” (Note the courtroom sentencing and slavery origins of these idioms.)
So my two colleagues and I traveled to the partner’s office and set up camp. Three full days of key informant interviews and focus group discussions. If we couldn’t go quantitative, we were going to make the best of what we could learn, the participatory element of the evaluation our focus.
What we found was not surprising. Some seeds and fertilizers distributed here and there, but late. Some HIV training “sessions” but no follow-up for voluntary counseling and testing or orphan support. In essence, “they [the partner staff] come from time to time” but I could not detect any deep relationships between the organization and the people they were serving from my vantage point.
And at the end of every day, as these results came in, I would post them on flip chart paper [an aid worker’s most important tool – another thing I wish I had learned in grad school] in the entrance hallway of the office.
As my preconceived expectations for the evaluation were being met, I wanted to make the evaluation “findings” as transparent as possible. When we observed the organization’s driver and vehicle were being dispatched each day to pick up and drop off the director’s granddaughter from school, this was the cherry on top.
We went back to our office, flips charts in tow, to compile the report. The evaluation report flowed as I wrote it the next week, my “got em” mentality reigning supreme. It was scathing, but I felt honest and given the sharing of the preliminary results, it should have hardly come as a surprise.
But it did.
The implementing partner requested a meeting. They felt called out and wanted a chance to respond.
I went in to the meeting, confident that the report might have been a bitter pill to swallow, but that it did indeed represent what we had found on the ground.
I honestly don’t remember much about that meeting. I know that because it was my strong written words that were at issue, I had enough sense to be fairly silent.
In the end, there was an agreement to add details and “tone down” the language of the report without altering any of the findings.
Looking back on this experience now, my hubris in writing that report with such a “gotcha” mentality is regrettable. But in our lives, in our relationships, it’s often the breakdowns and mistakes that make us more sure of who we are, that remind us of our connections to each other and of what’s most important.
I left the country office a year later and I don’t, in the end, know if the partner made changes to their programs or to their organization.
But what I do know now is that when you’re looking for what’s wrong, you’re certainly going find it.
This post originally appeared at: http://www.how-matters.org/2011/04/17/got-em-an-evaluation-story/
Would YOU fund this organization?
Grassroots = No Brains?
Overlooking the Capacity of Local Organizations
Listening to People on the Receiving End of Aid
What is our true job?