Saturday, January 18, 2014

Willingness to pay

Recently I was talking to a donor, telling him about a learning network that my team developed over the last year.  The conversation went something like,
Me: “Now that we have a relationship with the partners, they are much more willing to talk to us.  In fact sometimes they seek us out to get guidance or ideas for how to approach something.”
Donor: “Interesting.  So, do you think that there is any impact that you can see from the project yet?”

Those of you in development probably know that impact is a word that should always be written in italics and handled with care.  Keep it on the same shelf as: “value for money” and “rigorous evaluation.” 

As soon as he asked the question, I remembered a conversation I had years ago, with another donor, ex-McKinsey.  We were talking about how much consulting McKinsey does for private foundations and government agencies.  Most of it goes unreported, because it’s a different category of engagement.  But it’s A LOT.  
Me: “Has anyone ever looked to see what the impact of McKinsey is, on firms that hire them?”  Donor: “No.”  
Me: “So then how does McKinsey know that it’s delivering any value?” 
Donor: “Willingness to pay.”

So here’s a large, successful company based on the entire premise that it can help other companies work more effectively that has not one shred of rigorous evidence of its impact. And yet it continues to get plenty of business.  But in development, we apply a very different standard, one with a much higher bar.  I’m expected to be able to show the impact of a learning network, not merely the fact that practitioners themselves demonstrate demand for it.

Theoretically, I think we should set the bar high. What concerns me is the gross amount of money that goes into monitoring and research.  If it comes from a finite pot then that means less money can go into program implementation, which one would assume could have a negative impact on the overall results.  Our willingness to pay for the measurement of results is too high, and still climbing.

When you look through the publicly available information on the budget and spending, there’s no line for impact evaluation.  Usually monitoring and external evaluations are built into project grants, so I assume that they are included in allocations to implementation.  How much is this commitment to rigorous evaluation costing us?  Is our high standard worth the trade-off, the reduction of money to actually put into program?

Some suggest that the answer is to conduct even more research.  The authors of the recently published Investments to end poverty told the Guardian:
"Good data is essential to global efforts to end poverty. It is needed to assess the prevalence and location of poverty. It is needed to inform decision-making, to quantify, allocate and track resources and to measure the effectiveness of investments. And it is needed to empower people in whose name resources are being spent to demand accountability."

Good data can improve programs, and research can guide the way to better interventions.  But research is also a tool, and therefore we need to think carefully about the marginal returns of raising the bar still higher, and the significant trade-offs.

My suggestions:
  1. More attention and reporting to the costs of monitoring and evaluation.  Despite the emphasis on transparency, there’s still a lot we can’t see about where the money is going (see this spending breakdown from the USAID website for an example).  How hard would it be to include the total research budget in the methods section of academic papers?  I find it ironic that we can have such excitement about the rigorous evaluation of costs and effectiveness of multiple interventions (i.e. should we invest in computers or teacher training if we want to improve educational outcomes?), without any thought to the amount of money it took to get these answers.
  2. Getting creative on combining multiple purposes into one data stream. There are ways to integrate data collection into programming to bring down the costs and increase the data’s utilization in programming.  One example that I’ll explore more fully in a later post is the change monitoring system that Shiree put in place for all its implementing partners.  As part of normal field operations, front-line staffs spend about 10 minutes per household collecting data, which is uploaded to a central system. Once the project is over, Shiree will be looking for researchers to use this huge data set to better understand the dynamics of ultra poverty and the impact of the program.  Much cheaper than contracting a research group to collect the data themselves in addition to the program’s monitoring activities.  They’ve created a public portal to the data that’s worth exploring, especially if you are a phD student looking for a dissertation topic…..
  3. Focus on indicators that are easy to measure.  Most measures are imperfect, and yet, the world goes round.  Think about it: the SAT, the test you take to get a driver’s license, a standard physical exam at the doctor’s office…..likely we could raise the bar and find more rigorous ways to do these very important evaluations.   But these work and are fairly simple.  We can apply similar principles to development.  These would include:
  • Food security—something along the lines of the “kichuri index” suggested here, which compares the cost of a common dish (rice, lentils, and fixings) and wages.
  • Health and wealth status—in country after country, we’ve seen the epidemic transition—as basic living conditions improve, people start dying from chronic conditions rather than infections.  For the first time, people will start to worry about caloric intake rather than starvation.  Availability of diet coke is a good proxy.
  • Gender equality—pick a random sample of intersections in a city, and check the gender ratio of the people on the sidewalks there (for extra points, separate the vendors from bystanders).  50/50 means that women and men have equal mobility, which is a good sign for other types of equality.

Other ideas?

No comments:

Post a Comment