Gartner, a leading IT research company, recently published its annual “Hype Cycle for Emerging Technologies,” which plots the hype level of new technological innovations (spoiler alert: “big data” is just hitting the “peak of inflated expectations” and is headed to the “trough of disillusionment”!)  In today’s world, new fads burst on the scene every fifteen minutes, the speed of life seems to accelerate in measure with ever greater development challenges from climate change to poverty eradication.

Will these fads come to the rescue when dealing with complex problems? Yes, we need to innovate, but no: we need to base those new solutions on past experience. We at the Independent Evaluation Group are fortunate to draw on several decades of experience and analysis on what works and what doesn’t, sifting through all the stories and data to identify relevant solutions that are grounded in the reality of demonstrated results.  This blog is about how evaluation can be a transformational source of evidence to accelerate the impact and effectiveness of development work.

I have been a professional evaluator for more than 25 years in a variety of international development agencies. In those years, I have seen how good evaluation work can help generate critical insights to help development professionals tackle development challenges in a better way, as well as cases where weak evaluation has muddied the waters – more in a later blog. 

One thing that has been essential to come up with difficult messages and new perspectives is independence, which gives us evaluators the invaluable opportunity to

  • reflect whether and how things could have worked differently and better,
     
  • add value by looking for patterns and making connections others might miss, and
     
  • critically reflect on commonly held positions and seek evidence to stimulate the debate when paradigms need questioning.


The Luxury of Stepping Back. Time is ticking, deadlines are a constant, and many stakeholders have different demands and ideas; these are just few of the challenges in the daily life of development practitioners. As evaluators we enjoy an often undervalued luxury: of stepping back to take another look and exercise the wisdom of hindsight.  So: that is about learning from the past, but doesn’t it keep us stuck right there, generates lessons we have long left behind, and is void of the most pressing new challenges? It all depends. Looking at a portfolio over a 10-year period – something we typically do in our larger evaluations – includes more recent work, not from a perspective of outcomes but how the program has developed. And, benchmarking older policies, programs or investments against newer standards helps understand what is needed to close the gap to the latest standards.  We use the advantage of hindsight to look at the past through a different lens and wonder: could we have known this at the outset, and what if we had?

Unexpected Connections. Data descends upon us like an avalanche and more and more programs are available to present information in smart looking ways, often automatically processing massive amount of details and churning out an image to help us understand the information overload. Or not: as we know, infographics can be confusing and misleading, word clouds are great, but don't tell us the deeper meaning of data. This is the mainstay of evaluation: we collect data and information, analyze it and pull it together to make sense of it all. The most influential evaluations reveal patterns people are not aware of or draw connections others have not seen, sometimes bridging disciplines and ideas in uncommon ways. Again, it is in this type of evaluative thinking and analysis that generates insights that will help practitioners learn from the past for a radically different future.

Iconoclastic Is Good. Many development practitioners face the challenging tension of working within a bureaucracy that requires certain behaviors while innovating to find new solutions to intractable development challenges. This is where the second luxury of evaluation comes in: that of asking the difficult questions, seeking and accepting evidence also when it does not confirm long-held convictions. It is also one of its most fundamental challenges as it requires us evaluators to overrule traditional wisdoms and impartially look at evidence – accepting truths that might run contrary to shared understandings – and have the courage to defend uncommon and sometimes uncomfortable truths. But, in this day of uncertainty about so many aspects of life, it is a quality that will help build bridges from the past into the future where long-held beliefs might prove untenable.

I am excited to use these unique advantages to share ideas and stimulate discussions.  No fads from us – just some straight shooting reflections on how the development community can promote evidenced based solutions to help our clients.

Comments

Submitted by Ian goldman on Wed, 10/30/2013 - 23:45

Permalink
Thanks Caroline. Interested to see the response. Training members of parliament (in South Africa) yesterday and they are very anxious to get the evaluations, see this as real value add for their oversight role. Regards

Submitted by Caroline Heider on Tue, 11/12/2013 - 06:08

Permalink
Ian, your work with Parliamentarians is groundbreaking and of interest to many countries -- I hope you will have time to write up and share -- how about a guest blog entry!

Submitted by Andrew Stone on Tue, 11/12/2013 - 00:35

Permalink
Your blog brings to mind a fundamental challenge to us as evaluators -- how to make make evaluative findings as attractive, digestible and accessible as the latest development fad. Those of us at WBG for a while have seen a single project or country experience, usually taken entirely out of context, inflated, mythologized and replicated with highly varying results. Without systematic attention to the true characteristics of the project, its real impact, the context for its impact, and the transferability of the experience, it is hard to expect success in proportion to the likely scale of replication. How can we as evaluators make the slower to produce, more complex to understand and heed, and more conditional in nature findings of our evaluations as appealing and accessible to development professionals as do the propogators of development fads?

Submitted by Caroline Heider on Wed, 11/13/2013 - 06:53

Permalink
Andy, you are right -- needing to compete with fads, we have to come up with things that are attractive and exciting to capture people's minds and passions. I don't think there is a "one size fits all" or magic solution how to achieve this. In my own experience, the things that matter the most and get people interested is when we step back and look at things from a different perspective, not taking them out of context, but adding value. Here are two specific examples: Capacity development in Lao PDR. In this evaluation (done for AsDB), I used a concept for capacity development that eventually was codified by the OECD/DAC. It put the various initiatives in a framework that helped make strategic choices and develop synergies between each of them. Years later when returning to Vientiane on a WFP mission, we had discussions with the World Bank office, who told us that they were using the AsDB evaluation for their capacity development program. It contained enough food for thought to inspire a rethinking of their program. Country Program Evaluation in Mongolia. Similarly, in this evaluation, I took a very different look at the country strategy and program, one that contextualized the investments, ESW and TA work in Mongolia's transition process. It was an evaluation of what had been achieved, but it also asked whether the choices were strategic given the development challenges of the country, and systematic. For instance, were policy prescriptions applied in similar ways across sectors or was it dependent on who was leading the work? As part of the evaluation, we had interim presentations and discussions with Management which engaged and prepared them for the findings. The final meeting focused entirely on Mongolia and its challenges rather than arguing whether the findings were right or wrong. The discussion at the Development Effectiveness Committee suggested the report should be guiding work in all of the Central Asian Republics, and the country economist contacted me even after I had left AsDB to thank me once -- she had reread the report again. In short: stepping back, making unexpected connections, and being iconoclastic combined with an engaging process can be very effective in generating exciting and lasting findings that can compete with fads. A number of our IEG evaluations are equally instrumental in this way.

Add new comment