The Organization for Economic Cooperation and Development (OECD) Development Assistance Committee (DAC) Network on Development Evaluation  -- or EvalNet as it is known among evaluation cognoscenti -- celebrated its 30th anniversary in June 2013.  We should all lift a glass to toast this important network, which plays a central role in sharing evidence on results of international development co-operation to support learning and help hold decision makers to account.

As we celebrate 30 years of EvalNet’s achievements, lets also pause and imagine where we will be in 10 or 20 years from now!  I want to highlight five things that will be instrumental over this period.

We will see in partner and client countries more and more robust evaluation professionals, which will be better networked into a global community.  Making sure we are one community rather than a fragmented lot of different ones, and that while respecting the diversity of composition and needs of different groupings, will be a challenge.  This Network, through its evaluation capacity development work that also reaches across the community to the Banks and Foundations through the Regional Centers for Learning on Evaluation and Results (CLEAR Initiative) and the embrace of EvalPartners to the United Nations and non-government organizations, is well placed to play a role that we all need to assume.

Likewise, becoming a profession will require internationally agreed and accepted professional standards and a process of accreditation.  This is nothing new: I can recall the debates of the early 1990s, when time was not ripe to form and agree on those standards.  Today, a number of groups are engaged in defining competences.  To truly strengthen the profession we need to bring these efforts together to a global consensus, as otherwise we will suffer greater fragmentation.

In terms of methodology, great advances have been made in many fields, and the arrival of “big data” that gets generated “automatically” expands our fount of information for use in evaluation.  I would challenge us too, to look more closely at how we evaluate capacity development.  This has always been considered the highest priority in the development discourse. Yet, across all organizations, the same mistakes are repeated, both in the design, doing, and evaluating thereof.

The other methodological challenge, we will need to tackle is that of complexity.  This science is “complex” and quite often a bit wooly, because it does not follow our linear thought or speech patterns. The logical framework approach has been instrumental in shaping evaluation approaches.  It explains cause and effect and is meant to help planners as much as implementers and evaluators.  We all have seen the challenges in coming up with good frameworks and the dire consequences when they were missing.  Going forward, we need to develop the mindset and the tools to make evaluative sense of the much more complex, intertwined and dynamic reality we live in. (Richard Straub of the Harvard Business Review wrote an interesting piece on complexity.).

And finally, our influence.  We are evaluators not just because we like the research and the digging into data, or because we like the odd fight we can get out of an evaluation when we hit a raw nerve. Primarily, we do this work because we want to influence change and ensure our institutions do better, learn, grow and serve their clients.  The disciplines of behavioral science and psychology can teach us some important lessons:  people learn better when they feel good about themselves.  We humans are less rationale than economic theory thought  and therefore act on feelings and sometimes contrary to what facts should make us do, and incentives of carrots and sticks work when people do simple tasks and fail when they are engaged in complex pursuits. Now, imagine a shift that takes our institutions away from a culture of blame to one that strives for excellence and is inspired by continuous learning – and evaluation playing its part to make that happen.

Comments

Submitted by clay wescott on Fri, 11/01/2013 - 05:51

Permalink
Evaluators have called in recent years for more effectively transferring research findings to the workplace. Evidence-based management is seen as the way forward to close the research-practice gap, drawing on the analogy of clinical trials in the medical field. However there is an important difference between medical research and public management evaluators: the former are practitioners, while the latter can only give advice. To address this, evaluators need to go through a three step process to have an impact on management practice. First, they need to develop theories that provide analytic insight into management processes, backed up by evaluative evidence. The complex context we work in makes this analytic work increasingly challenging, as the cited piece by Straub points out. A highly relevant reference for IEG in this domain is Ramalingam, Ben. 2013. Aid on the Edge of Chaos. Oxford: Oxford University Press. Second, they need a strategy for giving advice to managers drawing on this insight. Third, managers need to know how to use this advice. IEG does a reasonable job on the first step, but has done little to apply analytical tools to understand and enhance our work in steps 2 and 3.

Submitted by Bruce McGregor on Mon, 11/04/2013 - 02:57

Permalink
Hi Caroline, I read your article with interest and could not help feeling your strong desire to facilitate change. I am part of an international team that teach a course about exploring in consciousness, it's called Avatar. The intention of this course is to facilitate change by teaching people the tools to Live Deliberately, it takes them away from the blame game, empowering them to create their dreams. The shift you were wanting us to imagine, is actually happening right now as people around the world reconnect with their own intuitive knowing that they have become disconnected from, using the experiential Avatar tools. If you are interested in learning about Avatar go to: www.theavatarcourse.com

Submitted by Caroline Heider on Tue, 11/12/2013 - 06:15

Permalink
Clay: great comments. Ben and I have spoken about his book when he was in the early stages of writing and he had sent me an early copy. Lots of great food for thought. In terms of "giving advice", IEG has an incredible capacity for outreach activities, more than any other evaluation office I have worked for. We organize seminars, workshops, participate in conference, and share our evaluations through conferences, social media, and our website. One such event was our recent workshop in Nigeria, organized together with the World Bank's Africa Region and Human Development Network as well as the IFC's Nigeria Office and Corporate Results Team to share the lessons from our Youth Employment Evaluation and other relevant studies produced by within the Bank Group and outside of it. This event was very well received and fostered learning from evaluation.

Submitted by Caroline Heider on Tue, 11/12/2013 - 06:15

Permalink
Bruce, thanks for the suggestion. When living in Rome, I had the opportunity to participate in an Avatar training -- a great resource, indeed.

Add new comment