Today there is hardly an evaluator who has not heard that results are needed at a much faster pace. Decision-makers think of an issue and want answers within the shortest period of time. Fair enough, after all: we want evidence-based decision-making and demand for our evaluation work.

In the past I have written a number of times about how important it is to make strategic choices when selecting topics for evaluation. A high-performing evaluation function anticipates which evidence is needed when, sometimes even ahead of decision-makers knowing or asking for it.

Instead, I want to talk about making evaluation more agile. How can evaluators shorten response times without compromising the quality of analysis, the strength of the evidence-base, and the validity of findings?

In some instances, the simple answer is to make the evaluation simpler. Instead of trying to cover the entire ground, keeping evaluation questions and scope narrowly defined can help. But, it really depends on what the decision-makers want to know. If the scope is too narrow, follow-up questions from them might force us to draw blanks.

Thirty years ago, data mining was a different thing. Analyses used to take a lot of painstaking, time-consuming reading, extracting and compiling of information. Techniques relied on paper and pen, highlighters and copious notes in margins, and eventually spreadsheets. All of this to ensure information was compared and contrasted in as objective a manner as possible.

Now, we have so much more information. I cannot even imagine how it would be to sift through all of this manually. Thankfully, technology has made it easier to work through piles of documents and data, and do so more systematically and faster.

Here are some of the things we have been doing at IEG.

Standardized data (where it makes sense).  For evaluations that follow a standard approach, it makes sense to have a common understanding of which data is needed and which sources to draw from. For instance, project evaluations that require certain country information to set the context, or industry data as benchmarks, can become nimbler when they tap into common sources and a shared understanding of the required data profile. It gets the evaluation team much faster to the analytical part of the job.

Algorithms.  Today, computer applications have become so user-friendly that certain algorithms are relatively easy to design. I say so carefully, as it is really our capable evaluation and data analysts who write the code and speed up their analytical work. It’s been very impressive to see what is possible today. From tabulating information automatically as soon as new inputs come in, to drawing information from various sources, to generating visuals that help interpret and present data more easily and convincingly.

Electronic data collection. Remember hardcopy survey forms? Not just the bundle that one had to carry around, but also the time it took to transcribe the data from the form to an electronic medium. And, then the time to do data cleaning when data was entered incorrectly? Well, today, surveys can be designed for electronic media, where data is entered directly on tablets and then uploaded into a common server. Or, when people affected by projects are well connected to the internet, online surveys can be administered when accompanied with outreach strategies to ensure participation.

Tapping into existing data. The work we would need to do to assess impact is extraordinary. It would be impossible to do so as part of a major evaluation. The cost would be forbidding and the time required to collect all the data would certainly not make this a nimble, fast turn-around exercise. Fortunately, today we can tap into databases with impact evaluations and even systematic reviews that assess quality of evidence and provide a synthesis of findings to complement other analytical work.

Upping the game on evaluation management. Tight timelines are demanding on evaluation teams. In addition to strong analytical work, they have to deliver many different parts within manifold constraints. The tools that help with analytical work can also be used to track progress of evaluation work as one of the inputs to good project (meaning evaluation) management. The better evaluation teams are equipped with skills and tools for effective evaluation management, the more they can focus their attention and resources on the difficult task of evaluating.

In short, there are ways in which evaluation can save time and become more agile. But, it should never do so at the cost of quality.