CLAimDev’s approach to learning, disseminating, and using evidence from evaluation is a highlight of its CLA work with the Mission. CLAimDev’s work is premised on the belief that when evidence is disseminated in multiple pathways, to broadly defined audiences, and through an iterative process of engagement and discussion, audiences will be more likely use that evidence to inform their development of policies, programs, and implementation. This approach to learning, dissemination, and use brings together the three cornerstones of CLA: collaborating with broad, diverse audiences; learning from efforts to rigorously gather and analyze evidence; and adapting and presenting the evidence in ways that are relevant and useful to the different audiences.

The pandemic created an opportunity for CLAimDev to test its premises. CLAimDev designed its first evaluation under the assumption that the evaluation team would be able to conduct its fieldwork in person and included travel costs in its budget. However, when it became clear that travel would not be possible, CLAimDev began discussions with its COR on using the unexpended travel funds to experiment with new ways of approaching learning, dissemination, and use. CLAimDev’s COR agreed that the travel restrictions created a fortuitous opportunity to experiment and learn.

CLAimDev decided to invest the unused travel funds in two ways, both oriented toward increasing prospective use of the evaluation results by creating multiple ways for different audiences to encounter and engage with the evaluation. The first was to create a multimedia, interactive, web-based report that included video clips, infographics, short visually engaging summaries of the major findings, conclusions, and recommendations, and links to the traditional written evaluation report, data collection instruments, data, interview transcripts, and quantitative and qualitative analyses. The second was to create participatory, interactive learning events that communicated lessons from the evaluation and created space for ownership of that learning by having stakeholders and beneficiaries reflect on the evaluation results based on their experiences.

One challenge that arose during this first experiment was how to attract an audience to a remote-only learning event. CLAimDev decided on a wide outreach, beyond direct stakeholders to local governments, private sector organizations, and higher education institutions across the country. CLAimDev also publicized the evaluation results and the multimedia web report through Facebook and LinkedIn posts. The result was more than 200 participants from the Philippines and a handful from other countries in a half-day learning event, more than double the number anticipated. Nearly all of the participants in the learning event remained in the online event for the full duration and many participated actively in small group discussions on different topics covered in the evaluation.

While the STRIDE task order required only two learning events, opportunities arose to disseminate the evaluation findings more widely with additional presentations to government stakeholders and to USAID staff in the Asia Bureau and in Washington. CLAimDev seized these opportunities and created new presentations directly relevant to these additional audiences.

CLAimDev learned from this first experiment and applied the lessons to later evaluations. Among the most important lessons was that learning, dissemination, and use of evidence must be intentional, systematic, and resourced. After the STRIDE experience, CLAimDev included and USAID approved line items specifically for innovative knowledge products, such as web reports, explainer videos, and infographics. The evaluation timelines also included up to three months after the evaluation was complete for learning, dissemination, and use activities. CLAimDev started planning for these activities as it was developing the evaluation questions and work plans with the goal of bringing the evaluation results to distinct audiences, using knowledge products and information directly relevant to their interests. STRIDE also showed the effectiveness of delivering evaluation results and knowledge products through social media, such as Facebook and LinkedIn, to increase interest and participation in the learning events.

After each evaluation and its learning events and presentations, CLAimDev conducted an after-action review with CLAimDev’s team and COR to capture lessons to inform evaluations and learning activities. Each learning event and the knowledge products created for the evaluation is described below, along with links to and illustrations of the knowledge products.

You have to think about how to design the data collection and if you will be able to collect the required data. Otherwise, you won’t be able to answer the evaluation questions.

IVY MEJIA

EVALUATION ADVISOR