Appendix 2: 3Ie Indicators

Total Page:16

File Type:pdf, Size:1020Kb

Appendix 2: 3Ie Indicators

Appendix 2: 3ie indicators

All evaluation teams are encouraged to use indicators in tracking their engagement and the uptake and use1 of the evidence from the study. Below are non-exhaustive indicators that 3ie uses when monitoring grants through calls, progress reports and visits. Teams are free to add indicators that are relevant for monitoring their study’s engagement and use.

Engagement indicators  # of formal or informal meetings and other events to discuss any aspect related to the evaluation or for building relationships with stakeholders, including community members, civil society, media, academia, donors (e.g. please include a short description of why you engaged and whether you discussed anything specifically about the study, e.g., study design, study questions, justification for doing the evaluation, baseline, midline, results) o # of participants in those events (estimate) . Estimated # of participants who are decision-makers (e.g. policymakers and programme managers, including community members as decision- makers, as appropriate)  # of knowledge and information products produced that are related to the study; please specify the intended main audience o Include the name, web links, draft copies and brief description e.g. briefs, memos, articles, working papers, blogs as part of documenting impact evaluation analysis and findings. This may also include products produced as part of the intervention being evaluated, such as cartoon story boards, illustrations, songs, stories or theatre.  Estimated # of stakeholders that received the knowledge products (disaggregated by product)

1 By uptake and use we mean discernible instances where the act of doing the study, including interacting with the stakeholders, as well as the findings and any recommendations, inform, influence or result in action(s). Uptake indicators  # of media mentions (i.e. print, television, web or radio), where the study was mentioned (please provide electronic clippings and web links when possible)  Feedback from policymakers, programme managers or beneficiaries on the study and findings  # of policymaker and programme management invitations to present on interim or final findings  # presentations of the interim or final study findings by the implementing agency at events  # of downloads of policy notes, briefs, blogs, memo articles, working papers or other study-related materials from your website  Implementing agency website page on the study or if final report gets posted on their website  # of any other websites where the final study report was posted (please include web links)

Use2 indicators  Inform discussions of policies and programmes related to the programme that was evaluated o Meeting agendas, where the study appears on it; please include enough detail to know the meeting and who was there (see above for reporting on events) o # of citations by implementing agency in policy documents o Invitations to participate in working groups on policies and programmes related to the focus of the study o Influence funding decisions of donors o Guide implementing agency or donor’s strategies and programme priorities  Changes (positive or negative) in the design, implementation or budget of the programme or policy being evaluated (please identify which one(s))  Changes in the local, national or global programmes or policies, to which there is some type of verification3 that the study findings have contributed to that change  Inform regional or global discussions related to the focus of the study, e.g. study findings are used to inform policy priorities o # citations of study findings by donors, international networks or alliances in global policy documents  Inform the design of other similar programmes based on the study findings, e.g. use of study findings by multilateral or bilateral donors to design similar programmes in other contexts  Scale-up of the programme to which the findings have contributed in some verifiable way  Scale-down or closure of the programme based on the study findings  Change in institutional culture of evidence use or commitment to evaluation, e.g. the implementing agency changes their monitoring system, changes programme or

2 3ie understands that the use (actual action) and impact of a study and/or its findings takes time, most often beyond the end of the study itself. The purpose here is to capture what has happened so that we can gain useful, evidence- informed insights into the context and mechanisms for how a study has influenced or informed a change and why.

3 By verify, we mean some way that can validate that what is being reported is correctly reported and that 3ie can cross-check the information. It does not mean attribution.

Version 12 October 2015 Page 2 of 3 institutional practice, or it plans to commission another impact evaluation. Please summarise details about causal pathways for the change and contribution  Changes in public discourse on questions addressed by the study, with details to allow verification and understanding of the change  Any other changes that you observe (and can verify) that you consider evidence of use, either direct or indirect (i.e. an observable change related to the study that is beyond the universe of known direct stakeholders for the study. For example, a vitamin study that changed the culture of evidence use by programme managers of an unrelated early childhood development parenting programme)

In your final progress report at the end of the study, we will request you please to provide any information that you may have about the intended uptake and use of findings, both direct and indirect. 3ie may decide to continue monitoring the study context for evidence of use after the grant has closed.

Version 12 October 2015 Page 3 of 3

Recommended publications