VALUE CHAIN MONITORING AND EVALUATION GUIDE
MODULE 11
Evaluations and Value Chain Projects
April 12, 2023
2
GETTING STARTED
OBJECTIVE OF MODULE 11
This module will teach the basic principles of conducting evaluations of your value chain project.
All VC projects should plan to conduct a baseline, midline and end line evaluation with an experienced external partner.
April 12, 2023
3
GETTING STARTEDOVERVIEW
This is not a comprehensive guide to planning and conducting and evaluation.
It will focus on key issues and provide links to more information where necessary.
Step-by-step guide will teach you how to create a valid counterfactual.
April 12, 2023
4
GETTING STARTED3 BASIC KINDS OF EVALUATIONS
IMPACTAim is to determine if changes have taken place in the VC or among VC actors
To what degree those changes can be attributed to CARE’s work.
Designed to answer what would have happened if CARE had not intervened.
Answered via a statistically valid counterfactual using control groups and experimental or quasi-experimental designs.
PERFORMANCEAlso aims to determine if changes have taken place in the VC or among VC actors.
To what degree CARE’s interventions contributed to those changes.
Use non experimental designs.
Typically only gather and analyze data from those directly engaged or impacted.
Lower cost, but less rigorous.
PROCESSAim to assess whether and to what degree projects have been implemented in line with initial plan.
Does not consider results directly, but how the initiative is managed.
Typically internal.
Assesses timeliness and quality of performance.
Looking to ID areas for improvement to enhance implementation process.
April 12, 2023
5
GETTING STARTEDWHICH BEST SERVES OUR PURPOSES?
IMPACTAim is to determine if changes have taken place in the VC or among VC actors
To what degree those changes can be attributed to CARE’s work.
Designed to answer what would have happened if CARE had not intervened.
Answered via a statistically valid counterfactual using control groups and experimental or quasi-experimental designs.
PERFORMANCEAlso aims to determine if changes have taken place in the VC or among VC actors.
To what degree CARE’s interventions contributed to those changes.
Use non experimental designs.
Typically only gather and analyze data from those directly engaged or impacted.
Lower cost, but less rigorous.
PROCESSAim to assess whether and to what degree projects have been implemented in line with initial plan.
Does not consider results directly, but how the initiative is managed.
Typically internal.
Assesses timeliness and quality of performance.
Looking to ID areas for improvement to enhance implementation process.
April 12, 2023
6
GETTING STARTEDEXCEPTIONS
•If the project is small in scale, or lacks the resources for impact evaluation, it is better to carry out a performance evaluation.
•Performance evaluations do not require counterfactuals and use non-experimental methods, such as:
•Pre-post intervention design•Post intervention design
April 12, 2023
7
STEP-BY-STEP GUIDE
1
•Determine the Purpose for the Evaluation
2
•Determine the Financial Resources Available for the Evaluation
3
•Identify Research Team and Partners
4
•Identify Research Questions
5
•Choose a Research Methodology
6
•Determine the Other Details of the Research Design
7
•Implement the Impact Evaluation
April 12, 2023
8
STEP-BY-STEP GUIDE: STEP 1
FIRST!Ask yourself what is the purpose of this evaluation and who is it for?
Not asking this can result in a methodology poorly matching donor requirements and can cost precious time, money and energy.
April 12, 2023
9
STEP-BY-STEP GUIDE: STEP 1Other Angles to Consider
•Typically, there are 3 M&E clients that might want an impact evaluation
•Motivations for conducting an impact evaluation
•Impact evaluations are not always the right choice.
April 12, 2023
10
STEP-BY-STEP GUIDE
1
•Determine the Purpose for the Evaluation
2
•Determine the Financial Resources Available for the Evaluation
3
•Identify Research Team and Partners
4
•Identify Research Questions
5
•Choose a Research Methodology
6
•Determine the Other Details of the Research Design
7
•Implement the Impact Evaluation
April 12, 2023
11
STEP-BY-STEP GUIDE: STEP 2
Evaluation Costs•More rigorous evaluations = More $$•Attributable evidence is expensive• Cost depends on:
# of research rounds
Survey length
International evaluation experts
Sample size
Geographic dispersion of respondents
Price of local research talent
Sampling methodology
April 12, 2023
12
STEP-BY-STEP GUIDE: STEP 2
WA R N I N G !
Best practice evaluation standards strongly recommend outsourcing impact evaluations.
April 12, 2023
13
STEP-BY-STEP GUIDE
1• Determine the Purpose for
the Evaluation
2
• Determine the Financial Resources Available for the Evaluation
3• Identify Research Team
and Partners
4 • Identify Research Questions
5• Choose a Research
Methodology
6• Determine the Other Details
of the Research Design
7• Implement the Impact
Evaluation
April 12, 2023
14
Activities and Responsibilities for External Research Partners
STEP-BY-STEP GUIDE: STEP 3
Refining Evaluation
Design
Sharpening Research Questions
Translating research
instruments
Pilot testing research
instruments
Developing the research instruments
Training survey
enumerators
Managing the field data
collection
Entering results into data shell
Cleaning the data set
Transcripts of interviews
Data analysis
Final reports
April 12, 2023
15
STEP-BY-STEP GUIDE: STEP 3
Proposals from Potential Research Partners
• Receiving proposals
• Evaluating proposals
• Evaluation and selection criteria
• World Bank guide
April 12, 2023
16
STEP-BY-STEP GUIDE
1• Determine the Purpose for
the Evaluation
2
• Determine the Financial Resources Available for the Evaluation
3• Identify Research Team and
Partners
4• Identify Research
Questions
5• Choose a Research
Methodology
6• Determine the Other Details
of the Research Design
7• Implement the Impact
Evaluation
April 12, 2023
17
How to ID Questions•Benefit from research partner’s knowledge and experience•Questions should measure critical links and associated key performance indicators•Goal is to verify results•Involve the team and M&E clients early•Check USAID publications
STEP-BY-STEP GUIDE: STEP 4
April 12, 2023
18
STEP-BY-STEP GUIDE
1• Determine the Purpose for
the Evaluation
2
• Determine the Financial Resources Available for the Evaluation
3• Identify Research Team and
Partners
4 • Identify Research Questions
5• Choose a Research
Methodology
6• Determine the Other Details
of the Research Design
7• Implement the Impact
Evaluation
April 12, 2023
19
Creating a Comparison Group
•Must be a group of farmers, entrepreneurs, business owners, etc. as similar as possible to the actual project beneficiaries•AKA Control group vs. Treatment group•Isolates different impacts
STEP-BY-STEP GUIDE: STEP 5
April 12, 2023
20
2 Sources of Selection Bias
STEP-BY-STEP GUIDE: STEP 5
OBSERVABLE CHARACTERISTICS
•Include things that can be seen or tangibly measured•Sex, education, age, location, etc.•If treatment group is 90% male / 10% female and control group is 40% male / 60% female, you will come up with invalid conclusions.•Educated vs. uneducated•Urban vs. rural
UNOBSERVABLE CHARACTERISTICS
•Aspects of an individual’s personality that play a large role in determining success•Personal initiative, entrepreneurial spirit, risk orientation, persistence, self-confidence, optimism, etc.•Those who volunteer for VC projects have more of these qualities than others•Comparing a group of new-seed adopters to a group of non-adopters would not allow us to know to what extent any observed differences in farming outcomes are the result of the project of the result of pre-existing personality differences among the groups
April 12, 2023
21
Experimental Methods of Evaluation
•Follows the same basic approach as a placebo experiment•Of a selected group of maize farmers, some receive project assistance while others do not•Theoretically eliminates all sources of selection bias•Also referred to as randomized controlled trials (RCTs)
STEP-BY-STEP GUIDE: STEP 5
April 12, 2023
22
Downsides of Experimental Method
•Randomization protocols can be complicated, time consuming and operationally burdensome•May be perceived as unethical•Not ideal for projects with small #’s of beneficiaries, impromptu projects, specified locations or groups of people, or projects with no available control group (broad-based policy reform)•VC projects are flexible, easily changed, while this methodology requires consistent variables•Difficult for evaluation designers to reasonably ‘control for’ changes in the environment that was not influenced by the project.
STEP-BY-STEP GUIDE: STEP 5
April 12, 2023
23
Quasi-Experimental Methods•Does not randomly assign subjects into treatment and control groups•Instead, compares pre-existing groups via a matching process•Treatment groups are selected via random sampling•Control groups are selected by ID’ing areas and communities with matching observable characteristics and then randomly sampling the relevant population living in those areas and communities.•But, quasi-experimental methods are less rigorous thanexperimental
STEP-BY-STEP GUIDE: STEP 5
April 12, 2023
24
In choosing your method, ask the following:
•Will our M&E system clients be less well served if we opt for a quasi-experimental design over an experimental design?•Is our project amenable to random assignment?•Is random assignment operationally feasible?•Can we manage/overcome the anticipated opposition from our project staff and external stakeholders?•Is the tradeoff of an increased operational burden worth the improvement we get in statistical credibility?
STEP-BY-STEP GUIDE: STEP 5
If ‘Yes’ to each, then experimental
If ‘No’ to any, then quasi-experimental
April 12, 2023
25
STEP-BY-STEP GUIDE
1• Determine the Purpose for
the Evaluation
2
• Determine the Financial Resources Available for the Evaluation
3• Identify Research Team and
Partners
4 • Identify Research Questions
5• Choose a Research
Methodology
6
• Determine the Other Details of the Research Design
7• Implement the Impact
Evaluation
April 12, 2023
26
Other Considerations•Sample size and composition•Trend study vs. Panel study•Single method vs. Mixed methods•Early vs. Delayed Baseline Data Collection
STEP-BY-STEP GUIDE: STEP 6
April 12, 2023
27
STEP-BY-STEP GUIDE
1
•Determine the Purpose for the Evaluation
2
•Determine the Financial Resources Available for the Evaluation
3
•Identify Research Team and Partners
4
•Identify Research Questions
5
•Choose a Research Methodology
6
•Determine the Other Details of the Research Design
7
•Implement the Impact Evaluation
April 12, 2023
28
The Final Act•Implement the evaluation•Work closely with the local research firm, project staff and (as relevant) implementing partners and local authorities/community leaders•Assign one person specific task of monitoring the research firm’s performance at every stage
STEP-BY-STEP GUIDE: STEP 7
April 12, 2023
29
Resources
Impact Evaluation Resources
Evaluation Firms•Abdul Latif Jameel Poverty Action Lab•Innovations for Poverty Action •International Food Policy Research Institute (IFPRI) Impact Assessment Program
Web Resources•Evaluation Portal •Evaluation Virtual Library•Free Resource for Program Evaluation and Social Research Methods
Associations and Networks•American Evaluation Association•Donor Committee for Enterprise Development•InterAction Monitoring & Evaluation•International Initiative for Impact Evaluation•Network of Networks on Impact Evaluation
Donor Organizations•International Program for Development Evaluation Training•United Nations Evaluation Group•USAID Private Sector Development Impact Assessment Initiative•World Bank Development Impact Evaluation Initiative•World Bank Independent Evaluation Group
April 12, 2023
30
COMMON PITFALLS
•Teams do not conduct appropriate due diligence about their evaluation options•Teams implement the baseline data collection too soon•Teams implement a trend study when a panel study would have been both preferable and possible•Teams load up the impact survey with excess questions•Teams do not monitor the local research firm’s adherence to the TOR•Teams do not budget or plan for mixed-methods evaluations
•Teams do not seek advice on sampling from qualified technical experts•Teams inappropriately attribute evaluation findings•Teams attempt to implement the impact evaluation using project staff•Donors demand rigorous evaluations but do not allocate sufficient funding•Projects make compromises to the evaluation methodology•Evaluation reports do not fully disclose the tradeoffs made•Projects do not closely monitor the performance of external research firms
April 12, 2023
31
QUESTIONS?
COMMENTS?
April 12, 2023
32
Multiple ways to continue the discussion and continue learning:• Initiate a monthly session on the M&E guide
and case studies from across CARE. Contact [email protected]
• Join the Market Engagement Community of Practice on LinkedIn.
• Join a task force to review and refine the universal indicators. Contact [email protected]
Want to Learn More?