33
2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1) M M&E Side Event 1-2 November 2010 Nanning, China

APR Workshop 2010-M&E-Maria Donnat

Embed Size (px)

Citation preview

Page 1: APR Workshop 2010-M&E-Maria Donnat

2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1)M

M&E Side Event

1-2 November 2010Nanning, China

Page 2: APR Workshop 2010-M&E-Maria Donnat

2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1)

Session 1 – Overview of main M&E challenges

Page 3: APR Workshop 2010-M&E-Maria Donnat

2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1)

Page 4: APR Workshop 2010-M&E-Maria Donnat

The first miles…. …. the next thousand ones.

Page 5: APR Workshop 2010-M&E-Maria Donnat
Page 6: APR Workshop 2010-M&E-Maria Donnat

??

Page 7: APR Workshop 2010-M&E-Maria Donnat

2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1)

You will need directions!

Page 8: APR Workshop 2010-M&E-Maria Donnat

2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1)

Your Monitoring and Evaluation system

Page 9: APR Workshop 2010-M&E-Maria Donnat

2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1)

A successful journey….

InputsActivities

Outcomes

Outputs

Impact

Page 10: APR Workshop 2010-M&E-Maria Donnat

2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1)

From a focus on Monitoring …. (“beans counting” exercise focusing on activities

and outputs)

….. to a focus on Evaluation! (informed reflection on performance)

Page 11: APR Workshop 2010-M&E-Maria Donnat

2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1)

Evaluation questions

Are we doing the right thing?Are we doing it right?

What should be done differently?

Page 12: APR Workshop 2010-M&E-Maria Donnat

2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1)

MFI and project records:• 120 Credit and Saving Groups formed• 13,500 active borrowers (72% women)

• 22,220 savers (80% women)

• Average repayment rate: 75%• Portfolio at risk: 20%

Example:

Objective is to increase access of poor women to microfinance services in 3 provinces

“Beans’ counting” exercise

Page 13: APR Workshop 2010-M&E-Maria Donnat

2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1)

Case study: “My husband wanted a new car, so I asked the

MFI Officer. He told me that if I would form a group, I could access credit. So I asked all my friends and relatives to join me in a group, after which I got a loan from the MFI and my husband bought his car…”

A women beneficiary

Reflection on performance

Example:

Objective is to increase access of poor women to microfinance services in 3 provinces

“Reflection on real performance”

Page 14: APR Workshop 2010-M&E-Maria Donnat

2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1)

Each step in the results’ chain shall be monitored

Cost-effective outcome measurementEach step in the results’ chain shall be

monitoredNeed for baseline dataNeed for SMART indicators

Evaluation challenges

Page 15: APR Workshop 2010-M&E-Maria Donnat

2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1)

InputsActivities

Outcomes

Outputs

Impact

Challenge #1: Monitoring each step in the results’ chain

Are our activities implemented according to plans; are inputs efficiently mobilized?

Are project outputs according to plans , relevant to address beneficiary needs and of the required quality?

Is the project producing the desired outcomes in terms of introducing positive behavioural changes in the community, increasing agricultural productivity, etc.

Is the project achieving its objectives in terms of poverty reduction

Page 16: APR Workshop 2010-M&E-Maria Donnat

2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1)

Limited reporting on outcomes

• 80% of available RIMS data concern Outputs

• Some 80% of outcomes indicators are not reported upon

Page 17: APR Workshop 2010-M&E-Maria Donnat

2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1)

Example

• People trained in crop production practices and technologies

Missing indicators:Missing indicators:

Farmers reporting production or yield increases Farmers adopting recommended technologies Nb of ha of incremental crops grown

Page 18: APR Workshop 2010-M&E-Maria Donnat

2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1)

Concrete results (RIMS)

21

16

20

4

6

9

7

Nb project

s

816,896

777,581

483,178

68,184

24,328

153,632

165,894

Data

2008/09

28814,76020456,60918263,492Active borrowers

152,467,39919939,42716502,580Nb of active savers

29457,25120316,63022241,526Persons trained in crop and livestock production and development

7616,13511807,27811218,088Nb of farmers adopting technology recommended by project

10120,92310679,89711132,319Farmers reporting production or yield increased/herd size

2ND22928,59321763,671Persons receiving project services

985,47513289,974782,689Households reporting improved food security

Nb project

sData

Nb project

sDataNb

projectsData 

2009/20102007/08 2006/07  RIMS data

Page 19: APR Workshop 2010-M&E-Maria Donnat

2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1)

Quality versus quantity

Page 20: APR Workshop 2010-M&E-Maria Donnat

2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1)

You need baseline data!

As compared to what?

Is your life better?

Page 21: APR Workshop 2010-M&E-Maria Donnat

2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1)

Few projects with “real” baseline data

• 33% of projects have no baseline survey

• Only 20% of baseline surveys have been conducted during first year of implementation

Page 22: APR Workshop 2010-M&E-Maria Donnat

2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1)

Even reporting on outputs is problematic:

1) Reliability of data

Double counting of results:• Nb of persons trained? (same people trained several times)• Nb of beneficiaries?• Nb of persons receiving project services?

Monitoring challenges

Page 23: APR Workshop 2010-M&E-Maria Donnat

2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1)

How well is performance measured?

Persons receiving project services: “Population of villages where PRA was done”

Number of persons trained: “The project, since PY1, records persons as new

participants every time they take part in a training (regardless if the same person is trained several times). In this way the cumulative total exceeds by far the number of target villagers as the same persons may have been counted several times”

Example: How do you measure…?

Page 24: APR Workshop 2010-M&E-Maria Donnat

2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1)

Tracking “cumulative” achievements: main challenges

Cumulative achievements are not necessary the sum of annual achievements!!

Page 25: APR Workshop 2010-M&E-Maria Donnat

2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1)

2) Inefficient data collection exercises

Overly complex data collection exercises, with little evaluation of data

Proper Information Management Systems often lacking

No staff dedicated to data entry

Other monitoring challenges

Page 26: APR Workshop 2010-M&E-Maria Donnat

2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1)

What to avoid: data collected but not used!

Page 27: APR Workshop 2010-M&E-Maria Donnat

2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1)

Remember!

“Not everything that can be counted counts, and not everything that counts can be counted.”

- Albert Einstein (1879-1955)

Page 28: APR Workshop 2010-M&E-Maria Donnat

2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1)

Weak national systems

Lack of standard approaches, definitions and tools

Lack of continuity in support from IFAD

Other monitoring challenges

Page 29: APR Workshop 2010-M&E-Maria Donnat

2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1)

Other issues

Unclear Logframes

Missing targets

Turn-over of M&E staff and M&E consultants

Poor in-country capacities

Page 30: APR Workshop 2010-M&E-Maria Donnat

Impact on quality of M&E data

High

Some M&E champions

M&E Officers lack operating budget

IFAD may abandon

RIMS

Good/trained M&E Officers will leave

Good M&E consultants

available

Poor M&E Officers capacities

IFAD not negotiating strongly

enough

Some CPOs are building M&E

support capacities

Low Growing number of RB COSOP

High

Intensity (strength) Lack of continuity in external

support

PI willing to provide more support

Likelihood (Threat/Opportunity

Lack of Gov

commitment

PDs have no interest in

M&E

Lack of IFAD commitment to

support M&E

Growing number of DSF countries

Project designs don't pay enough attention to M&E

)

IFAD failing to offer good guidelines

Lack of PI ability to provide direct

suort

Increasing donors' interest in results

Low

Strengths, Weaknesses, Opportunities and Threats - Analysis

PDs have no interest in M&E

Yyy

RIMS surveys do not measure impact well

Page 31: APR Workshop 2010-M&E-Maria Donnat

2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1)

Project recordsOutput indicators

Outputs

Impact surveys; National/local statistics

Impact indicators

Overall Goal

Project recordsProcess indicators

Activities

Surveys; Observations; Interviews with key informants

Outcome indicators

Component/ Sub-Component

Impact surveysImpact indicators

Project Purpose

Data Collection MethodsTypes of Indicators

Objectives

Increasingly expensive, time consuming, technically complex and skill intensive

… AND there are cost implications!

Page 32: APR Workshop 2010-M&E-Maria Donnat

2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1)

Do you have the necessary budget, and staff, to do proper M&E?

Page 33: APR Workshop 2010-M&E-Maria Donnat

2010 Annual Performance Review Workshop M&E Side Event – Day 1 (Session 1)

Minimum M&E budget necessary

•RIMS: 10,000 to 16,000 per survey

• Annual (staff time)• 3 “RIMS+” surveys

• Annual outcome surveys

• Min. 5 case studies:[Gender] – [Empowerment] – [Environmental impact] – [Sustainability] – [Infrastructure O&M] – [Agric. innovation uptake] – [Microfinance] [Etc]

Surveys and studies:

• Transportation

• Per diem

• Workshops, meetings

Operating budget:

Minimum costs:

• 36 m/m per year; Senior, Junior and clerk levels

• 36 m/m ST staff whole project duration (RIMS)

•2 m/m per year (other surveys)

At least:

• 1 M&E Officer

• 1 M&E Assistant

• 1 full time data entry pers.

• short term enumerators

Staff: