15
Reviewing the quality of evidence in humanitarian evaluations Juliet Parker, Christian Aid David Sanderson, CENDEP, Oxford Brookes University ALNAP, March 2013 Review of four evaluations

Reviewing quality of evidence in humanitarian evaluations (Juliet Parker, Christian Aid, and David Sanderson, Oxford Brookes Uni)

  • Upload
    alnap

  • View
    367

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Reviewing quality of evidence in humanitarian evaluations (Juliet Parker, Christian Aid, and David Sanderson, Oxford Brookes Uni)

Reviewing the quality of evidence in humanitarian evaluations

Juliet Parker, Christian AidDavid Sanderson, CENDEP, Oxford Brookes University

ALNAP, March 2013

Review of four evaluations

Page 2: Reviewing quality of evidence in humanitarian evaluations (Juliet Parker, Christian Aid, and David Sanderson, Oxford Brookes Uni)

Four parts

1. Why did Christian Aid want to do this?

2. The evidence assessment tool3. Quality of evidence - assessing four

evaluations 4. So what for Christian Aid?

Page 3: Reviewing quality of evidence in humanitarian evaluations (Juliet Parker, Christian Aid, and David Sanderson, Oxford Brookes Uni)

1. Why do this?

We want to improve the quality of our evaluations:• For our own analysis and decision making• To get our money’s worth from evaluation

consultants(!)• As part of a challenge to, and move across,

the sector

Page 4: Reviewing quality of evidence in humanitarian evaluations (Juliet Parker, Christian Aid, and David Sanderson, Oxford Brookes Uni)

2. The tool used

BOND’s ‘checklist for assessing the quality of evidence:’ • Developed between 2011-12 through

NGO and donor consultation • Five principles, four questions for each

that are scored on a scale of 1-4 …

Page 5: Reviewing quality of evidence in humanitarian evaluations (Juliet Parker, Christian Aid, and David Sanderson, Oxford Brookes Uni)

Five principles• Voice and inclusion – ‘the perspectives of people living in poverty,

including the most marginalised, are included in the evidence, and a clear picture is provided of who is affected and how’

• Appropriateness – ‘the evidence is generated through methods that are justifiable given the nature of the purpose of the assessment’

• Triangulation – ‘the evidence has been generated using a mix of methods, data sources, and perspectives’

• Contribution – ‘the evidence explores how change happens and the contribution of the intervention and factors outside the intervention in explaining change’

• Transparency - ‘the evidence discloses the details of the data sources and methods used, the results achieved, and any limitations in the data or conclusions’

Page 6: Reviewing quality of evidence in humanitarian evaluations (Juliet Parker, Christian Aid, and David Sanderson, Oxford Brookes Uni)
Page 7: Reviewing quality of evidence in humanitarian evaluations (Juliet Parker, Christian Aid, and David Sanderson, Oxford Brookes Uni)

Checklist for criteria (eg. of voice and appropriateness)

Page 8: Reviewing quality of evidence in humanitarian evaluations (Juliet Parker, Christian Aid, and David Sanderson, Oxford Brookes Uni)

Review of four evaluations

1. DRC Final phase evaluation, August 2011 (assistance to conflict and displacement)

2. Tropical storms in the Philippines end-of-project evaluation, October 2011 (response to typhoon Ketsana)

3. Middle East Crisis Impact Evaluation final report, May 2011 (Gaza crisis)

4. Sudan Appeal End of term evaluation, April 2011 (conflict in Darfur)

Page 9: Reviewing quality of evidence in humanitarian evaluations (Juliet Parker, Christian Aid, and David Sanderson, Oxford Brookes Uni)
Page 10: Reviewing quality of evidence in humanitarian evaluations (Juliet Parker, Christian Aid, and David Sanderson, Oxford Brookes Uni)

Findings Voice and inclusion • No mention that most excluded or marginalised groups were

included • No evaluations provided data by gender • No mention that beneficiaries engaged in the assessment process,

eg analysing data

Appropriateness • ‘Good’ data collection methods, involving qualitative review, focus

group discussions and review of reports• But, no information given for sample size

Triangulation • Data collection methods: one ‘gold standard’, three minimal level • Varied presenting of findings back to people

Page 11: Reviewing quality of evidence in humanitarian evaluations (Juliet Parker, Christian Aid, and David Sanderson, Oxford Brookes Uni)

Findings …..

Contribution • No baselines (not unusual)• Little/no exploration of how interventions contributed to change• Unidentified and unexpected changes: two ‘weak’, one ‘minimal’ and

one ‘good’

Transparency • Three evaluations were ‘weak’ in explaining the composition of the

group from which data was collected• Data collection and analysis for two was ‘weak’ and for two ‘minimal’• Explanation and discussion of bias was ‘weak’ for all four evaluations

Page 12: Reviewing quality of evidence in humanitarian evaluations (Juliet Parker, Christian Aid, and David Sanderson, Oxford Brookes Uni)

In summary

• ‘The quality of evidence in the evaluations was found to be low in almost every category identified by the BOND tool, ie voice and inclusion, appropriateness, triangulation, contribution and transparency.’

• ‘That does not mean the project was bad - it means it’s hard to tell.’

Page 13: Reviewing quality of evidence in humanitarian evaluations (Juliet Parker, Christian Aid, and David Sanderson, Oxford Brookes Uni)

Observations on the BOND tool

• The tool prioritises affected populations – good for accountability

• Assumes a thorough write up of methodology – not current practice

• Assumes no baseline means a poor evaluation - yet for disasters this is the norm not the exception

• Ultimately it’s subjective judgement based on interpretation of words (academic similarity)

• … that’s the nature of the business

Page 14: Reviewing quality of evidence in humanitarian evaluations (Juliet Parker, Christian Aid, and David Sanderson, Oxford Brookes Uni)

4. So what for Christian Aid?

• Be clearer on what we’re expecting of our evaluation consultants

• Repeat the process next year• Improve the quality of our data

collection during programme implementation

Page 15: Reviewing quality of evidence in humanitarian evaluations (Juliet Parker, Christian Aid, and David Sanderson, Oxford Brookes Uni)

BOND criteria

• Voice and inclusion • Appropriateness • Triangulation • Transparency• Contribution

ALNAP criteria • Truth/accuracy• Representativeness • Significance • Generalisability • Attribution