21
Towards a Research Agenda for Impact Evaluation of Development Impact, Innovation and Learning: Towards a Research and Practice Agenda for the Future conference Institute of Development Studies, Brighton, UK 26-27 March 2013 Professor Patricia Rogers RMIT University (Royal Melbourne Institute of Technology), Australia [email protected]

IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Patricia Rogers

Embed Size (px)

Citation preview

Page 1: IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Patricia Rogers

Towards a Research Agenda for Impact

Evaluation of Development

Impact, Innovation and Learning: Towards a Research and Practice Agenda for the Future conference

Institute of Development Studies, Brighton, UK

26-27 March 2013

Professor Patricia Rogers

RMIT University (Royal Melbourne Institute of Technology), Australia

[email protected]

Page 2: IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Patricia Rogers

2

Page 3: IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Patricia Rogers

3

Research Agenda

Practice Agenda

Page 4: IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Patricia Rogers

Towards a research agenda

for impact evaluation in development

1. What it needs to cover

2. What is needed to develop it

3. Types of research needed

4. Some burning research questions

5. What is needed to develop and enact it

Page 5: IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Patricia Rogers

Filling in the map

of impact evaluation in development

Page 6: IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Patricia Rogers

1. What the research agenda

needs to cover

Types of development impact evaluation

Scale Individual evaluations, evaluations of multiple projects in a

program, evaluation systems

Purpose Identify ‘best buys’, understand how to scale up and translate

effective programs, understand how to improve effectiveness

Questions Does it work? What works? What does it take for it to work? What

works for whom in what circumstances? Is it working?

Users Donors, implementing agencies, policymakers, regional

associations, communities

Done by External evaluators, internal evaluators, managers and staff,

communities

Page 7: IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Patricia Rogers

7 7

MANAGE

DEFINE

FRAME

UNDERSTAND CAUSES

SYNTHESIZE

REPORT & SUPPORT USE

DESCRIBE

CDC Evaluation Framework with BetterEvaluation components overlaid

1. What the research agenda

needs to cover

Aspects of development impact evaluation

Page 8: IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Patricia Rogers

1. What the research agenda needs to cover

Page 9: IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Patricia Rogers

1. What the research agenda needs to cover

Page 10: IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Patricia Rogers
Page 11: IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Patricia Rogers

2. What is needed to develop

the research agenda

Consultations with Different parties involved in conducting, managing, using

and being affected by impact evaluation

Consultations about Gaps in knowledge, issues, priorities, opportunities

Review of Documentation and guidance for development impact

evaluation – and impact evaluation generally

Issues and challenges in impact evaluation for development

Previous research into impact evaluation

Potential methods, tools and approaches from other areas of

evaluation and research

Promising examples and recent innovations in development

impact evaluation

Page 12: IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Patricia Rogers

3. Types of research needed

Documenting practice Retrospectively, concurrently; good practice, problematic

practice; micro-interactions; decision-making heuristics;

Positive deviance Learning by intended users from success cases

Trials To address particular issues through various

tools/methods/strategies

Trials Of possible uses for particular tools/methods/strategies

Longitudinal studies Of the impact of impact evaluation

Supporting interdisciplinary communities of practice

Knowledge translation To other contexts (sectors, organisations, roles)

Page 13: IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Patricia Rogers

OVERALL

1. How do we do impact evaluation that actually

supports development?

2. How do we support all agents of development,

including communities, to be reflective and empirical

about the impact of their work?

3. Why does so much development impact evaluation

fail to be informed by what has been learned about

effective evaluation?

4. Some burning research questions

Page 14: IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Patricia Rogers

MANAGE an evaluation or evaluation

system 4. What are effective ways to support communities to have

genuine involvement in decision making about

evaluations?

5. How can an evaluation accommodate different ideas about

what constitutes credible evidence among intended

users?

6. When should different strategies be used for developing

an evaluation design (as part of the brief, as part of the

proposal, as a separate project)?

7. How can an evaluation design best accommodate

emerging issues?

8. How can organisations working in the same region share

information and data collection?

8b What are options for funding public interest evaluations

not under the control of the powerful?

Page 15: IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Patricia Rogers

DEFINE what is to be evaluated

9. How can a theory of change/program theory effectively

represent complicated aspects of interventions

(multiple layers, components and partners and

complex aspects (adaptability, emergence)?

10. How can an organisation support projects to have

locally specific theories of change/program theory that

are still widely coherent?

11. What are effective strategies for identifying possible

negative impacts in advance?

12. What investments and activities are the subjects of

evaluation? What examination is made of others?

Page 16: IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Patricia Rogers

FRAME the boundaries of the evaluation

13. What are effective processes for developing good

Key Evaluation Questions – in terms of likely to be

useful and feasible?

14. How can implicit values about results, processes and

distribution of benefits be made explicit?

14b How can evaluations deal with ‘undiscussables’ – eg

actual unstated program objectives, unaddressed

poor performance?

Page 17: IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Patricia Rogers

DESCRIBE activities, outcomes,

impacts, context

15. When is purposeful sampling most appropriate, and how

can it be used validly and effectively?

16. How can long-term results be followed up?

17. How can unanticipated negative outcomes and impacts be

identified and addressed in data collection and reporting?

18. How can reasonable intermediate outcomes be identified

for an evaluation that will end before impacts are evident?

19. How can Big Data be used effectively for development

impact evaluation?

19b What standard measures and indicators should be used

for common outcomes and impacts of interest?

Page 18: IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Patricia Rogers

UNDERSTAND CAUSES of outcomes and

impacts

20. What are credible methods and strategies for non-

experimental causal inference in development

impact evaluations?

Page 19: IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Patricia Rogers

SYNTHESISE data from one or more

evaluations

21. How can different values be accommodated in

developing an overall evaluative judgement?

22. How can systematic reviews which don’t exclude

materials in terms of a hierarchy of evidence deal

with the large number of potentially relevant sources?

Page 20: IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Patricia Rogers

REPORT AND SUPPORT USE

23. How can a development impact evaluation respond to

significant changes in the intended users during the

course of an evaluation?

24. How can a development impact evaluation provide a

coherent message without only focusing on average

effects?

25. What are effective strategies for supporting use of

development impact evaluation, especially in difficult

situations – eg fragile states, changing decision

makers?

25b What process and structures can be created to

protect those ‘speaking truth to power’?

Page 21: IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Patricia Rogers

5. What is needed to develop

and enact a research agenda

Legitimacy

Resources, especially complementary to existing resources

Interdisciplinary co-operation

‘Creative abrasion’