23
Evaluating complex development interventions in real-time: the African Institutions initiative Sonja Marjanovic on behalf of the evaluation and learning team March 2012 Measuring Impact of Higher Education for Development Conference, London

Evaluating complex development interventions in real · PDF fileEvaluating complex development interventions in real-time: ... (skills, programmes, management , infrastructure)

Embed Size (px)

Citation preview

Page 1: Evaluating complex development interventions in real  · PDF fileEvaluating complex development interventions in real-time: ... (skills, programmes, management , infrastructure)

Evaluating complex development interventions in

real-time: the African Institutions initiative

Sonja Marjanovic

on behalf of the evaluation and learning team

March 2012

Measuring Impact of Higher Education for Development Conference,

London

Page 2: Evaluating complex development interventions in real  · PDF fileEvaluating complex development interventions in real-time: ... (skills, programmes, management , infrastructure)

Outline

– Background and context: what influences evaluation

design

– Evaluating the African Institutions initiative: design and

methods, challenges and opportunities

– For take-away (food for thought) - potentially relevant

areas for consideration in developing indicators for

complex research capacity-building interventions

Page 3: Evaluating complex development interventions in real  · PDF fileEvaluating complex development interventions in real-time: ... (skills, programmes, management , infrastructure)

Why are we

evaluating?

How complex is

the intervention

being

evaluated?

Evaluation approach is influenced by many factors, including:

Page 5: Evaluating complex development interventions in real  · PDF fileEvaluating complex development interventions in real-time: ... (skills, programmes, management , infrastructure)

The interventions we evaluate vary in complexity

• Simple, complicated, complex (e.g. Rogers 2008; Campbell et al., 2007)

• Criteria influence complexity levels and have implications for evaluation:

– Context in which intervention is being developed and deployed

– Evidence base on success factors for intervention

– Component complexity of intervention

– Ability to specify outputs upfront (range, predictability and probability)

Page 6: Evaluating complex development interventions in real  · PDF fileEvaluating complex development interventions in real-time: ... (skills, programmes, management , infrastructure)

The African institutions initiative is a complex intervention

• Can lead institutions manage funding effectively?

• Will political instability interfere with intervention?

Many uncertainties in intervention context (e.g. socioeconomic, political)

• Alignment of parts is not straightforward (skills, programmes, management , infrastructure)

• Intervention and context highly inter-dependant

Component complexity: many interdependent parts must function

together for intervention to be sustainable

• High propensity for adaptation and change over time

• Unforeseen consequences

Ability to specify full range of outcomes upfront with high certainty is limited

• Some elements more tried and tested than others (e.g. individual vs. Institutional, network)

• More known about challenges than solutions

Evidence base on success factors is mixed

and fragmented

Page 7: Evaluating complex development interventions in real  · PDF fileEvaluating complex development interventions in real-time: ... (skills, programmes, management , infrastructure)

Real-time evaluation is of particular benefit for complex

interventions and uncertain environments

When ongoing learning and

informing programme

implementation are important!

For high-risk initiatives

where adaptability is

important

When evaluation has

multiple objectives -

summative and formative

Page 8: Evaluating complex development interventions in real  · PDF fileEvaluating complex development interventions in real-time: ... (skills, programmes, management , infrastructure)

For the African Institutions initiative, we are using real-time

approaches to:

1. Evaluate the performance of each consortium and

ultimately the initiative as a whole

2. Legacy and learning: extract lessons learnt from the

initiative and disseminate insights

3. Help support networking efforts of consortia to improve

learning, strengthen shared experiences and promote re-

source sharing

Rooted in tried and tested theory; flexible, bespoke,

participative, objectivity in inferences

Page 9: Evaluating complex development interventions in real  · PDF fileEvaluating complex development interventions in real-time: ... (skills, programmes, management , infrastructure)

Our evaluation is based on theory of change, realist evaluation

methods

• Theory of change surfaces perceived causal mechanisms through which an initiative is intended to deliver benefits and underlying assumptions (e.g. Weiss, 1995; Ling et al, 2012)

• What is each consortium trying to achieve?

• How are they hoping to achieve their objectives?

• Why do they think their approach will work?

• Realist evaluation emphasises intervention contexts (e.g. Pawson and Tilley, 1997)

– What works, for whom, in what circumstances

– Engages local expertise and insights

– relationship-building, time, training, listening to African voice

Page 10: Evaluating complex development interventions in real  · PDF fileEvaluating complex development interventions in real-time: ... (skills, programmes, management , infrastructure)

Logic modelling maps sequences of activities

that connect actions to intended consequences

• Helps stakeholders specify and agree on intended outcomes, outputs, activities and inputs, and necessary conditions

• But milestones are NOT set in stone – are a guide

Page 11: Evaluating complex development interventions in real  · PDF fileEvaluating complex development interventions in real-time: ... (skills, programmes, management , infrastructure)

INPUT PROCESS OUTPUT OUTCOME

1. Capacity-building in scientific skills and careers

Indicators and measures

Indicators and measures

Indicators and measures

Indicators and measures

2. Capacity-building in research management, governance, administration

Indicators and measures

Indicators and measures

Indicators and measures

Indicators and measures

3. Capacity-building related to physical and ICT infrastructure

Indicators and measures

Indicators and measures

Indicators and measures

Indicators and measures

4. Learning, linkage, and exchange: communications and networking

Indicators and measures

Indicators and measures

Indicators and measures

Indicators and measures

• Helps stakeholders specify and agree on intended outcomes, outputs, activities and inputs, and necessary conditions

• Helps set milestones, and develop S.M.A.R.T indicators for evaluating progress

• Relating inputs, process to outcomes: examining causal effects and mechanisms, ‘linking constructs’ (McDavid & Hawthorn, 2006)

• Examining evaluation criteria: relevance, efficiency, effectiveness, utility and sustainability of an intervention

Page 12: Evaluating complex development interventions in real  · PDF fileEvaluating complex development interventions in real-time: ... (skills, programmes, management , infrastructure)

How are we implementing the evaluation?

Establishing and nurturing mutual understanding and cooperative relationships

Understanding where it is heading

Understanding how things are

going over time

Learning and sharing the

learning

Establishing where a

consortium is coming from

Work Package 1—Part A:

• Establishing relationships

• Assessing baseline research capacity at institutions

Work Package 2:

Ongoing co-evaluation and interim reporting (annual KPI framework, quarterly engagement elements)

Work Package 3:

Supporting networking and exchange

Work Package 4:

Endline and initiative-wide assessments/lessons learned

Work Package 1—Part B:

• Specifying intervention logic

• Risk management & SWOT

• Evaluation framework/ indicators

• Milestones/targets

Page 13: Evaluating complex development interventions in real  · PDF fileEvaluating complex development interventions in real-time: ... (skills, programmes, management , infrastructure)

• Managing consortia requests for

advice on strategic direction

• Impacts of political turbulence on timeliness of evaluation evidence and project management

• Cultural challenges and historical sensitivities

Complexities of real-time evaluations in development contexts– for

evaluators and initiative participants

• Limited baseline evaluation (not monitoring) capacity in consortia (some exceptions)

• Competing demands on staff time –mobilising engagement, staff turnover

• Importance of designated posts, succession planning, training

Page 14: Evaluating complex development interventions in real  · PDF fileEvaluating complex development interventions in real-time: ... (skills, programmes, management , infrastructure)

But also many opportunities from the participative approach and

benefits for interventions

• Building local evaluation capacity

• Knowledge management in networks

and building organisational memory

• Providing timely evidence to increase

chances of programme success and

sharing learning

– highlighting areas where

adaptation is needed

– Sharing how others address similar

issues

• Annual deliverables useful for ongoing

fundraising!

Page 15: Evaluating complex development interventions in real  · PDF fileEvaluating complex development interventions in real-time: ... (skills, programmes, management , infrastructure)

A TAKEAWAY - FOOD FOR THOUGHT –

FOR IN YOUR OWN TIME

The following slides share some examples of the

types of issues indicators of research capacity

building for HE interventions in development

contexts might explore...

Page 16: Evaluating complex development interventions in real  · PDF fileEvaluating complex development interventions in real-time: ... (skills, programmes, management , infrastructure)

• These are just some examples of some areas of capacity building we

are exploring in this initiative.

• They might apply more widely to other higher education interventions

in development efforts , but this will obviously be context-dependent

• Longer term outcomes are often aspirational, and not in full control of a

single initiative. However, it is still important to examine contributions

towards them

• Important considerations: attribution vs contribution; time-lags

• The slides that follow cover examples of process, output and outcome

indicators

Some points to consider

Page 17: Evaluating complex development interventions in real  · PDF fileEvaluating complex development interventions in real-time: ... (skills, programmes, management , infrastructure)

Training and empowerment of individuals to conduct

and lead research

Strengthening career development prospects at universities – institutional

receptiveness

Improving research governance, management

and administration capacity

Physical infrastructure

Equitable and sustainable South–South and South–

North networks

• Take up of Post-doc, PhD, and MSc scholarships

• Completion rates and changes in drop-out dynamics

• Are there clear criteria, roles and responsibilities for supervisors

• Is thee feedback on quality of supervision and training courses

• Numbers, types and distribution of researchers (newly trained or existing with new skills) across the career pathway in a region?

• Evidence of new and improved training programmes accredited by institutions

• Is there better access to existing training opportunities in the region (e.g. linked to opportunities for credit transfer and more inter-institutional collaboration)?

• Evidence of knowledge outputs – e.g. publications and citations as evidence of scientific impact

• Evidence of improved ability to obtain third party funding for research sustainability in the region’s institutions?

Examples only:

Page 18: Evaluating complex development interventions in real  · PDF fileEvaluating complex development interventions in real-time: ... (skills, programmes, management , infrastructure)

Training and empowerment of

individuals to conduct and lead research

Strengthening career development prospects at universities – institutional

receptiveness and support

Improving research governance, management

and administration capacity

Physical infrastructure

Equitable and sustainable South–South and South–

North networks

• Are there advocacy efforts for research support in institutions, with Deans and Vice-chancellors? With Ministries?

• Evidence of continued professional development training opportunities in institution

• Institutionalisation of research positions

• Evidence of research being valued? Is there increased demand for it ?

• Systems for merit based promotion?

• Greater availability of competitive small grant schemes over time in region?

• Dedicated research time supported by institutions?

• Supervision quality monitored and rewarded?

• Third party funding for research sustainability

Examples only:

Page 19: Evaluating complex development interventions in real  · PDF fileEvaluating complex development interventions in real-time: ... (skills, programmes, management , infrastructure)

Training and empowerment of

individuals to conduct

and lead research

Strengthening career development prospects at universities

Improving research governance,

management and administration

capacity

Physical infrastructure

Equitable and sustainable South–South and South–

North networks

• Is there improved access to training in research management (e.g. grant-writing, financial management, ethics, project management, supervision, publication writing)?

• Evidence of improved governance structures, management systems, policies, and procedures in institutions?

• Transparent processes for distribution of funding, be it based merit or equity?

• Better guidelines for monitoring of supervision and training quality embedded in departments and faculty?

• Better knowledge management systems (tracking people, grants, publications)?

• Are research management and administration staff with new and improved skills embedded at institutions?

• Is there evidence of more coordinated use of support structures within institutions?

Examples only:

Page 20: Evaluating complex development interventions in real  · PDF fileEvaluating complex development interventions in real-time: ... (skills, programmes, management , infrastructure)

Training and empowerment of individuals to conduct

and lead research

Strengthening career development prospects at

universities

Improving research governance, management

and administration capacity

Physical infrastructure

Equitable and sustainable South–South and South–

North networks

• Is the process for distributing infrastructure funding based on clear criteria and needs of individual, institution and region?

• Is there greater sharing of available resources within institution and between projects?

• Is there evidence of impact from infrastructure investments on research and training quality?

• e.g. research which would not be possible without new infrastructure?

Examples only:

Page 21: Evaluating complex development interventions in real  · PDF fileEvaluating complex development interventions in real-time: ... (skills, programmes, management , infrastructure)

Training and empowerment of individuals to conduct

and lead research

Strengthening career development prospects at

universities

Improving research governance, management and administration capacity

Physical infrastructure

Equitable and sustainable South–South and South–

North networks

• Are planned networking interventions being implemented? Is there take up?

• e.g. staff and student exchanges; joint supervision, cross-appointments

• Levels and diversity of collaborative dissemination over time?

• in training, research, dissemination, skills and resource sharing

• Network sustainability - longer term aspirations:

• Collaborative publications and grants by partners are sustained?

• Increased commitment to research by ministries and policy makers who see value of outputs?

Examples only:

Page 22: Evaluating complex development interventions in real  · PDF fileEvaluating complex development interventions in real-time: ... (skills, programmes, management , infrastructure)
Page 23: Evaluating complex development interventions in real  · PDF fileEvaluating complex development interventions in real-time: ... (skills, programmes, management , infrastructure)

Criteria for a fit for purpose evaluation approach –

• Grounded in tried and tested theory

• Bespoke:

– Evaluation has multiple objectives: accountability, learning, advocacy

– Consortia have mix of common and unique features

– A.I.I. is multidimensional capacity building (individuals, institutions,

networks)

• Participative (co-evaluation)

– To maximise relevance and learning, and enable timely action on

evidence: formative and summative

– Engages and evaluates consortia and Trust

– Aims to support a self-improving system (e.g. Narayan, 1993; Cousins

and Whitmore, 2004)

• Flexible

– To be able to address the complexity of the initiative and potentially

changing priorities, contexts, resources

• Practical

– Feasible, balance breadth and depth

• Objective and independent