36
Innovations in Evaluation IPDET Workshop, Ottawa, June 14 2013 Simon Roy & Louise Mailloux

Innovations in Evaluation

  • Upload
    kanoa

  • View
    55

  • Download
    0

Embed Size (px)

DESCRIPTION

Innovations in Evaluation . IPDET Workshop, Ottawa, June 14 2013 Simon Roy & Louise Mailloux. Outline. Definitions Innovations in Canada Innovations on the International Scene Discussion: What’s your experience?. A Definition…. - PowerPoint PPT Presentation

Citation preview

Page 1: Innovations in Evaluation

Innovations in Evaluation

IPDET Workshop, Ottawa, June 14 2013

Simon Roy & Louise Mailloux

Page 2: Innovations in Evaluation

Definitions Innovations in Canada Innovations on the

International Scene Discussion: What’s your

experience?2

Outline

Page 3: Innovations in Evaluation

Innovations can be defined as alternative and new ways of conducting evaluations (methods, analyses, governance, etc.)

Many drivers:− Methodological challenge affecting data quality

or availability− Opportunities stemming from new technologies− Influence from other disciplines/professions− HR or governance challenges

3

A Definition…

Page 4: Innovations in Evaluation

Innovations are region-specific: What is innovative in one place may not be in another area

Some innovations may work in one country, but not in another

4

Contextual factors

Page 5: Innovations in Evaluation

5

Recent Innovations in Canada

Page 6: Innovations in Evaluation

Multimode approaches in surveys

Focus on cost analyses Professionalization of

evaluation: certification of evaluators

6

Three notable innovations in last decade In Canada

Page 7: Innovations in Evaluation

Surveys traditionally done in single mode: mail or phone or fax

Low response rates now major problem Evaluators have moved to surveys

administered in multiple modes: respondents offered to complete it online, by phone or by mail

Advantages: Higher response rates, less bias in terms of sampling

Disadvantage: There is a bias associated to the mode

7

Multi-Mode Surveys

Page 8: Innovations in Evaluation

Many governments moving towards “value for money” analyses, including analysis of input, outputs and outcomes in view of the costs involved

Innovation is in the refinement of the approaches to conduct such analyses

8

Cost Analyses

Page 9: Innovations in Evaluation

9

Perspectives on Assessing Resources Utilization and the Results Chain

Primary focus of analysis

Informs analysis

OperationalEfficiency

Inputs OutputsActivities Immediate Outcomes

Intermediate Outcomes

Ultimate Outcomes

Results Chain

AllocativeEfficiency

Economy

The analysis for economy, operational efficiency and allocative efficiency occurs along the results chain.

Page 10: Innovations in Evaluation

Canadian Evaluators have an association: The Canadian Evaluation Society (CES) (http://www.evaluationcanada.ca/)

The CES implemented an Evaluation Credentialing Program in 2010. Evaluators can become “Credentialed Evaluators”

This is an association-led initiative. The Governments of Canada have no direct control over this credential.

It is not a requirement to conduct evaluations.

10

Credentialing

Page 11: Innovations in Evaluation

Canadian Evaluators can receive a credential if they meet criteria (demonstration of competency), including 2 years of evaluation experience and competencies in 5 areas (see appendix)

Expected benefits: Evaluators gain recognition. Credentials help evaluation users select evaluation provider.

About 200 credentialed evaluators to date.

11

Credentialing

Page 12: Innovations in Evaluation

Evaluation is evolving – becoming more and more complex

Before discounting new ways, look at the advantages, especially how they can compensate for limitations of traditional approaches (traditional methods have gaps too!)

Weigh the advantages vs. disadvantages, manage them to reduce the latter. Have a backup plan.

12

Our Overall Lessons to Date

Page 13: Innovations in Evaluation

13

Innovations

International Development Context

Page 14: Innovations in Evaluation

Real Time Evaluations (RTE)

Digital Data

Page 15: Innovations in Evaluation

A definition of RTE

A real-time evaluation (RTE) is an evaluation in which the primary objective is to provide feedback in a participatory way in real time (i.e. during the evaluation fieldwork) to those executing and managing a humanitarian response.

Source: Real-time evaluations of humanitarian action An ALNAP Guide Pilot Version, John Cosgrave Ben Ramalingam and Tony Beck, 2009

Page 16: Innovations in Evaluation

Origins of RTEs In the humanitarian sector, UNHCR’s

Evaluation and Policy Analysis Unit (EPAU) was for several years the chief proponent of RTE

WFP, UNICEF, the Humanitarian Accountability Project, CARE, World Vision, Oxfam GB, the IFRC, FAO, WFP and others have all to some degree taken up the practice.

Source: ISSUE 32 December 2005 Humanitarian Exchange Magazine Real-Time Evaluation: where does its value lie? by Maurice Herson and John Mitchell, ALNAP

Page 17: Innovations in Evaluation

RTE vs other types of evaluations

RTEs look at today to influence this week’s/month’s programming

Mid-term evaluations look at the first phase to influence programming in the second phase

Ex-post evaluations are retrospective: they look at the past to learn from it

Page 18: Innovations in Evaluation

Key Features/Methods Semi-structured interviews Purposeful sampling –

complemented by snowball sampling in the field

Interviews with beneficiary groups important

Observation

Page 19: Innovations in Evaluation

Methodological Contraints of RTE

Limited use of statistical sampling (sample frame)

Limited use of surveys Lack of pre-planned coordination

between humanitarian actors Baseline studies usually inexistent Attribution (cause and effect)

difficult given the multiplicity of actorsSource: Brusset, E., Cosgrave, J., & MacDonald, W. (2010). Real-time evaluation in humanitarian emergencies. In L. A. Ritchie & W. MacDonald (Eds.), Enhancing disaster and emergency preparedness, response, and recovery through evaluation. New Directions for Evaluation, 126, 9–20.

Page 20: Innovations in Evaluation

Lessons - Advantages Timeliness: RTEs bring in an external

perspective, analytical capacity and knowledge at a key point in a response.

Perspective: RTEs reduce the risks that early operational choices bring about critical problems in the longer term.

Interactivity: RTEs enable programming to be influenced as it happens, allowing agencies to make key changes at an intermediate point in programming.

Page 21: Innovations in Evaluation

Lessons - Challenges Utilisation: Weakness in the follow up

on recommendations Ownership: workers, managers,

beneficiaries? Focus: What are the key questions? Meeting each partners’ needs for

accountability and learning Few RTEs in complex emergencies

Source:Lessons from recent Inter Agency Real Time Evaluations (IA RTEs) Riccardo Polastro

Page 22: Innovations in Evaluation

Digital data and tools

Page 23: Innovations in Evaluation

Rationale behind it Explosion in the quantity and diversity of high

frequency digital data e.g. mobile-banking transactions, online user-generated content such as blog posts and Tweets, online searches, satellite images, computerized data analysis.

Digital data hold the potential—as yet largely untapped— to allow decision makers to track development progress, improve social protection, and understand where existing policies and programmes require adjustmentSource: Global Pulse, Big Data for Development: Challenges & Opportunities May 2012, www.unglobalpulse.org

Page 25: Innovations in Evaluation

Big Data – UN Initiative

1) Early warning: early detection of anomalies in how populations use digital devices and services can enable faster response in times of crisis

2) Real-time awareness: Big Data can paint a fine-grained and current representation of reality which can inform the design and targeting of programs and policies

3) Real-time feedback: makes it possible to understand human well-being and emerging vulnerabilities, in order to better protect populations from shocks

Page 26: Innovations in Evaluation

Potential Uses and Focus ILO, UNICEF and WFP, researching

changes in social welfare, especially with regard to food and fuel prices, and employment issues

The number of tweets discussing the price of rice in Indonesia in 2011 follows a similar function as the official inflation statistics for the food basket.

Page 27: Innovations in Evaluation

What is Big Data?"Big Data" is a popular phrase used to describe a massive volume of both structured and unstructured data that is so large that it's difficult to process with traditional database and software techniques.

Types of digital data sources 1. Data Exhaust2. Online Information3. Physical Sensors4. Citizen Reporting or Crowd-sourced

Data

Page 28: Innovations in Evaluation

Lessons Learned to DatePrivacy Privacy is an overarching concern that has

a wide range of implications vis-à-vis data acquisition, storage, retention, use and presentation− People routinely consent to the collection

and use of web-generated data by simply ticking a box without fully realising how their data might be used or misused.

− Do bloggers consent to have their content analyzed by publihing on the web?

Page 29: Innovations in Evaluation

Lessons Learned to DateAccess and Sharing

Much of the publicly available online data (data from the “open web”) has potential value for development, there is a great deal more valuable data that is closely held by corporations and is not accessible

“The next movement in charitable giving and corporate citizenship may be for corporations and governments to donate data, which could be used to help track diseases, avert economic crises, relieve traffic congestion, and aid development.”

Source: Data Philanntropy where are we now. Andreas Pawelke and Anoush Rima TatevossianMay 8, 2013

Page 30: Innovations in Evaluation

Lessons Learned to dateAnalysis “conceptualisation” (i.e. defining categories,

clusters); selection bias (representative of general

population?) “measurement” (i.e. assigning categories and

clusters to unstructured data, or vice-versa) “verification” (i.e. assess how well steps 1 and 2

fare in extracting relevant information)

Page 31: Innovations in Evaluation

31

Discussion:

What’s happening in your organization/ country in terms of innovation in evaluation?

What lessons can you share about what works and what does not work?

Page 32: Innovations in Evaluation

32

Thank You!

Louise Mailloux – [email protected]

Simon Roy – [email protected]

Page 33: Innovations in Evaluation

1.0 Reflective Practice: competencies focus on the fundamental norms and values underlying evaluation practice and awareness of one’s evaluation expertise and needs for growth.

2.0 Technical Practice: competencies focus on the specialized aspects of evaluation, such as design, data collection, analysis, interpretation and reporting.

3.0 Situational Practice: competencies focus on the application of evaluative thinking in analyzing and attending to the unique interests, issues, and contextual circumstances in which evaluation skills are being applied.

4.0 Management Practice: competencies focus on the process of managing a project/evaluation, such as budgeting, coordinating resources and supervising.

5.0 Interpersonal Practice: competencies focus on people skills, such as communication, negotiation, conflict resolution, collaboration, and diversity.

33

Appendix: Competency Domains in Evaluation

Page 34: Innovations in Evaluation

Appendix RTE Distinguishing Features

Real-time evaluations Traditional evaluations

Need In-the-moment feedback at critical decision points

In-depth analysis in a detailed report, with the clarity of hindsight. 

Types of deliverables

Frequent in-person meetings and data summaries.

Full report at a defined end point and potentially at mid-point. 

End goal Getting the program to work as efficiently as possible, as soon as possible. 

Learning what worked and what didn’t, and using that information to inform the next iteration of the program. 

Cost

May be more costly due to multiple rounds of data analysis and meetings. Since evaluation activities may evolve to meet changing information needs, costs are not always as predictable.

Costs are generally more predictable because you know what activities will be conducted at the evaluation outset.

Trade-offsThe analysis will not be as rigorous because in-the-moment feedback cannot achieve the same clarity as hindsight. 

The analysis will not be available until midway through or after a program’s end. However, with the additional time available, a higher degree of rigor is possible.

Source: Getting Real About Real-Time Evaluation, Clare Nolan and Fontane Lo , Non-Profit Magazine, March 29, 2012

Page 35: Innovations in Evaluation

Appendix: Types of digital data sources

(1) Data Exhaust – passively collected transactional data from people’s use of digital services like mobile phones, purchases, web searches, etc., and/or operational metrics and other real-time data collected by UN agencies, NGOs and other aid organisations to monitor their projects and programmes (e.g. stock levels, school attendance). These digital services create networked sensors of human behaviour.

(2) Online Information – web content such as news media and social media interactions (e.g. blogs, Twitter), news articles, e-commerce, job posting. This approach considers web usage and content as a sensor of human intent, sentiments, perceptions, and want. Source : www.unglobalpulse.org

Page 36: Innovations in Evaluation

Appendix: Types of digital data sources

(3) Physical Sensors – satellite or infrared imagery of changing landscapes, traffic patterns, light emissions, urban development and topographic changes, etc. This approach focuses on remote sensing of changes in human activity

(4) Citizen Reporting or Crowd-sourced Data – Information actively produced or submitted by citizens through mobile phone-based surveys, hotlines, user- generated maps, etc. While not passively produced, this is a key information source for verification and feedback

Source : www.unglobalpulse.org