58
Tdh Project Cycle Management Series © Tdh/Grace Medina Requirements for Monitoring & Evaluation Operations – Quality & Accountability Version: January 2020 – Author: Q&A Unit – Monitoring & evaluation advisor: [email protected] Design: [email protected]

Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series

© Td

h/G

race Med

ina

Requirements

for Monitoring & Evaluation Operations – Quality & Accountability Version: January 2020 – Author: Q&A Unit – Monitoring & evaluation advisor: [email protected] Design: [email protected]

Page 2: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 2

Table of contents Introduction: M&E requirement Reference framework ................................................................................ 3

M&E for Quality & Accountability ........................................................................................................................ 3

Tdh and the Core Humanitarian Standard ........................................................................................................... 4

What does this document include and how to use it? ........................................................................................ 5

Who should use this document? ......................................................................................................................... 5

PART I - Monitoring ................................................................................................................................... 7

1. What is monitoring? ...................................................................................................................................... 7

2. Why monitoring matters? ............................................................................................................................. 8

3. Monitoring requirements in a nutshell ......................................................................................................... 8

4. Tdh Requirements on quality monitoring ................................................................................................... 10

5. Process : what are the big steps in monitoring planning and implementation? ........................................ 18

6. Recap - Project Monitoring: What are my responsibilities? ....................................................................... 23

PART II - Evaluation ................................................................................................................................. 26

1. What is evaluation? ..................................................................................................................................... 26

2. Why Evaluation matters ? ........................................................................................................................... 28

3. Evaluation requirements in a nutshell ........................................................................................................ 29

4. Minimum requirements for evaluation in Tdh ........................................................................................... 30

5. Process : what are the big steps in evaluation planning and implementation? ......................................... 36

Highlight .................................................................................................................................................. 41

What questions does “monitoring” help answering? ........................................................................................ 41

Ethical principles for engaging children in M&E ................................................................................................ 43

Quantitative and Qualitative indicators ............................................................................................................. 44

Project Reviews .................................................................................................................................................. 45

Main types of evaluations, their purpose and key questions they address ...................................................... 46

Internal vs external evaluation, advantages and disadvantages ....................................................................... 49

What entry point? OECD-DAC criteria and CHS ................................................................................................. 50

Ethical considerations in Evaluation .................................................................................................................. 53

What is the role of the “Evaluation Manager” .................................................................................................. 55

Budgeting for Monitoring and Evaluation : Typical costs .................................................................................. 57

Page 3: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 3

Introduction: M&E requirement Reference framework

M&E for Quality & Accountability

How do we know that our interventions are appropriate, relevant and effective? How do we know that our efforts are successful, empower people and do not cause further harm to the communities? How do we ensure that we continuously learn and improve?

Establishing an M&E system implies much more than counting activities implemented or services delivered for reporting purpose. It means that we continuously analyse implementation processes and outcomes, that we use data and feedback from the people we work with and for to orient decision making, programming, prevent harm, and adjust strategies. Monitoring and evaluation are essential elements of the Results Based Management approach of Project Cycle Management (PCM): they enable us to steer our projects and reflect on how our interventions have made a positive difference for the children and their communities. They represent opportunities to reflect critically on what we do, decide appropriately, learn, reflect on impact and be accountable.

Accountability: be accountable means that we, as humanitarians, use our power responsibly ; that we take into account the views of different stakeholders (people affected by the situation our project addresses, donors, other humanitarian actors, our colleagues) and that we are held accountable by them. Source: ALNAP, 2017 Quality: is what makes that our intervention is relevant, appropriate and timely; it happens when the concerned stakeholders are involved, their rights and dignity being strengthened. It means that our resources, length and costs are well managed, and that the project brings satisfactory changes for the people considering the situation and needs. Source: Core Humanitarian Standard

© Td

h/M

arie-Lo

u D

um

authio

z

Page 4: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 4

Monitoring & Evaluation activities in the Project Cycle

Monitoring and evaluation activities occur throughout the project cycle management, through a series of activities and processes which will vary according to the project approach and context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality, institutional learning and accountability.

Tdh and the Core Humanitarian Standard

The Core Humanitarian Standard for Quality & Accountability (CHS) sets out nine commitments that organisations and individuals involved in development and humanitarian interventions can use to improve the quality and effectiveness of the assistance they provide. The CHS places communities and people affected by crisis at the centre of development and humanitarian action. As a core standard, the CHS describes the essential elements of principled, accountable, and high-quality interventions.

The CHS is fully integrated with widely recognized industry standards, notably the Sphere Handbook, Sphere companion / global humanitarian standard, the Minimum Standards for Child Protection in Humanitarian Action, and the IASC Principles for Accountability towards Affected Populations. The CHS is also central to the Grand Bargain Commitments on Humanitarian Action.1

1 Notably commitment 6.3: Strengthen local dialogue and harness technologies to support more agile, transparent but appropriately secure feedback

Page 5: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 5

Monitoring and Evaluation are essential for us to align with the CHS commitments. This resource highlights how Tdh minimum requirements on Monitoring and Evaluation are linked with the commitments of the CHS. It refers to the Compass Qualité & Redevabilité, which offers guidance on how to translate in practice the CHS throughout project management cycle.

What does this document include and how to use it? This resource is NOT a methodological guidance explaining how to develop and implement a M&E system.

But it refers to the “must do”/ “must be" in terms of M&E and the “who does what” and is meant to be used in complementarity with the other “how to” guidance available in Tdh: general guidance documents on PCM, more advance guidance and tool box (see Tdh Q&A Knowledge Center). In particular, it complements Tdh Project Cycle Management in Emergencies and Humanitarian Crisis handbook (2017) .

In other words, this resource aims at setting a minimum benchmark for Monitoring & evaluation quality and consistency across Tdh delegations.

It has a prescriptive character, which means that Tdh staff need to take it as their standard and are expected to apply it in their work. It:

1) Explains Tdh minimum commitments on M&E,

2) Highlights compulsory processes, roles and responsibilities

3) Offers references and practical notes about how to undertake monitoring and evaluation processes,

ensure quality products/outputs, and utilise monitoring & evaluation findings.

4) It informs on how to link M&E practices with the Core Humanitarian Standard and mainstream cross

cutting issues, such as Gender and Inclusion and ethical considerations and how to connect with

safeguarding policies (Global code of Conduct, Protection for Sexual Exploitation and Abuse, Child

Safeguarding Policy).

Who should use this document? The primary intended audience of these guidelines are Tdh staff responsible for project and programme management: Project Managers, Programme Coordinators, Quality & Accountability staff, Zones, at HQ, Regional and Country level. It is also intended towards Tdh partners. It may be also shared with external evaluators contracted to carry out monitoring and evaluation work on behalf of Tdh and donors.

Minimum Rules &

commitments: MUST - Who does what ?

General Guidance -

PCMiE -Manual

Tool Box (Q&A website)

Technical notes on specific

issues

Page 6: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 6

© Td

h/S. C

alligaro

PART I Monitoring

Page 7: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 7

PART I - Monitoring

1. What is monitoring?

Monitoring is the collection, analysis and use of data concerning events and processes related to a project’s progress. Its aim is to assess a project’s progress and ensure it is on the right track to achieve the expected results, or to observe and understand discrepancies, difficulties or even new opportunities.

See PCMiE Handbook, page 80-108

Monitoring is constituted by a series of activities:

✓ Define indicators for the project, explain how they will be measured, calculated;

✓ Collect data during the implementation of the activities, analyse them

✓ Carry on monitoring punctual “studies”, such as baselines, end lines or studies to explore changes in knowledge and behaviours.

✓ Moments to reflect and analyse “where is the project heading to”, “what the data tell us” (workshop, reviews)

Characteristics:

✓ It is carried out by the project team with the support of M&E staff

✓ It involves a process of joint analysis (Tdh – partners, stakeholders)

✓ It is conducted timely to inform decision making and project steering, adjustment of activities

✓ It informs reporting and other information products ✓ In involves the establishment of child-friendly feedback

mechanisms and participatory processes (from consultation to people-led processes)

What isn’t monitoring?

A scientific research linked with academic purpose.

An audit. Audits are intended primarily to check that project resources have been used in compliance with established procedures and requirement

An Evaluation, which involves a “judgement” on the project performance and quality by a third party to the project

A lesson learned exercise such as an after-action review, although learning is built continuously thanks to monitoring work

© Td

h/O

llivier Girard

Page 8: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 8

2. Why monitoring matters?

The quality and accountability of an intervention depends particularly on our capacity to implement robust monitoring work, as monitoring must:

Monitoring helps us answering a series of key questions [highlight] during the project design and implementation phase.

3. Monitoring requirements in a nutshell

Click on the titles to get more information on the requirements

Is our project supported by a robust Monitoring system, looking at inputs, activities, outputs, outcomes, processes, with quality indicators, including programme outcome indicators?

Do we have the right human resources and adequate technical and methodological skills?

Are we using the right Planning tools?

Do we collect, analyse and communicate data in a robust way?

✓ It should be flexible and can have very different objectives and scope (from more quantitative and output focus towards more participatory and qualitative)

Page 9: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 9

Do we have enough financial, logistical and material resources to ensure that monitoring is appropriate and effective?

Is our monitoring work gender and diversity sensitive?

Is Monitoring correctly used for steering the project?

Is our monitoring ethical and safe for children, their family and their community?

Page 10: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 10

How to assess requirements Tdh resources Core Humanitarian Standards

4. Tdh Requirements on quality monitoring

Below are summarized Tdh requirements for Project Monitoring, their link with Tdh institutional key performance Indicators (KPIs) Framework (“how to assess alignment with requirements”), with their internal resources and the main commitments of the CHS they contribute to. Institutional indicators currently being measured annually at global level in Tdh are highlighted in orange and are mandatory. The others can be used at delegation or programme level for guiding their development. The latest are recommended but not measured across Tdh yet. Delegations can go through full PCM self-assessment (Xls version) structured around a complete set of indicators.

Tdh Requirements on Project Monitoring

Boxes legend:

1. SYSTEM All development and emergency response projects and programmes are supported by a monitoring system. This involves implementing different activities on different scales, using articulated collection and analysis tools as part of a dynamic system.

» A monitoring system must include indicators and means of verification for inputs, outputs, activities, outcomes, context and processes (quality).

» A monitoring system must be based on SMART indicators, checked against Tdh checklist “defining SMART indicators”.

» A monitoring system combines qualitative and quantitative methods and tools [see highlight: quantitative & qualitative indicators].

» A monitoring system include a mechanism to involve project stakeholders (including children and their families), gather and analyse their feedback (positive or negative) to orient project implementation.

» All project proposals include a description of the monitoring system to be implemented on the basis of (budgeted) realistic resources.

» All project monitoring systems include relevant programme outcome indicators from the PoIF – Programme Outcome Indicator Framework) in order to support programme steering under the 2016-2020 strategic plan.

How to assess requirements [SYSTEM 1] % of projects with monitoring systems that include indicators and/or information on inputs, outputs, activities, outcomes, context and processes

[SYSTEM 2] % of projects with qualitative and quantitative indicators

[SYSTEM 3] % of projects that include 1, 2, 2+ programme outcome indicators in their Logframe and M&E Plan

Tdh resources Essential - PCMiE handbook 2017 (See Result

Chain, p.57, Check list “defining Smart indicators”, page 69 and Levels of monitoring, p. 81)

- Tdh Procedure on “Complain, Feedback & Response mechanism”

Dig deeper - Monitoring handbook 2016 - PCM self-assessment (Xls version)

Essential - Programme Outcome Indicator

Framework (POIF) Annex: Roles & Responsibilities

Core Humanitarian Standards

Commitment 1: response is appropriate and relevant.

Commitment 3: Response strengthen capacities and avoid negative effects.

Commitment 4: response is based on communication, participation and feedback.

CPMS chapter 4

Page 11: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 11

2. PLANNING Each project has its own monitoring plan based on its logical framework and provisional budget,

finalised at least one month after the project launch and approved by the Programme Coordinator.

The monitoring plan is the description and operationalization of the monitoring system.

» Monitoring planning begins during the strategic planning phase and be finalised during the

operational programming phase. The monitoring plan is drafted in an inclusive way, with project

actors being involved in team meetings and workshops, including Partners.

» For all projects, the monitoring plan includes at least the following elements:

o Indicator Matrix to monitor contractual and in-house indicators for outputs, outcomes,

processes and context, including sources, methods and collection tools; The distribution

of roles and tasks in the fields of data collection, processing, analysis, diffusion and use is

documented.

o Beneficiary counting matrix;

o A M&E activities workplan and description of all major data collection exercises

(baseline studies, surveys, mid-term reviews and studies);

o M&E budget;

» For more complex projects (multi-sector, multi country, consortium, participatory projects,

innovation projects) entailing a more complex M&E system it is recommended that the monitoring

plan includes, in addition:

o A narrative explaining M&E strategy for the project, including data quality management

and ethical considerations;

o An Information utilization Plan (reporting, studies, other communication products);

o Information flows set down in a visual.

» For all projects, a Project Follow up Tool (PFU) is prepared and updated by the Project Manager. It

includes, at least, the following tools:

o A Project reminder sheet

o A simplified Indicator Matrix

o Indicator Tracking Table to monitor quantitative targets;

o A detailed workplan with activities and sub-activities

o A “Major Events” sheet

» For all projects, baseline measurement must be conducted within the first quarter of the project as

a maximum or before an activity starts, aiming at capturing the status of the indicators before the

How to assess requirements [Plan 1] % of projects with monitoring plans containing all required appendices one month after being launched

[Plan 2] % of projects where Inception workshops have been held to review monitoring systems and involve project actors

[Plan 3] % of Projects implemented in the course of the year using the Project Follow Up Plan.

Tdh resources

- M&E plan toolbox - PFU introduction, Guidance

& template

- PCM self-assessment (Xls

version)

Additional tools-references: - Procurement plan - Outil de suivi budgétaire (BFU)

- Baseline basics, IFRC

Core Humanitarian Standards

Commitment 1: response is appropriate and relevant.

Commitment 2: Response strengthen capacities and avoid negative effects.

Commitment 4: response is based on communication, participation and feedback.

Page 12: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 12

project begins, from which change and progress can be assessed. Target values should be then refined accordingly.

3. RESOURCES Delegations set aside enough financial, logistical and material resources to ensure that monitoring is

appropriate and effective.

» In project drafts and budget planning exercises, these resources are indicated in specific budget

lines. In total, they should amount to at least 5% of the total budget (direct costs) for Monitoring

and Evaluation, covering the following type of costs: staff, expertise, material, regular and one-off

monitoring operations, staff training, equipment and information management costs. [The “5 %

Golden rule”: M&E types of costs]

4. PROJECT STEERING Monitoring data is used for steering purpose.

» A steering body and steering mechanism are established for the project, clearly defined and

described in the project’s monitoring plan and proposal. It includes Tdh staff and partners.

» Monitoring data is analysed and discussed in the frame of senior Management meetings involving

the logistical, financial and programme departments during Project Reviews [Highlights: project

reviews] on a quarterly basis for projects lasting over 12 months and on a monthly basis for

emergency projects or short-term projects (<12 months).

How to assess requirements [RESOURCES 1] Number and % of projects whose monitoring & evaluation resources, as indicated in specific budget lines, represent at least 5% of the total budget

Tdh resources

- Tdh Golden Rules for Budgeting

- M&E budget typical costs: See

Guidance on budgeting for M&E

- PCM self-assessment tool -

Monitoring

Core Humanitarian Standards

Commitment 8: Staff are supported to do their job effectively

Commitment 9: Resources are managed and used responsibly for their intended purpose

How to assess requirements [Steering 1] % of projects that

have a committee to discuss

monitoring information

[Steering 2] % of projects that

have mechanisms to follow up

on monitoring recommendations

[Steering 3] Number of projects

that undergo strategic reviews

which take place every quarter

Tdh resources

- Tdh Project Review format

- PCM self-assessment tool -

Monitoring

Core Humanitarian Standards

Commitment 1: response is appropriate and relevant.

Commitment 2: Response strengthen capacities and avoid negative effects.

Commitment 7: Continuous learning and improvement

Commitment 8: Staff are supported to do their job effectively

Page 13: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 13

Page 14: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 14

5. HUMAN RESOURCES Delegations have the technical and methodological skills required to implement monitoring systems.

» Each team counts with a qualified staff member to manage the monitoring process, ideally as part of a

M&E or Q&A department. To support critical reflection, Monitoring and Evaluation (M&E) staff should

ideally not be part of project or programme teams. In other words, they should not implement projects

while in the same time monitoring those projects outcomes.

» If specialist staff are unavailable, the monitoring skills of programme and project teams should be

enhanced. Professional training must be provided, covering technical and methodological issues as well

as communication, planning, operational, analytical and drafting skills.

» Activities of Monitoring and Evaluation staff should fit the standard job descriptions approved by Tdh –

Lausanne (for M&E Managers, M&E Officers, M&E Assistants and M&E/IM Operators).

» If necessary, technical assistance must be requested from M&E advisors based in regions or at

headquarters or from experts in the field.

» When in-house skills or time is lacking, delegations should call on external experts using terms of

reference approved by programme coordinators / thematic specialists. The types of activities likely to be

outsourced include:

o Baseline studies;

o Surveying;

o Data collection;

o Qualitative studies.

How to assess requirements

[HR 1] Number of staff whose Q&A

skills are strengthened (per year, per

delegation and per region)

[HR 2] % of M&E positions with job

descriptions that correspond to Tdh’s

standard job descriptions

[HR 3] Number of M&E managers,

officers and assistants per delegation

and per region

[HR 4] % of delegations with at least

1 M&E staff at national level

[HR 5] The number / % of delegation

where national level M&E staff is

about 1 person per every four

projects or 1 million Euro Country

Budget

Tdh resources

- See specifications for the

Q&A department and

standard M&E, Q&A and

IM job descriptions.

- Standard ToR for Survey,

ToR for Evaluations,

consultant selection Grid.

- Tdh Method Pack

Core Humanitarian Standards

Commitment 2: Response strengthen capacities and avoid negative effects.

Commitment 7: Continuous learning and improvement

Commitment 8: Staff are supported to do their job effectively

CPMS chapter 2

Page 15: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 15

6. DATA MANAGEMENT All data collection exercises are based on a clearly identified information need, on appropriate

methodology and are used following a sound analysis process, as per data cycle management.

» For any data collection a methodological note (or ToR) and an analysis plan is prepared

» Only needed data should be collected in order to avoid respondent fatigue and spare resources.

» Bias are caused by pre-conceptions, intentions and stereotypes that can unconscious and lead to major

errors and bad quality data. Bias can happen during design, data acquisition, analysis and

communication process.

A conscious and collective effort has to be made to minimize and mitigate biases.

» Collected data has to be verified by crossing different sources, methods, through teamwork

(triangulation).

» When possible and appropriate, mobile data collection should be considered to gain in efficiency and quality, following consideration of Tdh-CartOng MDC toolkit.

How to assess requirements [DATA] All delegations proceed

annually on a data protection

self-assessment, as a minimum.

[DATA] % of delegation developing a methodological note (ToR) and an analysis plan during the design phase of a study or data collection exercise

Tdh resources

- Data protection starter kit,

including the Directive on Data

protection, introduction to data

protection, model SoPs and

tutos, protection self-assessment

tool (in xls)

- Tdh method pack : ToR, how to

build a questionnaire, bias in

surveys, sampling, what method

to choose, analysis plan… and

much more!

- Tdh guidance on focus groups

- Principles for data managements

- Data viz pack

- MDC toolkit : prerequisite for

MDC - Risk of misusing MDC

Core Humanitarian Standards

Commitment 1: response is appropriate and relevant.

Commitment 3: Response strengthen capacities and avoid negative effects

CPMS chapter 5

Page 16: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 16

7. GENDER AND DIVERSITY Monitoring work are gender & diversity -sensitive and based on the principle of inclusion.

The monitoring plan is the description and operationalization of the monitoring system.

» Project log frame should include gender sensitive indicators (indicators that measure gender-related

changes in society and help us to understand how changes in gender relations happen).

» Stakeholders from different Genders and Diversities are involved in the design of the monitoring tools

(especially women and girls). The tools reflect their realities (questions are not biased; adequate

methodologies are chosen);

» The complaint, feedback, and response mechanism use accessible and appropriate tools to captures

the satisfaction of women, men, boys, girls, or any potentially vulnerable or marginalized group;

» Teams involved in monitoring work are mixed, and women are encouraged to undertake data

collection work in the ground in optimal conditions;

» Data collection for monitoring ensures the meaningful participation of women, men, boys, girls and

any potentially vulnerable or marginalized groups;

» All person-related data collected is sex and aged disaggregated;

» Data analysis must be oriented towards understanding gender roles and inequality, and how these

are affecting the project intervention and results.

How to assess requirements [G&D 1] Number of projects

launched in the course of the

year that are at least Gender

and Diversity Responsive

[G&D 2] % of the projects

launched in the course of the

year that collect (and analyse)

sex and age disaggregated data

[G&D 3] % of project that

collect and analyse data on

disability and potentially other

vulnerable groups

Tdh resources

- Tdh Gender and Diversity

Marker

- Tdh FGD with Children Guide

- Washington group set of

disability - related questions

- Tdh G&D section of the Q&A

knowledge Center

- HI : disability data collection :

review of the use of the

Washington group questions

Core Humanitarian Standards

Commitment 1: response is appropriate and relevant.

Commitment 3: Response strengthen capacities and avoid negative effects

Commitment 8: Staff are supported to do their job effectively

Page 17: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 17

8. ETHICS AND SAFEGUARDING Monitoring work is ethical safe for the people and staff.

» Monitoring must not cause harm or damage under any circumstances. Any data collection,

processing, analysis and communication activity is carried out following careful ethical reflection

including on do no harm principles and in line with Tdh global code of conduct (including the Child

Safety Policy, child protection good practices) and the 10 golden rules of data protection. A risk-

benefit analysis must be completed for all monitoring data collection work : see if it needs actually to

be done, if people need to be involved and in what capacity, what could be the impact that

participating in the exercise may have on people in terms of potential harm and possible benefits.

» Information is collected, processed and analysed in compliance with the principles of informed

consent of parents, caregivers and children, confidentiality and privacy. Strict measures must be

taken to ensure the protection and safety of people.

» Project monitoring must help identify potential or real negative effect of the action and address them

the best possible: avoid it to happen (avoidance) or if not possible to be avoided, reduce its impact

(mitigation).

» Monitoring data is used to update information allowing Tdh to address inequalities (especially due to

gender or age discrimination). Data collection must be oriented towards the use of participatory

techniques allowing persons with specific vulnerabilities (low literacy level, impairments, people

generally excluded and who are not used to raise their voice or be listened to) to express their

opinion.

» Monitoring aims to include children whenever possible and appropriate, subject to the presence of

trained, experienced professionals. Suitable methods support the participation of children while

taking appropriate precautions to protect, safeguard and ensure their well-being [see Highlight: Ethical

principles for engaging children in M&E].

How to assess requirements [ETH 1] Ethical considerations, including a risk-benefit paragraph, are systematically incorporated into all methodological proposals and data collection protocols.

[ETH 2] All projects include mechanisms to identify, document and analyse the negative effects of actions and follow up on decisions.

[ETH 3] Projects use data collection and sharing techniques that are suitable for the gender, age and capacities of children and local cultures, and explained in the monitoring plan and data use plan.

Tdh resources

- Tdh Global Code of Conduct

- Tdh note on Informed consent & assent with Children,

- Tdh list for safer recruitment and standardized Job desks for M&E and Q&A staff

- Tdh survey methodology pack –

- Project Cycle Handbook in Emergencies and humanitarian crisis

- Tdh guidance on focus groups

- Guidance on measuring MHPSS outcomes (available soon)

- Tdh (2017) Child Protection

Quality Framework.

- Note on errors and bias

Core Humanitarian Standards

Commitment 3: Response strengthen capacities and avoid negative effects

Commitment 4: response is based on communication, participation and feedback.

Page 18: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 18

5. Process: what are the big steps in monitoring planning and

implementation? Monitoring is composed of a series of sub-processes. Hereunder we have defined 2 main types of processes:

A) “Monitoring Planning”, happening across (I) strategic design and (II) operational programming phases of the project cycle.

B) During implementation: “Data Management process” (IV) and Strategic Reviews (III) for any M&E work involving data collection, analysis and use

A. Monitoring Planning process

Page 19: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 19

Page 20: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 20

B. Monitoring Planning process

We have just seen that monitoring must be rigorously planned during strategic planning and design stage and

optional programming, just after the project is being launched. During project implementation, data will be

collected, analysed and used in different ways:

1. Ongoing or “Routine” monitoring data: which is mainly data collected against input-activities or outputs

indicators on a regular basis, depending on a pre-agreed frequency.

2. Data collected on a pre-post basis (before and after certain activities)

3. More specific studies carried out (such as KAP studies, Baseline / mid / end line studies) – often based

on more ambitious methodological frameworks involving, mainly to follow up on outcome indicators

(changes in practices, behaviours, attitudes, live conditions) and contextual changes. For The third kind

of exercise, “Specific Studies”, which involves big Data collection and analysis work, the data

management process (process IV) is described below.

During implementation data is also analysed at regular points during the project. Those moments are called

“Project Reviews” (Process III) Those exercises should not be confounded with evaluations; they have specific

characteristics:

Project reviews are:

- Generally conducted internally - Focusing on steering and learning - Conducted on perioding basis throughout

the project - Seek to analyse

constraints/risks/opportunities/progress/change in context

- Involves joint analysis: cross check finance / log / HR / Activities implementation / indicators

- Cross-check points of views and reflect on the voices of beneficiaries-target groups

- Are key to analyse potential unintended effects of the project and prevent / mitigate them

For more information, see Highlight: project reviews

Monitoring & Evaluation activities in the Project Cycle

Page 21: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 21

Page 22: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 22

Page 23: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 23

6. Recap - Project Monitoring: What are my responsibilities?

M&E staff do not carry all the burden of monitoring, which is a matter of everyone. Each type of function at field level, Region and HQ has specific responsibilities, which have to be contextualized according to the setup of the Bureau, Delegation, the region and the programme. The delegate is ultimately responsible for ensuring that all those roles are well distributed in the working landscape.

FIELD

Who Responsibility

Project Manager ✓ Ultimately responsible for the quality of the project. ✓ Appropriateness of M&E approach, indicators, determination of information needs ✓ Quality and timely update of Project Follow Up ✓ Appropriate use of M&E data in project steering, for learning and accountability ✓ Quality involvement of project stakeholders in Monitoring ✓ Organize project review for her/his project ✓ Checks that projects include sufficient M&E budget for his/her field

Program coordinator/Field coordinator

✓ Coherence of project M&E system with Tdh ToC evidence need and PoIF ✓ Ensures coherence between projects and programmatic strategies, ensures

appropriate inclusion of program indicators in project design ✓ Makes sure that projects include sufficient M&E budget ✓ Quality and coherence assurance ✓ Ensures that information (evidence) need is analyzed coherently and data

collection kept within what is relevant ✓ Support the organization of Periodic Reviews

Thematic Advisor / Specialist

✓ Appropriateness of M&E approaches according to thematic and global guidance – tendencies in the sector

✓ Participate in joint analysis, feed into project reviews

Q&A staff (M&E, IM) ✓ Check project indicators during project design ✓ Issue technical advice, provide methodological support and raise alerts, working

closely with programme teams. ✓ Provide support in designing M&E system, including feedback mechanism and

M&E budget ✓ Lead development of M&E plan – Plan & coordinate with PM monitoring activities ✓ Support the calibrating of the PFU and its implementation ✓ Ensure quality of Data management Processes ✓ Set up data basis, data entry, cleaning ✓ Provides descriptive analysis ✓ Capacity building in Q&A matters

Delegate (or Deputy)

OR Field Coordinator

✓ Supervision of M&E team ✓ Ensure processes are happening in conformity with guidelines ✓ Ensure proper decision-making based on reliable data ✓ Check that M&E systems are robust and realistic ✓ Facilitate involvement of counterparts. ✓ Impulse project reviews

Admin staff ✓ Ensure budget fulfils golden rules, including the 5% M&E budget. ✓ Provides regular budget follow up for cross-checking with activities

implementation pace, namely during project reviews

Page 24: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 24

REGION

Who Responsibility

Programme Coordinator - Advisor

✓ Coherence of project M&E system with Tdh ToC evidence need and PoIF – ✓ Summarization. analysis at regional level, quality insurance ✓ Development or contextualization of standardized data collection methodologies

and tools in coherence with his/her field in the Region

HQ

Who Responsibility

Zones: Deputy ✓ Advice and technical support to his/her zone

✓ Seek complementary advice from Q&A - Programs – Humanitarian Specialists

✓ Ensure compliance with Golden Rules.

Programmes – Humanitarian Specialists

✓ Development or contextualization of standardized data collection methodologies

and tools in coherence with his/her field

✓ Coherence of project M&E system with Tdh ToC evidence need and PoIF –

✓ Analysis at global level, quality insurance at global level

✓ Checks that projects include sufficient M&E budget for his/her field

Transversal Protection - WELD

✓ Support development of standardized data collection tools

✓ Advice and technical support

✓ Involve Q&A unit as deemed necessary

Quality & Accountability ✓ Support development of standardized data collection tools

✓ Targeted advice and technical support

✓ Capacity development

Page 25: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 25

© Td

h/S. C

alligaro

PART II Evaluation

Page 26: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 26

PART II - Evaluation

1. What is Evaluation? An evaluation is a “systematic and objective assessment of an on-going or completed project, programme or policy, its design, implementation, and results (…) An evaluation should provide information that is credible and useful, enabling the incorporation of lessons learned into the decision-making process of both recipients and donors” (OECD/DAC definition).

Quality evaluations enable us to build evidence on the degree of success of our projects and programmes and are key in Tdh’s endeavour to make progress in terms of transparency, accountability and quality. Evaluation are good opportunity to pause and reflect, together with partners and project actors, further empower people and collectively analyse the changes we managed to achieve for the children and their communities.

The origin of the word “evaluate” stems from giving value: to the project’s beneficiaries and stakeholders, to the people and institutions supporting the project, and to the work done.

The following elements characterize an evaluation: ✓ It involves a critical judgment ✓ It has to be objective and neutral: usually, someone

having taken part to the project design and implementation stage cannot be involved in the evaluation, being judge and party in the same time.

✓ Evaluation must rely on reliable and transparent methodologies and analysis process and pose a judgment on the value and performance of a project, against a set of criteria and key evaluation questions. Methods should be consistent with Tdh approaches and values and Core Humanitarian Standards.

✓ It shows rigor and method and is undertaken so that biases are mitigated.

✓ It happens at specific times ✓ It builds on monitoring data and helps strengthen

evidence

What isn’t an evaluation?

Descriptive documentation of a program/project is not an evaluation.

Scientific research linked with academic purpose. Evaluation should provide evidence based practical guidance for future direction

Audits. Which are intended primarily to check that program resources have been used in compliance with established procedures and requiremens.t

A periodic internal review Advisory / technical support visits

conducted by a Tdh specialist or advisor A “capitalization” or lessons learning

exercise Monitoring work, through which we

collect and analyze data continuously

© Td

h/A

.Bü

hlman

n

Page 27: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 27

✓ It provides recommendations for learning and adjusting current and further programming

✓ It should be flexible and can have very different objectives and scope (from more quantitative and output focus towards more participatory and qualitative)

throughout the project for immediate action and project steering, and is conducted internally

Page 28: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 28

2. Why Evaluation matters?

Evaluations can have a variety of purposes including accountability, empowering project actors and beneficiaries, informing decision-making, building institutional knowledge, strengthening our credibility, influencing others and building our capacity for organizational learning.

Empowerment

and ownership

Evaluating a project helps understand and take increased ownership of its aims and strategy. Evaluations also create internal dynamics in a project team that encourage common and critical thinking.

Through evaluations, we should assess how responsive the projects are to communities’ priorities and how inclusive they are.

Evaluations should give a voice to the children and the communities we work for and with, especially the ones that are most marginalized.

Accountability

Evaluation is a key strategy for ensuring that Tdh and partners are accountable to one another, and to communities with which we work, to donors, to other NGOs and to other stakeholders. It is the process of taking into account the views of and be held accountable by, different stakeholders (donors, public, partners, authorities), and primarily the people affected by a crisis or by inequalities.

Learning Evaluation aims to understand why – and to what extent – intended and unintended results were achieved, which is key to develop collective knowledge and improve programming. We need to learn from our mistakes, from the challenges we faced and transcended, and make sure that we do not repeat errors. We must ensure that specific risks (safeguarding, fraud, security) are better prevented and addressed in our programming. This means we must encourage critical thinking, trust among the teams, value our staff and their experience, to build on our strengths.

Steering Evaluations help better manage the various actions and to allocate human and financial resources more rationally. They aim at informing decision-making and program development for improvement of quality and impact, helping to design or improve implementation or to assess the impact of a program/project.

Be accountable means that we, at Tdh, can demonstrate that we use our power responsibly ; that we engage meaningfully and take into account the views of different stakeholders (people affected by the situation our project addresses, donors, other humanitarian actors, our colleagues) ; that we have the capacity to measure the quality, adequacy of our intervention, how we contribute to the changes that we pursue and; that others can hold us into account.

Page 29: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 29

3. Evaluation requirements in a nutshell

Click on the titles to get more information on the requirements

When are evaluations required or compulsory?

Are the evaluation outcomes shared transparently and used for learning?

Are our evaluations based on significant Participation of stakeholders?

Is our evaluation work gender and diversity sensitive?

Is the evaluation process managed adequately throughout?

Is our evaluation ethical and safe for children, their family and their community?

Page 30: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 30

Tdh Institutional Indicator Tdh resources Core Humanitarian Standards

4. Minimum requirements for evaluation in Tdh Below are summarized Tdh requirements for Project Monitoring, their link with Tdh institutional key performance Indicators (KPIs) Framework (“how to assess alignment with requirements”), with their internal resources and the main commitments of the CHS they contribute to. Institutional indicators currently being measured annually at global level in Tdh are highlighted in orange and are mandatory. The others can be used at delegation or programme level for guiding their development. The latest are recommended but not measured across Tdh yet.

Tdh Requirements on Project Evaluation

Legend boxes:

1. WHEN ARE EVALUATIONS REQUIRED OR COMPULSORY? All projects must undergo an evaluation – even if the evaluation process is based on a light methodology. Tdh teams have the responsibility to look at our projects with a critical perspective to continuously learn and improve our intervention, as per CHS commitment 7.

» If there are no specific donor requirements for project evaluations, projects lasting over 1 year will carry out at least one evaluation; the nature of the evaluation (mid-term or final) will be determined by its purpose.

» Level 3 emergency projects lasting over 6 months and over 400,000 CHF should be evaluated, even if there are no specific donor requirements for project evaluations

» All projects with a duration of 3 years or more must plan and conduct: ▪ a mid-term evaluation ▪ and a final external evaluation at the end of the project.

» All multi country or pilot / innovation or complex projects must plan and conduct a mid-term

evaluation RECAP: when an evaluation should be conducted?

2 Y = Year ; M = Months

Mid-term evaluation Final evaluation

GENERAL CASES2

< 1 Year and < 400,000 CHF Optional (min. internal)

Emergency (crisis level 3): > 6 Months ; > 400,000 CHF

Optional (internal) Required (External)

1 – 3 Years Optional (internal or external) Required (external)

+ 3 Years Required (internal or external) Required (external)

PARTICULAR CASES

Pilot or “flagship” project Required (internal or external) Required (external)

Multi-country Required (internal or external) Required (external)

Complex or new type / new area Real time evaluation recommended Required (external)

Tdh Institutional Indicator [EVAL 1] Number and % of projects ending in the year that have been evaluated /internal/ external according to Tdh rules

Tdh resources

- PCMiE handbook 2017

- [highlight: Main types of evaluation]

Core Humanitarian Standards

Commitment 2: Response strengthen capacities and avoid negative effects.

Commitment 7: Continuous learning and improvement

Page 31: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 31

humanitarian response (especially for level 3 emergency

(start-up implementation)

Page 32: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 32

2. PARTICIPATION

» Implementing Partners are engaged by Tdh in the design, data collection, and interpretation of results and feedback phases of the evaluation. If the project is implemented directly, Tdh team should engage project actors/ communities (particularly women, children and youths and other marginalized groups) in these activities. In the case Tdh is part of a consortium or is implementing partner, Tdh teams should be actively participating in all evaluation stages.

» For evaluation to be truly participatory and to avoid tokenism (instrumentalization or fake participation), relevant methodologies are used, enabling marginalized group to raise their voice, including women and girls, with a particular care for ethical and safeguarding considerations. [see Highlight: Ethical considerations for engaging children in M&E].

» Whenever possible and appropriate, children and families should be involved in the evaluation design process, setting criteria of what should be evaluated and how (method), as well as they should be involved in the analysis and fed back with the result.

3. EVALUATION MANAGEMENT

» A steering committee is established, with a key person appointed as Evaluation Manager. This person ensures that communication between all stakeholders is clear and straightforward, manages all the evaluation process (planning and design, implementation, elaboration of products, restitution and follow up on the evaluation). [Highlight: What is the role of the evaluation manager]

» Developing Terms of References is compulsory, based on Tdh Template. ToRs include:

o Background of the evaluation, Information about the project o objectives and intended users, [highlight: Main types of evaluation]

o criteria and evaluation questions, o methodology, ethical considerations, and safeguarding risks mitigation o roles and responsibilities, timeframe, deliverables and available budget, o for external evaluations: expertise required and recruitment modalities.

» ToRs are developed under the lead of the evaluation manager with the support of the M&E staff and Programme Coordinator at Delegation Level (if they are not appointed as the evaluation manager). All ToR should be reviewed by the thematic specialist or the relevant programme staff either in the region or in HQ as well as Quality and Accountability unit. ToRs are signed off by the Delegate.

» If a consultant is to be contracted for the evaluation, recruitment process must be conducted in a

fair and transparent manner: o On the basis of clear criteria, o respecting Tdh procurement procedures (following Tdh tendering guidelines), Tdh

safeguarding policies and the safer recruitment checklist o by a steering committee involving the Project Manager, a representative of the Zone and

or the Programme and / or Quality and Accountability unit. o Outcome of the selection process must be thoroughly documented.

Tdh Institutional Indicator [EVAL 2] Number and % of projects evaluation based on participatory design (level 1: participation is limited to providing data – level 2: participation includes definition of evaluation criteria and method ; Level 3: participation includes level 2 and engagement in collecting data. Level 4: participation includes levels 2-3- and engagement in analysis of data and findings

Tdh resources

- Tdh note on conducting focus groups with Children

- Method pack: Tdh Technical note on “Bias and errors” – “Choosing the right methodological approach” –

- Tdh (2018), Child Protection Good Practice Framework. A guide to promoting quality child protection across all programmes.

Core Humanitarian Standards

Commitment 3: Response strengthen capacities and avoid negative effects

Commitment 4: response is based on communication, participation and feedback

Page 33: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 33

o Non-selected candidates must be provided with feedback on their technical and financial offer.

» Inception report is compulsory prior to conduct data collection in the field. Sufficient time for the

preparatory phase is allocated for the pre-inception report stage. No data collection should be conducted without prior approval of the inception report by the evaluation steering committee or at least the evaluation manager.

» Following data collection and prior to submission of the draft report: feedback to the team, partners and key stakeholders with early findings

» A draft report presenting the findings and recommendations is discussed with the team, fed back before preparation of the final report

4. DISSEMINATION AND LEARNING

» The results of the evaluation are shared by the country office with the programme teams at regional and HQ level, as well as the Zones and the Q&A unit, for evaluating the quality of the evaluation and drafting a management response within a month following the submission of the evaluation report. A realistic action plan to implement the recommendations accepted in the management response should be developed in a participatory way with the project team and the partner(s).

» Dissemination of evaluation products: All projects/programmes evaluation reports, ToRs and

management responses should be stored in Tdh evaluation library and made accessible to the main partners and interested actors.

» Publication: Evaluation reports should be published on Tdh website, upon proposal of operations at HQ level and in agreement with Q&A

5. GENDER & DIVERSITY

» Gender & Diversity are reflected in the ToR as part of the evaluation criteria.

Tdh Institutional Indicator # and % of project evaluations that are based on robust ToR

# and % of project evaluation counting with an inception report prior to the data collection work in the field

Tdh resources

Evaluation toolbox :

- Terms of reference guidance and format

- Tdh contract model - Inception report template - Grid for recruiting consultants - Final report template

- Logistic Knowledge Center: package for independent contractor

Core Humanitarian Standards

Commitment 2: Response strengthen capacities and avoid negative effects.

Commitment 8: Staff are supported to do their job effectively

Commitment 9: Resources are

managed and used responsibly for

their intended purpose

Tdh Institutional Indicator # and % of project evaluation

for which a management

response was drafted within a

month of the report submission

# and % of project evaluations

published internally and

externally

Tdh resources

- Tdh response to the evaluation and Action plan

- Template for evaluating the quality of an evaluation (Oxfam)

Core Humanitarian Standards

Commitment 7: Continuous learning and improvement

Page 34: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 34

» Evaluation questions should explore changes in social norms or structural/systematic factors affecting Gender Equality. If a Gender Analysis was conducted for the delegation or the project, then the evaluation questions should build on the findings of this analysis.

» Data analysis is oriented towards understanding gender roles and inequality, and how these affect project outcomes.

» All person-related data collected during evaluations are sex and age disaggregated

» All teams involved in evaluation work are mixed, women, people from different social groups and different capacities should be encouraged to undertake data collection work and analysis in the best conditions possible.

» Planning & data collection for the evaluation ensure the meaningful participation of women, men, girls, boys and any potentially vulnerable or marginalized groups.

» All evaluations include an analysis of where the intervention sits on the Gender & Diversity Scale.

6. ETHICS

Evaluation work is ethics-oriented [Highlight : ethical considerations in Evaluation]

» Evaluation must not cause harm or damage under any circumstances. Any data collection, processing, analysis and communication activity is carried out following careful ethical reflection including on do no harm principles and in line with Tdh global code of conduct (including the Child Safeguarding Policy, Policy on the Protection from Sexual Exploitation and Abuse, child protection good practices) and the Directive on data protection. A risk-benefit analysis must be completed for all monitoring evaluations : see if it needs actually to be done, if people need to be involved and in what capacity, what could be the impact that participating in the exercise may have on people in terms of potential harm and possible benefits.

» Information is collected, processed and analysed in compliance with the principles of informed consent of parents, caregivers and children, confidentiality and privacy. Strict measures must be taken to ensure the protection and safety of people.

» Evaluations must help identify potential or real negative effect of the action and address them the best possible: avoid it to happen (avoidance) or if not possible to be avoided, reduce its impact (mitigation).

» Data collection and analysis is oriented towards the use of participatory techniques allowing persons with specific vulnerabilities (low literacy level, impairments, people generally excluded and who are not used to raise their voice or be listened to) to express their opinion.

» Evaluation aims to include children whenever possible and appropriate, subject to the presence of trained, experienced professionals. Suitable methods support the participation of children while taking appropriate precautions to protect, safeguard and ensure their well-being [see Highlight: Ethical principles for engaging children in M&E].

Tdh Institutional Indicator # of evaluations that integrate

gender among evaluation

criteria in their ToR

# and % of gender-focused

evaluations

Tdh resources - Tdh Gender & Diversity Policy - Tdh Gender and Diversity marker External resource:

- Evaluation with Gender as a Cross-cutting dimension (UE)

Core Humanitarian Standards

Commitment 3: Response strengthen capacities and avoid negative effects

Commitment 4: response is based on communication, participation and feedback

Page 35: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 35

» Confidentiality: the privacy of the person met during the evaluation must be respected. The names of participants, or any information that may indicate their identity, should be avoided.

Tdh Institutional Indicator # and % of evaluation ToR

including a section on Ethical

considerations

# and % of evaluation Inception

reports including a section on

Ethical considerations

# and % of data collection tools

that include informed consent

note.

# and % of evaluation inception

reports including a cost-benefit

analysis

Tdh resources

- Tdh (2018), Global Code of Conduct, Policy on the Protection from Sexual Exploitation and Abuse and Child Safeguarding policy.

- Tdh (2017), Data protection starter kit. Note on informed consent – assent with children.

- Directive on Data protection

- Logistic Intranet: package for

independent contractor

Core Humanitarian Standards

Commitment 3: Response strengthen capacities and avoid negative effects

Commitment 4: response is based on communication, participation and feedback

Page 36: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 36

5. Process: what are the big steps in evaluation planning and

implementation?

5.1. Global process for planning and managing an evaluation (external – Internal)

Page 37: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 37

In case of Internal evaluation, the process differs slightly.

There is no recruitment process and it is not required to produce a full inception report to be approved by the

Evaluation steering committee. However, structured ToRs, evaluation methodology and final report and evaluation

response are still compulsory.

Page 38: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 38

5.2. Additional guidance on the evaluation process: key questions, outputs, resources per

evaluation stages

Planning & design

The questions to be explored are the following ones: - Why should the evaluation be undertaken? Why not another learning

exercise? - What type of evaluation [Highlight: Types of evaluation? Their purpose

and key questions] [Highlight: External and internal evaluation] ? What approaches would be indicated, what would be the scope and complexity?

- Who is the audience? - What is being evaluated? - How the evaluation will be used - Who should be conducting the evaluation, What expertise do we need? - Who should be involved? - What resources should be available?

When should it ideally happen? - Project design stage - Launch of response for Real Time Evaluations

Developing terms of references is a key step,

compulsory at Tdh. Why? Because it: 1. Provides a clear overview of what is expected in an

evaluation: the why and for what, how, when, by whom, with whom, and will be guiding the evaluation process

2. It is the basis for contracting consultants and establishes the key criteria for assessing the quality of an evaluation

3. Establishes guidelines, so everyone involved will understand what is expected

4. Helps us ensuring that the evaluation is feasible, realistic and will deliver credible and valid results.

ToRs are also compulsory for internal evaluations

Elements to be covered in Terms of References - Process of selection of the evaluation team - Why the evaluation is being undertaken; background,

rationale, justification - What is being evaluated: objective of the evaluation, its

scope - Key Evaluation Questions, emerging from the evaluation

objective, linked with key evaluation criteria (OECD-DAC, CHS) [Highlight: What entry point? OECD-DAC criteria and CHS]

- Intended users, how the evaluation will be used and shared - How the evaluation will be undertaken; approach and basic

methodological requirement, ethical questions and how cross-cutting issues should be addressed [Highlight: Ethics in evaluation] ; [Highlight : Ethical considerations in engaging children in M&E]

- How the evaluation will be managed: roles and responsibilities, management arrangements

- Schedule, budget, logistics and deliverables - Qualification and skills of the evaluation team

Product of this step - Evaluation mandate confirmed - Stakeholders identified - Resources needed identified - Terms of references developed and approved

Resources - Tdh Q&A Knowledge Center Tdh PCM

handbook: monitoring vs evaluation vs institutional learning

- Intrac https://www.intrac.org/wpcms/wp-content/uploads/2017/01/Types-of-Evaluation.pdf

- ALNAP humanitarian evaluation guide page 41-101

- https://www.betterevaluation.org/en/start_here/decide_which_method

- http://betterevaluation.org/plan/manage/identify_engage_users

- Tdh - Terms of reference guidance and format - https://www.betterevaluation.org/en/rainbow

_framework/frame - ALNAP humanitarian action evaluation guide

pages 92-128 - http://betterevaluation.org/plan/manage_eval

uation/determine_resources - On gender:

https://asia.ifad.org/documents/627927/627956/65+Gender-Sensitive+M%26E.pdf?version=1.0,

- OECD- DAC Criteria [Highlight OECD-DAC and the CHS]

Page 39: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 39

Evaluation Management

Caution: Why is the inception report compulsory and important?

Our ability to have any influence on the quality of the evaluation decreases rapidly after the inception report, as the work progresses. The evaluation manager will have to discuss and agree with the evaluators the data collection methodology, how evaluation questions will be addressed and how triangulation will be done (make sure that the evaluator develops an evaluation matrix. The evaluation steering committee should feel confident about the approach being proposed. It is key to engage at that moment to ensure that the final report will be relevant and address the objective of the evaluation. The inception report must be delivered and approved before collecting primary data (data we collect through interview, survey or via observation) on the ground.

Inception report: what does it contain? - Reviewed evaluation questions, - reviewed methodology, data collection

instruments, and first analysis of desk review.

- Data collection workplan - Analysis Plan - Data collection tools (observation guide,

focus group discussion guide)

Outputs of this step ✓ Consultant recruitment:

Documented tender process

✓ Contract, including approved evaluation proposal

✓ Inception report, including method, data collection instrument and workplan, analysis plan

✓ Signed off evaluation report and annexes

Resources ✓ On tendering process: Tdh consultant evaluation grid, Inception report

template, final report template, logistics intranet package for independent contractor, safer recruitment check list

✓ Tdh method pack: documents on sampling, analysis plan, bias and error, choosing the right approach, note on focus groups

✓ Tdh and Cartong mdc toolkit https://www.mdc-toolkit.org/en/ ✓ Child Ethics project https://childethics.com/ethical-guidance/ ; note on

informed consent with children ✓ Better evaluation resources on data collection and analysis :

https://www.betterevaluation.org/en/rainbow_framework/describe

If you realize that the draft evaluation report is of bad quality,

because the data collection was not well conducted…

unfortunately it is too late – data has already been collected.

You have greater chance to ensure quality if you ensure that a

robust inception report is produced. The inception report is the

opportunity to fix any misunderstanding, clarify the evaluation

objective, scope and questions with the consultant and resolve

any issue.

Page 40: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 40

Why sharing transparently our evaluations is so important?

Evaluations represent a huge investment in terms of effort, time and money. They contain valuable information about the quality, effectiveness of our projects and programmes. Unfortunately, they are often underused, and the reports shelved to be forgotten by the teams.

Publishing information and filing them in a systematic manner is key for Tdh accountability (we are transparent about the evaluation results and the way we will address recommendations) and learning.

The principle of sharing transparently evaluation applies both for negative and positive evaluations, both for internal as external evaluation, we should not hide any lessons learnt with the pretext that we did not achieved the outcome expected and communication should be shared internally at a very least. When there are concern about the credibility of some evaluation findings, reliability of data or lack of quality, the evaluation response document represents the appropriate means to communicate limitations, alongside the evaluation report.

Exception will be whenever if the publication represents unacceptable risks or could entail repercussion to beneficiaries or staffs, partners or any other stakeholder. In case of doubt, it is the final responsibility of the Country delegate to decide whether the evaluation can be published.

Remember: the response to the evaluation is compulsory. Why? Because it:

1) Ensures that any quality issue of the evaluation is noted and acted on. For example, if the evaluation presents a biased analysis by inadequately representing the views of different stakeholders or if recommendations are not relying on robust evidence.

2) Ensures that findings, conclusions and recommendations are given careful consideration and are acted on. Developing a management response in consultation with the relevant stakeholders helps us document our main learnings from evaluations as well as track our actions in response to the recommendations.

Outputs of this step ✓ Evaluation quality assessment of evaluation completed ✓ Management response and action plan developed and

approved ✓ Management response shared together with (summary of)

evaluation report to Q&A unit and published on Q&A knowledge Center

✓ Management response published together with full evaluation report and evaluation quality assessment and ToR on one drive

Resources

✓ Template Response to the evaluation and Action Plan document

✓ Evaluation registry (in construction)

Page 41: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 41

Highlight

What questions does “monitoring” help answering?

During the design stage

Information need What do we need to know, what do we need to track? What is our evidence need? Methods How are we going to collect, analyse and use this information? Information

How can we ensure that we inform everyone in a relevant way about the project progress? (is this at design stage?) Participation How can we ensure that the voice of everyone concerned by the project is expressed and listened to?

During the implementation

Context What are the external factors that can influence the intervention / Do contextual indicators show any change we should pay attention to? Inputs How are our resources managed? Are budget spending and inputs leading to Activities within time frame as planned? Activities and outputs Are we doing what we said we would be doing? Are the activities leading to the outputs announced? Approach Should we be doing something else or doing things differently?

Outcomes Are the outputs leading towards the planned outcomes? Are there more longer-term changes materializing? Is the project actually leading towards the objectives? Do No Harm Are there any negative unexpected effects form the intervention? For whom? Target groups Who has been benefitting the most from the intervention and how? Are all people targeted benefitting? Who else? (consider gender, age and other factors or groups such as persons with disabilities)

Participation How are women / men / girls / boys participating in

© Td

h/G

race Med

ina

Page 42: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 42

the project? What is their opinion about the intervention?

Page 43: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 43

Ethical principles for engaging children in M&E

1) Participation is safe: Risks are considered both before and during M&E processes to ensure that children

are kept safe.

✓ This includes ensuring that processes do no harm / further traumatize children (for example not

asking children questions about their abuse in open forums); prevent children from being

exposed to risks as a result of their participation (for example be subjected to stigma or

discrimination as a result of their involvement); and ensuring their emotional, psychological and

physical safety during participation processes.

✓ This also includes careful consideration of confidentiality and anonymity, and circumstances in

which it is necessary to break confidentiality, such as where abuse is disclosed.

2) Participation is voluntary, Consent / assent is always sought both from children and their caregivers

(where available). Children know that they can withdraw from the M&E exercise at any time, without any

negative consequences.

3) Children are informed about the purpose of their participation in monitoring. The purpose is understood

and felt as meaningful and relevant for them. They are made aware of how their views and opinions will

be used and any feedback that will be given to them, in order to make an informed decision regarding

whether to participate.

4) Involving children is meaningful and necessary. Careful consideration is always be given as to whether it

is relevant and appropriate to collect data from the children, and if information is already known and can

be found from other sources or if other data collection methods are more appropriate. Where those

exercises are conducted with children, the data obtained is used to inform policy and programs.

5) Participation is Inclusive and non-discriminatory. All children, including those who are marginalized, are

able to participate equally, and where necessary special measures are in place to ensure that children

who are marginalized can participate fully in M&E processes

6) Participation is developmentally appropriate, gender sensitive and culturally relevant. Participation

should be an enjoyable and stimulating experience

7) Participation is ensured by professionals having the required competencies. They must have the

necessary experience and sensitivity to apply developmentally, age, gender and culturally sensitive

processes for children.

8) In all situations the best of interest of children is the paramount consideration

Adapted from Tdh (2017), Child Protection Quality Framework

Page 44: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 44

Quantitative and Qualitative indicators Indicators are signs that provides evidence that something has happened (eg. That an output was delivered, and immediate effected occurred or a longer-term change). They can be of different nature: qualitative, quantitative, proxi, scoring and ranking, framing, composite…

Quantitative Qualitative

- Expressed in numbers - Provide information on trends, incidence, width

and scope of work - Analysed through statistical data method

- Expressed in words - Provide in-depth information on changes - Analysed through summarizing, reduction and scoring

- % of birth assisted by a skilled health worker - # of children reintegrated in school

- Extent to which youth feel recognized as change agents in their community

- Level of integration of international standard in a local policy

Beyond dichotomy of distinction between quantitative and qualitative indicators: Qualitative data can in a further stage, be quantified (scoring, ranking indicators). In addition, OHCHR spectrum interestingly bring the distinction between Fact-based (or objective) and Judgement-based (or subjective) types of indicators.

FACT-BASED or OBJECTIVE JUDGEMENT-BASED or SUBJECTIVE

Qu

anti

tati

ve A

Quantitative form - expression

Information on objects, facts or events directly

observable and verifiable

Prevalence of underweight children <5

B

Quantitative form

Based on information that is a perception,

opinion, or judgement

Percentage or children who report an improvement of

their subjective wellbeing

Qu

alit

ati

ve C

Narrative expression

Information on objects, facts or events directly

observable and verifiable

Status of ratification of treaties in a given country

D

Narrative expression

Based on information that is a perception,

opinion, or judgement

Assessment expressed in narrative form –how well are

coordinated informal and formal protection actors

OHCHR, Human Rights indicators, a guide to measurement and implementation, New-York and Geneva, 2012, page 18

Dig deeper – toolkit - Intrac resource on indicators (basic)

- Acaps resource on composite measures (advanced)

Page 45: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 45

Project Reviews

Monitoring

What

- A Project review is a moment/space where management examine and analyse project monitoring data, issues, risks, opportunities, discuss difficulties in project implementation, look for solutions and make the right decisions for steering the project.

Who

- Project Manager: present updates and questions - Delegate or Field Coordinator or Deputy: moderate - HR, Log, Program Coordinator: participate - M&E-Q&A: technical support and participation

Why

- Ensure that project is on track, check on indicators at all levels (inputs, activities, outputs, outcomes) and workplan

- Prevent and mitigate any unintended effect - Review risk analysis – hypothesis - Address any management – partnership issue

How

1) Preparation by PM, Log, HR: update on PFU, procurement plan, support by ProgCo & M&E.

2) From 1,5 hour discussion for small projects up to half a day for more important projects

3) Compilation of Decisions and Action plan 4) Follow up on decisions

When

- On a regular basis; every 3 or 4 months for projects that last over 12 months

- Every month for short emergency project

Do’s (Tips granted by Tdh Burkina Faso) Prepare well! - PM should receive in advance data from HR-Log and M&E

colleagues and update PFU, procurement plan. - Dare ask the support of the Program Coordinator for assisting

with preparation, check accuracy of data, particularly when there are issues with the project.

- Develop and share a list of questions / attention points in advance, to help orienting the discussion.

Moderate to keep everyone on board and build trust - Set ground rules and remind the objective of the exercise, to

encourage trust-building, transparency and critical thinking. - Designate a timekeeper and a note taker, eventually a “mood

taker”. - Postpone discussions that would focus too much on specific

issues that need more bilateral discussion to keep everyone on board and maintain attention.

- Pause if debates become too tense or people tend to settle scores.

- Ensure presence and quality participation of required managers, to ensure that necessary decisions can be made.

Ensure that all of this is useful - Think in integrated the major action points into the PFU, in a

separate sheet (decision tracker). Be Learning-Change oriented.

Don’t Project managers: Come unprepared, with unclear data,

no question, no proposal. Everyone would lose his/her time.

Try to hide things or be ashamed of addressing issues, difficulties, blockages.

Moderator: Do not let the meeting drift towards a

trial / examination somebody. Reviews are not evaluations of individuals. They are made to analyze problems and find solutions.

Let the discussions drift towards specific topics or disputes that would be better managed bilaterally in another space / at a different moment.

Note taker: Draft too detailed and long meeting

minutes that where action points – recommendations could be diluted – and lose focus of readers.

Good Practice of project review in Tdh - Burkina Faso - In Burkina Faso, the team has started undertaking project reviews in a systematic way. The delegation intends to

conduct reviews for all the projects during a same period (over a month), for logistical reasons (people traveling to the capital) and to give more weight to the exercise.

- Project reviews are purely internal and complements other steering mechanisms (project steering committee), that

involves other stakeholders. This ensures that a specific moment is dedicated for addressing issues related with project

implementation within the family and find solutions.

- Special care is given to set a working atmosphere that empowers the project managers, generate trust and

transparency, enhance teamwork.

Page 46: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 46

Main types of evaluations, their purpose and key questions they

address

There are many different types of evaluation. Each type has its own set of processes and /or principles. Many factors influence decisions over what type of evaluation to use. Evaluations can be categorized according to their purpose, who conducts them, when they are carried out, the broad approach used, and cross-cutting themes.

Stage Possible types of evaluations

Purpose of the evaluation Example of question that they seek to answer

Design

Evaluability assessment

Assess the extent to which an intervention can be evaluated in a reliable and credible fashion. Assess the feasibility and appropriateness of the program design. They may contribute to developing the Theory of Change underpinning the program/project and analyze the strengths and weaknesses of the proposed strategies.

Is the project adequately defined and are its results verifiable? Is an evaluation the best way to answer questions posed? How could the project design (ToC, M&E framework) improved? Is an evaluation possible, what type and with what conditions? Is information available? Does the context allow for an evaluation?

Early implementation

Real Time evaluation

Provide immediate (real time) feedback to those planning or implementing a project, so that they can make improvements.

How comprehensive were needs assessment? To what extent our project is answering immediate needs? How can we improve delivery of activities? What could be the long-term consequence of… How well-coordinated is the intervention?

During implementation

(eg.mid-term)

Formative evaluation

(Correspond with what is sometimes called Process evaluation)

Measures project ability to meet targets and suggest improvements for the second stage of the project. Examine current strengths and weaknesses, the project design, what does and does not work. Provide opportunities for feedback and reflection amongst stakeholders in a way that can immediately inform the ongoing implementation and lead to revise the design of a program/a project. May also assess whether the use of resources is proving to be effective and efficient, and whether the organizational systems and capacities of Tdh and partners are appropriate for achieving the program/project.

Is our project on track? Have the activities been carried out? What is the quality of the intervention? How internal management practices have affected our work? What could be done differently? What are our progresses towards our objectives?

Outcome Evaluation

They assess how and in what ways the program or project is contributing to immediate changes in policies, practices, ideas and attitudes, and if there have been any negative or unexpected effects.

What has been achieved (what has changed)? To what extent our intervention contributed to those changes?

Page 47: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 47

Closure – ex-post

Summative evaluation

Impact evaluation

Assessing the significant and sustained changes in the lives of women, men, girls and boys (impact) that the program has contributed to, they also consider questions of relevance, effectiveness, efficiency, legacy or sustainability, and learning.

Insights into the project’s success and contribution to impact and highlight potential improvements for subsequent projects. Depending on the approach chosen, can emphasize learning and accountability.

What are the long-term effects of the intervention? What have been the changes brought about or favored by the project, intended or not, positive or not, direct or not? To what extent can those changes be attributed to the project? OR: to what extent the project may have contributed to those changes?

Ex-post Meta evaluations Evaluation of a series of

evaluations. Assess the evaluation

process itself, often used to assess

compliance with evaluation policies,

and see how well evaluations are

conducted and used across an

organisation, a region or a

programme.

Were evaluations of a certain program conducted of quality? What could be improved? What common trends discrepancies do we observe? What are the lessons learned for future evaluations?

Evaluations could be also categorized according to

Their Approach Who conducts the evaluation

Gender Focused A gender-focused evaluation will analyze changes in inequalities and their root causes (I.e. social norms and structural/system factors).Focus on the extent to which and how the project took into account gender and diversity equality

External Conducted by a third party external to Tdh

Real time Aims at giving immediate feedback and learning in the early phase of a project, generally RTE are associated with emergency response

Internal (self-evaluation)

Conducted by a member of Tdh, external to the project

Theory based Will test the robustness of the ToC, and seeks to collect evidence at different stages along the ToC to establish what has changed and why

Joint evaluation (or mixed)

Conducted by a third party external to Tdh and a member of Tdh

Process evaluation

Will focus on internal project issues, for example the quality of work, management practices, coordination…

Peer Conducted by another Tdh delegation or another NGO working in the sector

Impact While most evaluation seek to assess impact to some extent, a “real” impact evaluation will have an explicit and robust methodology to establish change and causality. They cannot be done quickly or cheaply

Participatory Conducted by and with the project stakeholders

Dig deeper – toolkit Intrac, Paper on types of evaluations and the M&E universe for learning more about each type of evaluation

www. betterevaluation.org https://trainingcentre.unwomen.org/ https://elearning.evalpartners.org/elearning/ United Nations Evaluation Group - guidance documents

Page 48: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 48

Page 49: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 49

Internal vs external evaluation, advantages and disadvantages

Evaluation can be conducted internally – externally or both in the same time. The decision to use internal, external or mixed evaluation team will depend on the purpose of the evaluation.

Self-evaluation led by Tdh team in charge of the project

Internal evaluation is carried out by a member of Tdh, who is not directly part of the project implementation team (eg. Humanitarian Specialist, Q&A unit staff, other project Manager (peer), Zone staff) and has the skills to conduct this type of work.

External evaluation will be carried out by a team or an individual that is not part of Tdh (consultant). It can be commissioned by Tdh, the donor or a partner

Mixed or joined evaluation evaluation may be conducted by a joint team of external consultants and Tdh staff

Advantages and disadvantages of internal and external evaluations 3

External evaluation Internal evaluations

+ Bring expertise (technical as thematic), in-depth knowledge on a certain topic, language fluency + Evaluator is known to staff, the process can be felt as less

threatening à Potentially good ownership of the team

+ Time secured for the evaluation + Less expensive, more efficient and better capacity to mobilize partners

+ Often more objective and less likely to be subjected to organizational bias, bring fresh and outsider perspective

+ Knowledgeable of the organization, the culture, the requirements, programmatic approaches, methods, and will gain time in getting acquainted with the projects à findings and recommendations maybe more relevant for the organization

+ Independent and less subject to power dynamics of the organization + Learning process is more likely to remain in the

organization Build internal evaluation capacity

- More expensive - Not always structured and prepared à less transparent and formalized à less accountable and with lower impact in terms of change

- Do not know well the organization, its policies, its culture and practices, requirement and guidance - Maybe felt as less legitimate, relevant by parties

- May be perceived judgmental - Objectivity could be questioned, more risk of bias

- Time consuming contract related negotiation, higher need for close follow up - Potentially less capacity in terms of data collection and

analysis methods and evaluative judgment

- Maybe unfamiliar with the environment, the culture, language - May lack technical expertise in carrying evaluation

- Could be over positive due to interest in securing future contracts - May not have enough time for a robust data collection

and analysis process

3 Adapted from ALNAP, Evaluation of Humanitarian Action, page 158

Page 50: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 50

What entry point? OECD-DAC criteria and CHS

The most commonly used framework for evaluations in the humanitarian and development sector is the OECD/DAC framework. Initially developed for the development sector, it is based on five criteria to which two were added specifically for humanitarian contexts (ALNAP) and interpreted in the light of the very specific field of Protection (ALNAP).

Criteria Examples of General questions explored by the criteria

Dev

elo

pm

ent

sect

or

Relevance (and

appropriateness)

To what extent the project responds to the priority and evolving needs of the people? Did

the intervention correspond to the local-national priorities and policies? Is the intervention

logic (or the Toc) consistent?

Effectiveness To what extent is the projects attaining its objective, achieving the intended outcomes, in

the short, medium and long term?

Efficiency How were the resources used? What are the outputs delivered in relation to resources

used/inputs?

What has been the ratio of costs to benefits? What is the most cost-effective option?

Has the intervention been cost-effective (compared to alternatives)?

Is the project the best use of resources?

Impact What are the positive and negative, direct or indirect, intended or unintended effects of

the intervention?

To what extent can changes be attributed to the program? What were the particular

features of the program and context that made a difference? What was the influence of

other factors?

Sustainability

Have our intervention outcomes continued after the end of the project?

To what extent local stakeholders were empowered to ensure that the benefits of the

activities are likely to continue after the project ends?

Hu

man

ita

rian

co

nte

xts

Connectedness Have our short-term emergency response been considering longer-term and

interconnected problems?

Coverage Have we reached population groups facing life-threatening risk to provide with assistance

and protection? Where there inclusion or exclusion bias?

Coherence Does our intervention take account of other key actors and efforts? Is there enough

contextual analysis and stakeholder analysis to analyze what connection is appropriate,

feasible and desirable?

Coordination To what extent intervention of different actors where synergic, avoiding gaps and

duplication, conflicts around resources

There is currently a debate around the use made of those criteria. Why?

“Issue” repeatedly noted: reducing evaluation to

box-ticking exercise Mechanical way of using the criteria: totally unrealistic

Terms of references including 20+ evaluation questions, categorized their DAC evaluation criteria. The problem is that often it is not plausible, feasible or even appropriate to capture all those dimensions in one evaluation.

Other dimensions such as Gender & Diversity, do no harm, may be not fully considered. Should they be added as specific dimensions?

Using too general and broad questions without thinking

exactly about what we want to assess

Tips: Do not use the OECD-DAC criteria and

standard questions as a shopping or check list,

or in a mechanical way

✓ First, start by reflecting on your learning needs

or evidence gap: what do you need to know and

why? Phrase your evaluation questions and then

connect your questions to the corresponding

criteria.

✓ Do not try automatically or necessary to come

up with questions for all the criteria

Gender You can look at OCDE-DAC criteria through gender

lenses. See this example

1) EU: Evaluation with gender as a cross cutting

dimension

2) European Institute for Gender Equality

3) Guidance by IOM

Page 51: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 51

Page 52: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 52

How do OECD-DAC criteria relate with the Core Humanitarian Standard?

Should we take another entry point?

URD has developed a guide “the Compass Handbook” to help understanding how evaluation, as well as other Project Cycle Management stages contribute to the 9 CHS commitment. There are clear linkages between OECD-DAC criteria and the CHS, they are not exclusive but interconnected… make sure that you consider cross cutting themes if using the OECD-DAC criteria and emphasize in your ToR what elements of the CHS you would like to explore more in depth.

You can also decide to use the CHS as your entry point…

Page 53: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 53

Ethical considerations in Evaluation

Various evaluation bodies have produced guidance on evaluation ethics that are important to consider when launching an evaluation and examining technical proposals submitted by the consultants For example, the 2008 Ethical Guidelines produced by the United Nations Evaluation Group (UNEG) expands on the commitments cited below.

Commitments from the code of conduct for evaluation (UNEG)

- Independence

- Impartiality

- Conflict of interest

disclosure

- Honesty and Integrity

- Competence

- Accountability

- Respect of difference and

rights

- Confidentiality

- Avoidance of harm

- Accuracy, completeness

and reliability

- Transparency

- Obligation to report

evidence of wrong-doing

or unethical conduct

In our evaluation ToRs, we should insist particularly on Duty of Care and Do no Harm elements:

Confidentiality the privacy of the person met during the evaluation must be respected. The names of participants,

or any information that may indicate their identity, should be avoided.

Informed consent the evaluation team has the responsibility of informing the participants of the objective of the

evaluation, how data will be used and ask for their consent to participate. The researcher should

invite participants to ask questions, and the researcher will respond honestly and transparently.

Only after these stages can the participants be asked if they wish to participate or not in the

research.

Do no harm Evaluation activities should be undertaken only if necessary and where the evaluation is designed

to lead to valid and valuable information being gained, respecting data protection principles, such

as minimizing the data to those really needed to fulfil our objective. The consultant must consider

the level of vulnerability and protection status of the participants and adapt their questions and

attitude accordingly. If during the process the consultant becomes aware of a child in need of

protection and/or assistance, ‘the best interest of the child’ takes precedent over the desired

outcomes of the evaluation. This could lead to an evaluation activity being suspended if it is

considered to compromise the well-being of the child, her family, or her community.

Measures should be taken to minimize the distress that participants may experience during the

evaluation. Tdh has a responsibility to ensure that arrangements are in place to provide support to

a participant, particularly a child, should they require it during or after an evaluation. The

Evaluation should not put any child in danger and if a request is made for assistance or the

consultant recognizes a risk, the appropriate resources will be activated to assist the participant

when possible. All precaution should be taken to avoid generating expectations, respondent

“fatigue”, tokenism or re-victimization.

Child safety The consultant and anyone else affiliated with the evaluation (assistants, translators etc) must sign

the Terre des hommes Child Safeguarding Policy and Code of Conduct and be willing to adhere to

Page 54: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 54

its principles and expected practices. If a breach of the policy or code of conduct takes place the

consultancy will be terminated immediately without any financial burden on Tdh.

Page 55: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 55

What is the role of the “Evaluation Manager”

A Key role throughout the process: the Evaluation Manager

WHY?

It may be complicated to run an evaluation, many actors can be involved, and someone must be responsible for

checking that quality standards are met, that risks are managed, relevant stakeholders are engaged and

logistics well managed. It is why it is important that someone takes on the overall responsibility over the process.

WHO CAN PLAY THE ROLE OF THE EVALUATION MANAGER?

The project manager or the Programme coordinator, or the Q&A coordinator at the delegation level. For regional

project evaluation, it could be the Regional project manager, the Regional programme coordinator.

WHAT IS SHE/HE RESPONSIBLE FOR?

The responsibility for the overall process and outcome of the evaluation lies in the Evaluation manager, who will

be supported by the steering committee of the evaluation, or an advisory group.

WHO ELSE SHOULD BE INVOLVED IN THE EVALUATION MANAGEMENT?

Tdh partners should be involved as much as possible in the evaluation steering committee. When working in

consortium, all its member should be represented in the committee. Donors can also be involved as well as

beneficiaries – as much as possible through relevant methodologies. Do not forget: the process of evaluation is

as important as the final products.

Role of the Evaluation Manager during evaluation process on the ground: sub-stages

Advise - Alert – readjust if needed

Establish steering committee - advisory

group. Clarify roles and responsibilities

Manage recruitment process: Post ToRs, receive and analyses offers, select best

qualified consultants on the basis of methodological financial proposal, feedback,

making contract

Organise briefing and documentation

for desk review

Coordinate review and aproval of inception report,

check feasibility of work plan

Mobilize stakeholders, arrange for meetings and workshops.

Ensure participatory character of the evaluation

Facilitate and follow up on logistic arrangement,

including security

Organize de-briefing and sharing of preliminary results

Coordinate revision and approval of intermediate and final report + annexes

Page 56: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 56

Process and tasks of the Evaluation Manager in disseminating the evaluation products

Analyse potential risks of publishing evaluation products

Confirm communication strategy (to whom, what format, when,

how, who)

Translate main elements of the report in local language - adapt

to adequate format

Share - communicate in-country in a manner that is context -

objective appropriate

Share evaluation internally: in Delg, Zone, programme, Q& unit

in HQ

Share documents with donors, partners, evaluators

Fills in evaluation registry Check that all documents are stored in One Drive and Q&A

website

Page 57: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Tdh Project Cycle Management Series 57

Budgeting for Monitoring and Evaluation: Typical costs

The 5 % golden rule

Monitoring and Evaluation related activities need to be

planned and properly budgeted at the early stages of

the project design. In reality, despite good intentions,

M&E budget is often cut at the latest stage of proposal

development and falls into the tracks.

Why do we need money for M&E?

People, expertise,

Capacity development

Monitoring studies,

data management,

evaluations

M&E events:

project reviews,

analysis workshops,

networking

Type of M&E costs to be included in a project budget:

Human resources -

Salaries

- Expatriate and national Staff (eg. M&E Manager, Q&A coordinator, Data entry, IM officer)

- % of salary of HQ or Regional Staff, depending on their contribution to the project

Training - Training fee for national staff, travelling, accommodation - Cost for internal training (consultant, HQ)

Expertise and

technical support

- Consultants to conduct evaluations - Consultants for studies, - Consultant for developing methodologies & tools, M&E approach or framework - MDC Hotline - HQ / regional support on studies / diagnostics /M&E products, Internal evaluations

Documentation - Memberships to journals / access to libraries - Books - Access to platforms if not open source

Logistics for workshop

and publications

- Transportation - Communication - Accommodation - Catering

- Renting hall - Stationary - Design and Printing - Translation

Equipment - Tablets / phones for collecting data - Recorders - Computers - Office equipment - Stationary

- Data collection software: Survey CTO, ComCare

- Data analysis software: NVivo, Atlas, SPSS - Hosting, maintenance fees - GIS software

DO NOT FORGET Tdh’s Golden Rules for

budgeting: 5 % of the total budget should be

dedicated specifically to M&E. This rule is of

equal importance as the other ones!

Page 58: Requirements for Monitoring & Evaluation · 2020. 4. 2. · context. Monitoring and evaluation are closely linked. Both help produce knowledge and are key processes in improving quality,

Siège | Hauptsitz | Sede | Headquarters Av. de Montchoisi 15, CH-1006 Lausanne T + 41 58 611 06 66, F +41 58 611 06 77 E-mail : [email protected], CCP : 10-11504-8

www.tdh.ch www.facebook.com/tdh.ch www.twitter.com/tdh_ch www.instagram.com/tdh_ch

Every child in the world has the right to a childhood.

It’s that simple.

© Td

h/A

bayo

mi A

kand

e