23

Click here to load reader

europeanevaluation.orgeuropeanevaluation.org/sites/default/files/work...  · Web viewpurpose of expression of interest (eoi)

Embed Size (px)

Citation preview

Page 1: europeanevaluation.orgeuropeanevaluation.org/sites/default/files/work...  · Web viewpurpose of expression of interest (eoi)

UNICEF EVALUATION OFFICE07 March, 2014

EXPRESSION OF INTEREST: Lead Evaluator for the Developmental Evaluation of the PBEA

Subject: Peacebuilding, Education and Advocacy Programme (PBEA)

Date of the EOI: 07 March 2014

Closing Date of the EOI: 18 March 2014, and 14 April 2014 (as per Section IX of the enclosed ToR)

Address EOI by e-mail to: [email protected]

PURPOSE OF EXPRESSION OF INTEREST (EOI)

UNICEF Evaluation Office (New York) plans to commission a developmental evaluation of the Peacebuilding, Education and Advocacy Programme (PBEA).

This is an invitation for an Expression of Interest (EOI) from eligible individuals to provide services to conduct the evaluation.

The following are presented in the EoI call:1. Detailed terms of reference (ToRs) for the lead evaluator developmental evaluator2. Appendix A – EoI Form3. Appendix B – Concept note for the PBEA developmental evaluation

At a later data, a call for two Support Evaluators will also be published. Support Evaluators will work with the Lead Evaluator as a three person team.

1

Page 2: europeanevaluation.orgeuropeanevaluation.org/sites/default/files/work...  · Web viewpurpose of expression of interest (eoi)

Terms of reference for Lead Evaluator for the PBEA Developmental Evaluation

Purpose To recruit a suitable consultants/evaluators for roles of technical lead and field-based evaluators for the DE exercise to be conducted in two PBEA participating countries

Evaluation Timeline March 2014 through November 2015Reporting to Evaluation Specialist, Evaluation Office

I. Background

1. The Peacebuilding, Education and Advocacy Programme (PBEA) is a four-year (2012-2015) programme funded by the Government of the Netherlands (GoN), currently being implemented in 14 countries. The strategic vision of the programme is to “strengthen resilience, social cohesion and human security in conflict-affected contexts,” with the strategic result of “strengthening policies and practices in education for peacebuilding.” The strategic result will be achieved through five outcomes:

a. Increase inclusion of education into peacebuilding and conflict-reduction policies, analyses, and implementation;

b. Increase institutional capacities to supply conflict-sensitive education;c. Increase capacity of children, parents, teachers and other duty-bearers to prevent, reduce

and cope with conflict and promote peace;d. Increase access for children to quality, relevant, conflict-sensitive education that

contributes to peace; and,e. Contribute to the generation and use of evidence and knowledge on policies and

programming on linkages between education, conflict and peacebuilding

2. UNICEF commissioned an evaluability assessment of the PBEA in 2013. This assessment was a systematic process used to determine the programme’s readiness to be evaluated by examining the coherence and logic of the design, and the capacity of the management systems and governance structures to implement, monitor and measure results. The assessment also examined whether there was a shared understandings of the desired results and/or outcomes, and proffered advice, through findings and recommendations, on how the programme might be strengthened, and on the design for the end-of-programme evaluation.

3. The evaluability assessment categorized country programmes into three levels of ‘evaluation readiness’, with the most advanced programmes deemed ‘evaluation ready’ (having most of the elements required to execute a credible end-of programme evaluation). The least ‘evaluation ready’ programmes require substantial inputs and support to bring them to a level where an evaluation effort would yield information that would enable credible findings/conclusions. The differential progress that country programmes have made towards implementation, diverse country contexts, the complexity of the PBEA, and indeed the advice of the evaluability assessment, point to a need for a well thought out and non-conventional end-of-programme evaluation effort.

4. The evaluability assessment also noted that using the conventional ex-post facto evaluation design - aggregating the contribution of education to peacebuilding across country programmes and selecting a sample of programmes for in-depth field work – would not represent PBEA results (or the lack thereof) adequately, nor will it capture the lessons that PBEA programming has to offer. Instead a “bottom-up” evaluation design was deemed more suitable to capture the diversity of interventions and themes, broad variations in country programme profiles, and variations in navigating programming complexity.

2

Page 3: europeanevaluation.orgeuropeanevaluation.org/sites/default/files/work...  · Web viewpurpose of expression of interest (eoi)

5. In addition to the advice on the evaluation approach, the evaluability assessment also made strong observations on the need to balance accountability to funding authorities with seizing the opportunity to capture emergent learning. Hence two major evaluative activities are proposed for the PBEA; an exercise to assist in the systematic documentation of the lessons of PBEA development and implementation over a period of 18 months or so, and a summative evaluation during the final quarter of the programme. The first evaluative activity will be based on the somewhat evolving thinking of ‘developmental evaluation’ (DE) – recruiting suitable personnel for the first evaluative activity is the subject of this terms of reference.

II. Purpose of the DE and its rationale

6. DE is an approach that injects evaluative thinking and supports adaptive learning in complex initiatives. This design combines the rigor of evaluation methodologies with the flexibility and creativity that is required in seeking solutions to development problems, typically involving innovation, high levels of uncertainty, and tackling social complexity (Patton, 2008; Gamble, 2008; Dozois, Langlois and Blanchet-Cohen, 2010). DE seems to be an appropriate design for capturing the learning that ensues in a programme as complex as the PBEA.

7. To commence during the first quarter of 2014 in two of the country programmes that are still in the early stages of implementation, this component will be executed by a three person evaluation team. One consultant will be a senior (experienced) lead evaluator who has experience conducting or supervising a DE, who will conceptualize and guide the technical execution of the evaluation. The other members of the team will be two support evaluators, each working from within a country programme as an integral part of the country programme team.

IV. Scope of the evaluation and methodology

8. Scope: As previously stated, the evaluation’s purpose is to systematically capture the learning that can be infused into the programme to heighten its chances for success. Hence, the evaluation will provide comprehensive and evidence-based answers to two overarching questions, namely,

a. Are the programme impact pathways stipulated for the PBEA feasible/credible and likely to produce the intended results and/or outcomes? and,

b. What new learning and/or improvements were effected to improve attainment of results and/or outcomes?

The evaluation will cover two to three key outcomes that will be selected by the country programme team for intensive monitoring and study through the developmental evaluation. The evaluation will be based on a learning framework which will develop sub-questions that are customized to the programming context, as well as criteria for weighing the evidence on each question.

9. Methodology: Given that it is an adaptive, context-specific approach, the methodology of a DE is usually largely informed by the theme/subject matter under investigation and context. Its practice offers a great opportunity for innovation and experimenting with new ideas, even in terms of the approach and methodology. However, DE primers (Dozois, 2010) have identified entry points, practices and organizing tools that are emerging as part of the methodology for a DE investigation. Below are some of the steps, in building the methodology for the proposed DE, adapted from Dozois, 2010 and tailored more to the context of PBEA.

Orienting the evaluation team: Evaluators undertake investigative work early in the initiative in order to build a deeper understanding of the identified problem or opportunity,

3

Page 4: europeanevaluation.orgeuropeanevaluation.org/sites/default/files/work...  · Web viewpurpose of expression of interest (eoi)

resources, stakeholders and broader context. This could be a good starting point for the developmental evaluation of the PBEA.

Building relationships: The quality of relationships determines the degree to which the team can access information and influence change. For this reason, the methodology should consider a mapping of relationships that are critical to execute the DE, and a strategy to keep people engaged in the evaluation.

Orienting the implementation team (the country programme team): Related to the point above, a key part of the evaluators’ role is to help stakeholders test their assumptions, articulate and refine their models, extend their understanding, and cultivate a culture that supports learning. These activities will likely help the PBEA country teams to develop and maintain an adaptive orientation in complex and unknown territory.

Developing a learning framework: A learning framework is a good tool for DE practice. Working in collaboration with key stakeholders, developing a learning framework (slightly different from an evaluation framework), will guide the evaluation by mapping out potential areas for learning (and identify both opportunities and challenges), identifying data and/or evidence that is required to make decisions, and to articulate feedback mechanisms.

Observing: Evaluators carefully observe the unfolding situation in order to help the group identify leverage points, assess their efforts, and stay true to the core intent and principles of their initiative. Evaluators should (i) key developmental moments; (ii) group structure (iii) group dynamics; (iv) action or inaction; and (v) opportunities and threats. PBEA should be able to benefit from a sustained evaluation effort that can bring a better calibration of impacts of micro-level solutions on a macro-level conflict.

Sense-making: Sense-making is largely about making sense of the data that has been collected. The evaluator’s role is to help the group identify patterns, integrate new information, and consider the implications of their observations, and propose solutions.

Intervening: As a member of the programme team, evaluators actively help to shape the work by: (i) asking questions; (ii) facilitating discussion; (iii) sourcing or providing information; (iv) modeling solutions; (v) and, making new connections.

10. A major task for the Lead Evaluator will be to develop a methodology for the DE, based on these rudimentary steps, and to enrich it with his/her knowledge and experience (see Section VII of p4 of this ToRs document). Also, additional resources for building an evaluation methodology centered on reflective practice include guidance on evaluation peacebuilding interventions, (OECD/DAC, 2012; Reimann, Chigas, and Woodrow, 2012; and Rogers, 2012).

V. Tasks, responsibilities, and management accountabilities

11. The lead evaluator will have overall responsibility for the technical guidance of the DE and its quality. He/she will be responsible for the following:

a. Conceptualize and develop the DE design and approach (learning framework, methodology, work planning, reporting, etc.);

b. Orient and/or coach of the evaluation team; c. Develop a workplan for the evaluation team, including and agreed set of deliverables.d. Provide inputs in the recruitment of the support evaluators, as well as guide and

supervise them;e. Provide quality assurance of all outputs, including the final report of the evaluation;f. In conjunction with the support evaluators, provide milestone reports to the Chief of

Education (or PBEA focal point) in each participating country; and,g. Periodically provide report on the progress of the evaluation to the Evaluation Specialist

in the Evaluation Office.

12. The two support evaluators will be responsible for the following;

4

Page 5: europeanevaluation.orgeuropeanevaluation.org/sites/default/files/work...  · Web viewpurpose of expression of interest (eoi)

a. Adapting the methodology of the DE to the country context, and clearing it with the lead evaluator;

b. Develop and implement a work plan for evaluation activities;c. Direct the execution of the DE and assure its quality at the country level;d. Monitoring and tracking of the data that is required to inform decisions on the

evaluation;e. Make agreed inputs into milestone reports (Lead Evaluator has overall responsibility);f. Compiling progress reports, clearing them with the lead evaluator, and presenting them

to the Chief of Education (or PBEA focal point), according to an agreed schedule;

13. The country office leadership (presumably the Education Chief) will be responsible for:a. Co-facilitating the recruitment of support evaluators; b. Provide supervision support in day-to-day execution of evaluation activities; and,c. Reviewing the progress reports, and all deliverables.

14. The Evaluation Specialist and evaluation manager (in UNICEF’s Evaluation Office in New York) will be responsible for;

a. Coordinating, directing and supervising all activities of the Lead Evaluator. b. Consulting with the M&E Specialist (PBEA) on a regular basis; c. Providing updates to the PBEA Programme Manager at agreed intervals;d. Assure the quality of all deliverables as well as give final approval.

VI. Key skills, academic/technical background, and experience required

15. This invitation is extended to evaluation practitioners and/or consultants with extensive experience in monitoring and evaluation of international development programmes, to submit a bid to lead a three person evaluation team to develop and execute a developmental evaluation study. The successful consultant should offer the following range of skills and experience:

a. Leadership and strategic thinking skills; b. Extensive technical knowledge, skills and expertise in evaluation concepts and

approaches, and evaluating complexity, in particular; c. Programming and/or evaluation experience in the broad area of education in

emergencies and/or working with vulnerable populations in conflict affected countries. (evaluation experience in peacebulding programmes will be assessed as an added advantage);

d. Strong analytical skills to support both qualitative and quantitative research;e. Facilitation skills, particularly design and execution of stakeholder consultations and

team building exercises;f. Active listening and time management skills;g. Excellent oral and written communication and report writing skills in English;

(profieciency and the ability to communicate in one other UN language will be assessed as an added advantage); and,

h. Computer literacy in Word, Excel and Power Point.

The consultancy is estimated at 117 person days, stretching from March 2014 to October 2015. Consultancy fees will be payable at P-5 to D1 level, depending on the qualifications and experience of the consultant. The consultants will be paid on an agreed schedule, and upon submission of satisfactory products. Within the contract period, the consultant will be expected to undertake at least four trips to UNICEF New York for the purpose of consultations, at least two trips to each of the participating country programmes, as well as a trip the PBEA annual global meeting. (The consultant will be responsible to submit copies of travel health coverage prior to any travel for this contract). Additional expenses up to

5

Page 6: europeanevaluation.orgeuropeanevaluation.org/sites/default/files/work...  · Web viewpurpose of expression of interest (eoi)

and not exceeding US $2000 will be reimbursed to the consultant upon presentation of breakdown and receipts for miscellaneous expenses (telephone calls, fax, photocopy, postal charges, etc.).

VII. Timeline, time allocation and deliverables

Task Output/deliverables Person days Deadline

Conceptualize and develop the DE design and approach (learning framework, methodology, work planning, reporting, etc.)

Draft Inception report 10 days April 30, 2014

Orienting and training of support evaluators (and team building)

Final Inception report and work plans

5 days April 30, 2014

Mission to UNICEF, HQ New York for consultations with EO and PBEA PMT: 3-4 visits

Trip reports 6 days

Mission to DE programmes countries (4 missions - 2 to each country)

Trip reports 20 days

Mission to PBEA Global review Meeting Trip report 6 daysTechnical backstopping of evaluation and quality assurance (2 days per month, times 2 countries )May – Nov, 2014: 28 daysFeb – Oct 2015: 36 days

Quality review protocols completed

64 days

Reporting Draft and final reports 6 days Oct 30, 2015

TOTAL 117 days

VIII. Expression of interest, and concept paper

All interested consultants should send an application packet including the following:a. A completed Expression of Interest form (see Appendix A) and responses to the questions,

including the professional fee/rate, per person day; b. Updated CV/Resume, and completed Personal History Profile (P11)1, if UNICEF Evaluation Office

does not have it on file already.c. A sample evaluation report, where consultant was team leader and/or lead author; and,d. A revision of the concept paper in Appendix B of this ‘expression of interest’ publication that

contributes to and improves Section IV (scope and methodology, including a draft learning framework, evaluation criteria), not exceeding 8 pages, excluding references.

The application packet should be transmitted via email to Kathleen Letshabo using the following email:Email: [email protected] Subject: Expression of Interest- PBEA Developmental Evaluation (Lead Evaluator)

IX. Closing dates

There are two closing dates to observe;a. A filing of the ‘expression of interest’ form as well as the applicant’s CV/Resume (Items VIIIa, VIIIb,

and VIIIc above). These are due on 18 March, 2014, 12:01 am (midnight), New York City time.b. A filing of the concept paper and the sample evaluation report that has the applicant as team

leader and/or lead author; (Items VIIIc and VIIId). These are due on 14 April, 2014, 12:01 am (midnight), New York City time.

1 http://www.unicef.org/about/employ/files/P11.doc

6

Page 7: europeanevaluation.orgeuropeanevaluation.org/sites/default/files/work...  · Web viewpurpose of expression of interest (eoi)

NB: Submissions have to observe both deadlines, and file documents accordingly to be considered. Submissions that file all four documents on the 14 April, 2014, will not be considered.

7

Page 8: europeanevaluation.orgeuropeanevaluation.org/sites/default/files/work...  · Web viewpurpose of expression of interest (eoi)

Appendix A: Expression of Interest Form

Lead Evaluator for PBEA Developmental Evaluation

Please fill-in page 1 of the form in its entirety and submit it to us electronically or via fax.

First Name:

Last Name:

User Salutation:       Mr. Ms. Mrs. Dr.

Job Title:

Mobile: (please include country & city codeFax: (please include country & city code) Email address:

Address:

City: State: Postal Code: Country:

Alternate contact

8

    

     

          

          

                              

                                        

          

Page 9: europeanevaluation.orgeuropeanevaluation.org/sites/default/files/work...  · Web viewpurpose of expression of interest (eoi)

Please respond to the questions below in a narrative not exceeding 3-4 pages.

1. Provide information which will enable us to determine whether you have relevant experience to conduct the proposed developmental evaluation. Information should include:a. The number of years of experience in evaluation research; b. The number of evaluations you have carried out as lead investigator, and as a member of an

evaluation team, and the number of evaluations commissioned by UN agencies or comparable organizations;

c. A description of your technical competencies, skills and expertise in evaluation concepts and approaches, and analytical skills to support both qualitative and quantitative research;

2. Provide information which will enable us to determine whether you have relevant specialized knowledge in the areas that are critical to this work. Information should include;

a. A description of programming experience in the education sector, and peacebuilding programmes, or related sub-disciplines or themes;

b. A description of evaluation or programming work in complex settings, as education in emergencies programmes, and/or with vulnerable populations in conflict affected countries (evaluation experience in peacebulding programmes will be assessed as an added advantage);

c. A description of actions/decisions that demonstrate your leadership and strategic thinking skills;

d. A description of ‘other’ skills or experiences that may be required to execute this evaluation, include facilitation skills, design and execution of stakeholder consultations, conducting team building exercises, time management skills, etc;

3. Provide any additional experience that may be critical to the success of the proposed work, including but limited to: a. Affiliation to universities, university programmes, professional bodies, communities of practice,

etc.b. Any other information that you deem relevant to the this work that would give you an

advantage over others competing for the same consultancy

4. Please provide your professional fee rate, per person day

5. Confirm the following;a. You have NO on-going litigation with the UN;b. You are not currently removed/invalidated or suspended by the United Nations or UN system

organizations;

6. Closing dateThere are two closing dates to observe;c. A filing of the ‘expression of interest’ form as well as the applicant’s CV/Resume (Items VIIIa and

VIIIb). These are due on 18 March, 2014, 12:01 am (midnight), New York City time.d. A filing of the concept paper and the sample evaluation report that has the applicant as team

leader and/or lead author; (Items VIIIc and VIIId). These are due on 14 April, 2014, 12:01 am (midnight), New York City time.

NB: Submissions have to observe both deadlines, and file documents accordingly to be considered. Submissions that file all four documents on the 14 April, 2014, will not be considered.

9

Page 10: europeanevaluation.orgeuropeanevaluation.org/sites/default/files/work...  · Web viewpurpose of expression of interest (eoi)

Appendix B

PBEA DEVELOPMENTAL EVALUATION: A CONCEPT NOTE

I. Background

1. The Peacebuilding, Education and Advocacy Programme (PBEA) is a four-year (2012-2015) programme funded by the Government of the Netherlands (GoN), currently being implemented in 14 countries. The strategic vision of the programme is to “strengthen resilience, social cohesion and human security in conflict-affected contexts,” with the strategic result of “strengthening policies and practices in education for peacebuilding.” The strategic result will be achieved through five outcomes:

Increase inclusion of education into peacebuilding and conflict-reduction policies, analyses, and implementation;

Increase institutional capacities to supply conflict-sensitive education; Increase capacity of children, parents, teachers and other duty-bearers to prevent, reduce

and cope with conflict and promote peace; Increase access for children to quality, relevant, conflict-sensitive education that

contributes to peace; and,Contribute to the generation and use of evidence and knowledge on policies and programming on linkages between education, conflict and peacebuilding

2. A unified Global Results Framework (GRF) was developed to guide an assessment of global corporate accountabilities. Based on the general guidance of the five strategic objectives, country programmes and other implementation units were expected to develop context specific programmes and adapt the five outcome results framework to their contexts. The country programmes integrated their results into the GRF via operational matrices outlining key objectives, indicators, and activities.

3. The PBEA is managed by the Programme Management Team (PMT) housed in the Education Section (Programme Division), working closely with the Humanitarian Action and Transition Unit (HATIS) and other divisions, sections, and units. The PMT provides overall leadership for the programme while implementation is carried out mainly through individual country-level programmes with technical support from regional offices and the PMT.

4. UNICEF commissioned an evaluability assessment of the PBEA in 2013. This assessment was a systematic process to determine the programme’s readiness to be evaluated by examining the coherence and logic of the design, and the capacity of the management systems and governance structures to implement, monitor and measure results. The assessment also examined whether there was a shared understandings of the desired results and/or outcomes, and proffered advice, through findings and recommendations, on how the programme might be strengthened, and on the most suitable design for the end-of-programme evaluation.

5. The assessment categorized country programmes into three levels of ‘evaluation readiness’, with the most advanced programmes deemed ‘evaluation ready’ (having most of the elements required to execute a credible end-of programme evaluation). The least ‘evaluation ready’ programmes were flagged as requiring ‘substantial inputs and support’ to get them to a point where the end-of-programme evaluation would yield useful information.

6. The evaluability assessment also noted that using the conventional ex-post facto evaluation design - aggregating the contribution of education to peacebuilding across country programmes and selecting a sample of programmes for in-depth field work – would not adequately capture the

10

Page 11: europeanevaluation.orgeuropeanevaluation.org/sites/default/files/work...  · Web viewpurpose of expression of interest (eoi)

complexity represented in PBEA implementation, nor will it capture the lessons that PBEA programming has to offer. Instead a “bottom-up” evaluation strategy was recommended as more suitable to capture the diversity of interventions and themes, broad variations in country programme profiles, and variations in navigating programming complexity. In addition, the evaluability assessment also made strong observations on the need to balance accountability to funding authorities with capturing emergent learning.

II. The problem, and proposed solution

7. The concept of peacebuilding has evolved from a broad definition since its earlier articulation in UN system in the 1990s, to a more refined set of priorities. The provision of basic services, including primary education, is explicitly stated as one of these priorities. For instance, access to quality education is often a key component and measurement of basic safety and human security, while restoring schools and reconstructing education systems can increase government legitimacy. Civic education programmes have often been used to build conflict-management capacities, and to acquire dialogue skills which are necessary to foster reconciliation and support political processes. Education can also contribute to economic revitalization, provide constructive opportunities for empowerment of women and girls and disenfranchised youth, and has a strong potential to transform accepted norms around gender and power, to prevent violence and serve as a peace dividend.

8. Emanating from the increased recognition and acknowledgement of the linkages between education and peacebuilding, PBEA is a programme that attempts to harness the power of education to contribute to peace. However, the diversity country contexts, the level of complexity displayed in programming options, and the differential progress that country programmes have made towards PBEA implementation (and their evaluation readiness), all point to an end-of-programme evaluation strategy that will codify emergent learning in a manner that can benefit UNICEF as we move from this first generation of education and peacebuilding programmes, towards ‘programming for resilience’.

9. Hence, we propose a ‘developmental evaluation’ (DE) - an approach that injects evaluative thinking and supports adaptive learning in complex initiatives. This design combines the rigor of monitoring and evaluation practice with the flexibility and creativity that is required in seeking solutions to development problems, typically involving innovation, high levels of uncertainty, and tackling social complexity (Patton, 2008; Gamble, 2008; Dozois, Langlois and Blanchet-Cohen, 2010). While we acknowledge that DE will require a lot more inputs from the Evaluation Office (some of which may seeming to be programmatic), we submit that DE is an appropriate design for capturing the learning that ensues in a programme as complex as the PBEA.

10. Our proposal is for the DE component to commence during the first quarter of 2014 in two of the country programmes that are still in the early stages of implementation. The evaluation will be executed by a three person team; an experienced lead evaluator who will conceptualize and guide the technical execution of the evaluation from his/her base, as well as two support evaluators, one of which will be embedded with each of the two country programme teams 2. Hence, the purpose of

2 A PBEA end-of-programme summative evaluation – an ex-post facto assessment of programme performance that will enable aggregation of programme results as per UNICEF accountabilities to the donor - is planned to commence in 2015. The evaluation will feature a desk-based document analysis and synthesis of results from all 14 programmes, in-depth field work in three of the country programmes that were deemed “evaluation ready” by the evaluability assessment, as well as integrate lessons from the developmental evaluation which is subject of this concept paper. The Evaluation Office will commission an independent evaluation firm to conduct the summative evaluation.

11

Page 12: europeanevaluation.orgeuropeanevaluation.org/sites/default/files/work...  · Web viewpurpose of expression of interest (eoi)

this concept note is to provide a rationale for the developmental evaluation, broad strokes of the methodological approach, as well as roles and responsibilities of different actors.

III. Rationale for developmental evaluation and implications for practice

11. According to Patton (2008) and Gamble (2008), DE is suitable in:• Innovative interventions, requiring real-time learning, processing and feeding back new learning

to create the desired solutions;• Highly emergent implementation environments, where there is consistent need to restate,

reaffirm, or even modify the goals of an intervention as new learning comes in;• Highly complex interventions, characterized by non-linearity in terms of design, execution, as

well the relationship between input/output and outcome variables; and• Socially complex themes or areas of inquiry, requiring collaboration among stakeholder in

different parts of the system, as well as external stakeholders.Most PBEA interventions operate under several of these conditions or situations, hence the programme is deemed to be a good ‘fit’ for a DE.

12. However, PBEA is not an “innovation” in a classical sense of the term. And, while there is an expectation that it will generate innovative education solutions towards building sustainable peace in schools, homes, communities, and education institutions, the goals of the programme have already been defined (see para. 1 above). In practice, DE leaves a lot of room for the evaluator and the programme team to define and modify goals in response to new learning, and thereby determine what ‘success’ of the innovation and/or intervention is.

13. This means that one of the challenging and yet critical tasks of the developmental evaluation will be to align goals for the DE with accountabilities for results in the two country programmes, while at the same time ensuring that programme implementers are empowered to act on new learning by effecting the necessary modification to the programme. To that end, the PBEA evaluability assessment concluded the PBEA had articulated good monitoring indicators at the activity and output levels, but had yet to articulate clear evaluation indicators at the outcome or goal levels (individual behavior change, policy implementation and social change)3. This DE offers an opportunity and space to propose what success will look like, in terms of outcomes that can be realized as well as ‘evaluation indicators’.

14. Considering also, that DE is a complex and time-intensive undertaking, a certain level of organizational/system “readiness to support learning” is required for this approach to be effective. It also requires sensitization at the country and regional offices to provide comprehensive understanding of the methodology. The following questions (adapted from Dozois et al., 2010) are useful in assessing the readiness of PBEA country programmes for DE. Is there a buy-in for developmental evaluation or champions in respective country programmes who can help cultivate buy-in?• Does the culture of programmes teams support learning – i.e., how do programme teams

handle failure? How do they handle feedback? Are they willing to take risks? • Is there clear recognition of resources and the effort that is required, and are there

appropriate resources?• Does leadership within programme teams understand the need for participatory

processes in the way in which decisions are made, or the need to alter power dynamics?

• Are programme teams willing to adapt their structures (e.g., rules, routines, procedures) to accommodate new ways of operation?

3 PBEA Evaluability Assessment (para. 57). (http://www.unicef.org/evaldatabase/files/PBEA_Evaluability_Assessment_Final_Report_November_2013.pdf

12

Page 13: europeanevaluation.orgeuropeanevaluation.org/sites/default/files/work...  · Web viewpurpose of expression of interest (eoi)

• Are there any major issues that could interfere with the process (e.g., in-fighting among staff, an unclear mandate, unstable financial support)?

15. In terms of readiness of PBEA implementing units to execute DE and get the full benefit of what it has to offer (or indeed the readiness of UNICEF as a whole), reconceptualization of evaluation is required to ensure that there is full understanding of what DE effort entails. The Evaluation Office and the Education Section will have to work with all levels to promote understanding of this approach and ensure buy-in and ownership. More importantly, while traditional evaluation practice is governed by distinctive notions of independence or objectivity, we need to negotiate a new set of working relationships and management arrangements characterized by objectivity and transparency in order to create an environment that enables meaningful learning.

16. With regards to competencies required to execute the DE approach, DE literature suggests that while the skillset associated with traditional evaluation practice is definitely an asset, traditional evaluators may be less equipped to operate within the kind of ambiguity that is associated with innovation and emergent initiatives (Patton, 2008; Gamble, 2008; Dozois et al., 2010). Consequently, the skills set required of the evaluation team will be those deemed necessary in navigating complexity, namely some experience and/or leadership in the education and peacebuilding domain, strategic thinking, pattern recognition, relationship building. Strategic thinking is necessary to identify high level principles and purposes and cultivate actionable focus, while pattern recognition is essential in identifying categories for managing complexity.

17. On the issue of positioning of evaluators, lessons from DE practitioners indicate that each type of positioning presents advantages and disadvantages. Table 1 below, adapted from Dozois et al., 2010, highlights issues associated with the different options we have for positioning our evaluators, and implications for the PBEA. Consequently, the lead evaluator will be positioned externally. He/she will be required to make technical inputs on a regular basis, and to conduct two one-week missions to each of the two programme countries. The two support evaluators will be embedded with the country programme teams. However, thorough consideration should be given to this decision in terms of determining the suitable level/experience of embedded evaluators, their function and tasks, the likelihood to exercise their duties as well as the management arrangements4.

Table 1: Positioning of developmental evaluators

Positioning Advantages and disadvantages Implications for PBEADevelopmental Evaluator as internal staff member

Advantages: Internal staff members…• have tacit; insider knowledge that an outsider may

never acquire. • are situated where the action is, giving them

opportunities to observe the situation as it unfolds.• have established certain relationships; this can be a

significant advantage.

Initial proposals…• The two support evaluators should be

positioned as internal evaluators, even though they may be recruited from outside UNICEF.

• As internal staff members, support evaluators are expected to gain access to knowledge people, systems, to enhance their understanding of the PBEA

Disadvantages: Internal staff members…• often struggle with credibility as evaluators,

particularly if their previous roles, have not been focused specifically on evaluation.

• may be seen as less objective.• capacity to “speak truth to power” may be compro-

mised by their need to get along, or job security.• are usually expected to fulfill other roles, hence their

position as evaluators is sometimes diluted

To be mitigated…• There could be issues with the credibility of

support evaluators if they are considered too junior in UNICEF terms (P3 or lower)

• Clear ToRs should be issued to ensure that support evaluators are assigned responsibilities and tasks that are directly linked to the evaluation

4 Discussed in Section V of the terms of Reference for the Lead Evaluator.

13

Page 14: europeanevaluation.orgeuropeanevaluation.org/sites/default/files/work...  · Web viewpurpose of expression of interest (eoi)

• may have adverse relationships or caught up in office politics; this can be a significant disadvantage.

Developmental Evaluator as external consultant

Advantages: External consultants…• are generally considered to be more objective or

neutral; they are not party to internal politics• have more points of comparison to contribute, and a

broader network of information and influence to draw upon.

• can focus on the task more easier because they do not have other obligations within the organization.

Initial proposals…• The lead evaluator should be recruited as an

external independent consultant;• He/she should be positioned as an external

consultant to maintain objectivity and some measure of independence on the technical aspects of the DE.

• The lead evaluator is expected to gain sufficient insider knowledge through interaction with the two support evaluators.

Disadvantages: External consultants…• have less insider knowledge, sometimes a critical

requirement• have to work harder to build relationships and get

access to important pieces of information. • are generally more expensive, and the initiative may

have to limit contact time as a result.

IV. Scoping and executing developmental evaluation (methodology and other considerations)

18. Scope: As previously stated, the evaluation has the dual purpose of reporting on results, as well as to systematically capture the learning that can be infused into the programme to heighten its chances for success. Hence, the evaluation will provide comprehensive and evidence-based answers to two overarching questions, namely,

c. Are the programme impact pathways stipulated for the PBEA feasible/credible and likely to produce the intended results and/or outcomes? and,

d. What new learning and/or improvements were effected to improve attainment of results and/or outcomes?

Based on the question above, the evaluation will cover 1-2 key conflict drivers and corresponding programme outcomes. Additional monitoring will be provided for the selected outcomes. Evaluators will develop a learning framework (and/or theory of change) and criteria for weighing the evidence, to be validated with country teams. Country teams also be afforded the opportunity of develop sub-questions that are customized to the programming context.

19. DE primers have identified entry points, practices and tools that have been found useful in organizing a DE investigation (Dozois, 2010). Table 2 (adapted from Dozois, 2010) presents summarizes some conventional steps to consider in the execution of a DE, as well as how these might be interpreted in the context of the PBEA.

Table 2: Executing the DE: possible actions

Practice/Actions

Description Implications for PBEA

ENTRY POINTS/INITIAL ACTIONSOrienting the evaluation team

Evaluators undertake investigative work early in the initiative in order to build a deeper understanding of the identified problem/opportunity, resources, stakeholders, and broader context.

Time should be set aside for evaluators to review the PBEA proposal document, the global results framework, programme documents for the two selected countries, and the evaluability assessment report. They should also set up meetings with a range of stakeholders (Evaluation Office, PMT, TWG, regional focal points, and country teams).

Building relationships

The quality of relationships determines the degree to which the team can access information and influence change. For this reason, relationship building is critical to building a strong foundation for DE work.

The most important relationship will be that between the evaluators and the respective regional focal points and country teams. This should be managed carefully by the PBEA Programme Manager and Evaluation Manager (Evaluation Office).

Developing a learning framework

It is important to develop a learning framework early in the process. Co-created with key stakeholders, a learning framework helps to guide development by

A learning framework is highly desirable in the case the PBEA. It will be the guiding document of the evaluation (and a contract of sorts between the evaluation team

14

Page 15: europeanevaluation.orgeuropeanevaluation.org/sites/default/files/work...  · Web viewpurpose of expression of interest (eoi)

mapping key challenges and opportunities, high-lighting potential areas for learning, and identifying feedback mechanisms.

and the programme team).

Developing the rest of the methodology

Including the learning framework above, the methodology should articulate criteria to be used to asses evidence, delineate the scope of the evaluation, identify tools for collecting evidence.

OECD/DAC evaluation criteria should be reinterpreted for the PBEA. Criteria could also be expanded to reflect human right principles (such as participation, equality, empowerments, etc.). See Table 2.3 in UNEG Guidance Note (2011).

SOME CRITICAL PRACTICES FOR CONSIDERATIONOrienting the implementation team

A key part of a DE’s role is to help stakeholders sur-face and test their assumptions, articulate and refine their models, extend their understanding, and culti-vate a culture that supports learning. These activities help implementing teams to develop and maintain an adaptive orientation in complex and unknown territory.

Beginning with validating the learning framework mentioned above, orienting the implementing team should include ensuring that the team possesses a deep understanding of the conflict analysis. The country team (including the evaluator) should be on the same page on the most critical results for the programme, activate a strategy for incorporating new learning, and agree on roles and responsibilities.

Watching Evaluators carefully observe the unfolding situation in order to help the group identify leverage points, assess their efforts, and stay true to the core intent and principles of their initiative. Evaluators intentionally watch (1) Key developmental moments; (2) Group dynamics; (3) Structure; (4) Action/inaction; and (5) Threats and opportunities.

The core competencies outlined in Para. 16 are especially important to execute the DE in s coherent way, and organize emergent learning such that it is accessible to the whole country team. We expect that the lead evaluator will provide undertake initial coaching and preparation of the support evaluators to ensure that they are ready to execute the tasks.

Sense-making Sense-making is largely participatory in develop-mental evaluation: The evaluator’s role is to help the group identify patterns, integrate new information, and consider the implications of what they’re seeing and propose solutions.

Evaluative skills are important here; so is reflective practice. It is important to come into the evaluation with a well-considered set of analytical tools, including innovating some tools if necessary.

Intervening As a member of the team, evaluators actively help to shape the work by (1) Asking questions; (2) Facilitating discussion; (3) Sourcing or providing information; (4) Modeling; (5) Pausing the action; (6) Reminding; and (7) Connecting.

This is the most important component of the DE. Strategic thinking and communication skills are required for this aspect. Also, this likely the aspect that has to be managed carefully in that the technical skills have to be complemented with the evaluator’s credibility.

20. The Lead Evaluator will be expected to build on some of these lessons, adapt and/or interpret them for the PBEA, or propose a completely different set of steps and/or methods to execute the developmental evaluation). Also, additional resources for building an evaluation methodology centered on reflective practice include guidance on evaluation peacebuilding interventions, (OECD/DAC, 2012), UNEG guidance (UNEG, 2011), and CDA working papers (Reimann, Chigas, and Woodrow, 2012; and Rogers, 2012).

V. Tasks, responsibilities, and management arrangements

21. The Lead Evaluator will have overall responsibility for the technical guidance of the DE and its quality. He/she will conceptualize and develop the DE approach, guide and supervise the technical inputs of support evaluators, be responsible for quality assurance of all outputs, including the final report of the evaluation. The Lead Evaluator will also be responsible for compiling progress reports (with inputs from support evaluators) and timely submission of deliverables to the evaluation manager in the Evaluation Office in NY.

22. The two support evaluators will be responsible for execution of the DE and its quality at the country level. They will also be responsible for development of a work plan for executing the evaluation in-country, assume the responsibility for data for monitoring progress of the PBEA outcomes that are the subject of the DE, and producing an agreed set of deliverables. In consultation with the PBEA Focal Point, Support Evaluators will also be responsible for reporting of progress and results to the Lead Evaluator and the Chief of Education in-country.

23. The Chief of Education in each participating country will co-facilitate the recruitment of the support evaluator and provide supervisory support in day-to-day execution of the evaluation activities. The Chief of Education and the PBEA Focal Point will sign off on all reports that are generated in-

15

Page 16: europeanevaluation.orgeuropeanevaluation.org/sites/default/files/work...  · Web viewpurpose of expression of interest (eoi)

country before they are forwarded to the Lead Evaluator. The Deputy Representative and Senior Evaluation Specialist will be charged with the responsibility of adjudicating all disagreements.

24. The Evaluation Specialist (the evaluation manager in Evaluation Office, NY) will, in consultation with the PBEA Manager (and PBEA M & E Specialist, NY), recruit the Lead Evaluator, and co-facilitate the recruitment of the Support Evaluators. She will also be primarily responsible for coordinating, directing, and supervising all activities of the Lead Evaluator. She will also consult with PBEA M&E Specialist on a regular basis and provide update to the PBEA Programme Manager at agreed intervals, as well as approve all deliverables (See Appendix A, Section VII).

VI. Risk management and/or mitigation

25. The most critical risk for the evaluation pertains to balancing the need to produce results with readiness of different parts of UNICEF to execute a DE and get the full benefit of what it has to offer. To that end, careful consideration should be given to the following:

i) Clear delineation of the role of the support evaluator: There is always the risk that the support evaluator assumes the role of M&E officer for the entire education portfolio, or worst still, another pair of hands for the entire country office. The Evaluation Office and the two COs should come to an agreement that the role of support evaluator is primarily ‘evaluator’ who will assume the monitoring role only for PBEA activities.

ii) Management arrangements : Related to the point on the role of the support evaluator is the issue of who he/she reports to. In a typical independent evaluation, the evaluator would be supervised by the evaluation manager (in Evaluation Office). We anticipate a protracted discussion on management arrangements, and suggest that the arrangements proposed in Section V above be negotiated between the Evaluation Office and the Country Office leadership, (involving the Chief of Education and the Deputy Representative, or higher).

VII. References

Dozois, E., Langlois M., & Blanchet-Cohen, N. (2010). DE 201: A practitioner’s guide to developmental evaluation. Montreal, Canada. J.W. McConnell Foundation and the International Institute for Child Rights and Development.

Gamble, J. (2008). A developmental evaluation primer. Montreal, Canada. J.W. McConnell Foundation.

OECD/DAC. (2012). Evaluating peacebuilding activities in setting of conflict and fragility: improving learning for results. Paris, OECD-DAC.

Patton, M. Q. (2008). Utilization-focused evaluation. (4th ed). CA: Sage Publications

Reimann, C. (2012). Evaluability assessments in peacebuilding programming. Working Papers on Program Review and Evaluation. #3: CDA, Cambridge, MA. CDA Collaborative Learning Projects.

Reimann, C., Chigas, D., & Woodrow, P. (2012). An alternative to formal evaluation of peacebuilding; programme quality assessments. Working Papers on Program Review and Evaluation. #4: CDA, Cambridge, MA. CDA Collaborative Learning Projects.

Rogers, M. (2012). Evaluating relevance peacebuilding programs. Working Papers on Program Review and Evaluation. #1: CDA, Cambridge, MA. CDA Collaborative Learning Projects.

UNEG (2011). Integrating human rights and gender equality in evaluation: towards UNEG guidance. United Nations Evaluation Group. New York, NY.

16

Page 17: europeanevaluation.orgeuropeanevaluation.org/sites/default/files/work...  · Web viewpurpose of expression of interest (eoi)

17