2- Results Based Management in Dev Partners.pdf

Embed Size (px)

Citation preview

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    1/63

     

    EASIER SAID THAN DONE

     A REVIEW OF RESULTS - BASED MANAGEMENTIN MULTILATERAL DEVELOPMENT INSTITUTIONS

    Michael Flint

    March 2003

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    2/63

     

    This document is an output from a project funded by the UK Departmentfor International Development (DFID). The author is grateful to all those inDFID, UNDP, UNICEF, UNIFEM, IDB, and the World Bank who ass isted withthis s tudy. The views expressed in th is report are those of the author alone,and are not necessari ly those of DFID.

    Results-based approaches have continued to develop in the year s incework started on this report. The information in this report does nottherefore necessarily reflect the current si tuation in the institu tionscovered.

    Michael Flint & PartnersWerndduPontrilas

    Herefordshire HR2 0EDUnited Kingdom

    email: [email protected]

      ii

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    3/63

     

    CONTENTS

    Summary

    1. Introduction

    2. What is results-based management?

    3. The history of results -based approaches

    4. Strategic planning

    5. Monitoring and reporting

    6. Managing

    7. Issues in results-based management

    8. Conclusions

     Annexes

     A. Strategic planning – country levelB. Strategic planning – corporateC. References

    iii

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    4/63

     

     ABBREVIATIONS AND ACRONYMS

     ARDE Annual review of development effectiveness AROE Annual report on operations evaluation

     ARPE Annual report on projects in execution ARPP Annual review of portfolio performanceCAS Country assistance strategyCCF Country co-operation frameworkCN Country noteCIDA Canadian International Development AgencyCPIA Country policy and institutional assessmentCPO Country programme outlineCSP Country strategy paperDAC Development Assistance CommitteeDER Development effectiveness report

    DFID Department for International DevelopmentIDA International Development AssociationIDB Inter-American Development BankIMEP Integrated monitoring and evaluation planMDG Millennium Development GoalMDI Multilateral development institutionM&E Monitoring and evaluationMTSF Medium term strategic frameworkMTP Medium term planMTSP Medium term strategic planMYFF Multi-year funding framework

    OECD Organisation for Economic Co-operation and DevelopmentOED Operations Evaluation Department (World Bank)OVE Office of Evaluation and Oversight (IDB)PCR Project completion reportPRSP Poverty reduction strategy paperQUAG Quality Assurance Group (World Bank)RBM Results-based managementROAR Results orientated annual reportSBP Strategy and business planSRF Strategic results frameworkTAPOMA Task force on portfolio management

    UNDP United Nations Development ProgrammeUNICEF United Nations Children’s FundUNIFEM United Nations Development Fund for Women 

    iv

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    5/63

     

    SUMMARY

    1. The purpose of this report is to present a comparative study of the practice of

    results-based management in a sample of five multilateral development

    institutions: the United Nations Development Programme (UNDP); United

    Nations Children’s Fund (UNICEF); United Nations Development Fund for

    Women (UNIFEM); Inter-American Development Bank (IDB); and the World

    Bank. The report is based on a review of documents and a limited number of

    interviews with head office staff in mid-2002. As such, it does not claim to be

    definitive, nor necessarily fully up to date with developments since then.

    2. The terms ‘results’ and ‘results-based management’ (RBM) are used in

    different ways in different institutions. Section 2 of the report provides some

    introductory definitions. Results are taken to include outputs, outcomes and

    impacts, but with an emphasis on outcomes and impacts. RBM is similar to,

    but not synonymous with, performance management. 

    3. All five institutions are, to a greater or lesser extent, engaging with results-

    based management. All have made a commitment to increase their results-

    focus. All have taken steps to, or are working on, improving the planning and

    reporting of results. As befits their different histories, mandates and cultures,

    there is enormous variety in their approaches and progress to the four main

    components of RBM: strategic planning, monitoring, reporting and managing

    (using).

    4. The introduction and implementation of RBM within large institutions is never

    quick and easy, as is shown by experience in the public sector in OECDcountries. The introduction of RBM to international development agencies is

    even more challenging. Four particular issues can be identified:

    v

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    6/63

     

    •  developing country capacity

    •  attribution

    •  aggregation

    •  incentives

    5. Results-based management is the latest in a very long line of efforts to

    improve the measurement, monitoring and reporting of effectiveness. This is

    not to diminish its potential significance. Thinking about development in terms

    of outcomes and impacts, rather than inputs, activities and outputs, is a

    powerful idea that has major implications for how multilateral development

    institutions operate.

    6. Five conclusions emerge from this study:

    i. results-based management is easier said than done, particularly for

    development institutions, and particularly given the new emphasis on

    country and global results. Institutions should not underestimate the

    challenge.

    ii. Multilateral development institutions work through and with developingcountry governments to realise and measure results. This presents

    development agencies with a double challenge: introduce RBM internally

    and within partner country governments. One without the other is unlikely

    to succeed. Greater support for the introduction of RBM in developing

    countries, and associated public sector reform, is essential.

    iii. external accountability is driving much of the recent push for RBM. This

    needs to be accompanied by a greater emphasis on using results

    information for internal management.

    iv. RBM in development co-operation has to face up to the challenge of

    attribution. For all practical purposes, development agencies have little

    vi

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    7/63

     

    option but to manage for outcomes in the medium- to long-term, but to

    manage by outputs, indicators and other measures of performance (eg.

    partnership, strategy and process) in the short-term.

    v. multilateral development institutions need to work to amend their internal

    incentive structures in favour of results. This implies working to correct the

    continued bias in favour of inputs and activities, as well as giving

    substance to results-based budgeting. Resources and recognition needs

    to flow to those individuals, units, sectors and countries with the best

    record of managing for, and delivering, results.

    7. Finally, this study has implications for those supporting and monitoring the

    progress of results-based management within multilateral development

    institutions. Assessing the quality and extent of management change is not a

    straightforward task. Increasing support for the introduction of RBM will need

    to be accompanied by a more sophisticated approach to its monitoring.

    vii

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    8/63

     

    1. Introduction

    1.1 Recent interest in results-based management in multilateral development

    institutions is the product of two related developments. The first was the

    definition of, and agreement on, global development goals. This process

    started in the mid-1990s, and culminated in the endorsement of the

    Millennium Development Goals in September 2000 by all 189 United Nations

    states. The significance of this event is that, for the first time ever, all

    development agencies have a common set of results to which they are

    working, and against which their collective performance can be judged. This

    focus on results was confirmed at the United Nations Conference onFinancing for Development in Monterrey in March 2002, and is matched by a

    broad consensus on development partnership and aid effectiveness. One key

    feature of this consensus is the emergence of the country as the primary unit

    of account.

    1.2 The second development has been the drive to improve public sector

    performance in OECD member states. One response in many countries has

    been the adoption of results-based management (RBM) by public sector

    agencies, including those responsible for development co-operation. OECD

    countries are the major donors to the multilateral development institutions

    (MDIs). It was therefore only a matter of time before the MDIs themselves

    were influenced to embark upon a similar process of reform. This began to

    happen in the late 1990s. References to results and results-based

    approaches have become increasingly common among MDIs as a

    consequence. However, these references often mean different things in

    different institutions.

    1

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    9/63

     

    Study objectives

    1.3 The purpose of this report is to present a comparative study of the practice of

    results-based management in a sample of UN development agencies and

    multilateral development banks. This was originally intended as background

    to a DFID-sponsored workshop on RBM. Outline conclusions on the value of

    RBM as currently practiced, and the reforms needed to realise its full

    potential, were expected. In the event, DFID decided not to hold a workshop,

    in part because of the similar World Bank sponsored workshop in June 2002.

    1.4 Eight institutions were originally selected for study. With the agreement of

    DIFD this was reduced to five:

    -  The United Nations Development Programme (UNDP)

    -  The United Nations Children’s Fund (UNICEF) 

    -  The United Nations Development Fund for Women (UNIFEM) 

    -  The World Bank

    -  The Inter-American Development Bank (IDB) 

    1.5 The consultant was ask to document and comment on the following aspects

    of RBM for each institution: the length of experience; changes made over

    time; organisation, effectiveness and timeliness; quality of information;

    commitment of operational staff; use made by management; and the quality

    of reports. This proved to be a hugely ambitious undertaking. RBM is a

    management approach, not a simple technical instrument. There is a huge

    difference between how it is meant to work on paper, how it is said to work,

    and how it actually works. Understanding RBM basically means

    understanding how these institutions are managed, both in head office and in

    the countries where they operate. This was clearly impossible in the time

    available (25 days in total). Each of these institutions would require this much

    2

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    10/63

     

    time to do them justice. Useful meetings were held with all the institutions

    involved, but these could not really do more than scratch the surface. The

    result is a report that is inevitably more superficial than was originally

    intended, and which concentrates more on generic issues than on

    institutional specifics.

    1.6 The report begins with a discussion of the key terms: results and results-

    based management (section 2). Section 3 contains a brief history of RBM in

    each institution. Sections 4-6 cover the main elements of RBM: planning,

    monitoring, and managing. The report ends with a discussion of the main

    issues in implementing and monitoring RBM.

    3

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    11/63

     

    2. What is Results-Based Management?

    2.1 Results-based management can mean many different things. The area is

    bedevilled by different definitions. What one institution calls an ‘outcome’ isanother’s ‘output’, ‘intermediate outcome’, or ‘impact’. Without agreement

    about what exactly RBM is, it is very difficult to assess or monitor its

    implementation. Some discussion of what these words mean is therefore

    required at the outset.

    Results

    2.2 The recent OECD DAC glossary of key terms defines a result as ‘the output,

    outcome or impact of a development intervention’ (Box 1). While this is the

    definition used in this report, it should be noted that this is a broader

    definition than used by some of the leading exponents of results-based

    management. According to the Treasury Board of Canada, a result is ‘the

    end or purpose for which a programme or activity is performed ... and refers

    exclusively to outcomes’.1 ‘Outcome’ in this usage covers both effects and

    impacts - but not outputs - and may be immediate, intermediate or final.

    2.3 This is a potentially important distinction. A key feature of RBM is the

    requirement that managers look beyond inputs, activities and outputs, and

    instead focus on outcomes. Some would argue that to see outputs as results

    is therefore to weaken this fundamental shift in orientation towards

    outcomes. Others argue that outputs are results, and that the important

    feature of RBM is the link between these and changes at outcome level.

    1 Results-Based Management Lexicon. Treasury Board of Canada Secretariat (2002).

    4

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    12/63

     

    Box 1 – Results 

    OECD DAC defin itions 2 :

    Result : the output, outcome or impact of a development intervention.

    -  Output : the products, capital goods and services which result from adevelopment intervention. 

    -  Outcome : the likely or achieved short-term and medium-term effects of anintervention’s outputs.

    -  Impacts : positive and negative, primary and secondary long-term effectsproduced by a development intervention.

    -  Effect : intended or unintended change due directly or indirectly to anintervention

    2.4 The other key feature of a result is that it should represent attributable

    change resulting from a cause-and-effect relationship. In other words, there

    has to be a reasonable connection, or at least a credible linkage, between

    the specific outcome and the activities and outputs of the agency. If no

    attribution is possible, it is not a result.

    2.5 The accepted way of linking inputs to outcomes, and of demonstrating

    attribution, is via a logical framework or results chain. An example of such a

    results chain is shown below. Inputs are immediately measurable and under

    the control of the MDI. Activities hopefully follow soon after the provision of

    inputs, but are dependent on the commitment and actions of the government

    and other development partners. The outputs, and even more so the

    outcomes, that result are generated after a lag of several years, are subject

    to many exogenous factors, and are only partly attributable to the inputs

    provided by the MDI. Impacts are even more time-lagged, subject to

    5

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    13/63

     

    multidimensional causation, and are extremely difficult to attribute to one

    MDI.3 

    IMPACTS Lower infant mortality

    OUTCOMES Reduced infection

    OUTPUTS Immunisation coverage

     ACTIVITIES Immunisation programmes

    INPUTS Finance and skills

    2.6 It follows that the requirement for describable or measurable attribution

    presents a real challenge for development agencies as they move from

    projects to programmes, and as their focus shifts to shared country and

    global outcomes, as exemplified by the Millennium Development Goals

    (MDGs). The issue of attribution, and its implications for RBM, will be

    returned to later (para.7.5).

    Results-based management

    2.7 The OECD DAC defines results-based management as ‘a management

    strategy focusing on performance and the achievement of outputs, outcomes

    and impacts’. This is a wide definition. In addition to the reference to outputs,the definition also mentions performance. In doing so, OECD DAC is not

    implying that RBM is the same as performance management. Performance

    2 Glossary of key terms in evaluation and results based management. OECD DAC (2002)

    3 Measuring Outputs and Outcomes in IDA Countries. International Development Association. February

    2002.

    6

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    14/63

     

    should include measures of process and efficiency, not just results. RBM is

     just one, albeit significant, approach to performance management.

    2.8 Another definition of RBM is provided by the Treasury Board of Canada.

    This rightly defines RBM as a comprehensive management approach which

    emphasises outcomes throughout the programming cycle. As will be

    discussed later, RBM implies and requires fundamental changes in

    organisational culture and incentives.

    Box 2: Results-based management 

     A comprehensive, life-cycle approach to management that integrates businessstrategy, people, processes and measurement to improve decision-making anddrive change. The approach focuses on getting the right design early in theprocess, implementing performance measurement, learning and changing, andreporting performance.

    2.9 The application of RBM varies from country to country, and from agency toagency. However, there are four core elements to most RBM approaches4:

    1. Strategic planning: defining clear and measurable results and indicators,based on a logic model or framework.

    2. Monitoring: measuring and describing progress towards results, and theresources consumed, using appropriate indicators.

    3. Reporting, internally and externally, on progress towards results.

    4. Managing: using results information (and evaluation) for lesson-learningand management decision making.

    4 A much fuller discussion can be found in OECD DAC (2001) op cit

    7

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    15/63

     

    2.10 The experience and thinking of the five multilateral institutions with respect

    to these four elements is considered below, having first briefly outlined the

    history of results-based approaches in each.

    8

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    16/63

     

    3. The history of results-based approaches

    3.1 This section documents the history of results-based approaches in the five

    institutions reviewed: UNDP, the World Bank Group, Inter-American

    Development Bank (IDB), UNICEF and UNIFEM. The practice of RBM

    needs to be considered at three main levels5 :

    •  Project

    •  Country 

    •  Corporate 

    3.2 In the context of development co-operation, RBM at the project level has the

    longest history and is most well documented6. Work on introducing RBM at

    country and corporate level is much more recent. It is at these levels where

    the real challenge for RBM lies. This report will accordingly concentrate on

    RBM at country and corporate level.

    3.3 This does not mean that RBM at project level should be ignored, for two

    reasons. First, despite the shift to a non-project development paradigm,

    projects still dominate the aid landscape7. Second, RBM is most applicable,

    and least problematic, at the project level. Despite this, the application of

    RBM and logical frameworks to projects has not been particularly

    successful. The limited success of RBM in the much simpler environment of

    projects should, at the very least, give pause for thought. This issue is

    discussed further below (section 7).

    5 RBM is also applicable at a fourth, cross-cutting, level : the sector.6 OECD DAC (2001) op cit

    7 ‘Development Cooperation and performance evaluation : the Monterrey challenge’. OED, World Bank

    Working Paper. June 2002. See also DFID DER.

    9

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    17/63

     

    3.4 It is important to emphasise that the degree to which RBM has been applied,

    or is claimed to be, is not necessarily correlated with effectiveness. The fact

    that most of the institutions have not yet adopted and implemented RBM in a

    formal sense does not mean that they are not implementing parts of the

    approach at some levels. It certainly does not mean that they are not

    producing development results.

    United Nations Development Programme (UNDP)

    3.5 UNDP has made the strongest commitment to RBM. It is the only institution

    of the five to have begun to implement RBM as an organising principle at all

    levels, and is the most advanced of all the UN agencies. Further advances

    have been made since the information on which this section is based was

    collected. 8 

    3.6 UNDP’s advanced status has two origins. The first was the pressure of

    declining core funds in the 1990s. UNDP knew that it had to change if it was

    to recover the confidence of the donor community. In 1997 UNDP initiated a

    set of change management processes, known as UNDP 2001. The UNDP

    change process emphasised, among other things, the need for the

    organisation to become more results-orientated9.

    3.7 In parallel, UNDP’s Evaluation Office (EO) had been working on developing

    results-based monitoring and evaluation policies, methodologies and tools.

    In 1997 EO commissioned a joint study with SIDA on results management10,

    and produced a handbook on results-orientated monitoring and evaluation

    for programme managers11. In 1998 EO was given lead responsibility for

    developing a framework for the measurement and assessment of

    8 In a response to a draft version of this report, UNDP stated that this report does not take account of the

    many, more recent advancements UNDP has made in internalising RBM.9 Annual Report of the Administrator for 1997. UNDP (1998)

    10 Measuring and Managing Results. Poate, D. (1997)

    10

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    18/63

     

    programme results. This step initiated the introduction of RBM in UNDP12 

    and led to the Multi-Year Funding Framework (MYFF) in 1999. The MYFF

    was a four-year funding framework (2000-03) encompassing a Strategic

    Results Framework and a resource framework that integrated all financial

    allocations.

    3.8 Since then, UNDP has been working to ensure that “assessing and reporting

    on results is not a minority preoccupation but a way of doing business for the

    organisation as a whole”.13 Having been piloted in ten countries, RBM was

    introduced worldwide in only one year, with the first Results-Orientated

     Annual Report (ROAR) produced in 1999. Strategic choices were made to

    learn from others; to learn by doing; to tailor RBM to UNDP; to keep the

    system as simple as possible; not to over-invest in indicators; and to

    manage for (not by) results. The result is an approach that is still being

    adapted, but which has been mainstreamed throughout the organisation and

    its instruments. The next generation of RBM software is currently being

    introduced.

    Box 3: UNDP’s Resul ts-Based Management System 

    Planning Instruments:

    •  Strategic Results Framework

    •  Integrated Results Framework

    •  Multi-Year Funding Framework

    •  Country Office Management Plan

    Reporting Instruments:

    •  Results-Orientated Annual Report

    •  Multi-Year Funding Framework Report

    •  Country Office Management Plan Report

    11 Results-orientated Monitoring and Evaluation: a Handbook for Programme Managers. UNDP (1997)

    12 Results Based Management – Overview and General Principles. UNDP.13 The Multi-Year Funding Framework. UNDP (1998)

    11

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    19/63

     

    United Nations Development Fund for Women (UNIFEM)

    3.9 The history of results-based approaches within UNIFEM was not easy to

    discern on the basis of published documents and a single interview. In

    common with all the institutions in this study, the notion of results is not new

    to UNIFEM. In the Consultative Committee (CC) Report for 1997 UNIFEM

    reported on the introduction of RBM concepts into its programme. This work

    was initiated with support from the Canadian Government. UNIFEM’s

    Strategy and Business Plan (SBP) for 1997-99 also clearly listed the results

    that were to be achieved, and the SBP for 2000-03 includes a results

    framework which lists expected outcomes and indicators.

    3.10 Since 1998 the CC report has used a results orientated format for reporting

    against the SBP. By virtue of its close association with UNDP, UNIFEM was

    influenced by the UNDP 2001 change process and by the introduction of

    RBM in that organisation. UNIFEM uses the Results and Competency

     Assessment developed by UNDP, and has an interface with the UNDP

    ROAR. However, UNIFEM has also been exploring, and been influenced by,

    the RBM approaches of other multilateral and bilateral agencies.

    3.11 According to the recent Report of the Executive Director, each of UNIFEM’s

    three programming objectives “is measured and driven by a results-based

    framework designed to create a learning and knowledge based institution”.

    14 However, it is acknowledged in the same report that “new monitoring and

    evaluation mechanisms are needed, with greater focus on assessing

    progress towards results than completion of activities”. Thus, while UNIFEM

    has certainly become more results-orientated since 1997, the introduction of

    results-based management tools and internal support has some way to go.

    14 UNIFEM – Report of the Executive Director. September 2002.

    12

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    20/63

     

    United Nations Children’s Fund (UNICEF)

    3.12 UNICEF is proof that a results-orientation is not necessarily new. While the

    use of the term RBM may be new, UNICEF has been practising large parts

    of the approach for at least twenty years. One of the best examples was the

    child survival campaign launched in 1982. By insisting on strategic action,

    measurable results, and clear accountability, the then Director of UNICEF

    (J.P.Grant) spearheaded extraordinary improvements in child survival and

    development over the following decade.

    3.13 Over the last few years UNICEF has recognised the need to define more

    clearly the results it seeks to achieve. In 1996, a new Mission Statement wasapproved. This was followed by a Medium-Term Plan (MTP) for 1998-2001.

     Although containing a statement of priorities, these were numerous and

    wide-ranging, and were not mainstreamed within UNICEF. The MTP also

    lacked clearly defined targets against which to measure achievement.

    Significant progress was nevertheless made over the MTP period in

    achieving a stronger results-focus in programming and reporting, and in

    moving towards a more strategic approach.

    3.14 In 2000, UNICEF produced a Multi-Year Funding Framework. This was seen

    as an opportunity to strengthen results-based management within the

    organisation. Analytical reporting on results linked to objectives and budget

    was identified as a core element of the framework. The Executive Director’s

     Annual Report in the same year was the first to use a results-based format.

    3.15 Most recently, UNICEF has produced a Medium-Term Strategic Plan

    (MTSP) for 2002-2005, with results-based management as one of its guiding

    principles. This represents a clear shift towards results-based programming

    that goes beyond identifying broad goals and requires that specific results

    for children be identified, measured regularly, and systematically reported.

    13

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    21/63

     

    UNICEF also recognises that its evaluation function needed strengthening15.

     As the MTSP put it:

    “UNICEF must establish its organisational priorities, define objectives, define

    the criteria of success for its work, strive to achieve its objectives,systematically monitor progress (or lack of it) and evaluate its work so it maylearn how to maintain r elevance, effectiveness and efficiency: this is results-based management”.16

     

    The World Bank

    3.16 The World Bank has been working to increase its results orientation for the

    past ten years. In 1992, the World Bank was criticized by the Wapenhans

    Report for giving more attention to the quantity of its lending than to its

    quality: a product of the so-called “approval culture”. The World Bank

    responded with a concerted effort to improve its focus on quality and results.

    In 1993 the World Bank issued a new plan entitled “Getting Results: the

    World Bank‘s Agenda for Development Effectiveness” and initiated the “Next

    Steps” reform programme. In 1994 “Learning from the Past, Embracing the

    Future” was published, with a ’results orientation’ as one of its six guiding

    principles.

    3.17 This was followed by the “Renewal Process” in 1996. A raft of general and

    sector-specific performance monitoring indicators, and the logical

    framework, were introduced. In the same year, the Quality Assurance Group

    (QUAG) was established to improve, and allow management to keep track

    of, project design (quality-at-entry) and supervision. This added a significant

    quality element to the traditional measures of lending approvals (number and

    amount), project performance (projects at risk), and ratings of closed

    projects (outcome, sustainability, and institutional development impact).

    15 Report on the Evaluation Function in the Context of the Medium-Term Strategic Plan. UNICEF (2002)16 Medium-Term Strategic Plan for the Period 2002-2005. UNICEF (2001)

    14

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    22/63

     

    3.18 1997 saw the launch of the “Strategic Compact”. The Compact aimed to

    make the World Bank “more effective and efficient in achieving its main

    mission - reducing poverty” and included a commitment to “building a

    performance assessment system and to making management more

    performance based”. This led to further improvements in performance

    measurement and management, and to some increase in results-orientation.

    For example, the 1998 Annual Report on Operations Evaluation (AROE)

    concluded that while RBM had not been formally adopted – as had been

    recommended by the AROE in 1997 - “operations are moving in that

    direction”.

    3.19 A similar judgement was made in 2001. The Strategy Update Paper

    summarised the situation in the following way :

    “We also are much more explicitly focusing on results, particularly on howwe can better measure, monitor and manage to achieve them. We havecome a long way in developing measures of operational inputs and theirquality, and these have helped us to make a steady improvement in Bankperformance over the last several years. We now need to ratchet up ourresults focus, doing more to measure and explain how our work makes adifference in terms of country outcomes.17”

    3.20 Recent IDA-13 and Monterrey discussions have given renewed impetus to

    the search for better ways of monitoring country outputs and the contribution

    to country outcomes. Most of the improvements in the 1990s were aimed at

    improving the quality of the design, implementation and monitoring of

    projects. The World Bank accepts that more needs be done to increase its

    results orientation, particularly in areas other than projects. Improvements

    are planned in the planning and monitoring of country programmes, as well

    as for sector and thematic strategies. Further work on implementing the

    17 Strategy Update Paper for FY03-05: Implementing the World Bank’s Strategic Framework. Executive

    Summary p.i. March 2002.

    15

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    23/63

     

    results agenda with respect to corporate reporting, staff incentives and

    training, and risk management is also underway.18 

    Inter-American Development Bank (IDB)

    3.21 The IDB has not experienced the same level of external pressure for reform

    and results, and the associated permanent management revolution, which

    has characterised the World Bank over the last decade. However, concern

    about the results-focus of the IDB has followed a broadly similar history19.

    3.22 As with the World Bank, recent efforts to increase the results-focus of the

    IDB originated from a critical review of the Bank’s portfolio. In 1993 the Task

    Force on Portfolio Management (TAPOMA) found that the focus on the initial

    approval of projects and the subsequent control of execution took the focus

    away from managing for development results. It concluded that a concern for

    results needed to be paramount.

    3.23 The IDB Board and management endorsed this shift of focus and responded

    in the mid-1990s with a series of improvements to the way projects were

    designed and monitored. The overall aim was to promote “a results-

    orientated dialogue among Bank staff, executing agencies, and national

    counterparts” and an increased results-focus in project design, monitoring

    and reporting. Improvements included the requirement for logical

    frameworks, impact indicators, and project completion reports based on data

    on the outcomes or impacts. The new US Administration’s emphasis on

    results throughout 2001, and internal changes in Office of Evaluation and

    Oversight, gave fresh momentum to RBM within IDB.

    18 Better Measuring, Monitoring and Managing for Development Results. Development Committee Paper.

    World Bank. September 2002.19 This section draws extensively on the Development Effectiveness Report. Office of Evaluation and

    Oversight. IDB. February 2002.

    16

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    24/63

     

    3.24 Despite the increasing commitment of management to results, IDB accepts

    that it has some way to go. A recent report by the Office of Evaluation and

    Oversight concluded that IDB projects “are still not being designed and

    monitored so as to transparently demonstrate development results”. Further

    improvements to project and country results frameworks are under

    consideration, as are changes to the incentive framework to help sharpen

    the IDB’s focus on results and development effectiveness.20 

    20 Development Effectiveness at the IDB. Paper for the Board of Executive Directors. January 2002.

    17

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    25/63

     

    4. Strategic planning

    4.1 This section considers the extent to which strategic planning at the

    country and corporate level has a results-focus. For institutions that areimplementing RBM, strategic planning should be about planning to

    achieve outcomes: management for results. Plans should contain clear,

    realistic and attributable results; defined indicators specifying exactly what

    will be achieved by when; a results chain or logic model linking inputs,

    activities, outputs and outcomes; and a strategy or strategies explaining

    how and why inputs will lead to outcomes, including a discussion of risk.

    Country-level planning

    4.2 All the institutions are, to a greater or lesser extent, struggling with three

    challenges. First, to align their programmes more explicitly to the

    country’s own plans, such as the Poverty Reduction Strategy Paper

    (PRSP). Second, to raise the sights of their programmes from the project

    level to country level. And third, to define better country-level results

    frameworks.

    4.3 Annex A contains a summary assessment of country planning documents.

    While most of the institutions now specify country-level results of some

    sort, none of the institutions have developed logical frameworks for

    country programmes as a whole. UNICEF comes closest with its

    programme-level Integrated Monitoring and Evaluation Plan (para.4.12).

    Unlike the others, UNIFEM plans regionally and sub-regionally rather than

    at country level. The regional and sub-regional programmes are

    developed within the framework of the Regional SBP. Logical frameworks

    are a requirement at the programme level.

    18

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    26/63

     

    4.4 The 2002 Development Effectiveness Report (DER) provides a frank

    assessment of country-level planning in the IDB. It found that, “with few

    exceptions, Bank programming does not establish ex-ante any specific

    results that it is seeking to obtain in working with an individual country”.

    IDB country programmes described project-level outputs rather than

    country level outcomes. The one area of activity where the IDB had

    anticipated outcomes was structural reform. The experience in this area

    shows very clearly the importance of an outcome-focus. IDB projects

    have been “very successful in producing the output of reform, but these

    reforms did not produce the outcome of growth in productivity”.

    4.5 The three Country Papers reviewed21 support this conclusion. The

    Strategy Matrix is not a logical framework. Overall objectives are stated,

    but these are very general and are not accompanied by any indicators or

    targets. The Bank’s strategy then consists of priority areas, activities or

    focuses under each objective, often referencing specific IDB programmes.

    The performance benchmarks for the strategy are a mix of selected

    program outputs and country outcomes. Examples from the Country

    Paper for Chile are contained in Box 4. In all the Country Papers

    reviewed, the link between strategy outputs and country outcomes is not

    specified.

    Box 4 : IDB Strategy Matrix – examples

    Objective Poverty reduction, human capital formation and social inclusion

    Strategy Improvements in execution of preschool education programs(Early Childcare Program)

    Performance

    benchmark

    Recovery of net enrollment ratios in rural primary school to at

    least 87% by the end of 2003

    4.6 Recent IDB guidance recognises the importance of including in Country

    Papers (and distinguishing between) indicators that can be used to

    monitor progress specific to the Bank’s programme (ie. outputs), as well

    21 Brazil (2000), Ecuador (2001), and Chile (2001).

    19

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    27/63

     

    as the Bank’s contribution to country outcomes22. This is very much work

    in progress, but one idea is to juxtapose IDB programme outputs with

    associated country outcome targets.

    4.7 Better anchoring of the Country Assistance Strategy (CAS) in the

    country’s specific priorities and objectives is central to the World Bank’s 

    increased focus on results at country level. According to the Annual

    Review of Development Effectiveness (ARDE) for 2001, results would be

    improved if CASs included a logical framework (and results chains) linking

    Bank instruments with country objectives. In 2002 the World Bank

    reported that ‘results-based CASs’ are to be piloted in several countries.

    These will identify country outcomes (from the PRSP or similar) to which

    the World Bank will contribute, along with intermediate indicators linked to

    particular products and services that the Bank will provide23.

    4.8 Current World Bank CASs24 are similar to the IDB Country Papers. Most

    include a Country Program Matrix detailing the main objectives or

    priorities, and the Country Strategy/Key Actions. Progress benchmarks or

    targets for each main strategy/action are given, but these refer country

    outcomes rather than Bank outcomes or outputs. As with the IDB, country

    strategies do not yet contain clear results frameworks or chains, nor

    “clear, monitorable indicators for evaluating the development

    effectiveness of the Bank program”25. This is not to imply that this is easy

    to do. As recognised in an IDA-13 paper, part of the answer may lie in the

    identification of early indicators of output performance which have good

    eventual linkages to country outcome objectives26

    .

    22 Country Paper Guidelines. IDB. February 200223 World Bank (2002) op cit, p. 1024

     Pakistan (2002), Chile (2002), and Belarus (2002).25 Ten Features of a Good CAS. http://www.worldbank.org/html/pic/cas/tenfeat.htm26 Measuring Outputs and Outcomes in IDA Countries. IDA. February 2002.

    20

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    28/63

     

    4.9 Recent UNDP Country Programme Outlines (CPOs) include a results and

    resources framework.27 This lists intended outcomes and outputs (with

    indicators) within ‘strategic areas of support’. Examples from the Malaysia

    CPO are contained in Box 5. Note that the UNDP outcomes are lower

    level outcomes (less ambitious and more attributable) than those

    specified by UNICEF, IDB or the World Bank.

    Box 5 : UNDP results framework – examples

    Strategic area ofsupport

    Sustainable human development

    Intended outcome National policies more effectively address the socialimpact of economic liberalisation

    Indicator of

    outcome

    Explicit analyses of the impact of global liberalisation on

    human resources development integrated in key nationalplans and policies

    Output Increased capacity to assess and predict humandevelopment needs and to monitor in relation tocompetitiveness

    4.10 This is an advance on earlier UNDP Country Cooperation

    Frameworks (CCFs). These had merely listed the areas of support and

    made no mention of results.28 More recent CCFs had listed ‘key results’

    under each strategic area of support, but had not distinguished between

    outcomes and outputs, nor included indicators29.

    4.11 UNICEF Country Notes (CNs) describe overall objectives for the 5-

    year programme, plus specific objectives for each programme (eg. to

    reduce infant and child mortality by 25%). No overall results framework is

    presented for the country programme. However, a very detailedIntegrated Monitoring and Evaluation Plan (IMEP) is then developed for

    each of the constituent programmes (eg. health, early education, etc.).

    27 India CPO (2002); Malaysia CPO (2002). UNDP consider that the Malaysia CPO is not a good example

    from the RBM perspective.28 Mongolia CCF (1997)29 Malawi CCF (2001)

    21

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    29/63

     

    Examples from the Health Programme IMEP for Malawi (2002-06) are

    contained in Box 6.

    Box 6 : UNICEF Integrated Monitoring and Evaluation Plan – examples

    Overallobjective

    To create a conducive environment to realise rights to survival,development, protection and participation of children and women.

    Programobjective

    To eliminate or decrease the major killers of children in UNICEFimpact areas

    Specificobjective

    To improve access to, and the quality of healthcare at healthfacilities

    Output Health workers at health facilities trained in IMCI casemanagement and obstetrical care

    Baseline 10%

    Target 80%

    Criticalassumption

     Adequate number of qualified staff available

    Corporate planning

    4.12 There is a tension between corporate and country level objectives.

     Allowing priorities to be set at country level reduces the extent to which

    institutions can develop corporate-level results frameworks. In most cases

    this tension is resolved by restricting corporate planning to the definition of

    broad goals, priorities and principles. Few institutions have attempted to

    develop ex ante results frameworks at corporate level. A summary of the

    corporate plans for the five institutions is contained at Annex B.

    4.13 UNDP is well aware of the tension between top-down and bottom-

    up planning, but has gone further than any of the other institutions in

    determining a corporate results framework. The Strategic Results

    Framework (SRF) for 2000-03 lists 7 goals, 24 sub-goals, 142 outcomes

    (with indicators), and 84 ‘strategic areas of support’. Box 7 contains

    examples from the SRF.

    22

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    30/63

     

    Box 7 : UNDP Strategic Results Framework – examples

    Goal To create an enabling environment for sustainable humandevelopment.

    Sub-Goal Strengthen capacity of key governance institutions for people-

    centred development and foster social cohesionStrategicarea ofsupport

    Reform and strengthen the system of justice, including legalstructures and procedures

    Intendedoutcome

    Independent and efficient system of justice, accessible to allstrata of the population in particular the poor.

    Indicator Number of countries in which there has been a decrease in timerequired for disposal of civil and criminal court cases

    4.14 In practice, only the goals, sub-goals and strategic areas of support

    are used to guide country-level programming. Outcome and outputs are

    determined at country level. The utility of the corporate level outcomes

    within the SRF is therefore unclear.

    4.15 UNIFEM’s Strategy and Business Plan for 2000-03 followed a

    similar structure to that of the UNDP SRF, but without the sub-goals. 120

    outcomes and indicators were listed. As with UNDP, no means of

    verification were given for the indicators. An example is given below.

    Box 8 : UNIFEM Strategy and Business Plan – examples

    Objective Increase options and opportunities for women.

    Thematicarea

    Economic empowerment and rights

    Strategicarea ofsupport

    Strengthening women’s economic capacity, rights andsustainable livelihoods as entrepreneurs, producers and home-based workers

    Intended

    outcome

    Reduction in the number of women in poverty through

    participation in viable economic activitiesIndicator Number of small and medium scale enterprises owned by women

     

    23

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    31/63

     

    4.16 UNIFEM has subsequently revised and strengthened their results

    framework. In 2001 UNIFEM introduced a results indicators framework. In

    2002, it consolidated the numerous outcomes of the SBP into a more

    logical and focused Outcome Framework with indicators and suggested

    means of verification. This reduced the number of outcomes to 48.30 

    4.17 The UNICEF Medium Term Strategic Framework lays out 5

    ‘organisational priorities’ (plus targets and indicators) and 89 ‘core

    intervention areas’. Each of the organisational priorities is related to

    relevant long-term international goals, such as the MDGs. Examples from

    the MTSF are given below.

    Box 9 : UNCEF Medium Term Strategic Plan – examplesOrganisationalpriority

    Fighting HIV/AIDS

    Long-terminternationalgoals

    UN Special Session on HIV/AIDS Declaration of Commitment

    MTSP target By 2005 ensure that national policies, strategies and actionplans are under implementation to prevent parent-to-child

    transmission of HIV in all countries affected by HIV/AIDSIndicator Number of countries with national strategies and action plans

    under implementation

    4.18 The IDB and World Bank have not yet attempted to develop

    corporate plans to this level of detail. The IDB Institutional Strategy 

    merely sets out four priority areas31. Two overarching objectives – poverty

    reduction and social equity, and environmentally sustainable growth –

    were added after a long debate. The Bank’s contribution to theseobjectives is to be measured through its contribution to country level

    30  How are we doing? Tracking UNIFEM progress in achieving results for management and learning.

    Briefing Note. UNIFEM (2002)31  Renewing the Commitment to Development: Report of the Working Group on Institutional Strategy. IDB

    (1999)

    24

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    32/63

     

    outputs and outcomes. Sector strategies are in the process of being

    finalised.

    4.19 The World Bank takes a similarly minimalist approach to corporate

    planning. The Strategic Framework Paper acknowledges that the MDG’s

    frame the World Bank’s strategy and provide a results-based framework

    for the international community.32 However, no attempt is made to specify

    global outcomes or outputs for the Bank. Rather, the aim is to maximise

    the impact on poverty reduction through greater selectivity within

    countries, across countries and in global programmes. The main focus of

    planning and activity will remain at country level, but with strong corporate

    guidance on principles and practice. As the Strategic Framework

    observed, “given the tension between ‘bottom-up’ country driven needs

    and more ‘top-down’ imperatives, this is inevitably a difficult and iterative

    process".33

     

    4.20 This demonstrates the main difference between the UN agencies

    and the multilateral development banks (MDBs). All the UN agencies

    have, to a greater or lesser extent, defined global goals and outcomes.

    The challenge for all of them will be to show that these are monitorable

    and attributable. The MDBs have (so far) avoided global results

    frameworks, and have instead concentrated on strategy in broad support

    of the MDGs. This is now changing as all institutions feel the pressure to

    deliver and demonstrate results at the country and global level.

    32 Strategic Framework. WBG, January 2001.33 Strategic Framework. World Bank Group. January 2001, p. 7

    25

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    33/63

     

    5. Monitoring and reporting

    5.1 This section should be as much about monitoring as about reporting. Not

    everything that is monitored is reported, or needs to be. However, the limitedduration of this study meant that little information could be collected on

    monitoring per se. Time constraints also meant that no country-level reports

    were examined.

    5.2 The distinctions between monitoring and reporting, and between internal and

    external reporting, are important. Many institutions are under pressure to

    report externally on results. While this is important for accountability, internal

    reporting to management, and monitoring more generally, are arguably at

    least as important. RBM is intended to improve both management

    effectiveness and accountability.

    5.3 It is also important to stress that accountability for results implies more than

     just reporting results. Many results (eg. outcomes) will not be attributable to

    a single institution. Because of this, reporting needs to demonstrate several

    things :

    i. that the agency is managing for outcomes, not just activities and outputs. 

    ii. that improved outcomes are being achieved; 

    iii. that the agency has contributed to these outcomes; 

    iv. that the design and implementation of the results strategy is sound andeffective; 

    v. that the results over which the MDI has a significant degree of control, andis aiming for, are being achieved. 

    26

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    34/63

     

     Annual reporting should concentrate on short-term results that show meaningful

    change over the reporting period; are attributable to the interventions being

    supported; and bear a significant relationship to longer-term objectives. 34

     

    5.4 None of the reports reviewed yet approach this standard. Most concentrate

    on the second and last task – reporting on outputs and outcomes – but

    without either analysing the strength of the link between the two, nor the

    effectiveness of the management strategy.

    5.5 The UNDP Results Orientated Annual Report (ROAR) represents the most

    ambitious and comprehensive corporate results report. The third ROAR

    (2001) presents key findings for each of the six SRF goals, together with in-

    depth analysis of three selected sub-goals. Aggregated global figures for the

    percentage of annual outputs fully or partially achieved, and the percentage

    of outcomes where there was positive progress, are presented in the text,

    together with the number or percentage of country offices active in each

    area. Comparative figures are sometimes given for achievements in the

    previous year. One of the general observations made is that “there is still a

    sizeable gap ... between impressive results at the output levels achieved

    within each goal and their contribution to realising outcomes”.35 

    5.6 The ROAR process includes an independent assessment of the extent to

    which the self-reported results from the country offices are accurate and

    complete. 71% of progress statements were fully verified, and a further 9%

    were partially verified. The verification did not extend to the degree to which

    UNDP outputs contributed to progress at the outcome level. The ROAR

    does not make any claim that it is solely responsible for such progress, but

    simply reports changes in outcomes that are “clearly linked” to UNDP

    support.

    34 Results-Based Management and Accountability for Enhanced Aid Effectiveness. A Reference Paper.

    CIDA Policy Branch. July 2002.35 Results-Orientated Annual Report, 2001. UNDP (2002), p.2.

    27

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    35/63

     

    5.7 While the ROAR is clearly a great advance on previous reporting, it lacks

    transparency in two respects. First, there is no single table showing

    coverage and achievements by goal and sub-goal for 1999, 2000, and 2001.

    It would be possible to largely create such a table by extracting the figures

    for 2000 and 2001 from the 67 pages of text, but the fact that the data is not

    presented in an accessible format is strange. The ROAR badly needs a

    straightforward summary. Second, although activities by goal and country

    are tabulated in an annex, there is no presentation of the achievement by

    outputs and outcome for each country office. This is a deliberate decision36,

    and may reflect a judgement that country-specific results would be

    misleading given the wide variation in results, projects and countries.

    5.8 For the last two years the Evaluation Office of UNDP has prepared a

    Development Effectiveness Report (DER). This is largely based on

    independent evaluation studies, and complements the ROAR by providing

    summary findings on the impact and sustainability of UNDP interventions at

    project and country level. The relative paucity of empirical data on the

    development impact of UNDP’s assistance was noted in both of the last

    DERs.

    5.9 UNICEF has used a results matrix to report on its Medium-Term Plan (MTP)

    since 1999. The results cited are a mix of global outcomes to which UNICEF

    made some contribution, or a description of what UNICEF has supported (ie.

    activities). There is no assessment of what UNICEF has directly achieved in

    terms of outputs, either against what was planned for the year in question or

    over the four years of the plan. A comparison of the results matrix for 1999

    and 2001 does not allow any conclusion to be drawn as to whether UNICEF

    is more or less effective than it was, or whether its contribution is growing or

    36 Results Based Management – Concepts and Methodology. UNDP Technical Note (2000) p.17.

    28

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    36/63

     

    shrinking. The better definition of intended results in the MTSP for 2002-05

    is likely to lead to improved reporting.

    5.10UNIFEM’s Strategy and Business Plan (SBP) for 1997-99 included a

    detailed list of activities under each objective. The new SBP for 2000-03

    included a report on the previous plan that lists the specific and general

    results achieved. However, it is not possible to match the results with the

    activities originally listed in the SBP for 1997-99.

    5.11 The annual Report of the Executive Director mentions that “implementation

    of each of [the SBP] objectives is measured and driven by a results-based

    framework”, but does not report against the intended outcomes listed in the

    SBP for 2000-03.37 This is done in the Consultative Committee Report,

    which is an annual report of results against the objectives of the SBP, based

    on data from the 6-month and annual reports submitted from each Sub-

    Regional Office.

    5.12 The IDB prepares an Annual Report on Projects in Execution (ARPE) for the

    Board of Executive Directors. This provides detailed information on the

    status and performance of the Bank’s portfolio, including an assessment of

    the extent to which ongoing projects in each country are likely to achieve

    their development objectives. In addition, the ARPE provides an assessment

    of trends and challenges, the issues affecting portfolio performance and

    notes the Bank’s response to these challenges. In the last two years the

    report has contained an analysis of the quality and compliance rate of

    Project Completion Reports (PCRs), has provided information on good

    practices noted, and highlighted lessons learned from both the Bank and

    Borrowers.

    37 UNIFEM Report of the Executive Director. Executive Board of UNDP. September 2002.

    29

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    37/63

     

    5.13 The Bank is in the process of revamping the PCR and the Project

    Performance Monitoring Report (PPMR). The PPMR has been modified to

    include historical project ratings, as well as greater attention to financial and

    sustainability issues and lessons learned, and will be linked to other relevant

    reports and monitoring systems. The last PPMR will also serve as a key

    input for the preparation of the PCR, which will focus more on results and

    comply with OECD/DAC guidelines for MDBs. It will include an evaluation of

    both the Bank and Borrower performance, an assessment of the project’s

    contribution to institutional development, and an outlook on expectations

    regarding the project’s ability to deliver benefits in the medium and long-

    term.

    5.14 Like UNDP, IDB’s Office of Evaluation and Oversight (OVE) has also

    produced a Development Effectiveness Report. However, unlike in UNDP,

    OVE is independent of management. One of the findings reported in the IDB

    DER was that, for completed projects rated as highly likely to achieve their

    development objectives, the majority of PCRs only discuss project outputs.

     Although it is quite likely that all the projects made some contribution to

    outcomes and impacts, this was very rarely documented in the PCR38.

    5.15 According to Operations Evaluation Department (OED), the monitoring and

    evaluation (M&E) in World Bank operations has been “chronically deficient”.

    The Annual Report on Operations Evaluation (AROE) for 2000-01 went on to

    say that, “despite indications of increasing operational quality and project

    performance, the Bank does not have a solid foundation to convincingly

    demonstrate results on the ground”.39

     According to one source, the Bank is

    still ‘years away from the systematic measurement of results’.

    38 Development Effectiveness Report. IDB (2002) pp.29-3139 Annual Report on Operations Evaluation 2000-01. OED, World Bank (2002) p.18.

    30

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    38/63

     

    5.16 Part of the problem lies in the lack of monitorable outcomes in Country

     Assistance Strategies (CAS), Sector Strategy Papers, and Project Appraisal

    Documents. These and other problems have been the subject of a

    comprehensive M&E action plan since 1999. Further improvements in M&E

     – such as a CAS completion report – are underway. The methodological

    challenges associated with measuring and attributing results are also very

    real.

    5.17 The Annual Review of Portfolio Performance (ARPP) produced by the

    Quality Assurance Group (QAG) is the Bank’s primary operational

    monitoring tool. At present this focuses on design and supervision quality,

    rather than results. However, there are plans to broaden the ARPP into an

     Annual Report on Portfolio Performance and Results. Subject to a

    satisfactory solution to the problem of aggregation, there are also plans for

    units to report annually on “outputs and outcomes related to real-time

    actions”, but not “program and country outcomes that will be realised only

    after long and variable lags”40.

    5.18 The OED Annual Review of Development Effectiveness (ARDE) reports on

    the ‘outcomes’, sustainability and institutional development impact of

    completed projects, as well as providing a summary of country and sector

    evaluations. It should be noted that the term ‘outcome’ in this context refers

    to the extent to which the project’s relevant development objectives have

    been (or are expected to be) achieved. These will be a mix of outcomes

    (intermediate objectives such as skills and organisational capacity) and

    impacts (long-term goals such as human and social development).

    5.19 The ARDE provides a reliable measure of the extent to which completed

    Bank projects are producing relevant results. Because OED has used a

    40 Better Measuring, Monitoring and Managing for Development Results. Development Committee Paper.

    World Bank. (September 2002) p.11

    31

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    39/63

     

    consistent methodology over the past few years, it also allows trends in

    project performance to be monitored. What it does not attempt to do is to

    quantify the specific results achieved or assess the contribution towards

    higher goals, such as the MDGs. In this sense it is more of an aggregation of

    results ratings rather than results. This is a practical solution to the problem

    of aggregating across diverse results.

    5.20 The World Bank acknowledges that there is scope to improve its reporting

    on its results. The assessment of the Strategic Compact found that the

    corporate scorecard was still incomplete, in part because of a lack of an

    agreed methodology. Limited progress has been made on agreeing ways of

    measuring and monitoring the impact of World Bank actions at country and

    sector level. This missing ‘second tier’ of the corporate scorecard is intended

    to link internal bank measures (such as product quantity and quality) with the

    International Development Goals (IDGs). The Strategic Framework

    produced in 2001 also highlighted the need to link country and sector work

    with the IDGs, but was not able to say how this would be done.

    32

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    40/63

     

    6. Managing

    “...there is sufficient evidence that the key elements are well known todonors and carried out to some extent. But in so many instances they havefailed owing to weaknesses in how the systems are used rather than whatits components are. They reflect the missing link between the measurementprocedures and the way in which the information is used – the managementprocess”41

     

    6.1 There are two primary uses, and motivations, for results information. The first

    is for accountability: to demonstrate effectiveness to others. This aspect was

    covered in the previous section. The second is to provide continuous

    feedback and learning for management. To what extent are these institutions

    really managing for outcomes? To what extent are they using information on

    outputs and outcomes, and from evaluation, in decision making?

    6.2 These are difficult questions to answer, particularly in this type of study. The

    potential uses of results information extend throughout the institution, from

    planning and budgeting to staff appraisal. The observations below are drawn

    from a small number of interviews, and from the few reports that address this

    issue.

    Planning

    6.3 Section 4 looked at the extent to which these institutions were planning to

    achieve outcomes: managing for results. In an ideal world strategic planning

    should also be about managing by results. Institutions should be amending

    their plans on the basis of results and experience. In practice this is

    something that few institutions are able or prepared to do. No development

    41 Measuring and Managing Results. Poate, D. (1997) p.vi

    33

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    41/63

     

    institution has been implementing RBM long enough for the results of one

    strategic planning cycle to inform to inform the next.42 

    6.4 More fundamentally, given the time lags between programmes and

    outcomes, let alone between programmes and data on outcomes, it is

    doubtful whether management by outcomes or impacts will ever be a practical

    proposition for development agencies. The best that can be hoped for is for

    periodic reviews to examine the alignment of the programme in respect of

    outcome trends. UNICEF have done this to some extent in their Medium

    Term Strategic Plan by concentrating, for example, on countries with

    particularly high child mortality rates.

    6.5 Management by intermediate outcomes and outputs is more feasible on an

    annual basis. This is the approach being adopted by UNDP, and being

    investigated by the World Bank. The two drawbacks with this approach are,

    first, that even outputs are a poor measure of the agency’s recent efforts

    because of the time lags involved. As observed in an IDA-13 paper, short-

    term measures of outputs are likely to reflect the result of resources provided

    many years earlier 43. Second, early indicators of output or intermediate

    outcome performance need to have good linkages with ultimate outcome

    objectives. Unless they do, progress towards outputs and intermediate

    outcomes will not necessarily be the same as, nor any guarantee of, progress

    towards improved development outcomes. Institutions need to keep track of

    outcome and impact trends, and ensure through evaluation that performance

    in terms of outputs and intermediate outcomes is linked to these. This is the

    challenge for UNDP.

    42 This is not to say that the experience of one planning cycle has not informed the next. For example, the

    UNIFEM SBP for 1997-99 certainly informed the formulation of SBP 2000-03.43 Measuring Outputs and Outcomes in IDA Countries. IDA (February 2002) p. 5

    34

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    42/63

     

    Resource allocation

    6.6 As with planning, resource allocation can mean allocating for results or by

    results (or conceivably both). The World Bank is probably the strongest

    exponent of budgeting for results. There is good evidence that aid has a

    larger impact on growth and poverty reduction in the context of good policies

    and institutions. In line with this thinking, the World Bank has for some time

    used assessments of country policy and institutional (CPI) performance as a

    basis for allocating IDA funding44.45 Since the Strategic Framework, the Bank

    has sought still greater selectivity and focus in its work.

    6.7 Budgeting by results is altogether more controversial, and difficult to apply.

    UNDP is particularly reluctant to contemplate results-based budgeting (RBB).

    There is concern over using results information to reward countries that do

    well, and penalise countries than do badly. As with the decision not to publish

    country-level data, this may reflect an internal political judgement. Getting

    staff to commit to results-orientated planning and reporting has been hard

    enough. Adding a budget implication would have made the process still more

    difficult. This implies, paradoxically, that RBM is more acceptable if it doesn’t

    actually change anything. This is clearly contrary to the spirit of RBM. If RBM

    is to mean anything, it has to mean using office/unit performance as one

    criteria for allocating resources. As far as could be ascertained, none of these

    institutions yet do this. This may, in part, be due to the lack of a reliable

    results-based indicator of office/unit performance.

    6.8 This is not to deny that there is real question about how best to balance

    ‘need’ and ‘results’ in resource allocation, particularly for country allocations.

    44 Better Measuring, Monitoring and Managing for Development Results. Development Committee Paper.

    World Bank. (September 2002) p.7

    45 It can be argued that CPI scores are themselves results of previous actions by governments and donors.

    The World Bank is in effect budgeting on the basis of past results in order to increase the likelihood of

    future results.

    35

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    43/63

     

    But this is not an either/or choice. Aid should be directed at countries in need

    with good policy and institutional environments, and therefore the best

    prospects for achieving results. It would appear that the World Bank does this

    rather better than do the UN agencies.46 

    6.9 Finally, results-based budgeting should have implications for how resources

    are allocated. The 2001 ARDE included an analysis of which objectives World

    Bank projects have been the most effective at achieving. This showed, for

    example, much greater success with physical infrastructure than for public

    sector institutional change. 47 

    6.10 Other things being equal, results performance should inform sectoral

    allocations, both within countries and globally. As with country allocations,

    there is a question about whether poorly performing sectors should be

    penalised. For example, the DER showed that UNDP is performing relatively

    poorly in relation to gender and institution building. This should mean that

    UNDP should attempt to understand and address the causes of this under-

    performance, not immediately reduce its allocation to gender and institutional

    activities. In the longer run, however, continued poor performance should

    imply some reallocation of resources towards outcomes where the

    institution’s contribution will be greatest.

    6.11 As with office/unit performance, results-based sectoral allocations are

    dependent on reliable and acceptable indicators. This is a real challenge for

    all institutions, requiring as it does comparability in the definition and

    measurement of outputs and outcomes across sectors.

    46 UNDP’s allocations to ‘good policy’/’bad policy’ countries (as measured by CPIA scores) became less

    favourable over the 1990’s.47 Annual Review of Development Effectiveness 2001. OED World Bank (2002) p.20

    36

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    44/63

     

    Staff appraisal

    6.12 It was not possible to ascertain the extent to which the assessment of staff

    performance now includes a results component. According to UNDP, ROAR

    results are now used in the assessment of Country Resident Representatives.

    The World Bank is also making some progress at evaluating managers on the

    basis of tangible results.

    6.13 One obstacle to more results-based appraisal – and to the application of

    RBM more generally - is the time-lag between inputs and outcomes. Annual

    appraisals can only hold staff accountable for very short-term results. Any

    higher outputs or outcomes will be the product of resources and actions

    provided years before. Equally, given the predominance of short postings,

    most staff will be long gone by the time the outputs and outcomes of their

    work become apparent. Making staff more accountable for planning for

    results, and for reporting on results, would be a step in the right direction.

    Managerial response

    6.14 UNDP is aware that the real challenge for RBM is, and remains, to realise

    a management value beyond external reporting. As the ROAR itself points

    out, the “unique benefits of the ROAR lie in the extent to which it can

    generate managerial responses at all levels”. The key question is the extent

    to which a management response has been forthcoming. This is probably the

    most critical, but difficult, question to answer. To what extent is RBM really

    making a difference to the way the institution is managed? How much is

    rhetoric, and how much is reality?

    6.15 According to UNDP staff, RBM is beginning to transform the way UNDP

    does business. Examples of this include :

    37

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    45/63

     

    •  Restructuring in some country offices in line with outcomes.

    •  More outcome-focussed discussions with partners, and at Board level.

    •  Improvements in country-level planning as a result of the SRF.

    6.16 On the other hand, there has been some criticism of the limited response

    of management to some of the key ROAR findings, such as the relatively poor

    performance of UNDP in respect of gender. There is also reported to be more

    commitment to RBM at headquarters rather than in the country offices, and

    more at middle rather than senior management.

    6.17 UNDP is well aware of the challenges involved in implementing RBM.

     According to UNDP these include defining results consistently and in a

    measurable way; building partnerships and assessing results together with

    partners; convincing donors and local partners of the virtues of RBM; and

    changing hearts, minds and capacities within UNDP.

    6.18 None of the other institutions have attempted as rapid a transition to RBM

    as UNDP. Any managerial changes are therefore both more incremental and

    more difficult for an outsider to detect. The World Bank experience is a casein point. Ten years of management reform, intended in part to increase the

    results-focus of the organisation, have made some difference. The design,

    outcomes, sustainability and institutional development impact of World Bank

    projects have improved. However, as the Bank itself acknowledges, there is

    much more that needs to be done to increase its results-orientation,

    particularly at country, sector and corporate level. The Bank’s own

    assessment of performance measurement under the Strategic Compact

    concluded as follows :

    “... while the measurement of performance as well as several of its useshave improved during the Compact period, performance measurement hasnot yet been used systematically and consistently to make strategic

    38

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    46/63

     

    decisions on selectivity, mobilize resources, align staff motivation, and holdmanagers accountable for the performance of their units.”48 

    6.19 The lack of senior management support for a stronger focus on results

    measurement and management – as evidenced by the failure to implementthe OED recommendation on RBM in 1997, and the weak support for the

    corporate scorecard to date – partly explains the slow progress. However, two

    other factors have contributed :

    •  the long period of time needed to implement fundamental change withinan institution

    •  the difficulties associated with applying results-based management to a

    development institution.

    6.20 The other three institutions – UNICEF, UNIFEM and IDB – will face the

    same challenges. It is noteworthy that IDB is in the process of recruiting a

    Chief Development Effectiveness Officer to spearhead the process of culture

    change within that institution.

    48 Assessment of the Strategic Compact. Annex 9 – Performance Measurement. World Bank (2001) p. 15

    39

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    47/63

     

    7. Issues in results-based management

    7.1 The introduction and implementation of RBM to large institutions is never

    quick and easy, as is shown by experience in the public sector in OECD

    countries. The introduction of RBM to international development agencies

    is even more challenging.49 The aim of this section is to discuss these

    challenges, drawing on the findings of this survey and other literature.

    Four particular issues can be identified :

    •  developing country capacity•  attribution

    •  aggregation

    •  incentives

    Developing country capacity

    7.2 The implementation of RBM in OECD countries was born out of the need

    to improve the performance of national bureaucracies and deliver better

    public services. RBM has been most effective when it has been designed

    as part of wider public sector management reform, and in an affirmative

    and stable policy and fiscal environment.50 

    7.3 Aid agencies have come to recognise that development results depend on

    developing countries. As the World Bank said recently: “that is where

    development outcomes are realised and measured and where the other

    goals will be met, or not”51. This presents development agencies with a

    49 Assessing Development Effectiveness. Flint, M. and Jones, S. DER Working Paper 1. DFID Evaluation

    Department (2001)50 Measuring and Managing Results. Poate, D. (1997)51

     Better Measuring, Monitoring and Managing for Development Results. Development Committee Paper.World Bank. (September 2002) p.4

    40

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    48/63

     

    double challenge: introduce RBM internally and within partner country

    governments. One without the other is unlikely to succeed52. As the focus

    shifts from projects to sector programmes to countries, so development

    agencies become critically dependent on partner governments to better

    measure and manage results.

    7.4 Experience in OECD countries suggests that RBM will not succeed

    without a supportive policy, fiscal and institutional environment. As Poate

    stated in 1997 “even where aid agencies can tackle their own internal

    measurement procedures and use of performance information,

    advocating performance measurement to clients in isolation is unlikely to

    lead to improved results because too many components are missing”.

    RBM requires and means full-scale public sector management reform in

    developing countries. This is an altogether more challenging prospect

    than attempting to implement RBM with development agencies.

    Strengthening country capacity in parallel is an all important priority, but a

    long-term task. Many countries not yet even have the systems to track

    inputs and outputs, let alone outcomes53.

     Attr ibut ion

    7.5 Attribution is the extent to which a result is caused by the activity,

    programme or agency: if there had been no agency activity, how would

    the outcome have been different? As development agencies lift their

    sights - from projects to programmes to countries to global – so the

    problem of attribution increases. RBM is based on the principle that

    outcomes can be improved by increasing management focus on them.

    This can work well at the project level, and for so-called ‘hard’ outcomes.

    It is much more difficult to apply at country and global level, and for ‘soft’

    52 This would be akin to a car company introducing RBM in the major offices but not in its car plants.53 Annual Report on Operations Evaluation 2000-01. World Bank (2002) p.34

    41

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    49/63

     

    outcomes. Table 1 draws on a discussion paper for CIDA to summarise

    the differences.

    Table 1: Characteristics of projects versus country and global programmes

    PROJECTS COUNTRY AND GLOBAL

    Largely self-contained, involving arelatively limited and identifiable range ofactors, each of whom has relatively clearlydefined, complementary roles and interests

     Are not self-contained, involving a widenetwork of actors who may haveoverlapping or conflicting roles andinterests

    Produce tangible outputs Produce intangible outputs (eg. influence)for which objective and relevant forms ofmeasurement are not available

    Deal with discrete and well-defineddevelopment problems that have a definedphysical location

    Deal with systematic country-, region-wideor global development issues

    Progress from inputs to outputs to impactsin a way that is relatively easy to observeand quantify

    Do not always progress in linear fashionfrom outputs to outcomes to impacts.Progress may be iterative.

    Progress from inputs to outcomes orimpacts over a relatively confined period oftime

    Progress from inputs to outcomes orimpacts over a relatively long period oftime

    Have immediate cause and effectrelationships that are relatively easy toobserve and validate. There is a direct linkwith development outcomes and impacts.

    Involve cause and effect relationships thatare difficult to observe and validate. Linksmay be indirect and multi-causal.

    Have a design and direction over which theagency has a high degree of control orinfluence

    Have a design and direction which theagency, on its own, has a low degree ofcontrol or influence

    7.6 These are caricatures. For example, projects increasingly produce

    intangible outputs. The table nonetheless helps explain why RBM is

    becoming more difficult. Project-type interventions traditionally “produce

    clear, measurable impacts (changes) that result from linear processes;

    the causes and impacts are easily attributable to a narrow range of inputs

    and actors”. The more you move towards country-type interventions “the

    42

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    50/63

     

    more you push notions of measurable change and identifiable causality to

    the limits of their validity”. 54 

    7.7 Increasingly, donors (multilateral and bilateral) are seeking to work

    together in support of government designed and implemented policies

    and programmes. Multilateral development institutions (MDIs) pursue

    development outcomes indirectly, working with other partners and through

    country governments. But the assumption built into RBM is that the

    agency sets its objectives, pursues them and then reports on how well it

    has done more or less independently. Again, this makes most sense in

    the context of single donor projects and programmes.

    7.8 Attributing development outcomes and impacts to individual MDIs is very

    difficult in these circumstances. In most cases, the multiplicity of

    influences, and MDIs relatively minor role, means that attempts to

    establish causality will be of dubious validity. This has major implications

    for RBM. There is a clear tension between the benefits of an outcome-

    focus, and the need for results to be results in the proper sense (ie.

    attributable change) if management is to benefit from focussing on them.

    The long time horizons for achieving outcomes and impacts (5 -10 years

    plus), and the limited and lagged data availability, exacerbates the

    problem.

    7.9 One response is to say that what matters is improved development

    outcomes, not whether they can be attributed to a particular MDI. This

    misses the point. Attribution is fundamental to RBM. Development

    outcomes like child mortality are useful if you want to measure and

    manage the collective efforts of governments and donors. They are much

    54 Results-Based Management and Multilateral Programming at CIDA – a discussion paper. Mark Schacter.

    Institute of Governance, Ottawa, Canada. (1999) p.4-5

    43

  • 8/20/2019 2- Results Based Management in Dev Partners.pdf

    51/63

     

    less useful for measuring and managing the performance of a single

    donor.

    7.10 But equally, just because attribution is difficult, it would be wrong to

    conclude either that identifying ‘plausible association’ or ‘credible linkages’

    between outputs and outcomes is impossible, or that RBM is a non-starter

    for development agencies. Aiming for specific development outcomes and

    impacts remains a powerful and useful principle. Development agencies

    should manage for outcomes, and support developing countries in

    measuring these. They should not, however, either claim these as their

    results,