45
SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Embed Size (px)

Citation preview

Page 1: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

SPO Internship @ UVER

5-11 November 2007

Tiziana Tamborrini

Page 2: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Uver Effectiveness Evaluation

An Original Blend of Outcome Evaluation and Implementation Analysis

Page 3: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Legal Framework

The Presidential Decree - D.P.R. 20.02.1998, n. 38 –

• Provides UVER with the legal status for Outcome Evaluation.

• Specify the Object of the Evaluation Activity with reference to investments programs and projects carried out by Public Agencies (Regioni, Comuni…) and founded by financial resources.

• Mentiones the need to check for social-economic effects produced by these public interventions, to be evaluated accordingly to the planned targets and objectives and in line with the Cost Estimates.

• Attributes a role of guidance to UVER in order to promote and suggest specific initiatives to be adopted by the Administrations as a consequence of the evaluation activity.

,

Page 4: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Legal Framework The CIPE (Interministerial Committee

for Economic Planning) Resolution n. 12/2006 states the following:– The Effectivness Evaluation carried out

by Uver is an important step towards a more rational and efficient use of public resources providing guidance to decision-making for development policies.

Page 5: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Evaluation-Object

• Output: these are the direct outcomes, Goods and Services,of the funded activity accordingly to what are also called Operational Objectives;

• Results, these are what the funded activity has set out to achieve accordingly to the Specific Objectives of the Project;

• Impact or Effects, these are long-term and indirect Effects on socio-economic context, related to Global Objectives of the Program.

Page 6: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Evaluation-Level

• Uver EffUnit is in charge for a Project and Program-level Outcome Evaluation;

• Outcome Evaluation has been so far carried out on Projects and specifically on Public Investments on Infrastructures.

• Te Evaluation Activity is carried out on Intervention financed by the Additional National Public Resources within the Fund for Underutilized Areas (FAS).

Page 7: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Key Answers for a key Evaluation

Why are we evaluating?

•To assess if the community of stakeholders and beneficiaries – final users of the intervention services - is satisfied and to what extent with the achieved results;•To reward the best practices and stigmatized bad practices in order to improve the planning design and give feedbacks and guidance to the decision-making process.

What do we need to know?•Achieved outputs and outcomes of Projects in order to compare results to the planned targets.•Implementation process in order to check the governance of service delivering;•Not only achieved output and outcomes, but also APPRAISE the extent to which the project contributes to the program general goals.

Page 8: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Uver-Effectivness Unit Vision Of Evaluation

The adopted theoretical background (see the presentation on Background Theory of Evaluation) operationalizes a

broader and complex view of Evaluation which:

• Points to an ‘ad hoc – partecipated – contextbased’ application of the Evaluation Theory based not on a one ‘best’ method but on a unique mix of Evaluation Methods;

• Focus not only on the last part of results chain : immediate outputs, outcomes and final impacts, but on the chain itself: the design, the implementation process, the context where the process takes place and finally the ‘social’ utility of projects/programs;

• Looks for a statistical-quantitative appraisal of how results are causally linked, but also to a qualitative-participatory process of evaluation where all possible sources of information are key and essential to a multidimensional and systemic evaluation.

Page 9: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Different Approaches at Work

• Also the practice of Evaluation shows there is Not One ‘Best’ Approach to Evaluation but A Variety of Evaluation Approaches (Martini, 2006).

• Depending on the Answers to the two key questions any one of them or a mix of them can be the suitable Evaluation

Page 10: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

A Mix Approach to EvaluationOutcome Evaluation for Project Accountability

• Evaluation is intended as Evaluating project delivering and targets achievement.

• In this case users come from outside of program operation – the key stakeholders – and the object unit of evaluation is not a specific single organizations but a more complex public programs of investments.

• Evaluation here aims to account for the main results to the key stakeholders, therefore Evaluation aims not only to compare the results with specific targets, but also to provide a general and deep description of what has been done to the community.

• Evaluation evocates a Transparent, Responsible, Accountable process of analyzing the success of public intervention and delivering evaluation results.

Page 11: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

A Mix Approach to EvaluationEvaluation as Implementation Research

• Understanding how the Project has been implemented.• Evaluation here is meant as the process of going inside the

implementation process to shed light on critical issues and incongruence.

• Knowing why a project achieves its goals is more important than just knowing that it does. It evocates Governance of Implementation Policy.

• What are the critical components/activities of this project (both explicit and implicit)?

• How do these components connect to the goals and intended outcomes for this project?

• What aspects of the implementation process are facilitating success or acting as stumbling blocks for the project?

Page 12: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

A Mix of Tools for Different Puposed Evaluation

• Performance indicators• The logical framework (logframe) approach• Theory-based evaluation• Formal surveys• Rapid appraisal methods• Participatory methods• Impact evaluation Techniques

Page 13: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Performance Indicators

• Performance indicators are measures of inputs, processes, outputs, outcomes, and impacts for development projects, programs, or strategies. When supported with sound data collection—perhaps involving formal surveys—analysis and reporting, indicators enable managers to track progress, demonstrate results, and take corrective action to improve service delivery.

• Participation of key stakeholders in defining indicators is important because they are then more likely to understand and use indicators for management decision-making.

ReferencesCECEWorld Bank (2000). Key Performance Indicator Handbook. Washington, D.C.Hatry, H. (1999). Performance Measurement: Getting Results. The UrbanInstitute, Washington, D.C.

Page 14: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

The Logical Framework Approach• The logical framework (LogFrame) helps to clarify objectives of any

project, program, or policy. • It helps identify expected causal links—the “program logic”—in the

following results chain: inputs, processes, outputs (including coverage or “reach” across beneficiary groups), outcomes, and impact.

• It leads to the identification of performance indicators at each stage in this chain, as well as risks which might impede the attainment of the objectives.

• The LogFrame is also a vehicle for engaging partners in clarifying objectives and designing activities. During implementation the LogFrame serves as a useful tool to review progress and take corrective action.

References

World Bank (2000). The Logframe Handbook, World Bank: http://wbln1023/OCS/Quality.nsf/Main/MELFHandBook/$File/LFhandbook.pdf

GTZ (1997). ZOPP: Objectives-Oriented Project Planning:http://www.unhabitat.org/cdrom/governance/html/books/zopp_e.pdf

Page 15: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Formal Surveys

• Formal surveys can be used to collect standardized information from a carefully selected sample of people or households. Surveys often collect comparable information for a relatively large number of people in particular target groups (NOT CARRIED OUT AT THE MOMENT).

Providing baseline data against which the performance of the strategy, program, or project can be compared.

Comparing different groups at a given point in time and changes over time in the same group.

Comparing actual conditions with the targets established in a program or project design

Describing conditions in a particular community or group.

References:LSMS: http://www.worldbank.org/lsms/Client Satisfaction Surveys: http://www4.worldbank.org/afr/stats/wbi.cfm#sdsCitizen Report Cards: http://lnweb18.worldbank.org/ESSD/sdvext.nsf/60ByDocName/CitizenReportCardSurveysANoteontheConceptandMethodology/$FILE/CRC+SD+note.pdf

Page 16: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Theory-Based Evaluation• Theory-based evaluation has similarities to the LogFrame approach but allows

a much more in-depth understanding of the workings of a program or activity—the “program theory” or “program logic.” In particular, it need not assume simple linear cause-andeffect relationships.

• By mapping out the determining or causal factors judged important for success, and how they might interact, it can then be decided which steps should be monitored as the program develops, to see how well they are in fact borne out.

• This allows the critical success factors to be identified. And where the data show these factors have not been achieved, a reasonable conclusion is that the program is less likely to be successful in achieving its objectives.

• For example, the success of a government program to improve literacy levels by increasing the number of teachers might depend on a large number of factors. These include, among others, availability of classrooms and textbooks, the likely reactions of parents, school principals and schoolchildren, the skills and morale of teachers, the districts in which the extra teachers are to be located, the reliability of government funding, and so on.

ReferencesWeiss, Carol H. (1998). Evaluation. Prentice Hall, New Jersey, Second Edition.Weiss, Carol H. (2000). “Theory-based evaluation: theories of change for poverty reduction

programs.” In O. Feinstein and R. Picciotto (eds.), Evaluation and Poverty Reduction. Operations Evaluation Department, The World Bank, Washington, D.C.

Page 17: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Rapid Appraisal Methods

• Rapid appraisal methods are quick, low-cost ways to gather the views and feedback of beneficiaries and other stakeholders, in order to respond to decision-makers’ needs for information

• Providing rapid information for management decision-making, especially at the project or program level.

• i.e.: Focus groups, Key informant interview , Community group interviews, Direct observations, Mini survey (TO BE IMPLEMENTED FOR SPECIFIC CASE STUDIES)

ReferencesUSAID. Performance Monitoring and Evaluation Tips, #s 2, 4, 5,

10: http://www.usaid.gov/pubs/usaid_eval/#02K. Kumar (1993). Rapid Appraisal Methods. The World Bank

Washington, D.C..

Page 18: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Impact Evaluation Tools

• Rapid assessment or review, conducted ex post. This method can encompass arange of approaches to endeavor to assess impact, such as participatory methods,interviews, focus groups, case studies, an analysis of beneficiaries affected by theproject, and available secondary data;

• Ex-post comparison of project beneficiaries with a control group. With this method,multivariate analysis may be used to control statistically for differences in attributesbetween the two groups ― this is one way of estimating the counterfactualsituation;

• Quasi-experimental design, involving the use of matched control and project (beneficiary) groups. This method involves the use of a “non-equivalent” control group to match as closely as possible the characteristics of the project population – either through propensity score matching or using a multivariate regression approach. This method often involves the use of large scale sample surveys, and sophisticated statistical analysis;

• Randomized design. This involves the random assignment of individuals or households either as project beneficiaries, or as a control group which does not receive the service or good being provided by the project. This is also known as the experimental method, and is used in health research, for example, in areas such as evaluating the effectiveness of new drugs and medical procedures.

Page 19: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

FASE ISTRUTTORIA: •CONTEXT BASED ANALYSIS

•History and LOGFRAME OF THE PROJECT:Objectives, specfic targets, impacts

•Identify stakeholders e beneficiaries

IMPLEMENTATION EVALUATION:Governance of Implementation Process

Over the whole Project Cycle:Design-Implementation-Management

OUTCOME EVALUATION:to what extent and quality

phisical outputs and services have been produced compared to targets

Using quantitative and qualitativestatistical analysis

IMPACTS APPRAISALGauge

General Impacts on Beneficiaries and on ContextBy congruous context-based analysis,

Theory-based and LogFrame evaluation,Stakeholders participation,

Ad hoc Rapid Appraisal Evaluation

UVER Effectivness Evaluation Framework

Page 20: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Evaluation Steps: 1. Project Assignment

A Team of Experts (ToE) starts up the process.

The ToE is made of engineers, legal experts, lawyers, architects, economists.

The ToE initializes the Evaluation Form in the Data Base and check first available data;

The ToE contacts the Evaluation Team for the methodological support.

Page 21: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Evaluation Steps: 2. Desk Work

The ToE starts up the ‘Desk Work’ and builds up a partecipatory desk review from the very beginning setting contacts with key stakeholders

Context-based model – Implementation EvaluationTheory-based and Logframe model – Implementation Evaluation

Track and Assess Deadlines, Assess estimated and actual Costs, Assess Previous Evaluation Results (Ex Ante Evaluation,

Monitoring Activity), Track Specific and General Objectives and Targets, Check continuing relavance and consistency of Project

Design and Objectives/Targets given current needs and context.

Page 22: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Evaluation Steps: 2. Desk Work

Make a history of the project; Highlight the logframe and the

underlined ‘thory’ : General programs priorities and goals, Social-political-economic context where

it was planned and the local needs, General background motivations for the

project.

Page 23: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Evaluation Steps: 2. Desk Work

Assess critical factors affecting implementation and effectiveness (any possibile shortcoming in design and the management of implementation process, Time consistency,specific technical change in context which have required for an ExtraOrder Change);

Implementation Model Brainstorm on shortcomings with stakeholders

Context-based and Partecipatory models – Implementation Evaluation

Brainstorm on objectives and beneficiaries Context-based and Partecipatory models – Outome Evaluation

Page 24: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Evaluation Steps: 3. Indicators Selection

Examine quantification of objective and available data on output/results/impacts,

Highlight badly defined or not relevant indicators and speculative targets,

Select suitable performance indicators from the Table of Indicators in the Data Base according to the logframe built in the first step (see the Table of Indicators in Annex 1).Deductive-Quantitative Approach and Outcome Evaluation

Discuss and improve the Performance Indicators set with the key stakeholders and look for available Quantitative Targets.

Partecipatory model and Outcome Evaluation

Page 25: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Evaluation Steps: 4. On Site Visit Preparation

Set face-to-face meetings with the stakeholders;Partecipatory model

Review and Contextualize the Governance Questionnaire (see link) which will provide guidance for the interviews on site;

Partecipatory model and Implementation/Outcome Evaluation

Possibly organize Rapid Appraisal Methods (Focus groups, Key informant interviews, Community group interviews, Direct observations, Mini survey in case of selected case studies)Deductive-Quantitative Approach and Partecipatory model

Page 26: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Evaluation Steps: 5. On Site Visit

Have the planned meetings with the key stakeholders and beneficiaries;

Collect data on Indicators, specifically: Targets; Acutal values; Context-values which describes context situation against which the changes will be

measured (baselines);

Collect all possible information and visual evidence useful for the evaluation (fotographs; stakeholders views; direct observations);

Update the project’s picture from the First Step given On Site evidences – stakeholders views – subjective impressions;

Compare and Adjust results’ expectations based on theory-based and logframe analysis.

Context-based Partecipatory Approach and Outcome Evaluation

Page 27: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Evaluation Steps: 6. Post Visit Work and Analysis

Fill the Numeric Fields of the Evaluation Forms with the available data on: Expected and Actual Deadlines of different

stages of implementation process, Expected and Actual Costs, Targets, Context-values and Actual Values of

Performance Indicators; Qualitative Indicators, Beneficiaries: Inhabitants, Final and Potential

Users.

Page 28: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Evaluation Steps: 6. Post Visit Work and Analysis

Fill the Descriptive Fields of the Evaluation Forms: Description of the Intervention, Description of Expected Goals; Description of Implementation Process, Description of Beneficiaries, Description of critical factors.

Page 29: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Evaluation Steps: 6. Post Visit Work and Analysis

Discuss critical cases in collegial meetings with all the colleagues and Evaluation Team in order to adopt an homogeneous approach for problematic interventions (highly recommended for cases at high risk of a Negative Final Assessment);

Fill up the Governance Questionnaires: experts of ToE are intended to be ‘key informants’. Support of the Evaluation Team.

Page 30: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Evaluation Steps: 7. Empirical Analysis and Evaluation

Carried out by the Evaluation Team:

Measure Physical Output Realization and Planned Services Achievement on Recommended and Optional Performance Indicators;

Gauge the extent to which major Strategic Objectives are achievedby single projects and by geographic region/Asses based on Collected data on Efficiency Indicators (Time and Costs: rate of

increase/decrease of actual value respect to the target); Collected data on Quantitative Performance Indicators (rate of

increase/decrease of actual value respect to the target); Qualitative Indicators;

Elaborate the Governance Questionnaries;

Page 31: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Evaluation Steps: 7. Empirical Analysis and Evaluation

Combine in a Multicriteria Final Qualitative Assessment the Quantitative Indicators and the Qualitative Assessment in the Evaluation Form;

Discuss again the Final Assessment in open-up meetings for a final check on critical cases (Negative and Not Working Interventions);

Elaborate the Governance Final Scores for a Final Qualitative Integrate Assessment in order to provide a Rating of Interventions.

Page 32: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Evaluation Steps: 8. Finalize Evaluation Process• Communication of Final Results to

Public Agencies in charge for the Project and Government (CIPE);

• Dissemination of Results to stakeholders, citizens of local communities, mass media.

Page 33: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Picture of Final Assessment Process

Time Analysis

Cost Analysis

Critical Factors

Impressions

Performance Indicators

FinalAssessment

Page 34: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Positive Assessment

• Targets on physical outputs and services have been accomplished by more than 50 percent;

• Costs estimates have been attended;

• Deadlines have been reasonably accomplished;

• The Design has been proven ‘robust’ and ‘resilient’ (not incurring in too many Critical Issues: Time Extension, Extra orders Changes).

Page 35: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Negative AssessmentProjects have produced no relevant effects

(less than 50 percent of target has been achieved on evarage of recommended

indicators) and have been proved uneffective at the Implementation Analysis

and Qualitative Appraisal.

The Department suggests Cipe to sanction the involved Administrations by retaining from

the next assignment the same amount of money as that one has been uneffectively

used.

Page 36: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Not Activated Interventions

• Justified Not Activated Intervention: Infrastructures have not being activated yet due to acknowledged and well motivated reasons. They will be checked for Effectiveness in a next Evaluation Round;

• Not Justified and Not Activated: They receive a warning of possible Negative Assessment if not activated within 90 days since the CIPE resolution. They follow the same procedure as the Negative projects if Not Activated by the extended deadlines.

Page 37: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Towards a Rating of the Administrations

• Questionnaire on Governance yields a broader and complex picture of the project from design to results delivering.

• The Questionnaire is submitted to the Expert of the ToE who are reasonably playing the role of a key informant;

• It is made of 15 questions, each of them with 5 possible items;

• Each question is related to a Governance Dimensions in accordance with the European Community and International Approach to Governance: Projection, Implementation Management, Transparency, Participation, Effectiveness.

• The Final Score allows for a more graduated rating of projects

Page 38: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Project RatingWe consider six rating categories ranging from highly satisfactory to highly

unsatisfactory:

• Highly Satisfactory: The Project achieved at least acceptable progress toward all major relevant objectives and Governance Dymensions, and had best practice development impact on one or more of them. No major shortcomings were identified.

• Satisfactory: The Project achieved acceptable progress toward all major relevant Governance dymensions given the constraint of being effective has been satisfied. No best practice achievements or major shortcomings were identified.

• Moderately Satisfactory: The Project achieved acceptable progress toward most of its major relevant objectives and Governance Dymensions. No major shortcomings were identified.

• Moderately Unsatisfactory:The Project did not make acceptable progress toward most of its major relevant objectives and Governance Dymensions.

• Unsatisfactory:The Project did not make acceptable progress toward any of its major relevant Governance Dymension given the constraint of Effectiveness.

• Highly Unsatisfactory:The Project did not make acceptable progress toward any of its major relevant objectives and Governance Dymension

Page 39: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Project RatingHighly Satisfactory

Gov++Oucome ++

SatisfactoryGov +

Outcome ++

Moderalìtly SatisfactoryGov +

Outcome +

Moderatly UnsatisfactoryGov –

Outcom ++

UnsatisfactoryGov –

Outcome –

Highly UnsatisfactoryGov – -

Outcome - -

Highly SatisfactoryGov++

Oucome ++

SatisfactoryGov +

Outcome ++

Moderalìtly SatisfactoryGov +

Outcome +

Moderatly UnsatisfactoryGov –

Outcome+

UnsatisfactoryGov –

Outcome –

Highly UnsatisfactoryGov – -

Outcome - -

Page 40: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Tools for Evaluation

o Recommended and Optional Indicators: Few and homogenous by Category of InterventionsTo make projects comparable within the same Category; To address good and not so good practices;To build in prospective baselines (standards for

comparison);

o Free Indicators in case of Projects with specific objectives

o Governance Indicators

Page 41: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Completamenti: Main Features and Selected Projects

Set for Evaluation• Projects are selected from a group of

310 called “Completamenti” (1,8 bil €).

• It is a group of non completed projects, that in 1999 were provided with an ex ante evaluation made by UVAL.

Page 42: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

Main Results of 2006 – 2007 Evaluation Cycle

• Approximately 93 percent of activated projects are positive (152 out of 163);

• Few best practices but some good practices;

• 11 Negative cases which have been addressed to CIPE for sanction;

• 27 Not Activated Projects (16 Justified and 11 Not Justified).

Page 43: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

• Half visited projects did not have any data on targets on selected indicators.

• Context values were almost never available.

• In most cases it has been quite difficult to track back key stakeholders for different stages of project cycle.

• Deadlines got mostly unattended.

Main Shortcomings

Page 44: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini

PERSPECTIVES

• Improve the Projects Rating System; • Extend the Rating System to the Regions;• Stimulate the project planning and

implementation project cycle: – Preliminary evalutations need to be strenghten;– Quantitative context/baselines and targets for

Results Indicators need to be provided since the beginning of project cycle;

– Monitoring on indicators has to be to improved.

Page 45: SPO Internship @ UVER 5-11 November 2007 Tiziana Tamborrini