58
Project Monitoring and Evaluation 2005 Funded by Danida

Funded by Danida - Application serverwebapps.daff.gov.za/AmisAdmin/upload/PFM Guideline7-Project... · M&E can help an organisation to extract, from past and ongoing activities, relevant

Embed Size (px)

Citation preview

Project Monitoring and Evaluation

2005

Funded by Danida

Table of Contents

Table of Contents List of Abbreviations ………………………………………………………… iii 1. INTRODUCTION ……………………………………………………….. 1 2. ABOUT THIS GUIDELINE …………………………………………... 2

2.1 Aim and Objectives ………………………………………………… 2 2.2 Who is this Guideline for? …………………………………………2 2.3 How to Use this Guideline ……………………………………….... 3

3. ASPECTS OF MONITORING AND EVALUATION ……………... 4

3.1 What is Monitoring and Evaluation? …………………………….. 4 3.2 Why Monitoring and Evaluation? …………………………..…….. 5 3.3 Different Kinds of M&E……………………………………………. 5 3.4 Internal and External M&E ……………………………………….. 6 3.5 The Role of DWAF in M&E ……………………………………….. 7 3.6 Monitoring Levels …………………………………………….…….. 8 3.7 M&E and Stakeholder Participation …………………………….. 10

4. THE MONITORING AND EVALUATION PROCEDURE ………... 11

4.1 Step 1: Establish the Purpose and Scope of M&E …………….. 12 4.2 Step 2: Identify Performance Questions and Indicators …… 13 4.3 Step 3: Establish M&E Functions and Assign

Responsibilities and Financial Resources…………….. 17 4.4 Step 4: Gather and Organise Data …………………………….… 18 4.5 Step 5: Analyse Data and Prepare an Evaluation Report ……. 21 4.6 Step 6: Disseminate Findings and Recommendations ……….... 23 4.7 Step 7: Learn from the M&E …………………………………….. 24 ANNEX 1: TERMS OF REFERENCE FOR EXTERNAL MONITORING ………………………………………………. 25 ANNEX 2: EXAMPLE OF A MONITORING FORM ………………... 28 ANNEX 3: FORMAT FOR AN EVALUATION REPORT …………… 29 ANNEX 4: EXAMPLE OF IMPLEMENTING AN M&E PROCEDURE 31 ANNEX 5: GLOSSARY ……………………………………………………. 39 ANNEX 6: LIST OF REFERENCES ……………………………………. 43 ANNEX 7: THE PFM GUIDELINES ……………………………………. 52

Project Monitoring and Evaluation i

Table of Contents

FIGURES AND TABLES

Figure 1: Differences between Internal and External M&E ……….…. 7 Figure 2: The M&E Procedure ……………………………………………... 11 Table 1: Examples of Performance Questions and Indicators ……….. 14 Table 2: Examples of Indicators and Means of Verification …………. 15 Table 3: Examples of Quantative and Qualitative Indicators ……….. 16

Project Monitoring and Evaluation ii

List of Abbreviations

List of Abbreviations

CBO Community Based Organisation C,I&S’s Criteria, Indicators and Standards Danida Danish International Development Assistance DWAF Department of Water Affairs and Forestry IDP Integrated Development Plan LFA Logical Framework Approach M&E Monitoring and Evaluation NGO Non-Governmental Organisation PFM Participatory Forest Management TOR Terms of Reference

Project Monitoring and Evaluation iii

Introduction

1. Introduction The Department of Water Affairs and Forestry (DWAF) has adopted Participatory Forest Management (PFM) as a general approach to all its activities. PFM seeks to ensure that there is a shared responsibility of forest management between key stakeholders and the state, and that there is a sustainable flow of benefits to key stakeholders. DWAF thus strives to consider local people’s forest-based needs, their role in sustainable forest management and their involvement in decision-making processes. DWAF will increasingly focus on monitoring, policy making, regulating and providing technical support with regard to the management of state forests. In all of these tasks, DWAF will be required to monitor and evaluate forest management activities and projects, as well as other initiatives being implemented by various organisations. DWAF will thus need to ensure that projects are in accordance with relevant policies, principles and practices. Of key importance are the DWAF Criteria, Indicators and Standards (C,I&S) for Sustainable Forest Management as well as Participatory Forest Management (PFM) principles. This Guideline deals with procedures and examples of external project monitoring and evaluation (M&E), which will support forestry staff in their role as regulators and monitors. This Guideline is part of the PFM Guidelines developed during the DWAF/Danida PFM Project (2001-2005). The PFM Guidelines aim to empower DWAF staff, the new custodians of the State forests and partners at local level to implement the new DWAF Forestry Vision. The PFM Guidelines are meant to support community upliftment in accordance with the DWAF C,I&S for Sustainable Forest Management.

Project Monitoring and Evaluation 1

About this Guideline

2. About this Guideline

2.1 Aim and Objectives This Guideline aims to provide an understanding of monitoring and evaluation of a forestry project. It explains the steps required to develop and implement the M&E procedure. An example of implementing an M&E procedure is given in Annex 4.

An important role of DWAF will be one of monitoring of forest projects and activities implemented by local stakeholders. This Guideline provides information on external M&E and focuses on outputs of a project rather than its inputs and activities.

The objectives of this Guideline are to:

• Provide an overview of some of the key concepts and techniques of M&E;

• Explain how DWAF can use M&E to improve the outputs and

impacts of forest projects/activities;

• Introduce a generic M&E procedure which can be tailor-made to a particular context;

• Provide extensive references to other documents and resources on

M&E. 2.2 Who is this Guideline for? This Guideline is primarily for DWAF National Office staff as well as Area Managers, Foresters and Forestry Scientists in the regions. It will further be relevant for the new management agencies assigned to manage state forests. The Guideline will also be of benefit to key stakeholders such as environmental management agencies, NGOs, CBOs and PFM Forums/Committees.

Project Monitoring and Evaluation 2

About this Guideline

2.3 How to use this Guideline Chapter 3 explains the purpose of M&E and looks at DWAF’s role in the process and addresses various other aspects of M&E. Chapter 4 provides a procedure for external M&E, and details each step in the procedure. Annex 1 describes a format of a Terms of Reference (TOR) for hiring of a consultant to conduct the M&E. Annex 2 provides an example of a Monitoring Form. Annex 3 provides a format for an Evaluation Report. Annex 4 illustrates the implementation of the M&E procedure by providing an example of monitoring and evaluating a PFM Committee. Annex 5 provides a glossary, which explains words and terms used in the Guideline. Annex 6 presents an extensive list of references used in the text as well as other useful documents, reports and websites. Annex 7 gives an overview of the eight PFM Guidelines produced by DWAF. This Guideline has been produced as a practical resource document as well as for training purposes. Sections of the Guideline can be copied for discussions, presentations and other training and development purposes.

Project Monitoring and Evaluation 3

Aspects of Monitoring and Evaluation

3. Aspects of Monitoring and Evaluation

3.1 What is Monitoring and Evaluation? M&E is something we do in our daily life. We look at what has happened or is happening around us – for example how much money we have in our bank account at the end of the month. We assess the current situation and compare it to our expectations or goals. If there is a difference, we decide if the difference is significant – did we save the amount we planned to towards our annual holiday, if not, will that significantly affect our holiday plans. We then consider ways in which we could address these differences or shortcomings. M&E can be seen as a practical management tool for reviewing performance. M&E enables learning from experience, which can be used to improve the design and functioning of projects. Accountability and quality assurance are integral components of M&E, which help to ensure that project objectives are met, and key outputs and impacts are achieved.

Monitoring and Evaluation

M&E is the systematic collection and analysis of information to enable managers and key stakeholders to make informed decisions, uphold existing practices, policies and principles and improve the performance of their projects.

Monitoring is the regular gathering, analysing and reporting of information that is needed for evaluation and/or effective project management.

Evaluation is a selective and periodic exercise that attempts to objectively assess the overall progress and worth of a project. It uses the information gathered through monitoring and other research activities and is carried out at particular points during the lifetime of a project.

Project Monitoring and Evaluation 4

Aspects of Monitoring and Evaluation

3.2 Why Monitor and Evaluate? M&E can help an organisation to extract, from past and ongoing activities, relevant information that can be used as the basis for future planning. Without M&E how would it be possible to judge if a project was going in the right direction, whether progress and success was being achieved, and how future efforts might be improved? A structured M&E approach makes information available to support the implementation of forestry projects and activities and will enhance the sustainability. Used effectively M&E can help to strengthen project implementation and encourage useful partnerships with key stakeholders. The main objectives of M&E are thus to:

• Ensure informed decision-making; • Enhance organisational and development learning; • Assist in policy development and improvement; • Provide mechanisms for accountability; • Promote partnerships with, and knowledge transfer to, key

stakeholders; • Build capacity in M&E tools and techniques.

M&E is about feed-back from implementation.

The ultimate purpose of M&E is change for the better.

3.3 Different Kinds of M&E M&E can deal with many issues. It can be M&E of policy implementation, the performance of a unit in an organisation, staff performance or, for example, deliveries from a subcontractor. This Guideline deals with M&E related to a project. The concepts, tools, and procedures for project M&E, as presented in this Guideline, also helps to understand other kinds of M&E. This Guideline does not deal with financial management systems.

Project Monitoring and Evaluation 5

Aspects of Monitoring and Evaluation

3.4 Internal and External Project M&E Internal Project M&E is built into the design of a project and is undertaken by the team that is responsible for management and implementation of the project. This is done to ensure that the project meets deadlines, stays within the budget and achieves its objectives, activities, outputs and impacts1. A project that does not monitor its implementation is not a well-managed project. Findings, recommendations etc of internal monitoring is usually captured in progress reports submitted by project management. External Project M&E, described in this Guideline, is carried out by an outside team, which is not directly responsible for the management or implementation of the project. External M&E should assess the effectiveness of the internal M&E put in place by the project management team. External monitoring can take place once the project has been completed, and/or during implementation of the project. External M&E is often required by donor agencies or government organisations if, for example, they need to know how their funds are being spent or if their policies are being adhered to. All projects can benefit from external M&E. Findings and recommendations of external monitoring are often documented in a review or evaluation report.

External M&E also monitors and evaluates internal M&E

1 Internal monitoring is detailed in DWAF/Danida PFM Guideline: Logical Framework

Project Planning (2005)

Project Monitoring and Evaluation 6

Aspects of Monitoring and Evaluation

Figure 1: Differences Between Internal and External M&E

Monitoring and Evaluation

Internal M&E

1. Integrated part of

project design and management.

2. Undertaken by the project management team.

3. Findings, lessons learnt and recommendations documented in regular project progress/activity reports.

Assessing & improving

performance

External M&E

1. Prescribed outside of

project design and management.

2. Conducted by a team external to the project - often carried out by consultants.

3. Findings, lessons learnt and recommendations documented in an evaluation/review report.

3.5 The Role of DWAF in M&E M&E tools, procedures and budget are needed to conduct external M&E. Forests transferred to other agencies should comply with transfer agreements and DWAF policies and principles. The DWAF C,I&Ss for Sustainable Forest Management can contribute in determining whether a project is achieving the broader objectives of the Department. DWAF’s role will among others be to apply the information/findings gained through external M&E to improve policies, programmes and other activities to ensure sustainable forest management in each region.

Project Monitoring and Evaluation 7

Aspects of Monitoring and Evaluation

3.6 Monitoring Levels Traditionally, M&E focused on assessing the inputs and activities of a project. Today the focus is increasingly on measuring the outputs and impacts of a project to achieve a broader development objective or goal. Project inputs, activities and assumptions/risks are also important, however, as they all affect outputs. For example, if the budget (an input) is cut by 50%, this will obviously affect the outputs of the project and will need to be taken into account when conducting the M&E. The various monitoring levels in a project are:2

Input Monitoring Input monitoring is the monitoring of the resources that are put into the project – these include budget, staff, skills, etc. Information on this type of monitoring comes mainly from management reports, progress reports and accounting.

For example, ways of measuring this can be the number of days consultants are is employed, or the amount of funds spent on training and equipment.

Activity Monitoring Activity monitoring monitors what happens during the implementation of the project and whether those activities which were planned, were carried out. This information is often taken from the progress report.

Output Monitoring Output monitoring is the focus of this Guideline and is a level between activity and impact monitoring. This type of monitoring assesses the result or output from project inputs and activities.

The measurements used for output monitoring will be those which show the immediate physical outputs and services from the project.

2 Elements of a project are detailed in DWAF/Danida PFM Guideline: Logical Framework Approach Project Planning (2005)

Project Monitoring and Evaluation 8

Aspects of Monitoring and Evaluation

Impact Monitoring Impact monitoring relates to the objectives of the project. The aim of impact monitoring is to analyse whether the broader development objectives of the project have been met.

Such monitoring should demonstrate changes that are fundamental and sustainable without continued project support.

Assumption/Risk Monitoring Assumption/risk monitoring entails monitoring of external factors (those factors outside the control of the project), defined by project assumptions3, and the risks related to these assumptions not being achieved.

During assumption monitoring it can, for example, be found that local consultants with the necessary expertise are not available; or that there are changes in policies and legislation, which result in outputs not being achieved.

M&E should make it possible to assess:

• Relevance – Does the project/activity deal with broader development objectives of DWAF?

• Effectiveness – Have the impacts, objectives, outputs and activities of the project been achieved?

• Efficiency – Did the process that was followed make the optimum use of the resources and time available in order to achieve the desired outputs?

• Impact – To what extent has the project contributed towards longer term goals such as job creation, poverty alleviation, or a reduction of dependency on forest resources?

• Sustainability – What is the likelihood that efforts will be continued by other agencies after the end of the project?

3 Detailed in DWAF/Danida PFM Guideline: Logical Framework Approach Project

Planning (2005)

Project Monitoring and Evaluation 9

Aspects of Monitoring and Evaluation

3.7 M&E and Stakeholder Participation The participatory approach4 to forest management seeks to enable local communities living adjacent to forests and other local stakeholders to take part in decision-making and share the benefits of forest activities. This participatory approach should also be applied to M&E. Participatory M&E can play an important role in ensuring that the participatory principles are put into practice by:

• Improving the effectiveness of forest project management and decision-making, as the parties who have been involved in M&E will be informed and aware of the results of the M&E procedure;

• Ensuring that accurate and reliable information is communicated

to communities and stakeholders from the M&E process;

• Ensuring that stakeholders understand the reasons for failure in achieving project outputs and objectives and how and what to improve in the future;

• Providing mechanisms for transparency and accountability to

stakeholders;

• Providing DWAF and project management with objective information on the impact of projects on local communities and the forest resources;

• Building community capacity in M&E tools and techniques.

Recommendations from M&E are more likely to be accepted and taken forward by stakeholders, if they have had an active role in shaping them.

4 Refer to DWAF/Danida PFM Guideline: Stakeholder Participation (2005)

Project Monitoring and Evaluation 10

The Monitoring and Evaluation Procedure

4. The Monitoring and Evaluation Procedure

The M&E procedure below sets out the steps in planning and implementing external M&E. The M&E procedure must be tailored to the specific needs of each project, taking into account the project objectives, inputs, outputs, activities, stakeholders and beneficiaries. The M&E steps will vary from situation to situation. Seven key steps are listed in Figure 2 and further detailed in the rest of this chapter. Figure 2: The M&E Procedure

Step 1 Establish the Purpose and Scope of M&E

Step 2

Identify Performance Questions and Indicators

Step 3 Establish M&E Functions and Assign

Responsibilities and Financial Resources

Step 4 Gather and Organise Data

Step 5 Analyse Data and Prepare an Evaluation Report

Step 6 Disseminate Findings and Recommendations

Step 7 Learn from the M&E

Project Monitoring and Evaluation 11

The Monitoring and Evaluation Procedure

4.1 Step 1: Establish the Purpose and Scope of M&E Specifying the purpose and scope of the M&E helps to clarify what can be expected of the M&E procedure, how comprehensive it should be and what resources and time will be needed to implement it. When formulating the purpose of M&E, relevant stakeholders including the project management team, should be consulted or at least made aware of and understand the purpose of the M&E.

Example of an External M&E Purpose

To verify that the development objective and outputs of the forestry project have been achieved within the allocated budget.

With a shared understanding of the overall purpose amongst stakeholders, the next aspect is to clarify the sophistication or scope of your M&E. M&E can be highly complex, requiring advanced technology and considerable technical expertise. Or it can be simple, based mainly on structured discussions with stakeholders without requiring large amounts of technical data. The exact M&E procedure will depend on what is being assessed. For example, monitoring and evaluating a project whose aim is to uplift community members will require a more sophisticated M&E procedure than a project with the aim of establishing a medicinal garden. The purpose will also determine how sophisticated the M&E needs to be. The scope of the M&E may be determined by asking some of the following questions:

• What is the purpose of M&E? • How much money is available for your M&E? • What type of information is required by project management,

DWAF, donor agents or other stakeholders? • What is the level of M&E expertise available? • To what extent should local communities and other stakeholders

participate in the M&E procedure?

Project Monitoring and Evaluation 12

The Monitoring and Evaluation Procedure

4.2 Step 2: Identify Performance Questions and Indicators

4.2.1 Performance Questions A performance question is used to focus on whether a project is performing as planned and if not, why not. Performance questions will be guided by the broader development objective, the project objectives, the project outputs, as well as the M&E purpose. Once performance questions have been identified, it will be easier to decide what information is needed to evaluate the project. Table 1 gives examples of performance questions for the M&E of a particular project. 4.2.2 Indicators Indicators should be guided by performance questions and linked to the purpose of the M&E. Indicators are basically measurements that can be used to assess the performance of the project.

While performance questions help to decide what should be monitored and evaluated, indicators provide the actual measurements for M&E and determine what data needs to be gathered.

The project itself may have indicators by which it monitors it’s own progress – these may be used for external M&E, if relevant. Also the DWAF C,I&S can provide broader indicators that may be relevant to the external M&E of the project. Table 1 provides examples of performance questions and corresponding indicators with regard to a particular project with certain outputs.

Project Monitoring and Evaluation 13

The Monitoring and Evaluation Procedure

Table 1: Examples of Performance Questions and Indicators

PERFORMANCE QUESTIONS

INDICATORS

OUTPUT 1: Improvement of the condition of forests through establishment of a sustainable harvesting system for pole harvesting.

• Has a realistic harvesting

system been put in place for the harvesting of poles?

• Is there understanding and support of the harvesting system by forest users?

• Has there been an improvement in the condition of the forest?

• Harvesting systems are in

place for pole harvesting. • User groups were involved

in the establishing of the harvesting system.

• User groups are involved in the regulation of harvesting levels.

• There is a visible improvement in the regeneration and coppicing of species used for poles.

OUTPUT 2: Entrepreneurial skills among participating households developed.

• What types of skills have

been improved among how many households?

• Do these skills fulfil a need in the project area?

• 30% - 50% of women were

trained in establishing a medicinal plant nursery and 30% - 50% of men trained in tour-guiding.

• At least half of women generated income from plant sales within the first year of establishing their nursery.

• At least half of all the men trained found employment within the first year after training.

Indicators must be able to be verified (or proven). The means of verification generally indicates the source of, and methods to collect the information/data (this is detailed in section 4.4.2). Examples of means of verification are indicated in Table 2.

Project Monitoring and Evaluation 14

The Monitoring and Evaluation Procedure

Table 2: Examples of Indicators and Means of Verification

EXAMPLE OF INDICATORS

MEANS OF VERIFICATION/DATA

COLLECTION METHODS

• 30% - 50% of women were

trained in establishing a medicinal plant nursery and 30% - 50% of men trained in tour-guiding.

• At least half of women generated income from plant sales within the first year of establishing their nursery.

• At least half of all the men trained found employment within the first year after training.

• Training course proceedings

reports with evaluations. • Quarterly and annual project

reports. • Socio-economic surveys and

interviews with household heads. • Local government economic

surveys. • Chamber of commerce document

reviews.

Indicators, and therefore the data needed to verify them, can be qualitative or quantitative. Quantitative data is factual while qualitative information is based on opinions and perceptions and thus may be subject to further interpretation. During M&E, one should aim to have both qualitative and quantitative indicators. Table 3 provides examples of quantitative and qualitative indicators.

Project Monitoring and Evaluation 15

The Monitoring and Evaluation Procedure

Table 3: Examples of Quantitative and Qualitative Indicators

INDICATOR TYPES

EXAMPLES

Quantitative Indicators

• Fifty bundles of poles are harvested each month. • Five training courses were run during the project.

Qualitative Indicators

• The pole harvesters regard the harvesting system as being

sustainable. • Those who attended the training courses perceived the

courses to be meeting the demands for skills in the area.

Each indicator should be:

• Relevant – The indicators should be directly linked to the project objectives/outputs.

• Technically feasible – The indicators should be capable of being

verified or measured and analysed.

• Reliable – The indicators should be objective: i.e. conclusions based on them should be the same if they are assessed by different people at different times.

• Usable – People carrying out the M&E should be able to

understand and use the information provided by the indicators to evaluate the project.

• Participatory – Relevant stakeholders should be involved in the

collection of information generated by the indicators, the analysis of the information and possible use of the information in the future.

Project Monitoring and Evaluation 16

The Monitoring and Evaluation Procedure

4.3 Step 3: Establish M&E Functions and Assign Responsibilities and Financial Resources

Establishing M&E functions and responsibilities at the beginning of the procedure can help to avoid major communication issues, conflicts of interest, duplication of tasks and wasted efforts. Organising responsibilities means deciding which stakeholders will be involved and clarifying and assigning roles to these stakeholders as well as to DWAF officials, project management and any partner organisations. Stakeholders may need to be trained in the different aspects of the M&E procedure. M&E will require financial resources in accordance with the type of project(s) that is being evaluated as well as the M&E purpose, performance questions and indicators. Among the items that should be included in M&E costs are:

• Staff salaries; • Fees and expenses for consultants; • M&E training; • Organising M&E meetings and other participatory exercises.

Consultants can play an important role in enabling DWAF to fulfil its M&E responsibilities by providing specialist knowledge and expertise that may not be readily available in the organisation. In order to gain the maximum benefit from the use of consultants, steps should be taken to ensure that:

• Consultants are used strategically for M&E in ways that capacitate people and build on existing M&E systems and procedures;

• Efforts are made to enable as much continuity as possible so that consistent advice is available and valuable time is not lost providing briefings to new consultants;

• Consultants are provided with clear TOR5.

5 Annex 1 provides a format for a TOR for undertaking project M&E

Project Monitoring and Evaluation 17

The Monitoring and Evaluation Procedure

4.4 Step 4: Gather and Organise Data 4.4.1 Gathering Data Data is the oxygen that gives life to M&E. However, selecting methods of data collection can be confusing, unless it is approached in a systematic fashion. Rarely is any one method entirely suitable for a given situation. Instead, using multiple methods helps to validate M&E findings and provides a more balanced and holistic view of project progress and achievements. The performance questions and indicators will provide guidance in deciding what data/information to gather and the methods to be used. Data can either be primary or secondary.

Primary and Secondary Data

Primary data is new data that comes from an original source and has not been edited or amended in any way. Secondary data is data that exists in the form of reports, documents, maps, diagrams, etc.

In deciding whether to use primary or secondary data (or both) the key question is: Based on the specific data requirements (M&E purpose, performance questions, indicators), does some or all of the necessary data already exist as secondary data?

If the answer to this question is ‘no’, or if the available secondary data does not completely provide all the data required to monitor and evaluate the project, then primary data will need to be gathered. Both primary and secondary data can be gathered from a variety of sources including individuals, groups, libraries and government agencies. Internal project M&E also requires the collection of data, documentation and reports, and a significant amount of the data that is used for external M&E should come from the internal monitoring process.

Project Monitoring and Evaluation 18

The Monitoring and Evaluation Procedure

4.4.2 Data Sources and Data Collection Methods

Potential data sources and data collection methods are listed and explained below:6

Document Review Documents and reports provide a rich source of information for M&E. Key documents that can be used to assess project performance include:

• Progress reports compiled during implementation of the project; • Data collected for the chosen project indicators and any other

relevant internal M&E information7; • Technical reports, correspondence, records and budgets; • Media articles; • Statistics, reports, case studies and other documents published by

government, business, research and other institutions. Interviews Interviews are face-to-face meetings with individuals and/or groups, which, in this context, are usually fairly informal and semi-structured. While interviews can often be time-consuming, they can provide a rich source of data, particularly in regard to qualitative and sensitive information that may not be readily available in official documents.

6 For more examples refer to DWAF/Danida PFM Guideline: Stakeholder Participation

(2005) 7 Detailed in DWAF/Danida PFM Guideline: Logical Framework Approach Project

Planning (2005)

Project Monitoring and Evaluation 19

The Monitoring and Evaluation Procedure

Recommendations for Interviews

• Clearly explain what the purpose of the interview is. • Prepare a short checklist of themes in advance and ask simple

questions using plain language. • Introduce yourself and let the other parties introduce

themselves. • Be objective, open-minded and listen carefully to both what is

being said and what is not being said. • Use open questions so that the answer cannot just be a plain

‘yes’ or ‘no’. • Avoid leading questions and never help respondents with

answers. • Encourage participation of ‘quiet’ respondents but never

intimidate by asking questions such as ‘What do you think?’ • Do not raise expectations.

Surveys and Questionnaires Surveys and questionnaires provide a way of obtaining information from a large number of people. Questions should be relevant and simple to answer. If the interviews, survey or questionnaires are being used to collect a lot of information or information that is complex, it may be appropriate to get a consultant or research institution to provide professional assistance in designing the questions and processes. Field Visits and Transect Walks Visits to the site of a project can provide valuable information about the environment in which the project is taking place, its impact on beneficiaries and the working methods that are being used. Transect walks are an effective participatory method to gather this information. Expert Opinion Obtaining the views of experts who are knowledgeable about particular aspects of the project’s activities can in some instances provide valuable insights that may not be revealed by other methods of data collection.

Project Monitoring and Evaluation 20

The Monitoring and Evaluation Procedure

4.4.3 Organising and Storing Data Data needs to be captured, organised and stored so that it can be readily used for the M&E purposes. The task of organising data often gets lost in the gap between data collection and analysis. It requires some attention as it can greatly assist analysis if undertaken well, and can cause problems if done poorly. Proper capturing, organising and storage is particularly important when information has been collected from different sources with different methods.

M & E Data Storing For effective data storage, decide:

1. What data needs to be stored?

2. Who needs access to the data and when?

3. How should the data be stored? Maps, photographs, questionnaires or data sheets may need to be filed as hard copies; or data can be typed into a computer using programmes such as such as Excel, Access or Word software; or if not too complicated or too much, data can be copied down onto an appropriate form or table (an example is given in Annex 2).

4. What data needs to be kept up to date and what data can be put into an archive for use at a later stage?

5. Can existing data capturing systems such as the National Forest Inventory Application be used?

4.5 Step 5: Analyse Data and Prepare an Evaluation

Report The captured and organised data needs to be analysed, and findings and recommendations summarised and compiled into a report. 4.5.1 Analysing Data To ensure that the information gathered will be effectively used, the assessment or analysis of the data/information should be properly organised and carried out for each performance question using the associated indicator(s).

Project Monitoring and Evaluation 21

The Monitoring and Evaluation Procedure

In this regard, the performance questions and indicators can provide important assessment tools for the analysis. A final comparison with the outputs and impacts of the project should then be made. In this way performance, progress and achievements of the project can be assessed. In some cases several participatory meetings may need to be arranged to get more feedback, or consult experts in the field to make sure you have accurately evaluated the information. It may also be valuable to compare relevant aspects of other similar projects to the project that is being monitored – in this way the experiences and lessons learnt during other projects can be shared. 4.5.2 Reporting Feedback and reporting are key to both internal and external M&E as, in this way, information can be meaningfully combined, explained, compared and presented. All reporting should thus be as accurate and relevant as possible. As mentioned earlier, external M&E will frequently use the internal project progress reports and other relevant information as part of the information gathered to externally monitor and evaluate the project. For external M&E the report is usually called an evaluation or review report. An evaluation report for external M&E should be related to the outputs and impacts (or development objectives) of the project. The evaluation report(s) should indicate what information was collected and the methods used to collect it; the outcomes of the evaluation of this information based on the performance questions and indicators; whether the outputs and impacts of the project were achieved and if not, why not, as well as recommendations for improvement of the project or for application to other projects, practices and policies. The presentation of evaluation findings and recommendations should be clear, logical and based on facts and sound judgement. The report(s) should be in a format that is accessible and understandable to the target audience(s)8. 8 An example of a format for an evaluation report is provided in Annex 3

Project Monitoring and Evaluation 22

The Monitoring and Evaluation Procedure

4.6 Step 6: Disseminate Findings and Recommendations The evaluation reports, or summaries of these reports, should be widely distributed and presented to decision-makers and key stakeholders – including those who were consulted in the M&E process. Key communication issues are: Ensure clarity of message for specific audiences The interests and concerns of different audiences will vary and as a result, evaluation reports will need to be adapted to the specific needs of each audience. Reports should communicate different levels of details according to the audience being addressed. For example, donor agencies and DWAF national office may require reviews of the project at a more general and strategic level, whereas regional staff and project management may require more detail at an operational level in order to improve the day-to-day implementation of this and/or future projects. Agree on the frequency of communicating information It should be decided at the initial stages of the M&E procedure whether only one final evaluation report should be compiled, distributed and presented or whether there should be a report at different stages during the M&E procedure. This will be determined by the nature of the project and the purpose of the M&E. Consider location and time To ensure maximum participation by all the relevant stakeholders, careful consideration should be given to where and when meetings will be held to give feedback on evaluation report findings. It would be valuable to present reports at a time when the recommendations can be fed into decision-making meetings. Besides meetings, there are many other ways to communicate the findings, lessons learnt and recommendations to your broader stakeholder group.

Project Monitoring and Evaluation 23

The Monitoring and Evaluation Procedure

Some information dissemination channels include the following:9

- Bulletin boards/ community notice boards/signs; - Existing newsletters and free publications; - Broadcast announcements/ advertisements; - Press releases or newspaper inserts; - Brief presentations/ announcements at other municipal meetings; - Project newsletter.

4.7 Step 7: Learn from the M&E Knowledge gained through M&E lies at the core of DWAF’s organizational learning process. M&E provides information and facts that, when analysed, understood and accepted, become knowledge that can be used to improve forest management. Besides learning about the progress/achievements of the project outputs, etc, it is essential to learn from what works regarding partnership strategies, project design and implementation, and to feed this knowledge back into ongoing and future projects and policies. This information also provides a means to regulate the sustainable management of state forests by other agencies. Project evaluations can help to bring development partners together, and when this occurs the learning from M&E goes beyond DWAF to stakeholders involved in other development and natural resource management activities.

9 Detailed in DWAF/Danida PFM Guideline: Stakeholder Participation (2005)

Project Monitoring and Evaluation 24

Annex 1: Terms of Reference for External M&E

Annex 1: Terms of Reference For External M&E

The Purpose of M&E Terms of Reference M&E is often carried out by an external team and thus requires a TOR. TOR is used to explain why the M&E is being carried out, what is going to be evaluated and how it is to be conducted. A TOR provides the reason and motivation for a task to be carried out and identifies broad considerations, roles, tasks, and responsibilities for the process. The TOR should establish a clear understanding of the tasks and clarify the essential elements involved. The length and level of detail contained in the TOR will depend on the scope and complexity of the M&E. More specifically, TOR will:

• Describe the project being evaluated and the performance context;

• Identify reasons for the M&E; • Establish the scope and purpose; • Guide implementation; • Indicate schedules and time frames; • Provide guidance on the bidding procedures if the M&E is being

put out to tender. Initial considerations Before a TOR is prepared, one should have a basic understanding of:

• The purpose of the M&E; • The issues to be addressed; • The resources available for conducting the evaluation; • The anticipated costs; • The expertise required to complete the evaluation; • The time frame for completion.

Project Monitoring and Evaluation 25

Annex 1: Terms of Reference for External M&E

The budget for the M&E must be consistent with expectations of the M&E task, and include the anticipated costs for travel, communications, report production, etc. An estimate should be made of the cost for professional fees based on previous consultancy exercises of a similar nature. The budgeted cost should be broken-down between the amount allocated to professional fees and reimbursable (out-of-pocket) expenses. Where appropriate, reimbursable expenses should be identified by cost item such as transport, accommodation, stationary, etc. Unless a realistic perspective is taken, insufficient funding can result in having to amend the contract during the M&E procedure. A budget should include contingencies to cover planning deficiencies and unanticipated elements. Details of the estimated budget should remain confidential to DWAF officials, as in most cases bidding teams are expected to compete on price so that DWAF can obtain the best value for money. Format for TOR The format for TOR below is general and should be adapted to suit the particular needs of each evaluation.

List of Contents for M&E TOR

1. Introduction

2. Purpose of the M&E

3. Scope of the M&E

4. Evaluation Outputs

5. Methodology

6. Organisation of the M&E

7. M&E Team

8. M&E Budget

1. Introduction

A brief description of the project that is to be evaluated, including the projects objectives, outputs and impacts. Also identify the key stakeholders, partners and beneficiaries.

Project Monitoring and Evaluation 26

Annex 1: Terms of Reference for External M&E

2. Purpose of the M&E Describe how the need for the M&E was identified, explain why the M&E is being undertaken and why it is being undertaken at this point in time.

3. Scope of the M&E

Include issues the M&E is expected to address and focus on – performance question, if developed, should be included here. The scope could include: geographic coverage of the project; time-frame of the project to be covered by the evaluation; and issues pertaining to the achievement or non-achievement of project objectives, outputs and impact as well as data gathering and storing methods.

4. Evaluation Outputs

Describe what outputs are required from the M&E – this is usually in the form an evaluation report(s) and presentation(s). Be clear regarding when and how many evaluation reports are required during the M&E procedure as well as the contents of the report(s)10.

5. Methodology

Explain key elements of the methodology that should be used by the M&E team. This may include guidance regarding: document review (desk study); interviews; field visits; questionnaires; participatory techniques and other approaches for the gathering, storing and analysis of data; participation of stakeholders and/or partners.

6. Organisation of the M&E

Describe how the M&E is to be organised, outlining the roles and responsibilities of the M&E team, project management, DWAF officials and any other stakeholders.

7. M&E Team

Describe what is expected of the evaluation team in terms of qualifications, experience and expertise.

8. M&E Budget Include the overall budget ceiling for the M&E and specify whether the price should include VAT or not, and whether different components of the M&E should be costed separately. Do not provide details of the budgeted cost of the M&E. This will be put in the contract once the consultants have been hired.

10 Refer to Annex 3

Project Monitoring and Evaluation 27

Annex 2: Example of a Monitoring Form

Project Monitoring and Evaluation 28

Annex 2: Example of a Monitoring Form

Below is an example of a simple form to capture key data through entries into a table. In this case it is data from community project proposals, which is being gathered. Details of each proposal is captured in the table, and the tables are then grouped into: 1) pending proposals 2) approved proposals (projects being implemented) 3) rejected proposals. As additional data is obtained the tables are updated in the last column. This provides an effective system to store and continuously update details of what and when community project proposals come in and whether they are being implemented or not.

Date:

Project Overview Involved

Stakeholders/ Beneficiaries

Budget Preparation Status/

Comments

DWAF Region: Title: Brief Project Description: Implementing Body: Contact Person: Staff Responsible: Start Date: Exp. End Date: First Received: Latest Updated:

Annex 3: Format for an Evaluation Report

Project Monitoring and Evaluation 29

Annex 3: Format for an Evaluation Report

Framework format for an Evaluation Report:

List of Contents for an Evaluation Report

1. Executive Summary

2. Introduction

3. Purpose, Performance Questions and Indicators

4. M&E Context

5. Project Context

6. Methodology

7. Review of Project Outputs and Impacts

8. Findings and Recommendations

1. Executive Summary

An overview of the report comprising about 1 page. 2. Introduction

Why and when the M&E procedure was carried out and a brief description of the project being evaluated.

3. Purpose, Performance Questions and Indicators

List the purpose, performance questions and indicators of the M&E. 4 M&E Context

Mention any issues, constraints or factors that affected the M&E procedure.

Annex 3: Format for an Evaluation Report

5. Project Context Mention the outputs and development objectives of the project that is being evaluated as well as any issues, constraints or factors that affected the implementation of the project. Include a review of the external assumptions for the project, if appropriate.

6. Methodology

Describe the methods used to gather, store and analyse data and the participatory and consultative process that were employed during the M&E procedure.

7. Review of Project Outputs and Impacts

Based on the data analysis, briefly assess the outputs and impacts of the project achieved so far, with regard to the performance questions and indicators. Describe whether the outputs and impacts have been achieved and if not, why not. If relevant, mention the link between the implementation of existing policies, principles and practices and how these can be improved or new ones developed.

8. Findings and Recommendations

Discuss the findings of the M&E procedure and detail recommendations for this project and/or for future projects based on 7. above. Recommendations must be relevant to the purpose of the M&E.

The Project Planning Matrix11 and the Project Implementation Plan may be attached as an Annex to the Evaluation Report.

11 Detailed in DWAF/Danida PFM Guideline: Logical Framework Approach Project Planning (2005)

Project Monitoring and Evaluation 30

Annex 4: Example of Implementing an M&E Procedure

Project Monitoring and Evaluation 31

Annex 4: Example of Implementing An M&E Procedure

Monitoring and Evaluating a PFM Committee Initial Considerations Clarify why the M&E is being carried out and it’s purpose. The objectives of the Committee and when it was established will help to clarify what should be evaluated – one cannot evaluate something the Committee was not set up to do. Ensure that sufficient time, financial resources and expertise are available to conduct the M&E. Clarify the role and responsibilities of stakeholders who will be involved in the M&E. If it is felt that technical expertise needs to be brought in, this should be identified at the beginning, as it will need to be budgeted for. Some initial information gathering is necessary regarding the context within which the PFM Committee is operating. This information should be available from minutes of Committee meetings, the Committee’s constitution, or other documents. Finally, decide how and when the M&E findings should be presented, i.e. the nature of the written report and the type of dissemination events and meetings that should be organised to communicate the findings. The Key Steps to Follow The key steps to follow for M&E of a PFM Committee are:

1. Establish the purpose and scope of the M&E.

2. Identify the performance questions and indicators.

3. Establish M&E functions and assign responsibilities and financial resources.

4. Gather and organise the data.

Annex 4: Example of Implementing an M&E Procedure

5. Analyse the data and prepare an evaluation report.

6. Disseminate the findings and recommendations.

7. Learn from the M&E.

Step 1: Establish the Purpose and Scope of the M&E Keeping in mind why the PFM Committee was formed and its objectives, plus what information is required by the M&E procedure, decide on the M&E purpose. An example of an M&E purpose could be: To ensure that the PFM Committee is achieving its objectives and that there are noticeable economic and environmental improvements in the area. Explain the purpose to the members of the Committee and involve them in deciding on the scope and methodologies to use for the M&E, plus their roles and contributions in the procedure. Be aware of the amount of money available and the availability of expertise that you may require and factor this in during all steps of the M&E procedure. Step 2: Identify the Performance Questions and Indicators

Decide on the performance questions that need to be answered in order to achieve your M&E purpose. Examples of performance questions:

• Are all members of the Committee fully aware of their roles and responsibilities as well as the objectives of the Committee?

• Is the Committee achieving its objectives?

• Has the Committee succeeded in improving conditions of the forest as well as improving the economic situation of the target group?

Once performance questions have been decided on, identify useful indicators and any other measurements that may be needed to answer these performance questions. Examples are provided below:

Project Monitoring and Evaluation 32

Annex 4: Example of Implementing an M&E Procedure

Performance Questions

Indicators

Are all members of the committee fully aware of their roles and responsibilities as well as the objectives of the Committee?

• All committee members can fully

explain their roles and responsibilities within the Committee as well as the objectives of the Committee.

Is the Committee achieving its objectives?

• 60% of the objectives set out in the

List of Objectives/Action Plan have been achieved so far and progress is being made in achieving the rest.

Has the Committee succeeded in improving conditions of the forest as well as improving the economic situation of the target group?

• An improvement in the condition of

the forest can be seen. • Harvesting systems are in place for

sustainable use of forest products. • There has been an improvement of

economic conditions in 50% of households in the area.

Step 3: Establish M&E Functions and Assign Responsibilities and

Financial Resources Decide what activities will need to be carried out to achieve the M&E purpose and what functions the M&E team will have. Assign responsibilities to the M&E team. Decide which stakeholders should be included in the procedure. In this example, the Committee members should be involved in collecting, organising and analysing the data. Forest user groups and members of the target groups will also benefit from being involved in the M&E process – they can assist in data gathering and should be involved in the analysis of the data as they are the ones directly affected by the proper functioning of the PFM Committee. All involved stakeholders will have to undergo the necessary training to ensure proper understanding and implementation of the procedure.

Project Monitoring and Evaluation 33

Annex 4: Example of Implementing an M&E Procedure

Step 4: Gather and Organise the Data

Once performance questions and indicators have been chosen, the sources of data and data gathering methods will need to be identified in order to verify these indicators. A decision will have to be made regarding who will collect the information - these people must have the necessary skills. As mentioned in Step 3 where appropriate, use Committee members, users of the forest and other appropriate stakeholders in the data-capturing process. Examples of data gathering methods that can be used for the performance questions and indicators chosen here are provided below:

Performance Questions

Indicators

Data Gathering

Methods

Are all members of the committee fully aware of their roles and responsibilities as well as the objectives of the Committee?

• All committee members

can fully explain the roles and functions of the Committee and their roles and responsibilities within the Committee.

• Interviews with each

member of the PFM Committee.

Is the Committee achieving its objectives?

• 60% of the objectives set

out in the constitution have been achieved so far and progress is being made in achieving the rest.

• Interviews with each

member of the PFM Committee.

• Interview with the target group and forest users.

• Review of minutes of meetings, progress/activity reports.

Project Monitoring and Evaluation 34

Annex 4: Example of Implementing an M&E Procedure

Has the committee succeeded in improving conditions of the forest as well as improving the economic situation of the target group?

• An improvement in the

condition of the forest can be seen.

• Harvesting systems are in place for sustainable use of forest products.

• There has been an improvement of economic conditions in 50% of households in the area.

• Review of baseline data

to establish the condition of the forest before the Committee was established.

• Transect walks and field visits to the forest to determine whether conditions have improved.

• Interviews with foresters, forest users, forest guards and forest users regarding harvesting systems.

• Economic surveys of households in the area through interviews and questionnaires with household heads of target group.

Systems to organise, file and store the data will need to be in place so that information is not lost, confused or presented in the wrong format. Use computer software such as MS Excel, MS Word or Access, or systematically copy all relevant data/information down and file it under appropriate headings. Step 5: Analyse Data and Prepare an Evaluation Report

Once all data/information has been gathered, an analysis should be systematically carried out. In some cases experts may need to be consulted to make sure the data has been analysed accurately. Some examples of analysis of the data related to the performance questions are given below:

Project Monitoring and Evaluation 35

Annex 4: Example of Implementing an M&E Procedure

Performance Questions

Key Issues of the Analysis

Are all members of the Committee fully aware of their roles and responsibilities as well as the objectives of the Committee?

• It is essential that all Committee members are fully aware

of their roles and the functions of the Committee. If this is found not to be the case it is unlikely that the Committee is functioning effectively and the reasons for this need to be investigated and solutions found.

Is the Committee achieving it’s objectives?

• If less than 60% of objectives have been achieved but

clear progress is being made with the others, reasons for the slow progress will have to be investigated.

• If very few objectives have been achieved and little progress is underway, it is likely that the Committee is not functioning effectively and the reasons for this need to be investigated – one reason may be found in the first performance question: that the Committee members are not clear about their roles and responsibilities.

Has the Committee succeeded in improving conditions of the forest as well as improving the economic situation of the target group?

• If it is found that all new and existing projects and

activities that the Committee has initiated or linked up with, are still functioning to some extent, yet there is no improvement in forest conditions nor an improvement in the economic situation of the target community, it may be that the PFM Committee is not being effective in ensuring that (i) controls in harvesting levels are enforced; or (ii) the benefits go to the targeted community.

Once analysis of the data is complete, the evaluation report should be compiled. This report should include the performance questions, indicators, data gathering methods, analysis, findings and recommendations on measures that could be taken to improve the performance of the PFM Committee12.

12 A general format for an evaluation report is provided in Annex 3

Project Monitoring and Evaluation 36

Annex 4: Example of Implementing an M&E Procedure

Examples of the type of recommendations that could be developed in relation to the performance questions in this example are provided below:

Performance Questions

Some Typical Recommendations

Are all members of the Committee fully aware of their roles and responsibilities as well as the objectives of the Committee?

• Committee members should discuss and agree on their

roles and responsibilities and the functions of the Committee as a matter of urgency. This should be incorporated into a formal document, if one does not exist, such as a constitution, policy, list of goals and objectives or an action plan.

Is the Committee achieving it’s objectives?

• If some clear progress is being made with achieving

objectives, even if slowly, it may simply require a meeting with all members and stakeholders to identify and clear any bottlenecks and obstacles.

• If no progress is being made, a facilitator may need to be

employed to identify the obstacles and provide a way forward. Capacitation of members and stakeholders may be required to ensure that everyone is comfortable and capable of carrying out their roles and responsibilities.

Has the Committee succeeded in improving the conditions of the forest as well as improving the economic situation of the target group?

• The Committee should urgently establish why the

conditions of the forest have not improved and why the target community have not received any benefits. In an attempt to rectify these shortcomings, the Committee should make an effort to link up with conservation and development programmes in the area, as well as be involved with the local government in the establishment of the IDP.

Project Monitoring and Evaluation 37

Annex 4: Example of Implementing an M&E Procedure

Step 6: Disseminate Findings and Recommendations A draft copy of the evaluation report should be sent to those who were involved in the M&E process. Once comments have been received, they should be included in the final report. This report should then be distributed to the organisation that requested the M&E as well as those who were involved in the establishment of the PFM Committee and the M&E procedure. If time and finance permits, a summary of the evaluation report should be made and distributed to a wider audience. Workshops, bulletins and other communication methods should be used to disseminate/present the key findings and recommendations so that other committees and structures can learn from the shortcomings and strengths of this Committee. Step 7: Learn from the M&E

The findings and recommendations contained in the evaluation report should be applied to the PFM Committee in order to improve it’s functioning and successfully achieve it’s objectives. All future PFM Committees and other structures should also benefit from the M&E that has been carried out. Also, the findings and any lessons learnt during the M&E can be used to improve forest management policies and guidelines.

Project Monitoring and Evaluation 38

Annex 5: Glossary

Project Monitoring and Evaluation 39

Annex 5: Glossary

Assess To evaluate or judge a project or activity. Baseline data The initial situation before any input or influences has taken place. It is the set of basic conditions against which comparisons are made later. Beneficiary Person or group who should experience improved conditions (benefits) as a result of the project/activity. Data Facts or figures which are collected during monitoring and evaluation and used as a basis for making calculations, recommendations or drawing conclusions. Donor agency The agency, which is supplying finances, and often expertise, to carry out a project. Evaluation A selective and periodic exercise that attempts to objectively assess the overall progress and worth of a project as well as the achievement of project outputs and impacts. It uses the information gathered through monitoring and other research activities and is carried out at particular points in the project cycle. External monitoring and evaluation Monitoring and evaluation carried out by an outside team, which is not directly responsible for the management or implementation of the project. Forest resources Any products, benefits and services provided by the forest to a person or a group of people.

Annex 5: Glossary

Indicator Measurements that can be used to assess the performance of the project. Inputs The financial, human and material resources necessary to produce the intended outputs and achieve the objectives of a project. Internal monitoring and evaluation Monitoring and evaluation that is built into the design of a project or activity and is usually undertaken by the team that is responsible for the management and implementation of the project. Logical Framework Approach (LFA) A method of project planning that attempts to identify logical connections between the inputs, activities, outputs, objectives and impacts of a project. Means of verification The means of verification indicates where the evidence or data can be found to prove that objectives or outputs have been met, or to prove or verify the indicator. Monitoring The ongoing assessment of the performance of a project, which seeks to provide management and other key stakeholders with early indications of progress, or lack thereof, in the achievement of key milestones and outputs. It involves the systematic collection of information, and regular reporting on significant trends and variations. Objectives A specific statement detailing the desired achievements or impacts of a project at different levels. Outputs The tangible results that a project aims to achieve after undertaking certain activities. They include written tasks, products, services or any other deliverable required to achieve project objectives.

Project Monitoring and Evaluation 40

Annex 5: Glossary

Participatory Forest Management (PFM) This is an approach adopted by the Department of Water Affairs and Forestry which seeks to ensure that there is a sustainable flow of benefits to stakeholders and also that there is shared responsibility between participants and the state. Performance questions The questions that are developed at the initial stages of the M&E procedure to assess whether the project is performing as planned and if not, why not. They help to decide what information is needed to evaluate the project. Primary data New data that comes from an original source and has not been edited or amended. Project A systematically planned task or intervention designed to achieve specified objectives within a given budget and timeframe. Project proposal A summary and description of a project, including budgets - generally used when approaching management for approval or donors for funding Qualitative indicators Indicators/data relating to the nature or character of something rather than its size or quantity, and usually describes people’s knowledge, attitudes or behaviour. Quantitative indicators Indicators/data relating to the size or quantity of something and is capable of being expressed in numerical terms. Reimbursable expenses The running expenses that the consultant may incur (e.g. stationary, accommodation, transport) over and above his/her fees. Secondary data Data is not original data but exists in the form of reports, documents, maps, diagrams, etc.

Project Monitoring and Evaluation 41

Annex 5: Glossary

Stakeholder An individual group, institution, organisation (government or non-government) or business, amongst others, that could affect, or be affected by the outcome of a particular activity, process or project. Target groups are always stakeholders, whereas other stakeholders are not necessarily target groups. Sustainable forest management The process of managing forests to achieve specific objectives with regard to the production of goods and services without the degradation of its condition or a reduction of its value and future productivity. Target group A group of people who will benefit directly, in a measurable way, from interventions and assistance. Terms of Reference (TOR) A document or plan of action which provides the reason and motivation for a task to be carried out and identifies broad considerations, roles and responsibilities, objectives, results, activities and organisation for the process.

Project Monitoring and Evaluation 42

Annex 6: List of References

Project Monitoring and Evaluation 43

Annex 6: List of References

The following list provides an extensive range of useful documents and websites for the various aspects of M&E.

DEPARTMENT OF WATER AFFAIRS AND FORESTRY (2002): Draft Principles, Criteria, Indicators and Standards for Sustainable Forest Managements of Natural Forests and Plantations in South Africa. Pretoria

This document provides details and definitions of Principles, Criteria, Indicators and Standards and the participatory methodology used to develop the draft document. It includes Measures which describe aspects of the Indicators and recommends methods for implementing these Criteria, Indicators, Measures and Standards.

DEPARTMENT OF WATER AFFAIRS AND FORESTRY/DANIDA (2004): PFM Guideline: Logical Framework Approach Project Planning. Pretoria

The manual guides the process of preparing and documenting a project. It provides valuable information on the Logical Framework Approach (LFA) and guides the reader to prepare, plan, budget, implement, monitor and document a project. Included are also a Project Planning Matrix and tools for internal M&E.

DEPARTMENT OF WATER AFFAIRS AND FORESTRY/Danida (2005): PFM Guideline: Stakeholder Participation. Pretoria

The manual guides the process of stakeholder participation in the context of PFM. It provides valuable information on the procedure of participation as well as guidance on the disseminating, gathering and sharing of information.

IFAD (2002): Managing for Impact in Rural Development: A Guide for Project M&E.

The Guide provides comprehensive advice on how to set up and implement an M&E system, plus background ideas that underpin the suggestions. It has been written to help project managers and M&E staff improve the quality of M&E in IFAD-supported projects.

Annex 6: List of References

INTER AMERICAN DEVELOPMENT BANK (IADB): A Management Tool for Improving Project Performance,

www.iadb.org/cont/evo/EngBook/engbook.htm

This Evaluation Handbook for headquarters and field office staff of the IADB presents various tools for evaluation at the levels of project design, implementation or monitoring, and project completion/impact.

OECD/DAC: Evaluation Criteria,

www.oecd.org//dac/Evaluation/htm/evalcrit.htm

The site presents general criteria for evaluation and monitoring that are endorsed by the OECD-DAC members. It also lists key questions under each criteria (i.e., relevance, effectiveness, efficiency, impact and sustainability).

OECD/DAC (1998): Effective Practices in Conducting a Joint Multi-Donor Evaluation, www.oecd.org/dac/

This report outlines key steps in how to plan and conduct a joint evaluation of development programmes when more than one donor agency is involved. The guide serves as a useful tool for those who seek to promote joint evaluation and collaboration among donor agencies.

OECD/DAC (1991): Principles for Evaluation of Development Assistance, www.oecd.org/dac/Evaluation/pdf/evalprin.pdf

The DAC has drawn up a series of policy principles addressing key areas of aid programming and management, including project appraisal, programme assistance and technical cooperation. The set of principles described in the paper state the views of DAC members on the most important requirements of the evaluation process based on current policies and practices as well as donor agency experiences with evaluation and feedback of results.

Project Monitoring and Evaluation 44

Annex 6: List of References

OECD/DAC (1998): Review of the DAC Principles for Evaluation of Development Assistance, www.oecd.org/dac/Evaluation/pdf/eval.pdf

This review examines the implementation and use of the Principles in order to assess their impact, usefulness and relevance. The Principles include: purpose of evaluation, impartiality and independence, credibility, usefulness, participation of donors and recipients, donor cooperation, evaluation programming, design and implementation of evaluations, reporting, dissemination and feedback, and decentralized evaluation systems.

OECD/PUBLIC MANAGEMENT SERVICE (1999): Improving Evaluation Practices: Best Practice Guidelines for Evaluation and Background Paper, www.oecd.org/puma

The guidelines identify key issues and practices to improve the use of evaluation. The guidelines focus on the management of evaluation activities in government and management of individual evaluations rather than on methodological questions.

OECD/WORKING PARTY ON AID EVALUATION (2001): Glossary of Terms in Evaluation and Results-Based Management,

www.oecd.org/dac/htm/glossary.htm

Reference guide that provides an overview of the terms included in OECD members’ glossaries and database of terms and definitions in 15 agencies.

OLIVE (2002): Planning for Monitoring and Evaluation. The Monitoring and Evaluation Handbook.

The third in a series dealing with project planning in a development context, this handbook is targeted at NGOs and seeks to provide an overview of some of the central concepts, tools and techniques of project monitoring and evaluation.

Project Monitoring and Evaluation 45

Annex 6: List of References

RESEARCH METHODS KNOWLEDGE BASE,

trochim.human.cornell.edu/kb/index.htm

This is a comprehensive web-based textbook that addresses all of the topics in a typical introductory course in social research methods. It covers the entire research process, including formulating research questions, sampling, measurement (surveys, qualitative), research design, data analysis and writing up the study. It also addresses the major theoretical and philosophical aspects of research.

SWISS AGENCY FOR DEVELOPMENT AND COOPERATION (2000): External Evaluation: Are we doing the right things? Are we doing things right?

The guidelines are divided into two sections. Part I explains the terminology and principles of evaluation. Part II discusses each of the five stages of an external evaluation. These guidelines are primarily addressed to organizations that sponsor or participate in evaluations and are responsible for implementing their results.

THE GATEWAY TO DEVELOPMENT INFORMATION (ELDIS): Methods, Tools and Manuals, nt1.ids.ac.uk/eldis/hot/pm3.htm

This site contains a range of guidelines and manuals to help development practitioners in carrying out participatory monitoring and evaluation.

THE M AND E NEWS, www.mande.co.uk/

The M and E News is a news service focusing on developments in monitoring and evaluation methods relevant to development projects with social development objectives.

UNDP (2002): Handbook on Monitoring and Evaluating for Results, www.undp.org/eo/

This Handbook is intended to support the UNDP and its development partners in aligning their monitoring and evaluation systems with results based management methodology. It aims to provide simple, flexible and forward-looking tools.

Project Monitoring and Evaluation 46

Annex 6: List of References

UNDP, OESP (1997): Who Are the Question-makers? A Participatory Evaluation handbook, intra.undp.org/eo/methodology/methodology.html

The handbook complements the UNDP Handbook on Monitoring and Evaluating for Results. It is intended for those wanting more guidance on participatory evaluation methods and includes a discussion of some of the practical issues involved in doing such an evaluation.

UNFPA (2000): Monitoring and Evaluation Methodologies: The Programme Manager’s M&E Toolkit, bbs.unfpa.org/ooe/me_methodologies.htm

The Toolkit provides guidance and options for UNFPA country offices to improve monitoring and evaluation activities in the context of results-based management. Of specific interest are tools discussing stakeholder participation in evaluation, planning evaluations, the data collection process, managing the evaluation process, and communicating and using evaluation results.

UNICEF 91991): A UNICEF Guide for Monitoring and Evaluation: Making a Difference?

This manual covers UNICEF monitoring and evaluation policies and procedures. Section I discusses the importance of monitoring and evaluation. Section II addresses the organization of monitoring and evaluation by delineating roles and responsibilities in UNICEF (HQ and country office) and the role of national governments. Section III presents the scope of monitoring and how it can be used at the level of projects/programmes and higher development outcomes (e.g., the situation of women and children). Section IV presents the scope of evaluations; guidelines for how to plan, manage and conduct evaluations; and the use of evaluation findings.

UNITED NATIONS CHILDREN’S FUND (UNICEF): A Guide for Monitoring and Evaluation, www.unicef.org/reseval/mande4r.htm

UNITED STATES AGENCY FOR INTERNATIONAL DEVELOPMENT (USAID), CENTER FOR DEVELOPMENT INFORMATION AND EVALUATION (CDIE): Performance Monitoring and Evaluation Tips,

www.dec.org/usaid_eval/004

Project Monitoring and Evaluation 47

Annex 6: List of References

USAID: Evaluation Publications (1997 – 2000), www.dec.org/usaid_eval/

This site presents a number of evaluation publications, including the following:

------------. Conducting a Participatory Evaluation, TIPS No. 1, 1996. This note defines participatory evaluation, its characteristics and purposes. It discusses the differences between participatory evaluation and traditional evaluation. Finally, it outlines the key steps in conducting a participatory evaluation, including when participatory evaluation is appropriate, determining on the degree of participation and building consensus on results.

------------. Conducting Key Informant Interviews, TIPS, No. 2, 1996. This note presents key informant interviews as a low-cost rapid appraisal technique. It discusses the method’s advantages and limitations, how to maximize its usefulness and step-by-step instructions of how to apply the method.1 1 6

-----------. Preparing an Evaluation Scope of Work, TIPS, No. 3, 1996. This note offers suggestions for preparing a good evaluation scope of work. It outlines the components of the scope of work and highlights the kind of information needed under each.

------------. Using Direct Observation Techniques, TIPS No. 4, 1996. This note introduces direct observation as one example of a rapid, low-cost method for collecting information on the performance of development interventions. It discusses the method’s advantages and limitations, how to maximize its usefulness and provides step-by-step instructions of how to apply the method.

------------. Using Rapid Appraisal Methods, TIPS, No. 5, 1996. This note introduces a range of low-cost methods known as rapid appraisal methods, which are used to collect information on the performance of development interventions. It discusses their strengths and weaknesses and when they are appropriate.

------------. Preparing a Performance Monitoring Plan, TIPS, No. 7, 1996. This note introduces the elements of a performance-monitoring plan and provides advice on preparing one for the systematic and timely collection of performance data.

------------. Establishing Performance Targets, TIPS, No. 8, 1996. This note defines what performance targets are, why they are important, and what information sources and approaches may be used for setting targets.

Project Monitoring and Evaluation 48

Annex 6: List of References

------------. Conducting Focus Group Interviews, TIPS, No. 10, 1996. This note defines focus group interviews, discusses the method’s advantages and limitations, when it is best utilized and for what, and provides a step-by-step guide on how to organize and conduct focus group interviews for high quality results.

------------. The Role of Evaluation in USAID, TIPS, No. 11, 1997. This note addresses questions about the new role of evaluation in USAID. It discusses the changed emphases of evaluation in a results-based context and why this is important. It also outlines the key steps in planning and conducting an evaluation.

------------. Guidelines for Indicator and Data Quality, TIPS, No. 12, 1998. This note describes USAID criteria and procedures for ensuring the quality of indicators and data in performance monitoring systems for managing for results.

------------. Monitoring the Policy Reform Process, TIPS, No.14, 2000. This note discusses the issues and challenges of designing and implementing systems to monitor the policy reform process. It outlines the characteristics of good monitoring system, provides examples of milestone events during policy formation/adoption and policy implementation and elaborates three methodological approaches for monitoring the policy reform process.

------------. Measuring Institutional Capacity, TIPS, No. 15, 2000. This paper provides information on measuring institutional capacity, including some tools that measure the capacity of an entire organization as well as others that look at individual components or functions of an organization. The discussion focuses on the internal capacities of individual organizations.

USAID: A Sourcebook on Results-Oriented Grants and Cooperative Agreements, www.usaid.gov/pubs/sourcebook/usgov/

This sourcebook is primarily intended for USAID staff and development partners but contains useful guidance and suggestions in the following areas: defining results-oriented assistance interventions; managing for results through partnerships with government, non-governmental organizations and other civil society actors and the private sector; gathering, analyzing and reporting on overall performance against intended outcomes; and using outcome monitoring and evaluation information to inform decision-making, making flexible adjustments when necessary and highlighting achievement of results.

Project Monitoring and Evaluation 49

Annex 6: List of References

USAID, Centre for Development Information and Evaluation, www.dec.org/usaid_eval/

USAID’s Centre for Development Information and Evaluation (CDIE) publishes a wide range of impact evaluations, program and operational assessments, managing for results reports, performance monitoring tips and the USAID Evaluation News. This site includes access to more than 60 publications produced by CDIE since 1996. Reports are added as they become available.

W.K. KELLOGG FOUNDATION (1998): Evaluation Handbook,

www.WKKF.org

This Handbook is designed for evaluations at project level. It provides a framework for thinking about evaluation as a relevant and useful program tool and outlines a blueprint for designing and conducting evaluations. It also provides basic information to allow project staff to conduct an evaluation without the assistance of an external evaluator.

WOODHILL J. AND ROBINS L. (1998): Participatory Evaluation for Landcare and Catchment Groups: A Guide for Facilitators. Australia

This document provides a model for evaluation and explains how to include community members in the process. It details the setting of project goals, objectives and evaluation questions and indicators as well as providing tips on facilitation and participatory techniques.

WORLD BANK INSTITUTE: Training Evaluation Toolkit (Version 1.3), www.worldbank.org/wbi/

This toolkit, developed by the World Bank Institute Evaluation Unit, is a set of templates and guidelines that enables anyone—with or without prior evaluation knowledge—to conduct a Level 1 training evaluation. The Toolkit can be used to develop questionnaires that elicit participant feedback to training activities, collect the data and tabulate the ratings,

Project Monitoring and Evaluation 50

Annex 6: List of References

WORLD BANK (2002): Monitoring and Evaluation Chapter (draft), Monitoring and Evaluation for Poverty Reduction Strategies, www.worldbank.org/html/oed/evaluation/

The Poverty Reduction Strategy Sourcebook is an evolving document aimed at assisting countries in the development and strengthening of poverty reduction strategies. The purpose of the Monitoring and Evaluation chapter is to provide guidance on developing outcome monitoring systems and impact evaluation strategies. In addition, it addresses how to use monitoring and evaluation results to create a feedback process, how to promote participation in monitoring and evaluation activities, and how to build institutional capacity for outcome monitoring and impact evaluation.

WORLD BANK: Evaluation, Monitoring and Quality Enhancement Community Website, worldbank.org/html/oed

This website contains World Bank evaluation studies—including a range of documented evaluations conducted at country, sector, thematic and impact levels—and reports on best practices and lessons learned. Web links are provided to World Bank monitoring and evaluation handbooks and toolkits on indicators and poverty measurement, and to other evaluation groups in the World Bank.

WORLD BANK (2001): Impact Evaluation, www.worldbank.org/poverty/impact/index.htm

This site is designed to disseminate information and provide resources for people and organizations working to assess and improve the effectiveness of projects and programmes aimed at reducing poverty. 17

WORTHEN, BLAINE R., JAMES R. SANDERS AND FITZPATRICK J. (1997): Program Evaluation: Alternative Approaches and Practical Guidelines, 2nd Edition, White Plains, NY

This book serves as a reference guide for practicing evaluators and those professionals who want a comprehensive overview of program evaluation and references to additional information. It covers the following topics: evaluation approaches and models; collection, analysis and use of qualitative and quantitative data; practical guidelines for planning, conducting and using evaluations, including checklists and procedural guides; glossary of evaluation terminology.

Project Monitoring and Evaluation 51

Annex 7: The PFM Guidelines

Project Monitoring and Evaluation 52

Annex 7: The PFM Guidelines

The eight PFM Guidelines were prepared as part of the DWAF/ Danida PFM Project (2001-2005). The PFM Guidelines aim to empower DWAF staff, the new custodians of the state forests and partners at local level to implement the new DWAF Forestry Vision. The PFM Guidelines are meant to support community upliftment in accordance with the DWAF Criteria, Indicators and Standards for Sustainable Forest Management. Some Guidelines target local groupings, where limited capacity prevails. The Guidelines are available from the Directorate: Participative Forestry in DWAF, Pretoria. Guideline Description and Main Target Groups

Guideline

Description

Main Target

Groups

Stakeholder Participation

How to mobilise stakeholders at local level and form partnerships and agreements with local user groups/communities

DWAF and the new custodians of state forests as well as other departments/ organisations pursuing participation in natural resource management

Legal Options for Community Partnerships with DWAF Forestry

Legal mechanisms/entities available for local groups to co-operate and form CFAs with DWAF and thus obtain licences to use forests and their products

DWAF and the new custodians of state forests as well as local groupings (PFM Committees, CBOs, NGOs, clubs, small enterprises, etc)

Annex 7: The PFM Guidelines

Logical Framework Approach Project Planning

Planning and documenting a project and explaining what a project is, including the major projects funded by donors

DWAF and the new custodians of state forests and local groupings (NGOs, CBOs, Forest User Groups, etc)

Sustainable Resource Use

Multiple stakeholder use of indigenous forests through the development of sustainable resource use systems

DWAF and the new custodians of state forests and local groupings (NGOs, CBOs, PFM Committees, Forest User Groups, etc)

Project Monitoring and Evaluation

A tool for monitoring and evaluating projects in line with DWAF’s new monitoring and regulatory role

DWAF and the new custodians of state forests

Fund Raising for Projects

How to compile a funding proposal and where community structures and other local groupings can apply for funding for forest related and natural resource management projects – complements the PFM Guideline: LFA Project Planning

Local groupings (NGOs, CBOs, Forest User Groups, etc)

Formation of PFM Forums and Committees

Aspects and procedures of developing local PFM structures and compiling a constitution in order that DWAF can liase and form partnerships with communities through local structures – supplements the PFM Guideline: Stakeholder Participation

DWAF and the new custodians of state forests and local groupings (NGOs, CBOs, Forest User Groups, etc)

Financial Management of Projects

Simple aspects and processes of sound financial management of projects – many local groupings have limited capacity in this regard and can thus not apply for project funding

Local groupings (NGOs, CBOs, Forest User Groups, etc)

Project Monitoring and Evaluation 53