Upload
marshall-bridges
View
221
Download
2
Tags:
Embed Size (px)
Citation preview
FPD
M&
E W
ork
shop
5 –
6
Novem
ber
20
14
Introduction to M&E Components of MERP5 – 6 November 2014
Sunet Jordaan
Workshop Expectations?
• What do you expect to learn in the next two days?• Specific needs?• Key M&E skills you want to acquire?
Overall Learning Goal
• To build participants’ skills in monitoring and evaluation
• Other goals?
Learning Outcomes
• Describe the basic components of M&E• Describe the role and value of MERP (Monitoring,
Evaluation and Reporting Plan)• Apply key elements of MERP
– Theory of change framework and Results Logic Framework
– Indicators and Indicator Development– Data management: data quality and data flow– Shareholder analysis– Data use and dissemination plan
Overview of Presentation
• Overarching definition of M&E (what, why, how)• Theory of Change Framework and Logical
Framework• Indicators• Data Management (quality and flow)• Stakeholder Analysis• Data use and dissemination plan
Name M&E Activities
• Name one M&E activity linked to this project that is already being implemented– ART– Tier.Net– Registers– Patients on treatment– Viral load, CD4 count– Waiting times– Follow-up with defaulting patients– Health Education monitoring– Side effects monitoring– Compliance
Overview of Learning Outcomes• Describe the basic components of M&E• Describe the role and value of MERP (Monitoring,
Evaluation and Reporting Plan)• Apply key elements of MERP
– Theory of Change Framework and Logic Framework– Indicators and Indicator Development– Data management: data quality and data flow– Shareholder analysis– Data use and dissemination plan
What is Monitoring and Evaluation?
• Monitoring:– Routine, on-going assessment of activities to
provide managers, decision makers and other stakeholders with regular feedback on progress in implementation, results achieved and early indicators of problems that need to be corrected
• Evaluation:– Time-bound, periodic assessment that seeks to
answer specific questions to guide decisions
Monitoring versus Evaluation
• Monitoring: What are we doing?– Tracking inputs and outputs to assess whether
the programme is implemented according to plan
• Evaluation: What have we achieved?– Assessment of impact of the programme on
behaviour or outcome
Purpose of M&E
• To provide the data needed to guide the planning, coordination and implementation of response;
• To assess the effectiveness of the programme; and
• To identify areas for programme improvement.
Purpose of M&E
• Have to measure results to tell success from failure
• Learn from mistakes• Demonstrate results to the donor
Purpose of M&E Monitoring Evaluation
What is it? Ongoing collection and analysis of data on progress towards results, changes in the context, strategies, and implementation
Reviewing what has happened and why, and determining relevance, efficiency, effectiveness and impact
Why do it? Inform day-to-day decision making, adjust project design, and inform on planningAccountability and reporting
Strengthen future planningProvide evidence of successDeepen understanding of what works
Who does it? Programme staff/partners/participants External consultant/staff/participants
When to plan At design stage Core decisions at design stage and refined along the way
When to implement Continuously Mid-term (formative)Completion (summative)After completion (impact)
M&E versus other activities
Activity
• Monitoring & Evaluation
• Research
• Surveillance• Audit
Main Aim
• Project management, planning and justification
• Test hypotheses; develop new knowledge
• Disease control and prevention
• Control and proof
M&E in Programme Management
Programme Improvement
Data Sharing
Reporting/Accountability
M&E Schema
Decision Making Based on Information
Using M&E in Programmes
Methodology
Basic Concepts and Principles of M&E
Data Management
Data Uses
• Document if a service is happening, or not– Just because it’s not recorded, doesn’t mean it’s not
happening…but have to record to know
• Can provide an indication as to the quantity and quality of service– Can highlight breaks in continuity of service– Can identify ‘hot spots’ of poor performance
• Can suggest where a problem might be– Need to have investigations to confirm
• Can motivate for one action over another• Can highlight what we’re doing right and if there is
improvement
Discussion questions
• Are you involved with data management?– At your organisation?
• What do you want to different at your organisation after this course?
• Important to do it right!
Overview of Learning Outcomes
• Describe the basic components of M&E• Describe the role and value of MERP (Monitoring,
Evaluation and Reporting Plan)• Apply key elements of MERP
– Theory of change framework and Logic Framework– Indicators and Indicator Development– Data management: data quality and data flow– Shareholder analysis– Data use and dissemination plan
MERP?
• M: onitoring• E: valuation• R: eporting• P: lan
Purpose of MERP
• To provide a comprehensive Monitoring and Evaluation Plan for tracking performance and evaluation interventions
• To lay the basis for the design of a monitoring and evaluation system that would provide relevant, accurate and timeous information for informed decisions making
• To describe a system which links strategic information from various systems to decisions to improve a programme
Purpose of MERP
• The system that links strategic information to decisions that will improve programmes
• Ensures accountability and a measure for success
Function of MERP
• State how the programme is going to measure what it has achieved
• Encourage transparency and responsibility
• Guide implementation• Preserve institutional memory• Living document: adjust when needed
Steps
• Step 1: Understand your project• Step 2: Theory of change framework• Step 3: Results Logic Framework• Step 4: Data Management
Step 1: Understand your project
• Understand obligations • Research the context• Consult with programme team• Understand your budget, operations,
infrastructure and human resource capacity
Step 1: Understand your project
• Fit to your programme and need• Fit capacity, stakeholder requirements and
data needs• Understand obligations (contract,
programme etc.)• Be flexible• Simple!
Step 1: Understand your project
• Research the context:– Consult experts– What was done before?– Benchmarks and indicators?
Step 1: Understand your project
• Consult with programme team– Define the problem and the desired change– Create a project management plan
Step 1: Understand your project
• Understand your budget, operations, infrastructure and human resource capacity
Elements for MERP
• Brief project description• Purpose of M&E plan• Brief history of M&E plan development• Evaluation framework• Indicator system• Information system (data sources)• Impact evaluation design• Dissemination and utilisation plan• Possible adjustments to M&E plan
• Example of a MERP
Overview of Learning Outcomes
• Describe the basic components of M&E• Describe the role and value of MERP (Monitoring,
Evaluation and Reporting Plan)• Apply key elements of MERP
– Theory of change framework and Logic Framework– Indicators and Indicator Development– Data management: data quality and data flow– Shareholder analysis– Data use and dissemination plan
Step 2 and Step 3
• Theory of Change Framework• Logical Framework
Theory of Change Framework
Logical Framework
Theory of Change and Results Logical Framework
• Show how an organisations functions:– Theory and assumptions of a programme
• Creates a road map:– Links outcomes (short and long-term)
with programme activities and processes and the theoretical assumptions/principles of the programme
Theory of Change and Logical Framework
• This show where programmes fits into the wider context
• Shows relationships• Guide identification of indicators• Guide impact analysis
Theory of Change
Theory of Change examples
• CBCT Programme• Reduce Loss-to-Follow-Up patients• UJ BCURE: Using research evidence for policy
making
Step 2: Theory of Change
• It locates a programme or project within a wider analysis of how change comes about.
• It draws on external learning about development.• It articulates our understanding of change - but also
challenges us to explore it further.• It acknowledges the complexity of change: the
wider systems and actors that influence it• It is often presented in diagrammatic form with an
accompanying narrative summary
• http://policyimpacttoolkit.squarespace.com/theory-of-change/
Logical Frameworks
• Foundation for M&E Frameworks• Outlines the hierarchy and relationship between
project inputs, outputs, outcomes and impact
Logical Frameworks
Activities
Output
Outcome
Impact
Logical Frameworks: E.g.
Activitie
s: Provide
quality CHW
training
Output:
CHW’s
provide
better
health
services to population
Outcome: Mor
e peop
le accessing VCT services
Impact:
Improved healt
h outcomes at
national
level
Theory of Change Framework
Logical Framework
Logical frameworks
• Examples– CBCT Logical Framework– SIDA Results Logical Framework– SIDA Log Frame for Planning
Group Work
• Develop a Logical Framework for the Diabetes Project– Diabetes Awareness Project in Daveyton– Activities:
• Diabetes awareness talk• Testing for diabetes• Novo Nordisk Diabetes Bus
Results
• Broad term used to refer to the effects of a programme
• Most ambitious outcomes planned: this will be what you will be held accountable for
Overview of Learning Outcomes
• Describe the basic components of M&E• Describe the role and value of MERP (Monitoring,
Evaluation and Reporting Plan)• Apply key elements of MERP
– Theory of change framework and Logic Framework– Indicators and Indicator Development– Data management: data quality and data flow– Shareholder analysis– Data use and dissemination plan
Indicators
• What is an indicator?
Indicators
• If you can’t measure it, you can’t manage it• Definition:
– Measurable and verifiable proof of an action or result
– A quantitative or qualitative variable (something that changes) that provides a simple and reliable measurement of one aspect of performance, achievement or change in a programme or project
• An Indicator should be directly related to the programme or project objective to be measured with no overlap with that of other indicators
Indicators
• Measure change: directly or indirectly• Measure trends over time• Measure progress towards defined
targets and/or desired outcomes• Provide information about a broad
range if conditions through a single measure
Indicators
• Reduce a large amount of data to its simplest form
• Help direct resources to areas where the needs are greatest and optimal health care system strengthening
• Provides evidence for achievement (or lack of) of results and activities (comparisons)
Indicators
• Indictors follow the hierarchy of results in the logical framework:
• Process indicators:– Measure the completion of
activities• Impact indicators:
– Measure achievement of change– Outcomes
Difference between ‘results areas’ and ‘indicators’
Results areas
• Broad statement/ overarching idea about what you would like to change
• Uses words like: increased, decreased, improved etc..
• Laymen’s speak for what you want to change; justifies your reason for intervention
Indicators• Individual statements of
that statement/idea are measurable and verifiable
• Uses worlds like: number, percentage, rate…
• M&E speak for what you intend to measure in order to document that the desired change has been achieved
Indicator Structure
Tools for defining indicators
• Review existing indicators and resources from own country, international bodies (WHO, UNAIDS, MGD, PEPFAR)
• Align (as much as possible) to existing indicators (also align with existing data management structures)
• Indicators reference protocols:– Grouped by programme area and/or goal– Unique identifier– Definition (inclusion/exclusion criteria)– Disaggregation components– Data quality concerns– Reporting frequency
SMART and RAVESS Indicators
SMART• S: pecific• M: easurbale• A: chievable• R: ealistic• T: imebound
RAVESS• R: eliable• A: ppropriate• V: alid• E: asy• S: ensitive• S: pecific
Developing SMART Indicators
• How to measure your progress
Indicator Selection Rules
• They should be the optimum set that meet the management needs at a reasonable cost
• Limit the number of indicators used to track each objective or results to a few (2-3)
• Select only those that represent the most basic and important dimensions of your objectives
• Select indicators that you can measure!
Indicator Components
• Name of indicator• Description/definition• Unit of measurement• Data source (primary/secondary)• Baseline/target values by year• Frequency of data collection• Responsibility• Reporting plan and frequency
Priority Indicators
3 Questions when designing indicators:
1. Can I, and will I, use the information collected by this indicator?
2. Does the value outweigh the effort of data collection?
3. Will knowing this information (and using it) improve the quality of my programme?
Overview of Learning Outcomes
• Describe the basic components of M&E• Describe the role and value of MERP (Monitoring,
Evaluation and Reporting Plan)• Apply key elements of MERP
– Theory if change framework and Results Logic Framework– Indicators and Indicator Development– Data management: data quality and data flow– Shareholder analysis– Data use and dissemination plan
Reality to Action
Real World
Data
Information
Action
Collection and Coding
Processing, interpretation, presentation
Politics, commitment
Data Management
• Refers to the process of moving data from collection to collation to analysis to reporting
• Data management can be on a paper-based method or electronically
• Comprise of three focus areas:– Data Quality– Data Sources– Data Flow
• Good data management processes mean that data is translated into information according to time and format required for use and generation of knowledge
What is Data Quality (DQ)?
• Refers to the worth/accuracy of information collected
• How well do the data reflect ‘true performance’?• Is a direct result if data management (DM)
(poor DM → poor DQ)• Measured by 5 components:
– Validity– Reliability– Timeliness– Precision/accuracy– Integrity
Common Problems with DQ
• Incomplete or missing data• Getting data collected, collated or analysed quickly
enough (e.g. census statistics)• Getting data in a timely manner• Getting honest information
Problems with DQ
Technical Factors
Standard Indicators
Data Collection
Forms
Appropriate IT
Data Presentation
Trained People
Problems with DQ
System and Environmental Factors
ResourcesStructures of
the Health System
Roles and Responsibilities
Organisational Culture
Problems with DQ
Behavioural Factors
Motivation Attitudes and Values Confidence Sense of
Responsibility
Factors Affecting Data Quality
5 Components of Criterion-Based Evaluation of Data Quality
Data Quality
Validity
Reliability
Timeliness
Precision/Accuracy
Integrity
Measure of Validity
• Good Validity:– Measure what you intended to measure
• Risks to validity:– Understand the definition of the indicator in the context
of the project (e.g. appropriate training and support defining exactly what we want to measure)
– Recognise the data that must be included/ excluded from a data set (e.g. inclusion and exclusion criteria)
– Recognise where we measure the data (e.g. which tools collect the data, who is responsible etc..)
– Clicks?
Measure of Reliability
• Good reliability:– Consistently collect data of the same quality over time– Trust in the data
Reliability = pre-requisite for validity
• Risks: if we fail to identify:– The system doesn’t work: gaps in data collection, unclear
roles, not reporting what we need to– Collection instruments allow for variations over time and
place: unclear understanding of who to use registers/forms
– New people (inadequate mentoring and training)
Measure of Timeliness
• Good timeliness: we collect, collate and report data which still have the desired relevance at time of reporting
• Data arrive in time to be used for evidence-based planning and decision-making
• Frequency: frequent enough to inform programme management decisions (e.g. monthly statistics)
• Currency: data are reported as soon as possible after collection (this month’s statistics reflect last month’s activities)
Measure of Precision/Accuracy
• Good precision/accuracy:– We work to make our data as free from bias (accuracy) and error
(precision) as possible, and give an indication of the risk magnitude (how big is it) and direction of error (under- or over report)
Measure of Precision/Accuracy: pre-requisite for validity
• Source errors/biases:– Instruments used for collection, collation, manipulation and
storage produce error or bias (corrupted spread sheet, formula not accurate anymore)
– Data results in bias based on time, place, person or under/over reporting errors (e.g. person responsible for register not there)
– Transcription methods (e.g. data capturer entered 29 instead of 92)
– Are you using precise information?
Measure of Integrity
• Data integrity: – Truthfulness of the data (good or bad)
• Poor integrity occurs from:– Human error: ooops! (e.g. did not count one page of
register)– Actual human interference: Don’t report that, it looks bad
(e.g. too little data on TB symptom screening, so use and approximate instead)
– Manipulation of data: if we over-report, then it looks as if we are doing our work
– Technology failure: viruses, computer crashes etc.
• Integrity can be compromised during data collection, cleaning, handling and storage due to lack of proper controls
So, what is data quality?
• Good data quality has:– Validity: measures what you want to measure– Reliable: has consistent and uniform data management
methods over time– Timely: is consistent with deadlines– Precision/Accuracy: has minimal error/bias– Integrity: is free of human error or manipulation
Managing Data Quality
• Drive data use at all levels• Less is more• Critically assess your indicators for risks to data
quality• Pilot/test your M&E systems: never just assume
that it will work• Collaborate with the users of data• Build checks and balances into your data
management process to identify potential DQ issues quickly
Data Management Tools
• Look at existing tool (DoH, WHO, UNAIDS, partners)– Align with existing data management systems (e.g. DHIS,
tier.net, DoH registers)– Only implement new tools if exiting tools can not meet
the new data requirements
Information Systems
• Must have information systems for collecting data based on indicators
• Collect, process, analyse and report based on that• Various tools
Data Sources
• Files, registers, tick sheets, notification sheets• Surveys (telephonic, self-administered, face-to-
face, satisfaction)• Questionnaires• Interviews• Focus groups• Lab results• Databases• Stats SA, HSRC surveys• DHIS
Evaluation Design
• Experimental vs. observational• Quasi experimental • Based on implementation plan
Data Flow
‘Of all things that flow, data is not one of them’
Data Flow
• Data maps show the flow of indicators through data gathering forms and report formats and how they are connected
• Data maps ensure a process die collecting data for the indicators listed in the project proposal
• Depending on the scale and complexity of the project, there may be several data flow maps
Data Flow by Person
Data Flow: Data Quality Steps
Data Flow between partners
• Important: How the data should be communicated between partners?
• What is the situation in your organisations?
• Discuss
Data Quality Management Plan
• Criterion-based assessment of indicators+
• Data Flow: who does what, risks to DQ per data flow step, ensure data are being used
=• Tool development (definitions, quality checks,
feedback format)• Training interventions• General monitoring feedback• Data feedback format (general performance against
targets and data quality graphs)• Data use forums and demand for quality data
Framework for Enhancing Data Quality
Data Quality Management Plan
• Criterion-based assessment of indicators+
• Data Flow: who does what, risks to DQ per data flow step, ensure data are being used
=• Tool development (definitions, quality checks,
feedback format)• Training interventions• General monitoring feedback• Data feedback format (general performance against
targets and data quality graphs)• Data use forums and demand for quality data
Building sustainable M&E Systems
Alignment of tools based on users/ beneficiaries data needs
Systems to detect data quality issues
Timely and relevant feedback structures on performance (based on data) and DQ concerns
Training of tool users on data use, data quality and data management
A sustainable M&E system
Overview of Learning Outcomes
• Describe the basic components of M&E• Describe the role and value of MERP (Monitoring,
Evaluation and Reporting Plan)• Apply key elements of MERP
– Theory of change framework and Logic Framework– Indicators and Indicator Development– Data management: data quality and data flow– Shareholder analysis– Data use and dissemination plan
Stakeholder Analysis&
Data Use and Dissemination Plan
Stakeholder Management
A stakeholder: Is anyone who has reason to have
an interest in your project; Can have an interest and/or
participate in the project either in a direct or indirect manner; and
May act independently; or represent groups
Stakeholder Management
Stakeholders include for example:
The Project Manager - You; The Customer – the person/organisation paying for the
project ; The Sponsor - generally the person who assigned the
Project Manager the responsibility for the project, often an individual that participates in the senior or executive management level of the organisation;
The Manager in your organisation who is required to provide approvals for specific actions/tasks);
The Project Team Members; and All external role players e.g. regulatory authorities,
government, trade unions, traditional leaders, NGO’s FBO’s, etc.
Tip: Manage your stakeholders: Inform, inform and inform
Stakeholder Management
From: www.codeproject.com/KB/architecture/projectmgmt_Pt2.aspx. Accessed 25 September 2010.
Stakeholder Analysis
• Who has a vested interest in your project?• What are their data requirements?• Why do they want that data?• What should the data format look like?
Stakeholder Analysis
Data Use and Dissemination Plan
• Reports (examples)• Evaluation assessments• Logical Frameworks• Add to database• Presentations• Research?
– Publish– Credit to co-partners
• Discuss the above aspects in the group
Tips: Stakeholder Analysis & Data Use and Dissemination Plan
• Defines focus of data collection: data that can and will be used (in other words, data needs)
• Streamline and standardise as much as possible: minimise parallel and redundancies as much as possible
• Consult stakeholders and review draft analysis and data use and dissemination plan
Case Study: Stakeholder Analysis and Data Use and Dissemination Plan
• As a group identify a stakeholder• Complete the stakeholder analysis for that
stakeholder (look at the big picture)• 15 minutes for discussion, 10 minutes for
presentation