Outcome Evaluation:Outcome Evaluation:Logic Models,Logic Models,
Evaluation Frameworks,Evaluation Frameworks,& Measuring Indicators& Measuring Indicators
Session #2/3
What to Expect from this SeriesWhat to Expect from this Series
1. Learn about steps required to implement outcome evaluation
2. Design a program logic model
3. Identify appropriate evaluation/measurement tools
4. Develop an implementation plan
EMAIL??
A note about theA note about thenature of evaluation…nature of evaluation…
Name Affiliation Position
Name Affiliation Position Email
Purpose of Today’s SessionPurpose of Today’s Session
• To complete individualized workable program logic model for program of your choice
• To begin designing individualized evaluation plans based on your logic model
• To begin identifying indicators and potential evaluation measurement tools
Today’s agendaToday’s agenda• Answering questions and refining logic
models • Break• Analyzing logic models and identifying
priority evaluation questions• Lunch• Explaining indicators and measurement• Beginning to draft indicators for your
programs• Trying to wrap up by 3 so that we can
save time for other questions or one-on-one discussions at the end of the day.
Logic ModelsLogic ModelsResponding to the Online ExerciseResponding to the Online Exercise
• Great start!
• Varying stages
• Tips
What did YOU think of the online exercise?
Logic ModelsLogic ModelsReactions to your efforts thus farReactions to your efforts thus far
Challenges and IssuesChallenges and Issues
• Balancing the logic of program development and promotion with the logic of the program itself
• Finding the right balance for short-term and long-term outcomes
• Differentiating program process objectives from outcome objectives
• Finding outcomes for individualized programs or programs for people in crisis
Balancing the logic of program Balancing the logic of program development and promotion with development and promotion with
the logic of the program itselfthe logic of the program itself
Sample Activities Suggested Edits
• Promote the program to partners
• Provide support and information about the Canadian system to clients
• Secure safe and accessible community based locations
• Determine appropriate community volunteer leader to oversee each program location.
• Divide the logic of program design and implementation from the logic of the program itself
Finding the right balance for short-Finding the right balance for short-term and long-term outcomesterm and long-term outcomes
• Try to follow the “logical chain” from top to bottom and look for big logical leaps. Try to identify the unstated assumptions that underlay these leaps.
• Look for outcomes that are largely within your control.
Finding the right balance for short-Finding the right balance for short-term and long-term outcomesterm and long-term outcomes
Home visits
increased recreation and leisure
opportunities
increased opportunity for participation in
community
decreased feelings of depression
increased acceptance & understanding of
accessibility needs of [vulnerable population]
? ?
Families experiences are affirmed and
normalized
Families have increased knowledge
of local recreation opportunities
Dealing with programs that have Dealing with programs that have individualized activities and individualized activities and
outcomesoutcomes
Sample Activities Suggested Edits
• Resident care • Provide short term
counselling
• Provide settlement and adaptation support to participants
• break down into more concrete elements
• Describe in a more behavioural way
How do we identify outcomes when How do we identify outcomes when services are services are highly individualizedhighly individualized? ?
• Look for more immediate short term outcomes that may be common to more clients
• Try to categorize the most typical activities and outcomes into broad groups …even the most individualized services tend to have a cluster of typical activities
• Choose to emphasize those outcomes that will appeal to key audiences
Develop and support clients’ individual rehab
plans
Clients maintain or improve their level of
function
Clients become aware of +/or are connected to community resources
Clients gain insight to healthier
lifestyles/choices
Provide clinical support and interventions
Assess and develop life skills
Client becomes a more productive, independent
member within their community
Clients are able to find +/or maintain
accommodation of their choice
Lon
g-te
rm
outc
ome
obj
ectiv
es
Sho
rt-t
erm
ou
tcom
e o
bjec
tive
s
Clients have improved informal
supports
Reduce the # of hospitalizations or
shorten lengths of stay
Clients have improved informal
supports
Clients have longer community tenure
Act
iviti
es
Example of a well-done logic model for a program with individualized outcomes
This example works well because…This example works well because…
• The outcome objectives are diverse, so they explain the full impact of the program.
• The short-term outcome objectives are clear, concrete, achievable and measurable.
• The links are differentiated well (i.e., every activity doesn’t link to every outcome).
• Certain short-term outcomes emerge as pivotal to the overall logic.
How do we identify outcomes when How do we identify outcomes when services are services are crisis response-orientedcrisis response-oriented? ?
• Think “Harm Reduction” or “Secondary Prevention”: Identify the ways in which your intervention reduces negative consequences of crises, or reduces risk of further crisis
• Think of the “next steps” your clients can take once stabilized as outcomes
• Think of very, very short-term outcomes like, “client feels a sense of dignity”
Weekly self help group for women
Women are able to evaluate relationship with partner and
make decisions about the relationship
Women develop friendships in the
group
Women feel less fragile and are not dealing with daily
crises
Family members reconstruct a support system and maintain
family unity
Women feel supported in the decisions they
make
Lon
g-te
rm
outc
ome
obj
ectiv
es
Sho
rt-t
erm
ou
tcom
e o
bjec
tive
s
Women support each other outside of the
group
Women articulate expectations of their
partners
Women access resources
Partner is involved as part of family decision making
Act
iviti
es
Example of a well-done logic model for a crisis oriented program
This example works well because…This example works well because…
• The short-term outcomes don’t imply dramatic change in the lives of participants, which is appropriate for a program dealing with women in near crisis situations.
• Even the long-term outcomes are fairly modest and achievable (which is good for the same reason).
• The model makes a strong case for having an impact on one aspect of a complex social issue
Other Questions?Other Questions?
Time to work a bit more Time to work a bit more on polishing logic on polishing logic
modelsmodels
MORNING BREAKMORNING BREAK
Feeding BackFeeding Back
• What insights are starting to emerge from your model?
Review of Logic Models: Review of Logic Models: Prioritizing ObjectivesPrioritizing Objectives
for Measurementfor Measurement
Discussion Exercise
The importance of the arrows in The importance of the arrows in a logic modela logic model
• The arrows in a model are like your hypotheses. They express your ideas about the cause-and-effect linkages within your program.
• Most arrows are based on certain assumptions .. Some are conscious, others are probably not so conscious.
Analyzing your Logic ModelAnalyzing your Logic Model
• If you follow each logical pathway through from activities through short-term outcomes to long-term outcomes, does the logic “hang together” or does it feel like there are unstated assumptions?
• Where might data gathering help you to check these assumptions? (make a note)
Analyzing your Logic ModelAnalyzing your Logic Model
• Are there any activities that don’t seem to link to any identified outcomes, or vice versa?
• Which logical links do you feel sure are actually occurring in your program? Which ones are you less sure about?
•
• Make a note about those links that you feel you could document better.
Analyzing your Logic ModelAnalyzing your Logic Model
• Which outcomes seem “pivotal” in your model? Which ones are absolutely key to overall success?
• Which ones do outsiders least understand or appreciate?
• What are the issues or questions that your key stakeholders are going to see as most important?
• Make a note
Analyzing your Logic ModelAnalyzing your Logic Model
• Where does your logic model “link up” with the overall priorities of your board, your clients, your funders?
Translating key Translating key issues or unknowns issues or unknowns or assumptions into or assumptions into research questionsresearch questions
Clarifying the levels of questionsClarifying the levels of questionsTo determine how
effective our program is at
helping people find meaningful jobs
How many of our clients find jobs?
How satisfied are clients with their
jobs?
What parts of our program seem to help the most?
For clients: “How much do you enjoy
your job?”
For staff: “How satisfied does your
client seem?”
Evaluation Purpose
Actual Interview or Survey Items
Evaluation Questions
Definitions - Outcome Measurement
questions -• the questions we need to ask to assess the attainment of each
outcome objective
• the most central question is most often a rewording of the outcome objective
• Other evaluation questions may ask about how the program/activity was implemented, problems encountered, lessons learned, changes made in program implementation, or costs associated with achieving the outcome objective
Types of Evaluation QuestionsTypes of Evaluation Questions
Needs Assessment Questions (What are local needs? local strengths? What are some good ideas for trying to help?)
Process Questions (Was program implemented as planned? What’s working? What has been learned?)
Satisfaction Questions (Are stakeholders pleased?)
Outcome Measurement Questions (Have we made a measurable difference?)
Economic Analysis Questions (Are the outcomes worth it?)
Exercise: Comparing Exercise: Comparing QuestionsQuestions
How effective is our program in fostering independence, increasing consumer control, and improving nutrition?
What are our strengths and weaknesses?
Is our program community based?
Does our program improve quality of life?
What would our stakeholders like us to change, and what should we keep the same?
How successful have we been in involving participants at all levels of the project?
A Framework for Outcome MeasurementA Framework for Outcome Measurement
Evaluation Design Planning Chart
Objective Research Questions Indicators Where will we get the information?
What data collection tools will we use?
Data Analysis
Outcome objective from logic model
May refer to success in meeting objectives (from your logic model) or other questions stakeholders feel the evaluation should address.
Things you can you observe that will help you to answer this question.
Who will you speak to? How would you access existing information?
Given what is written in columns to the left, what method or methods are most efficient and effective?
How are you going to make sense of the data you collect?
Increased feelings of social support among participating parents
Q1: Have parents’ perceptions of social support changed as a result of participation in programming?
Q2: How isolated are the parents we serve? Do they need more social support?
Drafting Evaluation Questions:Drafting Evaluation Questions:ExerciseExercise
• Considering the outcome objectives you chose from your logic model, and;
• Considering the stakeholders involved, and;
• Considering how you want to use the evaluation results …
• What are the core questions that will guide your evaluation?
Group DiscussionGroup Discussion
• What are some of the key questions or issues you have decided to focus on?
• How did you decide which objectives to focus on? What were some of your criteria?
Some critical comments Some critical comments on logic modelson logic models
If we have time ….
A logic model…A logic model…
• can come in many different formats. There is no “correct” way to format models although some ways are better than others. As a rule, models are more useful if they are clear about logical linkages and proceed temporally from what you do to what you wish to accomplish.
• is not an outcome but a process. You can expect to go through many iterations of your model.
• is only useful insofar as it is used.
Different Approaches to Logic Different Approaches to Logic ModelsModels
Activities
Short-term Outcomes
Long-Term Outcomes
Goals or Vision
Principles
Target Groups
Process, implementation,
service objectives
Additional Info on process
Implementation Steps
Targets
Indicators
Inputs
Outputs/Benchmarks
Additional Info on Measurement
Intermediate Outcomes
Limitations of logic modelsLimitations of logic models• Logic models do not do a good job of capturing
program context and program process
• They can sometimes take on a life of their own. Because they purport to describe the program, people may continue to assume that it is the best rendering of what actually occurs and what the objectives are.
• They may contain outcome objectives that are logical but too difficult to evaluate for a variety of reasons.
• Raises the question: Should an organization include outcome objectives in their model if they fully expect to be unable to evaluate them?
Should PLMs includeShould PLMs includeoutcome objectives thatoutcome objectives that
can’t or won’t be measured?can’t or won’t be measured?
Activity
Behaviour Change?
Attitude Change?
Societal Change?
As we move down the logical chain of the program it gets more and more difficult to evaluate our objectives. This is due to many factors including:
• time • resources• funding and timing of funding • multiple causal factors outside control of program
Do we still include our longer-term objectives?
Group DiscussionGroup Discussion
• What was the hardest part about drafting evaluation questions?
• Did thinking about your evaluation questions lead to any insights about your objectives?
LUNCHLUNCH
IndicatorsIndicators
Objectives and evaluation questions lead to a variety of
indicators
Definitions:Definitions:- - Indicators -Indicators -
• empirical (observable) bits of information that are used to determine “how we will know” whether the outcome objectives are being achieved
• measurable approximations of the outcomes you are trying to achieve
Example: Objective: Increased social support among participants Indicator: client perceptions of social support
Review:Review: A Good Indicator … A Good Indicator …
is “empirical” in nature
• is pragmatic
• is as “close” as one can get to the phenomena of interest
• helps to further ”operationalize” an objective
Review:Review: A Good Indicator … A Good Indicator …
• leads to good measurement and methodology (e.g., measures degrees of change and not just presence or absence)
• is not necessarily quantitative
Example: client perceptions of social support
Review:Review:SSources of Indicatorsources of Indicators
• Research literature
• Data already gathered for other purposes
• Observations
• Perceptions of key informants
• Standardized measurement tools
Choosing Indicators: Choosing Indicators: An ExerciseAn Exercise
The Main Message About The Main Message About Indicators:Indicators:
• no hard and fast rules about how to do them well
• strategic combination of 2 or 3 imperfect indicators often works best
Good measurement is often an act of creative compromise and balance
Writing out indicators for Writing out indicators for your own objectives: your own objectives:
ExerciseExercise
• Use the indicator planning sheet
• critically reflect on indicators if you already have them .. what are you relying on now?
• refer to handout of good example indicators
Indicator development exerciseIndicator development exercise
Evaluation Question
List relevant indicators from information you already gather
Brainstorm some possible new quantitative indicators (numbers, percentages, rates of change, etc.)
Brainstorm some possible new qualitative indicators (stories, photos, examples, case studies)
Has the switch to a de-centralized model for Ontario Works intake made the system more accessible?
% of callers who hang up while on hold (old system vs. new)
# of legal challenges to screening decisions
Caseworkers opinions about the quality of client information sent to them by intake
% of applicants screened out
Client feedback on access
# of complaints received
A Framework forA Framework forOutcome MeasurementOutcome Measurement
(We will go through the elements of this table one by one throughout the day)
Evaluation Design Planning Chart
Objective Research Questions Indicators Where will we get the information?
What data collection tools will we use?
Data Analysis
Outcome objective from logic model
May refer to success in meeting objectives (from your logic model) or other questions stakeholders feel the evaluation should address.
Things you can you observe that will help you to answer this question.
Who will you speak to? How would you access existing information?
Given what is written in columns to the left, what method or methods are most efficient and effective?
How are you going to make sense of the data you collect?
Increased feelings of social support among participating parents
Q1: Have parents’ perceptions of social support changed as a result of participation in programming?
% of parents who feel they have social support (e.g., Scores on the ASSQ)
from parents
Survey (at intake, 3 months and 6 months)
pre-post statistical tests
Group DiscussionGroup Discussion
• What are some examples of indicators (or sets of indicators) you think are good? What makes them good?
• What are some examples of indicators (or sets of indicators) you’re not so sure about?
• What are you relying on now?
Is there
buy-in?
You want to do more outcome measurement
Is outcome measure-ment the
next step?
1. Develop Logic models & Evaluability assessment
YES
NOAwareness raising & team building; recruitment of additional people
2. Develop Purpose and Questions
Needs assessment, program design, planning
3. Identify Indicators
ONE MORE TRY: The outcome measurement decision-making tree
5. Agree on purpose and questions, set roles, talk about use of findings
Is there a fit with
resources?
Revise plans, prioritize, discuss with funders
YES
YES
NO
NO
4. Bring together stakeholders
6. Develop workplan
Design: Balancing Balancing Rigour and Credibility Rigour and Credibility
with Practical with Practical LimitationsLimitations
DesignDesign
• Once you have clear questions and some good indicators for each, the next step is to develop a plan for gathering the information
• Design involves balancing two competing demands: practicality and rigour
Easy ways to “beef up” Easy ways to “beef up” methodological rigourmethodological rigour
Strategy Rationale
Incorporate more descriptive, behavioural questions
Less vulnerable to social desirability bias, easier to understand
Incorporate a greater variety of data types
Allows you to “back up” and strengthen findings
Narrow your focus with research questions
Measure a few things well
Measure at regular intervals Create your own “comparison groups”
Add follow-up measures to time of intervention measures
Allows for tracking of a greater range of outcomes, opens up possibility of failure
Find a comparison group Isolate the contribution of your program
Easy ways to make your design Easy ways to make your design more practical & efficientmore practical & efficient
• Narrow your focus with research questions
• Compare the “return” on different measurement choices and
• Use indicator lists to minimize the length and intrusiveness of surveys
• Stop gathering data you don’t use
• Invest in buy-in, and save time on data collection. Get input from stakeholders at each stage. Pilot test!!!!
Running Example RevisitedRunning Example Revisited Evaluation Design Planning Chart
Objective Research Questions Indicators Where will we get the information?
What data collection tools will we use?
Data Analysis
Outcome objective from logic model
May refer to success in meeting objectives (from your logic model) or other questions stakeholders feel the evaluation should address.
Things you can you observe that will help you to answer this question.
Who will you speak to? How would you access existing information?
Given what is written in columns to the left, what method or methods are most efficient and effective?
How are you going to make sense of the data you collect?
Increased feelings of social support among participating parents
Q1: Have parents’ perceptions of social support changed as a result of participation in programming?
% of parents who feel they have social support (e.g., Scores on the ASSQ)
Parents Parent survey (at intake, 3 months and 6 months)
pre-post statistical tests
Q2: How isolated are the parents we serve? Do they need more social support?
parent self-reports of level of isolation
family or professional key informant assessment of level of isolation of parents
Parents Family
member/professional service provider
Parent survey (at intake)
Parent focus group
Key informant internviews
Design TipsDesign TipsBe flexible and responsive:• design an evaluation that “fits” the needs of the
target populations and other stakeholders;• gather data relevant to specific questions and
project needs• be prepared to revise evaluation questions and
plans as project conditions change • be sensitive to cultural issues in the community• know what resources are available for evaluation
and request additional resources if necessary• be realistic about the existing capacity of the project • allow time to deal with unforeseen problems.
Design TipsDesign Tips
Collect and analyze information from multiple perspectives
Always return to your evaluation questions• The more closely you link your
evaluation design to your highest priority questions, the more likely you will effectively address your questions.
Basic Data Collection Basic Data Collection MethodsMethods
• Interview
• Focus Group
• Satisfaction Survey
• Analysis of client records/administrative data
Pros and cons of popular Pros and cons of popular measurement techniquesmeasurement techniques
• Retrospective reflections or stories
• Self report (interview or survey)
• Peer/family/worker report
• Direct observation
• File review
• Clinical assessments
Less Rigour
More Rigour