51
Session 3 Session 3

Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Embed Size (px)

Citation preview

Page 1: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Session 3Session 3

Page 2: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Implementing Outcome Implementing Outcome Evaluation (Part Two): Practical Evaluation (Part Two): Practical

Implementation of Your Implementation of Your Evaluation FrameworkEvaluation Framework

• At this workshop, we will help you design useable measurement tools and address the practical issue of developing an organizational structure that allows you to implement and sustain your evaluation.

Page 3: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

What to Expect from this What to Expect from this SeriesSeries

1. Learn about steps required to implement outcome evaluation

2. Design a program logic model

3. Identify question and indicators

4. Identify appropriate evaluation/measurement tools

5. Develop an implementation plan

Page 4: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

From outcomes, to questions, to From outcomes, to questions, to indicators, to methodsindicators, to methods

Did our program have an impact???

What were we trying to change? (Long-Term Outcome Objectives)

How were we planning to make that contribution? (Activities)

What sorts of things would we see if the expected change was happening? (Indicators)

What particular contribution were we going to make to that change?

(Short-Term Outcome Objectives)

How can we document those observations in a systematic way?

(Methods)

Page 5: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Today:Today:• FIRST, figure out what the

implications of your indicator exercise are for you

• SECOND, prioritize the possible actions, and

• THIRD, come up with a viable action plan

Page 6: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

A Framework forA Framework forOutcome MeasurementOutcome Measurement

(We will go through the elements of this table one by one throughout the day)

Evaluation Design Planning Chart

Objective Research Questions Indicators Where will we get the information?

What data collection tools will we use?

Data Analysis

Outcome objective from logic model

May refer to success in meeting objectives (from your logic model) or other questions stakeholders feel the evaluation should address.

Things you can you observe that will help you to answer this question.

Who will you speak to? How would you access existing information?

Given what is written in columns to the left, what method or methods are most efficient and effective?

How are you going to make sense of the data you collect?

Increased feelings of social support among participating parents

Q1: Have parents’ perceptions of social support changed as a result of participation in programming?

% of parents who feel they have social support (e.g., Scores on the ASSQ)

from parents

Survey (at intake, 3 months and 6 months)

pre-post statistical tests

Page 7: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Analyzing your Analyzing your Indicator exercise dataIndicator exercise data

Figuring Out What Your First Steps Ought to Be

Page 8: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Typical SituationsTypical Situations

•We’re having a hard time coming up with evaluation questions•We don’t have a lot of existing indicators: we don’t gather much data now•We gather lots of data, but we don’t really analyze it

Page 9: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Typical SituationsTypical Situations

• We gather data and we use it, but it isn’t telling us much that’s new

• We don’t make good use of our evaluation findings when promoting our programs

• We’ve got so many programs, ideas and needs, we don’t know where to start. Different programs are at very different stages.

Page 10: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Pulling out the specific implications Pulling out the specific implications (example)(example)

The measurement possibilities that emerged out of the exercise:

• Extracting some more information out of existing data, particularly around trends over time in attendance and access, and around participant satisfaction. 

• Making contact with partner agencies that run similar programs and/or program which refer to or from yours, in order to get their impressions of how your clients are doing and also to find out what their experiences have been with attendance rates. 

• Creating some type of new tool to gather feedback from clients.  Some combination of a pre/post knowledge test, a more detailed feedback form, and/or more exploratory focus group discussions.

• More ambitious and sophisticated ideas like videotaping parent/child interactions and tracking your clients after they leave your program to see what happens with future pregnancies or housing choices.  

Page 11: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Pulling out the specific Pulling out the specific implications (example)implications (example)

The next step for you seems to be an intensive and focused analysis of your rich existing data

You may also want to think about new ways of communicating these findings in new ways for outside audiences.

In addition, you are thinking of adding a very in-depth and unstructured qualitative component to the mix through some case studies.

Page 12: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Pulling out the specific Pulling out the specific implications (example)implications (example)

Possible Next Steps:• Doing some intensive analysis of the data

you have already gathered.• Revamping your existing client feedback

survey to ask more behavioural questions and perhaps switching to a pre-test post-test format

• Developing a tool that would allow you to seek feedback from caregivers or family members

Page 13: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

DiscussionDiscussion

What are the emerging first steps for you?

Page 14: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Keep in MindKeep in Mind

• We will talk this afternoon about estimating resource implications of these ideas.

Page 15: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

The Basic Options for ActionThe Basic Options for Action

The Situation The Strategy• We’re having a hard

time coming up with evaluation questions

•We need to clarify our theory before measurement can help us (maybe)

Page 16: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

The Basic Options for ActionThe Basic Options for Action

The Situation The Strategy•We don’t have a lot of existing indicators: we don’t gather much data now

•We need to come up with some basic tools to get us started

Page 17: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

A quick overview of A quick overview of basic evaluation basic evaluation

toolstools

Page 18: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Pros and cons of popular Pros and cons of popular measurement techniquesmeasurement techniques

• Retrospective reflections or stories

• Self report (interview or survey)

• Peer/family/worker report

• Direct observation

• File review

• Clinical assessments

Less Rigour

More Rigour

Page 19: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

The “Big Three”The “Big Three”

• Surveys

• Focus groups

• Individual Interviews

Page 20: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

When to Use Focus Groups: When to Use Focus Groups: The AdvantagesThe Advantages

• Quick, cheap, easy to assemble• direct interaction between researcher and

respondents • flexible - can be used for many purposes in many

settings• good for obtaining data from children or those

who are not literate• provides an opportunity to involve people in data

analysis through open recording• produces results which are easy to understand• is often useful and enjoyable for participants

Page 21: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Limitations of Focus GroupsLimitations of Focus Groups

• convenience sampling and small sample size limit ability to generalize

• responses of participants are not independent

• a few members can dominate the results• focus groups require a fair bit of skill to

lead well• the data which result, though rich, can

sometimes be difficult to analyze because they are so unstructured

Page 22: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Tips for Writing Good Survey QuestionsTips for Writing Good Survey Questions

• Stay Focused: Know why a survey is the right tool for a job, know the questions you want the survey to answer, and know how you hope to act on the findings.

• Provide Feedback: If you let people know about generated from a survey, they are more likely to participate next time

• Pilot Test!

Page 23: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Tips for Writing Good Survey QuestionsTips for Writing Good Survey Questions

• Include a good mix of question types. Open-ended questions which provide some direction for people (such as, what are some of the things you like best about the program?) tend to elicit more responses than completely open questions like “please include any additional comments here”

Page 24: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Tips for Writing Good Survey QuestionsTips for Writing Good Survey Questions

• Keep the text of the survey short, but lay it out in an easy-to-read way.

• Avoid Leading Questions:– How much do you think we should increase

our evaluation budget?

• Avoid Double Negatives:– Would it be better not to avoid increasing our

evaluation budget, yes or no?

• Avoid “War and Peace” Questions:– Please describe your idea for a new

management and accountability structure for our organization.

Page 25: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Organizing a good satisfaction Organizing a good satisfaction surveysurvey

• Provide an introduction that explains the purpose

• Start by asking for basic descriptive information about people’s experience with the program (e.g., how long have you been coming? How did you find out about us?)

• Move to questions about changes in knowledge (what did you learn? How much do you feel you know about x?)

Page 26: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Organizing a good satisfaction Organizing a good satisfaction surveysurvey

• Proceed to questions about behaviour & about whether they have used what they have learned

• Satisfaction and critical reflection questions work well next: (What was most useful? Would you do it again?)

• Personal information at the end

Page 27: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

The Basic Options for ActionThe Basic Options for Action

The Situation The Strategy•We gather lots of data, but we don’t really analyze it

•We need to learn how to analyze and interpret

Page 28: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Suggested Steps in Data Suggested Steps in Data AnalysisAnalysis

1. Organize Data

2. Review Original Questions

3. Summarize and Code

Centre for Research & Education in Human Services

Page 29: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Steps in Data AnalysisSteps in Data Analysis

4. Generate Themes

5. Begin Writing

6. Provide and Receive Feedback

Centre for Research & Education in Human Services

Page 30: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Creative ideas for extracting the Creative ideas for extracting the most out of data:most out of data:

• Natural time series or comparison group designs

• Data from similar organizations

• Don’t forget the qualitative!

• Be selective and speak to the questions you can speak to

• The power of triangulation

Page 31: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Rapport Youth Rapport Youth ServicesServices

A Case Study in Analysis of Existing Data

Page 32: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

The ContextThe Context

• Large youth counselling agency

• Long history (11 years) of using a client tracking database to keep track of who was served, what kind and how much service, etc.

• Also included data from a client satisfaction form

Page 33: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Rapport Youth & Family Services:Rapport Youth & Family Services:Details of Program ActivitiesDetails of Program Activities Slide 1

CommunityRelations

Program SupportsProgramming

Community Connections

•Network withcommunity •Market and promote services•Promote partnerships

Referral

•Refer clients to appropriate services•Provide informationabout other services

Intake/ Assessment

•Assess clients’ needs•Determine appropriateness of cases to Rapport’s mandate (goodness of fit)

Pro

gra

m A

cti

vit

ies Counseling

Therapy

•Develop therapeutic relationships•Help youth to buildtheir strengths •Promote protectivefactors•Address risk factors•Teach skills e.g.

•Stress reduction•Anger man’gmnt•Communication

•Support parents•Develop healthy relations within social networks(rapport)•Assist youth with school difficulties•Enable youth to make good choices

Groups

•Provide safe setting for youth to meettogether•Promote protectivefactors•Address risk factors •Teach skills e.g.

•Stress reduction•Anger managem’t•Communication

•Provide information•Support parents•Develop healthyrelations within social networks(rapport)•Enable youth tomake good choices

ECLYPSE

•Provide multi-service under oneroof (Drop-in)•Reach out to hard to reach students •Introduce youthto services•Identify crises•Provide technicalsupport•Promote

•Mental health•Supportive relations•Harm reduction(drugs/alcohol use)•Safe sexuality

•Provide employmentsupport

Administration and Management Agency Management/Administration

▪ Recruiting staff ▪ Training staff ▪ Supervising staff ▪ Gauging comm. needs▪ Developing programs ▪ Regulating case flow ▪ Securing resources ▪ Developing policies

Clinical Administration▪ Case preparation ▪ Research ▪ Session planning ▪ Case documentation

Page 34: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Rapport Youth &Family Services:Rapport Youth &Family Services:Linking Activities to Short-term and Long-term ObjectivesLinking Activities to Short-term and Long-term Objectives

Slide 1

Reduceservice

duplication

Programming

Pro

gra

mA

ctiv

itie

sS

ho

rt-t

erm

Ob

ject

ives

/O

utc

om

esL

on

g-t

erm

Ou

tco

mes

Empowered Youth and Families

Behavioural Outcomes

Community Relations

Program Supports

Community Connections

Connect clientsto appropriate

services

ReferralIntake/Assessment

Outreach to

community

Counseling Therapy

Groups ECLYPSE

Build credibility

Social Outcomes Increased healthy

relationswithin families

Increased communication

skills

Healthier Communities

Youth who aresuccessful

in life

Engaged happyyouth

Youth making better choices

Increased stress reduction

skills

Increased angermanagement

skills

Increased use of healthy coping mechanisms

Increasedindividual self-

worth

Increased healthy

behaviorsIncreased school

successamong youth

Raise awarenessof services

Decreased CAS/YJS

involvement

Increased social/emotional

well being

Understand clients/family

situations

Increased pro-social relations

Address riskfactors

Promoteprotective

factors

Connect/reconnect youth to

community services

Page 35: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Our approachOur approach

• Developed a logic model• Spent time cleaning and organizing

data and moving it from various formats into one spreadsheet.

• Conducting basic descriptive analyses

• Presented these to staff, and went back to do more in-depth analyses

Page 36: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Some of the ChallengesSome of the Challenges• Who counts as a “case”? What if

someone comes back to the organization for more support 4 years later? Or participates in 4 different programs at once?

• How do you deal with changes in how the staff made use of the database over time (e.g., geography fields)?

• What to do when the satisfaction form (the only source of “outcome” data) isn’t filled out by very many people?

Page 37: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Findings About Findings About Presenting IssuesPresenting Issues

• About 41% of all cases presented were conduct issues,

• 26% were family, peer or relational issues• 16% were anxieties, depression or emotional

issues. • Male clients were more likely to present conduct

issues • Female clients were more likely to present family,

peer and relational issues, and anxieties, depression and emotional issues.

• An analysis of presenting issues by geographical area revealed a similar pattern for Rapport’s three main geographical areas.

Page 38: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Findings About Outcomes Findings About Outcomes

• Clients who completed the follow-up survey experienced improvements in family dynamics, fighting in the home, and interactions at school.

• 99.5% of clients who completed the client satisfaction questionnaire said they were satisfied with Rapport’s services, 93% said they received the services they needed, and 95% said the services they received helped them to better deal with their problems.

• 90% of clients rated Rapport’s services as good or excellent, 93% of them indicated that they would return to Rapport if they needed help, and 95% said they would recommend Rapport if a friend needed help.

Page 39: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

The Basic Options for ActionThe Basic Options for ActionThe Situation The Strategy• We gather data

and we use it, but it isn’t telling us much that’s new

•We need to revamp our existing tools and gather some new, complementary data.

•We need to get a bit more ambitious and probing with our questions

•We need to dig in the research literature for findings to build on

Page 40: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Easy ways to “beef up” Easy ways to “beef up” methodological rigourmethodological rigour

Strategy Rationale

Incorporate more descriptive, behavioural questions

Less vulnerable to social desirability bias, easier to understand

Incorporate a greater variety of data types

Allows you to “back up” and strengthen findings

Narrow your focus with research questions

Measure a few things well

Measure at regular intervals Create your own “comparison groups”

Add follow-up measures to time of intervention measures

Allows for tracking of a greater range of outcomes, opens up possibility of failure

Find a comparison group Isolate the contribution of your program

Page 41: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Easy ways to make your design Easy ways to make your design more practical & efficientmore practical & efficient

• Narrow your focus with research questions

• Compare the “return” on different measurement choices and

• Use indicator lists to minimize the length and intrusiveness of surveys

• Stop gathering data you don’t use

• Invest in buy-in, and save time on data collection. Get input from stakeholders at each stage. Pilot test!!!!

Page 42: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Criteria for Judging the Criteria for Judging the Trustworthiness of an Trustworthiness of an

EvaluationEvaluation• accurate recording in the field• prolonged engagement - knowing the context

takes time• peer checks - members of a research team • debrief and challenge one another• saturation - qualitative information is rich and

detailed enough to ensure that key themes have not been missed

Page 43: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Criteria for Judging the Criteria for Judging the Trustworthiness of an Trustworthiness of an Evaluation Evaluation (continued)(continued)

• well established "audit trail" to track where you drew information from to reach conclusions

• triangulation - findings are supported by multiple methods and multiple stakeholders

• endorsement of participants - findings are fed back and validated by stakeholders

• clear program description and statement of objectives

• links between exploration and confirmation

Page 44: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

The Basic Options for ActionThe Basic Options for Action

The Situation The Strategy• We don’t make

good use of our evaluation findings when promoting our programs

Link back to the logic model to come up with strong core messages, then “pin” the data to those messages

Page 45: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Communicating andCommunicating andUsing ResultsUsing Results

Page 46: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Communication: Communication: Questions to ConsiderQuestions to Consider

• How much interpretation do you want to do (how much do others do)?

• Who should act after learning about the findings?

• What do you want them to do with the findings?

• How do you package information in order to facilitate this?

Page 47: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

The Basic Options for ActionThe Basic Options for ActionThe Situation The Strategy•We’ve got so many programs, ideas and needs, we don’t know where to start. Different programs are at very different stages.

•We need to think about our evaluation priorities at an organizational level

•We may need to pick a program to pilot evaluation strategies.

Page 48: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Discussion Discussion ContinuedContinued

What are the emerging first steps for you?

Page 49: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

The Evaluation The Evaluation Action PlanAction Plan

Page 50: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Where are we going to Where are we going to find the time for all find the time for all

this?this?

Page 51: Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design

Key questions for discussionKey questions for discussion

• Exactly what kind of resources are missing? What’s going to take a lot of time or expertise?

• What kinds of supports might make it easier for you?