36
Evaluation: Asking the Right Questions & Using the Answers Presented by Annemarie Charlesworth, MA UCSF National Center of Excellence in Women’s Health November 3, 2006

Evaluation: Asking the Right Questions & Using the Answers Presented by Annemarie Charlesworth, MA UCSF National Center of Excellence in Women’s Health

  • View
    216

  • Download
    0

Embed Size (px)

Citation preview

Evaluation:

Asking the Right Questions &

Using the Answers

Presented by Annemarie Charlesworth, MA

UCSF National Center of Excellence in Women’s HealthNovember 3, 2006

Part 1 - Evaluation Overview

Part 2 - Steps to Program Planning and Evaluation

Part 3 - The Logic Model: A Tool for Planning and Evaluation

Part 1 - Evaluation Overview

What is Evaluation?

• Process of collecting information about your program in order to make some decisions about it.

• Complements program management by improving and accounting for program effectiveness.

How is Evaluation Helpful?

• Gain insight

• Change practice

• Assess effects

• Affect participants

Gain Insight

• Assess needs, desires, and assets of community members.

• Identify barriers and facilitators to service use.

• Learn how to describe and measure program activities and effects.

Change Practice

• Refine plans for introducing a new service.

• Characterize the extent to which plans were implemented.

• Improve the content of educational materials.

• Enhance the program's cultural competence.

Change Practice (cont.)

• Verify that participants' rights are protected.

• Set priorities for staff training.

• Make midcourse adjustments for improvement.

• Improve the clarity of health communication messages.

• Mobilize community support for the program.

Assess Effects

• Assess skills development by program participants.

• Compare changes in provider behavior over time.

• Compare costs with benefits.

• Find out which participants do well in the program.

• Decide where to allocate new resources.

Assess Effects (cont.)

• Document the level of success in accomplishing objectives.

• Demonstrate that accountability requirements are fulfilled.

• Aggregate information from several evaluations to estimate outcome effects for similar kinds of programs.

• Gather success stories.

Affect Participants

• Reinforce program/intervention messages. • Stimulate dialogue/raise awareness regarding

health issues. • Broaden consensus among coalition members

regarding program goals. • Teach evaluation skills to staff and other

stakeholders. • Support organizational change and

development.

Types of Program Evaluation

• Goals based evaluation (identifying whether you’re meeting your overall objectives)

• Process based evaluation (identifying your program’s strengths and weaknesses)

• Outcomes based evaluation (identifying benefits to participants/clients)

Type of evaluation depends on what you want to learn…

Start with:

1) What you need to decide (why are you doing this evaluation?);

2) What you need to know to make the decision;

3) How to best gather and understand that information!

Key questions to consider when designing program evaluation:

1. For what purposes is the evaluation being done, i.e., what do you want to be able to decide as a result of the evaluation?

2. Who are the audiences for the information from the evaluation (e.g., funders, board, management, staff, clients, etc.)

3. What kinds of information are needed to make the decision you need to make and/or enlighten your intended audiences?

Key questions (cont.)

4. From what sources should the information be collected (e.g., employees, customers, clients, etc.?)

5. How can that information be collected in a reasonable fashion (e.g., questionnaires, interviews, examining documentation, etc.)

6. When is the information needed (so, by when must it be collected)?

7. What resources are available to collect the information?

Evaluation should be considered during program planning and

implementation…

Not just at the end!

It is not enough to have a goal…

Goals exist because some action is needed.

However, you can’t argue an action without a deep understanding of the problem.

Problem Need Action Goal

Part 2 - Steps to Program Planning and Evaluation

10 Steps to Planning a Program(and its evaluation!)

1. Needs and assets Extent, magnitude and scope of problem Summary of what’s already being done Gaps between needs and existing services Community support

2. Goals and objectives Long-term specific to target population Link short-term objectives to goals

3. Defining the intervention/treatment program components to accomplish objectives and goals one or two activities should support each objective

10 Steps to Planning a Program(and its evaluation!)

4. Developing the program/logic model

5. Choose the type(s) of data collection (i.e., surveys, interviews, etc.)

6. Select your evaluation design (i.e., one group pre/posttest vs. comparison pre/posttest)

10 Steps to Planning a Program(and its evaluation!)

7. Pilot test tools

8. Collect data

9. Analyze data

10. Report, share, and act on the findings

Part 3 - The Logic Model: A Tool for Planning and

Evaluation• Picture of how your organization does its

work

• Communicates its “rationale”

• Explains hypotheses and assumptions about why the program will work

• Links outcomes with activities

Logic models help you chart the course ahead …

Allow you to better understand

• Challenges

• Resources available

• Timetable

• Big picture as well as smaller parts

Basic Logic Model

1. Resources/ Inputs

2. Activities 3. Outputs 4. Outcomes 5. Impact

Planned Work Intended Results

*From W.K. Kellogg Foundation Logic Model Development Guide

Basic Logic Model

Resources Activities Outputs Short and Long-term Outcomes

Impact

In order to accomplish our set of activities we will need the following:

In order to address our problem or asset we will conduct the following activities:

We expect that once completed or under way these activities will produce the following evidence:

We expect that if completed or ongoing these activities will lead to the following changes in 1-3 then 4-6 years:

We expect that if completed these activities will lead to the following changes in 7-10 years:

Resources Activities Outputs Short and Long-term Outcomes

Impact

IRS 501(c)(3) status

• Diverse, dedicated

board of directors

representing

potential partners

• Endorsement from

Memorial Hospital,

Mytown Medical

Society, and United

Way

• Donated clinic

facility

• Job descriptions for

board and staff

• First year’s funding

($150,000)

• Clinic equipment

• Board & staff

orientation process

• Clinic budget

• Launch/complete

search for executive

director

• Board & staff conduct

Anywhere Free Clinic

site visit

• Board & staff conduct

planning retreat

• Design and implement

funding strategy

• Design and implement

volunteer recruitment

and training

• Secure facility for clinic

• Create an evaluation

plan

• Design and implement

PR campaign

• # of patients referred from ER to the clinic/year

• # of qualified patients enrolled in the clinic/year

• # of patient visits/year

• # of medical

Volunteers serving/year

• # of patient fliers

distributed

• # of calls/month

seeking info about

clinic

• Memorandum of

Agreement for free

clinic space

• Change in patient

attitude about need

for medical home

• Change in # of

scheduled annual

physicals/follow-ups

• Increased # of

ER/physician referrals

• Decreased volume of unreimbursed

emergencies treated

in Memorial ER

• Patient co-payments

supply 20% of clinic

operating costs

• 25% reduction in # of

uninsured ER

visits/year

• 300 medical

volunteers serving

regularly each year

• Clinic is a United Way

Agency

• Clinic endowment

established

• 90% patient

satisfaction for 5

years.

• 900 patients

served/year

Produced by The W. K. Kellogg Foundation

Example Logic Model for a free clinic to meet the needs of the growingnumbers of uninsured residents (Mytown, USA)

S.M.A.R.T.

• Outcomes and Impacts should be:

–Specific

–Measurable

–Action-oriented

–Realistic

–Timed

One size does not fit all!

• Many different types of logic models

• Experiment with models that suit your program and help you think through your objectives

Useful for all parties involved(Funder, Board, Administration, Staff, Participating

organizations, Evaluators, etc.)

• Convey purpose of program• Show why its important• Show what will result• Illustrate the actions that will lead to the desired results

– Basis for determining whether actions will lead to results!

• Serves as common language

Enhance the case for investment in your program!

Strengthen Community involvement

• Created in partnership, logic models give all parties a clear roadmap

• Helps to build community capacity and strengthen community voice

• Helps all parties stay on course or intentionally decide to go off-course

• Visual nature communicates well with diverse audiences

Logic ModelsUsed throughout the life of your program

• Planning• Program Implementation• Program Evaluation

May change throughout the life of the program!

– Fluid; a “working draft” – Responsive to lessons learned along the way– Reflect ongoing evaluation of the program

The Role of the Logic Model in Program Design/Planning

• Helps develop strategy and create structure/organization

• Helps explain and illustrate concepts for key stakeholders

• Facilitates self-evaluation based on shared understanding

• Requires examination of best-practices research

The Role of the Logic Model in Program Implementation

• Backbone of management plan

• Helps identify and monitor necessary data

• Help improve program

• Forces you to achieve and document results

• Helps to prioritize critical aspects of program for tracking

The Role of the Logic Model in Program Evaluation

• Provides information about progress toward goals

• Teaches about the program

• Facilitates advocacy for program approach

• Helps with strategic marketing efforts

References• Kellogg Foundation

http://www.wkkf.org/pubs/tools/evaluation/pub3669.pdf • Schmitz, C. & Parsons, B.A. (1999) “Everything you wanted to know

about Logic Models but were afraid to ask” http://www.insites.org/documents/logmod.pdf

• University of Wisconsin Cooperative Extension http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html

• CDC Evaluation Working Group http://www.cdc.gov/eval/logic%20model%20bibliography.PDF

• CDC/MMWR - Framework for Program Evaluation in Public Healthhttp://www.cdc.gov/mmwr/preview/mmwrhtml/rr4811a1.htm

• McNamara, C. (last revision: Feb 16, 1998) “Basic Guide to Program Evaluation” http://www.managementhelp.org/evaluatn/fnl_eval.htm