26
Outcome evaluation Basics An Introduction Prepared for OLA Super Conference 2015 By Cindy Poon, Manager of Public Services Cindy Kimber, Coordinator of Branch Services Ajax Public Library

Outcome evaluation Basics An Introduction Prepared for OLA Super Conference 2015 By Cindy Poon, Manager of Public Services Cindy Kimber, Coordinator of

Embed Size (px)

Citation preview

Outcome evaluation Basics

An Introduction

Prepared for OLA Super Conference 2015

By Cindy Poon, Manager of Public Services

Cindy Kimber, Coordinator of Branch Services

Ajax Public Library

Agenda

• Definition of Outcome Evaluation• Outcomes vs Outputs• Difference between Assessment and

Evaluation• Benefits of Outcome Evaluation• Examples of APL’s Outcome Evaluation• Lessons we’ve learned

Definition of Outcome evaluationOutcomes• Benefits or changes• Influenced by a program’s outputs

United Way of America defines outcomes as the “benefits or changes for individuals or populations during or after participating in program activities. They are influenced by a program’s outputs. Outcomes may relate to behaviour, skills, knowledge, attitudes, values, conditions, or other attributes. They are what participants know, think, or can do; or how they behave; or what their condition is, that is different following the program.”

Evaluation • Determine whether a program has achieved the

desired result:• Was it successful?• What impact did it have?

Outcomes

• Define the expected results or outcomes in advance

• Include outcomes when developing a program• Measurable & Predictable• Change (or improvement)

• Skills, knowledge, attitudes, behaviours, status, life condition

• Outcome: Effectiveness of results = impact• Positive/ negative findings or unintended

consequences

Outputs

• “The direct products of program activities and usually are measured in terms of the volume of work accomplish… have little inherent value… important because they are intended to lead to a desired benefit for participants.” United Way of America

• Outputs = Numbers• Books circulated• Programs presented• Reference questions answered• Participants attended the storytime

programs• Flyers distributed

Evaluation not Assessment

• Assessment• Judgment / decision• Learning outcome• “An act of judging or deciding the amount,

value, quality or important of something,” defined by Cambridge Dictionary online

• Evaluation• Broader concept• Program outcomes• Program inputs (resources and activities)

Benefits of Outcome Evaluation

• “From the user in the life of the library to the library in the life of the user" article cited by Rhea Joyce Rubin from the California State Library.

• Why do we do evaluation? • Decision making: effectiveness• Length of a session, format, date/ time, etc.• Endorsement• Use the data (impact) for funding proposal

• Tell a story: stakeholders• Advocacy tool: support a library’s program makes a

significant difference; enhance public image• Share the impact

Develop an Outcome Evaluation Plan

• Choose or create a program• Identify goals with predictable outcomes• Limit the objectives• Prepare a clear statement

• Create outcomes• Hope to achieve• Have impact to the participants• Set indicators

• Design questionnaires• Measure the Inputs, Activities, Outcomes and

Outputs• Was the program a success?• What impact did it have?• Test the idea with ‘if-then’.

Selected Program: Story Stretchers

Talking Singing Reading Writing Playing

Goals and Objectives

Staff DataInputs -

ActivitiesOutputs

OutcomesWe use surveys to gather

information

Inputs: resources required for success

• Human resources• Who is doing the program?

• Fiscal Resources• Do we have sufficient resources?

• Facilities & equipment• Program room, projector, room set-up

• Knowledge base for the program• Training, knowledge, skill

• Involvement of collaborators• Volunteers, community partner

Activities: different actions to ensure success

• Planning• Clear understanding of goals, sufficient planning

time• Promotion

• Marketing plan, communication strategy• Spin-off activities

• Promoting other programs

Outputs: Stats

• Stats gathering• Number of participants• Circulation of display material• Customer satisfaction : rating of 4 or more out of

5 is our benchmark for success.• Compare to benchmarks

Outcomes: changes in participants or behaviours over length of program

• Changes in participants• Increased attention span, increased participation,

knowledge of rhymes• Changes in library use

• Come more often, check out more books, select different types of materials

• Changes in parent/child interaction • Asking child to predict what comes next, defining

new words, relate story to child’s real life experiences

QuestionnairesDear Parents:

The library is conducting an evaluation of our Story

Stretchers program. We would

appreciate your completing the following questionnaire. We

will have a second

questionnaire at the end of the session.

1. Have you attended the Ajax Library’s Story Stretchers storytime

program in the past?

yes no

2. What are your expectations when coming to the Story Stretchers

program? (tick all that apply)

Develop my child’s love of reading

Enjoy quality time with my child

Help my child develop literacy skills

Help my child get ready for school

Provide an opportunity for socialization for my child

Meet other parents and develop friendships

Other

_______________________________________________________________

_____________________________________________________________________

______________________________________________________________________

_3. Please rate our Story Stretchers storytime program based on your

first impressions where 1 is

needs improvement and 5 is excellent.

Needs Improvement Excellent

Storytime Room 1 2 3 4 5

(clean, suitable, welcoming)

Storytime Leader 1 2 3 4 5

(trained, enthusiastic, welcoming)

Books/Materials Displayed 1 2 3 4 5

(inviting, age appropriate)

Comments:

Family email (if you wish to receive email notifications from the

library):__________________________

Questionnaires

• Surveyed parents may not be the same at the beginning and end of the session (Drop-in Program)

• Drop off in parents completing survey from 24 to 14

• Staff survey – too much detail – staff did not track # books on display/taken out

• Staff retirement – lost info/new staff not in a position to comment on changes in children

• Multi-cultural participants – survey only in English and written

Data Collection

1.Has your child’s love of reading increased?

□ yes □ no □ stayed the same

2.Please rate our storytime where 1 is needs improvement and 5 is excellent.

Needs improvement Excellent

Stories 1 2 3 4 5

(variety, age appropriate)

Questions 1 and 2 are easy to collate – and translate into report 92% of participants reported…

3. What do you like best about storytime?

Harder to collate but can provide vital information; stories

What We Learned

• Validated that the program was working well• 70% of parents indicated they wanted to prepare

their children for Kindergarten• Led to development of Ready, Set, Kindergarten

program

• Learned to focus our future outcome evaluations • Clearly define information we are looking for• Narrow down data collected• Think in terms of outcomes• Need to share knowledge

Planning to Achieve Outcomes

TD Summer Reading Club

Set goals and objectives more thoughtfully – think about desired outcomes

We want reading skills to maintain or improve over the summer – minutes at reading level better than books below reading level

VS

Outcomes, Outcomes Everywhere

• Battle of the Books• Comments from students provided by Whitby TL:

• I felt included.• I made new friends.• I wasn't the only freak who loves to read.• It allowed me to move on from my old school.

• These were unexpected outcomes

And the Gold

“I wasn’t a reader

before Battle of

the Books”Share comments with funder

Use outcomes when speaking to community members about the program

Strengthen relationship with our partner

Additional Resource

Summary

• Choose a program to evaluate• Determine the outcomes• Use the Logic Model

• Inputs• Activities• Outputs• Outcome

• Analyze the data• Communicating the result with everyone,

stakeholders, funders, etc.• Improve, change, expand, scrap

Lessons we’ve learned

• Need a beginning and an end• Not the right program to evaluate – not

produce a great impact• No clear goal defined• Data is not measurable• No clear statement

Words of caution

• John Carver: “a crude measure of the right thing beats an elegant measure of the wrong thing.“

• Could be lack of experience in identifying and measuring outcomes;

• Staff cost to analyze the data;• Lack of clear goals and objectives;• Test with a small program.

• Need to be honest to ourselves, no matter what the outcomes are.

• Use the data with open arms and make change according to the results.

questions / comment

Thank You!

Cindy Poon: [email protected]

Cindy Kimber: [email protected]