Karla A. Henderson, Ph.D. Professor, North Carolina State
University [email protected]
Slide 3
Framework Evaluation Inventory (handout) Needs Assessment
Regarding Program Evaluation (Types of Needs Explained) Small Group
Needs Identification (6 people per group) Importance (Very,
Moderately, Slightly, Not) 5-10 topics to Discuss Together
Slide 4
What is Evaluation? Evaluation= systematic collection and
analysis of data to address criteria to make judgments about the
worth or improvement of something; making decisions based on
identified criteria and supporting evidence Assessment= the
examination of some type of need that provides a foundation for
further planning Evaluation is sometimes like solving a
mystery
Slide 5
Types of ProgramEvaluation Macro (System Evaluation) Micro
(Activity or Event Evaluation)
Slide 6
Formal (Systematic) Evaluation MACRO and MICRO approaches------
Provides rigor Systematic gathering (procedures and methods) of
evidence Leads to decisions and action Criteria Evidence
Judgment
Slide 7
Assessments Why assess? eventually use the input to design
programs and objectives generates new program ideas gives
constituents a say helps you be responsive What is a need? A want?
An intention? Assessments determine all three of these so YOU can
figure out how to promote what you are doing
Slide 8
Types of needs (Caution!) Normative needs- what should be
available Felt needs- what individuals believe they would like
Expressed needs- needs fulfilled through participation
Slide 9
1. Why do Systematic Evaluation
Slide 10
Potential Purposes: Efficiency-How are we doing? (Management
/Process Focused) Effectiveness- What difference do our efforts
make? (Impact/Outcome Focused)
Slide 11
What gets measured gets done If you dont measure results, you
cant tell success from failure If you cant see success, you cant
reward it If you cant reward success, youre probably rewarding
failure Reinventing Government, Osborne and Gaebler, 1992
University of Wisconsin-Extension, Program Development and
Evaluation Accountability EraDO YOU AGREE?
Slide 12
If you cant see success, you cant learn from it If you cant
recognize failure, you cant correct it. If you can demonstrate
results, you can win public support. Reinventing Government,
Osborne and Gaebler, 1992 University of Wisconsin-Extension,
Program Development and Evaluation Accountability Era
Slide 13
Evaluation Process: Begin with the end in mind. Covey (1990), 7
Habits of Highly Effective People
Slide 14
2. How to Think Like an Evaluator
Slide 15
A thinking process used by evaluators: (from Richard Krueger)
Reflecting Develop a theory of action. A logical sequence that
results in change. Begin with what is supposed to happen--the
results. Listening Share the theory of action with others.
Measuring Determine your measurement strategies--how you're going
to look at the program Adding value to the program What can
evaluation do to contribute to the program? How can evaluation make
the program better, more enjoyable, focused on results,
accountable, satisfying to participants and educators, etc?
Slide 16
Ways of Thinking: Goal Based Thinkers - "We look for goals"
Audit Thinkers -"We investigate and find out what's wrong"
Utilization Focused Thinkers - "We make evaluation useful"
Empowerment Focused Thinkers - "We empower local people"
Positivistic Thinkers - "We are scientists" Number Thinkers - "We
count--and we do it well Qualitative Thinkers - "We tell stories"
(Richard Krueger)
Slide 17
Practical tips for successful evaluation: (Richard Krueger)
Involve others Utilization, impact and believability emerge from
involving colleagues and clientele. If you want the information
used then involve others! Ask yourself: Do I have a program? and,
Is it worthy of evaluation? Consider your purpose for
evaluating---- (see earlier slide) Consider who wants the
evaluation-Who requested it? Use a variety of evaluation methods
when possible. Keep costs low by: Sampling strategically Keep
interest high by adding payoff to the participant. Start with
goals, but don't be unduly limited by goals Consider "early
evaluation" Design the evaluation carefully. The evaluation should:
Enhance the program Yield information beneficial to stakeholders
Conserve resources
Slide 18
3. Differences Between Assessment, Evaluation, and (Action)
Research
Slide 19
Evaluation= systematic collection and analysis of data to
address criteria to make judgments about the worth or improvement
of something; making decisions based on identified criteria and
supporting evidence Assessment= the examination of some type of
need that provides a foundation for further planning Action
Research-Evaluation leading to decisions/changes
Slide 20
4. Steps Involved in Evaluation Process
Slide 21
Steps Problem, Idea Identified Problem Statement/Purpose
Determined Instrument/Method Chosen Data Sources Data Collection
Data Analysis Conclusions/Recommendations
Slide 22
Evaluation Process: Begin with the end in mind. Covey (1990), 7
Habits of Highly Effective People
Slide 23
5. What Should be Evaluated
Slide 24
Areas to Evaluate Personnel Places Policies Programs
Participant Outcomes
Slide 25
Potential Purposes: Efficiency-How are we doing? (Mgmt/Process
Focused) Effectiveness- What difference do our efforts make?
(Impact/Outcome Focused)
Slide 26
Levels of Evaluation: END RESULTS (Impact) PRACTICE CHANGE
(Outcomes) KASA CHANGE (Knowledge, attitudes, skills, and
aspirations)(Outcomes) REACTIONS (SATISFACTION) (Outputs) PEOPLE
INVOLVEMENT (Outputs) ACTIVITIES (Outputs) INPUTS--RESOURCES
Slide 27
What is sampling? A population is the theoretically specified
aggregation of study elements. A sample represents or is
representative of a population
Slide 28
Types of Sampling Probability Non-Probability Theoretical
Slide 29
Probability Probability sampling Samples are selected in accord
with probability theory, typically involving some random selection
mechanism. Random Stratified Random Systematic Cluster
Representativeness Quality of a sample having the same distribution
of characteristics as the population from which it was
selected.
Slide 30
Nonprobability Technique in which samples are selected in a way
that is not suggested by probability theory. Purposive Convenience
Quota Expert Snowball
Slide 31
6. Who Should Conduct Evaluations
Slide 32
WHO DOES THE EVALUATIONS? Internal You! Staff Agency Evaluation
Personnel External Consultants University Students! RegardlessYOU
have to know your purpose, goals, and appropriate methods!
Slide 33
7. When Should Evaluataions Be Done
Slide 34
Timing Assessments (planning) find out where to begin based on
what you know Formative (process)- concerned with efficiency and
effectiveness Summative (product) overall performance
Slide 35
Approaches to Needs Assessments Literature/Professional
Development Advisory Groups Structured Interviews (individual and
focus groups) Surveys
Slide 36
Formative Evaluation Evaluation in Process Allows Changes to be
Made Immediately Most often Focused on Inputs and Outputs
Slide 37
Summative Evaluation At the END of something What was?
Recommendations for the Future
Slide 38
8. What Tools are Available for Evaluations
Slide 39
Data Collection Methods MAJOR ONES: Questionnaires/Surveys
Interviews (Individual and Focus Groups) (Pros and Cons)
Slide 40
Other Methods: Systematic Observation Checklists Field
Observations Unobtrusive Measures Physical Evidence Archives Covert
Observations Visual Analyses Experimental Designs Case Studies
Slide 41
9. What Cutting Edge Strategies Exist for Evaluation
Slide 42
To consider (see below for further info) Logic Modelsoutcome
focused Trends Analysis Benchmarking Proragis (NRPA)
Slide 43
10. CAPRA Standards related to Evaluation
Slide 44
CAPRA Standards 10.1 Systematic Evaluation Program There shall
be a systematic evaluation plan to assess outcomes and the
operational deficiency and effectiveness of the agency. 10.2
Demonstration Projects and Action Research There shall be at least
one experimental or demonstration project or involvement in some
aspect of research, as related to any part of parks and recreation
operations, each year.
Slide 45
CAPRA Standards 10.3 Evaluation Personnel There shall be
personnel either on staff or a consultant with expertise to direct
the technical evaluation/research process. 10.4 Employee Education
There shall be an in-service education program for professional
employees to enable them to carry out quality evaluations.
Slide 46
11. Evaluating Inputs, Outputs, Outcomes, and Impacts
12. Using Goals and Objectives as Basis for Evaluation
Slide 49
What are the goals of the program? What do we expect to happen?
What do we want participants to do, gain, learn? BEGIN WITH THE END
IN MIND!!
Slide 50
Goals and Objectives Goals Broad, long-range statements that
define the programs/services that are going to be provided
Objectives Specific statements (about the attainable parts of the
goal) that are measurable and have some dimension of time.
Slide 51
Objectives S pecific Must be clear and concrete M easurable
Must be some way to determine whether or not the desired results
have been achieved A chievable Must be attainable and
reality-based!!! R elevant Must be useful; must have worth to your
organization T ime-limited/Time connected Must specify a time frame
for accomplishment Adapted from Edginton, Hanson, & Edginton,
1980
Slide 52
13. Using Logic Models as Basis for Evaluation
Slide 53
Logic Model (Preview) Moving Toward Success: Framework for
After-School Programs, 2005 Goal Strategies- Activities Short-term
Outcomes Long-term Outcomes Data Sources & Performance
Measures
Slide 54
A logic model is A HOT TOPIC in (Program) Evaluation A
depiction of a program showing what the program will do and what it
is to accomplish. A series of if-then relationships that, if
implemented as intended, lead to the desired outcomes The core of
program planning and evaluation University of Wisconsin-Extension,
Program Development and Evaluation
Slide 55
54 Simplest Form of a Logic Model INPUTSOUTPUTSOUTCOMES
Slide 56
Everyday example 55 HEADACHEHEADACHE Feel betterGet pillsTake
pills SituationINPUTS OUTPUTSOUTCOMES
Slide 57
56 Where are you going? How will you get there? What will show
that youve arrived? If you dont know where you are going, how are
you gonna know when you get there? Yogi Berra
Slide 58
How will activities lead to desired outcomes? A series of
if-then relationships University of Wisconsin-Extension, Program
Development and Evaluation 57 We invest time and money Students
will have opportuniti es for active recreation They will learn and
improve their recreation skills They will be more active They will
become fit and develop life long skills IF then We can provide
afterschool programs to 50 children IF then Tutoring Program
Example
Slide 59
Logical chain of connections showing what A program is to
accomplish 58 What we do Who we reach What results INPUTS OUTPUTS
OUTCOMES Program investments ActivitiesParticipation Short Medium
Long- term What we invest
Slide 60
University of Wisconsin-Extension, Program Development and
Evaluation59 OUTPUTS What we do Who we reach ACTIVITIES Train,
teach Deliver services Develop products and resources Network with
others Build partnerships Assess Facilitate Work with the media
PARTICIPATION Participants Clients Customers Agencies Decision
makers Policy makers Satisfaction
Slide 61
OUTCOMES What results for individuals, families, communities..
SHORT Learning Changes in Awareness Knowledge Attitudes Skills
Opinion Aspirations Motivation Behavioral intent MEDIUM Action
Changes in Behavior Decision-making Policies Social action
LONG-TERM Conditions Changes in Conditions Social (well-being)
Health Economic Civic Environmental C H A I N OF O U T C O M E
S
Slide 62
Logic Model Doesnt address: Are we doing the right thing?
University of Wisconsin-Extension, Program Development and
Evaluation Limitations
Slide 63
14. Using Benchmarking as a Basis for Evaluation
Slide 64
Benchmarking Agencies or Services compared to other agencies or
services to determine a relative value Set goals against a standard
of the best Benchmarks are the statistics; Benchmarking is the
process Information examples: budgets, salaries, fees charged,
number of participants, number of staff Decide who you want to
compare, design a questionnaire, analyze, and write report
Slide 65
15. Using NRPAs PRORAGIS Tool for Evaluation
Slide 66
16. Doing Cost/Benefit and Economic Impacts
Slide 67
Economic Impact Analyses Determining the dollar value a program
or service has on the economy The net change of money in a
community from the spending of visitorsonly visitors are part of
the analyses
Slide 68
Cost-Benefit Analyses Focus on Monetary Forms Sometimes
Difficult to Do Social Benefits may outweigh Monetary Benefits
Slide 69
17. Using Trends Analysis for Evaluation
Slide 70
Trends analysis Examining the future by looking at the past
Measuring data over time Secret is good record keeping
Slide 71
18. Designing Valid, Reliable, and Useable Questionnaires
Examples Open-ended: What did you like best about day camp?
Close-ended, fixed alternatives: What did you like best about day
camp this summer? Counselors Arts Snacks Swimming Sports
Slide 74
Another Example Close-ended, Likert: What did you think about
each of these parts of day camp? Poor Fair Good Great
Counselors1234 Arts & Crafts1234 Snacks1234 Swimming1234
Slide 75
Wording Advise One idea per question One idea per question
Clear, brief, and simple Clear, brief, and simple Avoid leading
questions Avoid leading questions Avoid estimates if possible Avoid
estimates if possible Use words familiar to respondent Use words
familiar to respondent Avoid fancy words and jargon Avoid fancy
words and jargon Be clear about meanings of words Be clear about
meanings of words Avoid negative questions Avoid negative questions
Do a pilot study Do a pilot study State alternatives
precisely-mutually exclusive responses State alternatives
precisely-mutually exclusive responses Use stages if needed but
give GOOD directions Use stages if needed but give GOOD
directions
Slide 76
Format and Layout Design For Good Response Give clear
directions Start with something easy & familiar (NOT demos
though) Have white space & easy font to read Colored paper if
easy to readfont size appropriate for audience Have easy to follow
directions for staged questions Have professional look Front page
should include title, date or time or year, perhaps a graphic Keep
the length manageable for the info desired Anchor all numbered
responsese.g., 5=strongly agree, 4=agree etc. NO TYPOGRAPHICAL
ERRORS
Slide 77
PILOT TESTING Is the survey valid? How much time does it take
to complete? Do the respondents feel comfortable answering the
questions? Is the wording of the survey clear? Are the answer
choices compatible with the respondents experience in the
matter?
Slide 78
19. Using Individual Interviews
Slide 79
Types of Interviewing Personal (in-depth) Telephone Part of
Field Observation Focus Groups (Group Interviews)
Asking Open-Ended Questions NEVER allow a person to say yes or
no Purpose is to get people to talk Start with noncontroversial
questions Demographics at end/dont ask if you know (e.g., gender)
Avoid WHY Use probes and follow-ups (Tell me more about? Use
listening skills Maintain control of the interview Share yourself
with the interviewee
Slide 82
Structuring Interviews Purpose of interviews Individual or
group If group- number of persons involved Structured or
unstructured? Audio-taping/Note-taking Facilitator/Recorder Roles
Rapport-building Timing
Slide 83
Things to remember Pay attention to the content of your
questions (keep your focus) Give thought to setting up the
interview Pay attention to how you plan to record the data Take
notes-on-notes Conduct interviewer training Go over possible
problem areas
Slide 84
Training for Interviewers Discussion of general guidelines and
procedures. Specify how to handle difficult or confusing
situations. Conduct demonstration interviews. Conduct real
interviews.
Slide 85
Guidelines for Structured Survey Interviewing Dress in a
similar manner to the people who will be interviewed. Study and
become familiar with the questions. Follow question wording exactly
(if it is quantitative). Record responses exactly. Probe for
responses when necessary.
Slide 86
Telephone Surveys Advantages: Money and time. Control over data
collection. Disadvantages: Surveys that are really ad campaigns.
Answering machines.
Slide 87
20. Conducting Focus Groups
Slide 88
Focus Group A group of people are brought together in a room to
engage in guided discussion of a topic.
Slide 89
Pros and cons Pros Socially oriented Flexible High face
validity Speedy Low in cost? Cons Researcher has less control Data
may be difficult to analyze Need good moderator Must do more than
one group-great differences between groups Groups difficult to
assemble Must have conducive environment
Slide 90
What about group interviews? Focus groups encourage ideas by
interacting with other individuals Pay attention to the same issues
in a personal interview Have a clear purpose for the group meeting
(criteria) Have clear structured, open-ended questions
Slide 91
Points for Conducting a Focus Group Be a good moderator Set
ground rules The beginning of the focus group process is crucial to
its success Keep the process focused and flowing Probe where needed
Be sure to thank participants
Slide 92
Other Focus Group Tips Usually 5-10 individuals Key is to get
people to talk to each other Usually two moderators (one to direct
discussion and another to take care of technical aspects) Use
incentives to get people to particiapte (maybe) Pilot test just
like ALL surveys USE OPEN-ENDED questions Be familiar with group
processgetting people to talk,not letting some take over Ground
rules-speaking one at a time, confidentiality, breaks taken,
positive and negative
Slide 93
Summary Focus groups encourage ideas by interacting with other
individuals Pay attention to the same issues in a personal
interview Have a clear purpose for the group meeting (criteria)
Have clear structured, open-ended questions
Slide 94
21. Using Observational Tools
Slide 95
Types of Data Qualitative Quantitative
Slide 96
Qualitative Observations Takes a great deal of field time Need
in-depth notes on your observations Use of key informants is often
helpful Criteria often change and become re-defined
Slide 97
Tips on (Field) Observations Only unscientific if inappropriate
techniques are used Highly reliable and valid IF systematically
done Time-consuming Anecdotal records or critical incidents
describe FACTS of situations Field notes are collected-MANY, MANY,
MANY Researcher is data collection instrument SEPARATE facts from
interpretations
Slide 98
Notetaking Record as soon as possible Take notes in
detail-running descriptions, conversations, incidents Make copies
of notes ASAP Indicate whose language-yours, quotes, paraphrases
Rule of thumb-several (5-6 single spaced typed pages for an hour of
observation) Can use tape-recorder and transcription Record ALSO
what you do not understand MONITOR yourself as a data collector
Length of time spent depends on research question and theoretical
saturation IF it is not written down, it never happened!
Slide 99
Dimensions of Approaches Full participant observer-> Outside
observer Overt observer-> Covert observer Single
observation-> Long-term multiple obs. Narrow focus-> Broad
focus
Frequency Sampling Checklist Client: _________ Activity:
_____________ Time: _____ to ________ BehaviorFrequency of Behavior
Initiates a conversation// Shifts attention to new project/ Asked
question by peer//// Responds appropriately when asked question by
peer///
Slide 102
Rating Scales Behavior Excellent Average Poor* How well does
the client respond to questions? How appropriate does client
interact with peers? *Excellent= top third Average= middle third
Poor= lowest third
Slide 103
22. Using On-line Tools (SurveyMonkey etc.)
Slide 104
Question Structures-- SurveyMonkey Question 1: Have you used a
swimming pool in the last year? Yes No Question 2: How many times
in the last year have you gone to a public swimming pool? Never
Once Twice More than 3 times Question 3: How many times have you
gone to a public swimming pool this past year? (Please write in an
estimate number) __________
Slide 105
Slide 106
On-Line Considerations--Pros Same strengths as a paper version
Better at addressing sensitive issues Cost efficient Faster
delivery Design options Dynamic Ability to track Quick response
time Easier to use for skip logic
Slide 107
On-Line Considerations--Cons Spam/Privacy concerns Technical
issues Submitting multiple submissions No interviewer present to
clarify questions or issues
Slide 108
New Technologies and Survey Research CAPI - computer assisted
personal interviewing. CASI - computer assisted self interviewing.
CSAQ - computerized self-administered questionnaires. TDE -
touchtone data entry. VR - voice recognition.
Slide 109
23. Fun Strategies for Doing Evaluations
Slide 110
Importance-Performance Eval How important is something? How
well did the organization perform it? Matrix of: Keep up the good
work Possible Overkill Needs Work Low Priority
Slide 111
Other Ideas: Use Grading Scale A, B, C, D, F Use Pie Chart and
distribute money for preferences Use Happy Faces Computer Assisted
Others
Slide 112
Measuring Physical Activity SOPLAY or SOPARC Pedometers or
Accelerometers
Slide 113
24. Selecting Evaluation Instruments
Slide 114
Selecting an Evaluation/Measurement Tool Is it reliable and
valid? Does it measure what you want? Appropriate for the
participants? Reasonable to administer and in your price range?
Directions clear, concise, unambiguous? Easy to analyze? Is it the
best way to measure the objectives? Is the activity reaction form
put together by Cary, NC the one you should use?
Slide 115
25. Contracting with Outside Consultants
Slide 116
Outside: You have to know what you want done You have to be
able to evaluate results and make decisions You probably need some
amount of financial resources Could save you a lot of time if you
do it right
Slide 117
26. Developing Your Own Evaluation Tool
Slide 118
Steps: Define problem/level of evaluation Determine contents
(including criteria) and broad questions Identify and categorize
respondents Develop items, structure format Write directions Ensure
response
Slide 119
Purpose of Data Collection What do you want to know? Who has
the information? What is the best approach (based on purpose, time,
money, resources, your expertise) to use? How will you use the
results? Are you interested in outputs and/or outcomes ?
Slide 120
Kinds of Information Sought Behavior info Knowledge info
Attitudes/beliefs/values info Demographic info **pay attention to
relationship of respondent to the question (their past, present,
future)
Slide 121
27. Ethical Considerations related to Evaluation
Slide 122
Issues Ethical Political Legal Moral
Slide 123
Evaluator must be Competent Knowledge about area to be
researched Knows about evaluation design Knows wide range of
methods Knows how to analyze and interpret data and relate them
back to conclusions/recommendations
Slide 124
Developing Competencies cont Knows how to use the results
Understands how to handle political, legal, and ethical concerns
encountered Must have certain personal qualities: trustworthy,
strives for improvement, responsive to sensitive issues
Slide 125
Doing the Right Thing Political Issues (science is touched by
politics but goes on anyway; social change is always political;
values matter) Supports/refutes views and values Personal contacts
Value-laden definitions Controversial findings Pressures to produce
certain findings Know the organizations position, dont go beyond
the data in conclusions, have clear purpose for research
Slide 126
Doing the Right Thing Legal Issues Not many legal concerns
except around illegal behaviors Moral Unintentional mistakes made
by bias or mistake Cultural & procedural biases Letting bias,
prejudice, friendships influence outcomes Dealing with negative
findings Taking too long to get the results out Be prepared to
recognize the possibility of statistical errors and know how to
explain them
Slide 127
Doing the Right Thing Voluntary Participation No Harm Anonymity
and Confidentiality Issues of Deception Analysis and Reporting
Slide 128
Ethics Principles Be open with people Dont promise more than
you can do Protect the rights and privacy of your subjects Guard
against coercion Get written consent & Board approval Guard
against harm Let participants know the results
Slide 129
Anonymity The researcher cannot identify a given response with
a given respondent. Confidentiality Researcher can identify a given
person's responses but promises not to do so publicly.
Purposeful Planning for Evaluation Schedule time on your yearly
calendar Involve planning Board, committees, colleagues, staff,
volunteers, etc. Identify what you hope to achieve (desired
outcomes) Identify your goals and objectives
Slide 133
Purposeful Planning for Evaluation Identify the methods you
will use to measure whether or not your program met objectives
Identify how often evaluation will be undertaken Identify WHO will
do the evaluation and how they will get the appropriate
resources
Slide 134
Formal (Systematic) Evaluation--Review MACRO and MICRO
approaches------ Provides rigor Systematic gathering (procedures
and methods) of evidence Leads to decisions and action Criteria
Evidence Judgment
Slide 135
29. How to Use Information from Evaluation Projects
Slide 136
Using Data-KEY ADMONISHMENTS If you collect the data, you need
to analyze the data! Record the results of your data collection
Compare your findings to your original goals and objectives Share
with appropriate people Incorporate into your planning cycle