61
1 Measuring Program Outcomes Measuring Program Outcomes © Performance Results, Inc. Gaithersburg, Maryland A Training Toolkit for A Training Toolkit for Programs of Faith Programs of Faith Outcome-Based Evaluation Outcome-Based Evaluation Training Training and and Evaluation Evaluation Services Services www.performance-results.net Organizations of faith and ministries are being challenged to objectively measure the results of their programs. It has never been more important for these organizations to clearly identify how their programs make a difference in the lives of people they serve. In order to do that, there has to be a clear understanding of the needs of people to be served, what outcomes the programs are designed to produce, and a clear identification of the activities that will produce program outcomes. Outcome-based evaluation is one way programs can clearly identify who they serve, what results are desired, and how these results become true benefits for people served.

Evaluation Evaluation utcome-Based Services ... - · PDF file4 Measuring Program Outcomes Training Overview Define outcome-based evaluation and discuss its elements Discuss how programs

Embed Size (px)

Citation preview

1

Measuring Program OutcomesMeasuring Program Outcomes

© Performance Results, Inc. Gaithersburg, Maryland

A Training Toolkit for A Training Toolkit for Programs of FaithPrograms of Faith

Outcome-Based EvaluationOutcome-Based Evaluation

Training Training and and Evaluation Evaluation ServicesServices

www.performance-results.net

Organizations of faith and ministries are being challenged to objectively measure the results of their programs. It has never been more important for these organizations to clearly identify how their programs make a difference in the lives of people they serve. In order to do that, there has to be a clear understanding of the needs of people to be served, what outcomes the programs are designed to produce, and a clear identification of the activities that will produce program outcomes. Outcome-based evaluation is one way programs can clearly identify who they serve, what results are desired, and how these results become true benefits for people served.

2

Measuring Program OutcomesMeasuring Program Outcomes

INTRODUCTIONINTRODUCTION

Program Defined

OBE Limitations

OBE Achieves

Training Overview

OBE Focus

DefinitionDefinitionOutcome-based evaluation is a systematic way to assess the extent to which a program has achieved its intended results.

OutcomeOutcome--Based EvaluationBased Evaluation

© Performance Results, Inc. Gaithersburg, Maryland

www.performance-results.net

Outcome-based evaluation is systematic. It is a great place to begin when planning a project, and a practical way to identify what the results of the program will be. This presentation will take the viewer through all the elements of creating a program that is “outcomes-ready.” If you follow through and replicate the examples within this presentation, you will have an evaluation plan that accurately describes your program and how you will measure its results. It will be “proposal-ready.”

3

Measuring Program OutcomesMeasuring Program Outcomes

OBE focuses on the key questions: “How has my program made a difference?”“How are the lives of the program participants better as a result of my program?”

INTRODUCTIONINTRODUCTION

Program Defined

OBE Limitations

OBE Achieves

Training Overview

OBE FocusOBE Focus

Definition

OutcomeOutcome--Based EvaluationBased Evaluation

www.performance-results.net

© Performance Results, Inc. Gaithersburg, Maryland

Whether developing a new program or creating a plan for evaluating an existing program, the first consideration should be how the program will help the people it serves. Having a clear understanding of how a program will make a real difference in the lives of program participants will make the evaluation process easier with a focus on measuring the program’s true outcomes for people served.

4

Measuring Program OutcomesMeasuring Program Outcomes

Training OverviewTraining Overview

Define outcome-based evaluation and discuss its elementsDiscuss how programs get startedIdentify outcomes and ways to measure themIdentify ways to report results

INTRODUCTIONINTRODUCTION

Program Defined

OBE Limitations

OBE Achieves

Training OverviewTraining Overview

OBE Focus

Definition

www.performance-results.net

© Performance Results, Inc. Gaithersburg, Maryland

This training program will help the user understand the key concepts and terms used in outcomes evaluation and clarify the purpose of the program by understanding why the program is offered and what it is attempting to achieve.

5

Measuring Program OutcomesMeasuring Program Outcomes

What can OBE achieve?What can OBE achieve?Increase program effectiveness

Communicate program valueProvide a logical framework for program developmentGenerate information for decision-making

INTRODUCTIONINTRODUCTION

Program Defined

OBE Limitations

OBE AchievesOBE Achieves

Training Overview

OBE Focus

Definition

www.performance-results.net

© Performance Results, Inc. Gaithersburg, Maryland

Why is outcome-based evaluation important and why should you plan to evaluate your program in this way?Outcome-based evaluation is one of many evaluation methods that can identify whether a program has been successful. Outcome-based evaluation focuses on the results of services that were intended from the outset of the program. It is different than other forms of evaluation that evaluate a program after it is over and attempts to assess “what happened.”

Programs that establish clear outcomes from the beginning are more likely to construct a way to have those results achieved. Programs that don’t establish outcomes from the beginning are left saying “what happened?”

6

Measuring Program OutcomesMeasuring Program Outcomes

What are its limitations?What are its limitations?

OBE is not formal researchOBE suggests cause and effect; it doesn't prove it OBE shows contribution, not attribution

INTRODUCTIONINTRODUCTION

Program Defined

OBE LimitationsOBE Limitations

OBE Achieves

Training Overview

OBE Focus

Definition

www.performance-results.net

© Performance Results, Inc. Gaithersburg, Maryland

Outcome-based evaluation is not formal research. However, the user of this evaluation method can make the evaluation as formal as desired through the use of reliable and valid data instruments to collect outcomes information. Outcome-based evaluation should be considered a management tool comparable to setting budgets and holding the organization to a formal accounting system to see whether budgetary goals are being met. The strength of OBE as an evaluation process is that it has been established by the organization for the organization and as such, there is a great deal of ownership over the outcomes being measured and how the results will be used.

7

Measuring Program OutcomesMeasuring Program Outcomes

INTRODUCTIONINTRODUCTION

Program DefinedProgram Defined

OBE Limitations

OBE Achieves

Training Overview

OBE Focus

Definition

Activities and services leading towards intended outcomesGenerally has a definite beginning and endDesigned to change attitudes, behaviors, knowledge, or increase skills and abilities based on assumed need

What is a program?What is a program?

© Performance Results, Inc.

www.performance-results.net

This document defines a program as a series of activities and services leading toward a defined and predictable end. If the program keeps track of what services were prescribed to specific participants and what outcomes the participants achieved, it is possible to estimate what services contributed most to the program’s success. In addition, examining this same information can identify the services that had little or no effect on the outcomes achieved by program participants. This is very good information for the organization’s management and its staff to know as organizations continuously strive to streamline services to get a more direct impact for participants.

8

Measuring Program OutcomesMeasuring Program Outcomes

AssumptionsAssumptionsPrograms are developed as a result of assumptions about people’s needs

Assumptions can be drawn from:Your experiences A program partner’s experiences Formal or informal research

STEP 1 STEP 1 ––BUILD BUILD

Program Purpose

Influencers

AssumptionsAssumptions

© Performance Results, Inc.

www.performance-results.net

Thinking back to the point when the program was first being developed, what was the good idea that staff had about the program? What was the identified need? Programs generally get started with an understanding that a group of individuals had a certain need that the organization is capable of addressing.

The first step in the outcomes evaluation model is to sit back and assess what you know about the needs of people in your program. Was a formal needs assessment conducted? What evidence documented the need, and was it credible?

9

Measuring Program OutcomesMeasuring Program Outcomes

AssumptionsAssumptionsAssumption: The need that exists for a

group of individuals based on their common characteristics.

Solution: A program that will change or improve behaviors, knowledge, skills, attitudes, life condition or status.

Desired results: The change or improvement you expect to achieve.

STEP 1 STEP 1 --BUILD BUILD

Program Purpose

Influencers

AssumptionsAssumptions

© Performance Results, Inc.

www.performance-results.net

Programs are developed as a result of careful research that shows that certain people have needs that are not being met in the community. What are those needs and what are the characteristics of people that make them have those needs?

Then consider what the solution to those needs are. What are the services you offer in your program that would help to meet those needs?

Then consider success….if you are truly successful, what will the results be? Do those results take care of the need?

10

Measuring Program OutcomesMeasuring Program Outcomes

AssumptionsAssumptionsFollowing a needs assessment….Assumption: Teens in Springfield lack appropriate role models in their lives to assist them with setting constructive life goals and keeping them on a path to achieve them

STEP 1 STEP 1 --BUILD BUILD

Program Purpose

Influencers

AssumptionsAssumptions

© Performance Results, Inc.

www.performance-results.net

Let’s consider a hypothetical example, to help us understand how the OBE process works. Let’s assume that our organization has identified a need through a formal study we conducted in our town: namely, that teens in Springfield lack strong role models to help them succeed in life. To address that need, our organization plans to launch a mentoring program.

Notice that the assumption was made following a formal investigation into the identification of those needs, in this case a formal needs assessment.

11

Measuring Program OutcomesMeasuring Program Outcomes

AssumptionsAssumptionsSolution: With the help of the Springfield public schools, we will identify at-risk students, match them with mentors, assist them in setting life goals, and support their efforts to achieve these goals.

STEP 1 STEP 1 --BUILD BUILD

Program Purpose

Influencers

AssumptionsAssumptions

© Performance Results, Inc.

www.performance-results.net

The potential solution to this problem is our organization’s proposed program. Our staff has developed a set of activities and services that we believe will address the teens’ needs. We will identify at-risk students, and match them with mentors who can them set positive life goals and support them in their efforts to achieve those goals.

12

Measuring Program OutcomesMeasuring Program Outcomes

AssumptionsAssumptionsSTEP 1 STEP 1 --BUILD BUILD

Program Purpose

InfluencersAssumptionsAssumptions

© Performance Results, Inc.

Desired results: Students will complete school, not engage in risky behavior, and increase their commitment to further education, vocational training or employment.

www.performance-results.net

Now we need to specify the desired results of our program. Results are specified as benefits for the teens themselves. These results are shaped into statements—“students will not engage in risky behavior”--that will later be measured as outcomes.

13

Managing Program OutcomesManaging Program Outcomes

Results (Your program’s outcomes)

Solutions ( Your program)

Assumption (the need that exists)

Activity 1Activity 1AAssumptionsssumptions

© Performance Results, Inc.

Now let’s move away for a moment from our hypothetical example to a real0life example you may face in your own work. You can use this page record your responses to the assumptions model. Write down a need you have identified, the program that your organization is conducting to meet that needs, and the results you hope to achieve. Then, ask yourself some common sense questions:

•Is there a logical link between the needs you have identified among those to be served and the services (program activities) you will provide?

•Is it logical that, if the services were provided well, the desired results would occur?

14

Measuring Program OutcomesMeasuring Program Outcomes

InfluencersInfluencersInfluencers are individuals, agencies, funding sources, competition, community groups and professional affiliations. They influence:

type and nature of serviceswho is serveddesired outcomeshow results are communicated

STEP 1 STEP 1 --BUILDBUILD

Program Purpose

InfluencersInfluencersAssumptions

© Performance Results, Inc.

www.performance-results.net

The next step in the evaluation planning process is to identify the program’s influencers. Influencers are stakeholders, those individuals or entities that care about the results of your program. This subject is addressed at this time so that any data which an influencer might need is built into the evaluation process from the beginning.

15

Measuring Program OutcomesMeasuring Program Outcomes

InfluencersInfluencersExamples:

Target audienceOrganization missionAdministration and boardCommunity MediaFundersProgram partners

STEP 1 STEP 1 --BUILDBUILD

Program Purpose

InfluencersInfluencersAssumptions

© Performance Results, Inc.

www.performancewww.performance--results.netresults.net

Who are your program’s influencers? If a program is accredited or certified by some professional group, they would as well be an influencer. In almost all cases funders are influencers because they fund a program to do certain things and to achieve certain results.

16

•Change process•Add partners•Change responsibilities

Is there equal responsibility?

Which services produce outcomes?

Project partners

• Fund program• Increase funding• Promote replication

Who does the program serve?Is the program effective?

Funders

• To improve the program

• To end program• To start another

program

Is the program meeting the target audience’s needs?

Your organization

HOW THE RESULTS WILL BE USED

WHAT THEY WANT TO KNOW

INFLUENCER

STEP 1 STEP 1 --BUILDBUILD

Program Purpose

InfluencersInfluencersAssumptions

Measuring Program OutcomesMeasuring Program Outcomes

© Performance Results, Inc.

www.performance-results.net

When you have made a list of these entities, think about what they want to know about your program. Then try to think of how they would use that information. See the chart above for examples.

17

Managing Program OutcomesManaging Program Outcomes

How will they use this information?

What do they want to know?

Influencer

Activity 2Activity 2

InfluencersInfluencers

© Performance Results, Inc.

Now you can complete your assessment of your influencers. (You could complete this form based on the hypothetical Springfield teen mentoring program example mentioned earlier, or based on your own real-life situation.) Complete this form and identify the questions your influencers are likely to ask of your program. Be sure to make note of these questions so that all the data elements are captured in your evaluation system.

18

Measuring Program OutcomesMeasuring Program Outcomes

Program PurposeProgram PurposeProgram purpose is driven by assumptions about need. It relates to the organization’s mission statement. It defines audience, services, and outcomes. In summary –

We do what, for whom, for what outcome or benefit?

STEP 1 STEP 1 --BUILDBUILD

Program PurposeProgram Purpose

InfluencersAssumptions

© Performance Results, Inc.

www.performance-results.net

Now it is time to develop our “program purpose statement.” Unlike organizational mission or vision statements, which are intentionally written to be broad and inclusive of the range of activities theorganization conducts, a program purpose statement is a clear statement of process, people, and results. In short, the programpurpose statement indicates what we do, for whom and for what outcome or benefit.

Clearly written statements of the purpose for a program help people understand why the program is offered and its relationship to the organization’s mission. Accomplishment of the program’s purpose enables organizations to say they have been successful in fulfilling their mission.

19

Measuring Program OutcomesMeasuring Program Outcomes

STEP 1 STEP 1 --BUILDBUILD

Program PurposeProgram Purpose

InfluencersAssumptions

© Performance Results, Inc.

The “TrueTeens” program is a thirty-week program jointly offered by Faith in Our Communities, a faith-based, church-sponsored organization, and the public school system. The program offers classes in goal setting, mentoring by business professionals and church members, vocational training and counseling for at-risk youth in Springfield.

www.performance-results.net

Let’s return to our hypothetical example of the teen program in Springfield, which we will call the “True Teens” initiative. Although this is not a real program, there are some programs that do similar things. This slide offers background information describing the program. As you can see, the True Teens program is a collaborative effort of a faith-based organization and the public school system in Springfield. The purpose statement for the True Teens program follows on the next slide.

20

Measuring Program OutcomesMeasuring Program Outcomes

STEP 1 STEP 1 --BUILDBUILD

Program PurposeProgram Purpose

InfluencersAssumptions

© Performance Results, Inc.

To provide outreach, mentoring, vocational training, and support services to inner-city neighborhood youth in Springfield in order to:

Increase self-disciplinePrevent school drop outImprove academic performance

www.performance-results.net

Notice how concise this program purpose statement is. It starts with what the program will provide that will address the needs identified in the assumptions model (“outreach, mentoring, vocational training, and support services”). Then it defines the program’s process (delivering those services) and the intended population (inner-city youth in Springfield). Finally, it specifies the results or outcomes the program will achieve (increased self-discipline, prevention of school drop out, improved academic performance).

Since programs are designed to change, improve, or increase knowledge, skills, attitudes, or behaviors of a specific audience or population, it is not surprising that the central focus of the program’s purpose statement should address these issues.

21

Managing Program OutcomesManaging Program OutcomesActivity 3Activity 3

Program PurposeProgram Purpose

Instructions: Consider the purpose of your program. Respond by writing responses to the following questions.

For what outcome(s) or benefit(s) The results of your program in terms of changed, improved, or demonstrated skills, behaviors, knowledge, and attitude.

For whom? The targeted audience(s) for your program

We do what? The services and activities of your program

© Performance Results, Inc.

Now you can try this exercise for your own organization. Think about a program that you are operating currently, or one that you desire to start. Complete a program purpose statement for the program using the above format.

22

Logic ModelLogic ModelSTEP 2 EVALUATESTEP 2 EVALUATE

Data Intervals

Goals

Applied to

Data Sources

Indicators

Outcomes

Outputs

Activities & Svs

Inputs

Logic ModelLogic ModelAn outcomes logic model (evaluation plan) is a clear, graphic representation of the links between program activities, the results these activities produce, and how the results will be measured.

Measuring Program OutcomesMeasuring Program Outcomes

© Performance Results, Inc.

www.performance-results.net

Okay. Now that you have carefully defined the program, it is time to create the evaluation plan. The plan will have all your program elements described, including your program’s outcomes and how you will measure them. One helpful and commonsense evaluation plan is the logic model.

23

Program Purpose

OutputsOutputsActivities and ServicesActivities and ServicesInputsInputs

The amount of impact desired

When data is collected

The population to be measured

Sources of information about conditions being measured

Observable and measurable behaviors or conditions

Intended impact

GoalsData IntervalsApplied toData SourceIndicatorsOutcome

Measuring Program OutcomesMeasuring Program Outcomes

© Performance Results, Inc.

www.performance-results.net

To construct the evaluation plan (logic model), you will need to think through your program’s structure. The elements of a program’s structure include: program inputs, activities and services, and outputs. Each of these elements make some contribution to the benefits or outcomes the program delivers to program participants. It is for this reason that we look at each element separately and discuss what these elements might include and how they might impact the program’s outcomes.

You can use the form provided below to identify the inputs, activities and services, and outputs of your own program. (Each of these concepts is further defined in the pages ahead.)

24

Activity 4Activity 4Your Program ModelYour Program Model

Instructions: Consider your program. What are the elements of your program? Record your answers under the appropriate categories below.

Outcomes

Outputs

Services

Activities

Inputs

© Performance Results, Inc.

25

Measuring Program OutcomesMeasuring Program Outcomes

Logic ModelSTEP 2 EVALUATESTEP 2 EVALUATE

Data Intervals

Goals

Applied to

Data Sources

Indicators

Outcomes

Outputs

Activities & Svs

InputsInputs

InputsInputs

Web SiteFacilitiesConsultantsComputersMoneyCurriculaMaterialsStaff

Resources dedicated to or consumed by the program

© Performance Results, Inc.

www.performance-results.net

Program inputs are the resources devoted to a program. They include materials, tools, facilities, money, staff, and other resources that make the program operate.

Think about beginning a program from day one. What are all the materials you will need to bring to the program to make it work? It is important to know what it takes to operate a program because that tells us how much it costs to deliver the program to each participant, and how much it costs to achieve the desired program outcomes.

26

Measuring Program OutcomesMeasuring Program Outcomes

Logic ModelSTEP 2 EVALUATESTEP 2 EVALUATE

Data Intervals

Goals

Applied to

Data Sources

Indicators

Outcomes

Outputs

Activities & Svs

InputsInputs

© Performance Results, Inc.

Mentors (22)Books and materials (60 sets)Program facilitiesProject staffSpeakers

www.performance-results.net

Here are examples of inputs for the True Teens program. (Note that while “project staff” is listed on this slide, when you complete your own form, it may be more useful for you to list the exact amount of full time staff devoted to your project.)

27

Measuring Program OutcomesMeasuring Program Outcomes

Logic ModelSTEP 2 EVALUATESTEP 2 EVALUATE

Data Intervals

Goals

Applied to

Data Sources

Indicators

Outcomes

Outputs

Activities & SvsActivities & SvsInputs

Activities and ServicesActivities and ServicesActivities are management-related while services directly involve end-usersProgram Activities

Recruit participantsCoordinate materials and staffPromote program

Program ServicesWorkshopsProjectsMentoring

© Performance Results, Inc.

www.performance-results.net

Activities and services are the tasks that are done to make a program work. They are the tasks that must happen both behind the scenes and those that occur directly with program participants that help them achieve the program’s outcomes.

There is a subtle difference between these tasks. Activities are tasks that are administrative or managerial in nature. These are very important tasks and they consume money, effort, and resources. It is for this reason activities must be accounted for when evaluating outcomes. Services are those tasks that occur directly with program participants, such as workshops or mentoring.

28

Measuring Program OutcomesMeasuring Program Outcomes

Logic ModelSTEP 2 EVALUATESTEP 2 EVALUATE

Data Intervals

Goals

Applied to

Data Sources

Indicators

Outcomes

Outputs

Activities & SvsActivities & Svs

Inputs

© Performance Results, Inc.

Vocational training

Mentoring

Counseling

Paid internships

Job try-outs

Student events

Public service announcements

Outreach

Choosing participants

Mentor matching

Partner recruitment and management

Program ServicesProgram ServicesProgram Program ActivitiesActivities

www.performance-results.net

Activities and services make up the program and account for all the functions and tasks associated with getting the work done and serving the participants.

29

Measuring Program OutcomesMeasuring Program Outcomes

Logic ModelSTEP 2 EVALUATESTEP 2 EVALUATE

Data Intervals

Goals

Applied to

Data Sources

Indicators

Outcomes

OutputsOutputs

Activities & Svs

Inputs

OutputsOutputs

Web-site hitsMaterials used

Consultants’ hoursMaterials developed

Supplies consumedParticipants completed

Workshops givenParticipants served

A direct program product, typically measured in numbers

© Performance Results, Inc.

www.performance-results.net

The next part of the logic model is called outputs. These are various counts of such things as resources consumed, activities conducted, and services delivered. Organizations are very accustomed to reporting outputs. In fact, organizations have been asked routinely by funders and other stakeholders to report in these terms. For the True Teens program, outputs include the number of workshops given, the number of participants served in the program, and supplied consumed in the course of the program.

30

Measuring Program OutcomesMeasuring Program Outcomes

Logic ModelSTEP 2 EVALUATESTEP 2 EVALUATE

Data Intervals

Goals

Applied to

Data Sources

Indicators

Outcomes

OutputsOutputs

Activities & Svs

Inputs

© Performance Results, Inc.

54 students completed 5 classes in goal setting212 hours counseling 312 hours of mentoring activity

www.performance-results.net

Here in the True Teens example, one output was 312 hours of mentoring that was provided to the students. Outputs often demonstrate volume, telling the reader the quantityof what was delivered or consumed. But outputs say very little about the quality (impact) of what was delivered.

31

Measuring Program OutcomesMeasuring Program Outcomes

Logic ModelSTEP 2 EVALUATESTEP 2 EVALUATE

Data Intervals

Goals

Applied to

Data Sources

Indicators

OutcomesOutcomes

Outputs

Activities & Svs

Inputs

OutcomesOutcomesA target audience’s changed or improved skills, attitudes, knowledge, behaviors, status, or life condition brought about by experiencing a program

© Performance Results, Inc.

www.performance-results.net

The next part of the logic model is outcomes. Keep in mind that outcomes are different from outputs. Outcomes, as shown in the slide above, are the program participants’ changed or improved skills, attitudes, behaviors, status, or life condition brought about by experiencing the program. This is a very clear definition of outcomes. Programs are designed to make these changes and improvements.

Outcomes can also be framed sequentially; that is, in terms of what might happen first, and what might happen later or last. Some outcomes might be accomplished during the program, others shortly after the program finishes. These outcomes are called immediate outcomes. When the program is over, but changes are not expected to occur until a period of time after the program has ended, the outcomes are called intermediate outcomes. Long-term outcomes are often accomplished by a program, but these are not measured until a significant period of time after the program has ended.

The next page gives you a form to use to specify what the outcomes of your program are.

32

Activity 5Activity 5Evaluation Logic ModelEvaluation Logic Model

Instructions: Consider your program. What are the elements of your program? Record your answers under the appropriate categories below.

#2#2

GoalData Interval

Data Source

Applied toIndicatorOutcome

#3#3

#1#1

© Performance Results, Inc.

33

Measuring Program OutcomesMeasuring Program Outcomes

Logic ModelSTEP 2 EVALUATESTEP 2 EVALUATE

Data Intervals

Goals

Applied to

Data Sources

Indicators

OutcomesOutcomes

Outputs

Activities & Svs

Inputs

© Performance Results, Inc.

Outcomes:Outcome # 1Increase self-discipline

Outcome # 2Prevent school drop out

Outcome # 3Improve academic performance

www.performance-results.net

Let’s look again at the True Teens example. In it, the outcomes for the program came directly from the program purpose statement. All of the outcomes here are improvements in behaviors and skills/knowledge. They are not measurable yet, but they make a statement concerning what the expected performance of the program is.

34

MeasuringMeasuring ProgramProgram OutcomesOutcomes

Logic ModelSTEP 2 EVALUATESTEP 2 EVALUATE

Data Intervals

Goals

Applied to

Data Sources

IndicatorsIndicators

Outcomes

Outputs

Activities & Svs

Inputs

IndicatorsIndicatorsState the measurable conditions or behaviors that show an outcome was achieved

They are what you hope to see or know

Are observable evidence of accomplishments, changes, or gains

© Performance Results, Inc.

www.performance-results.net

In order to make outcome statements measurable, indicators must be identified. Indicators are written to define, in measurable terms, how the outcome is to be assessed.

In other words, indicators are the conditions or behaviors to be measured and they state what you anticipate seeing when a program participant demonstrates observable evidence of accomplishments, changes, gains, improvements or behaviors (outcomes).

35

Measuring Program OutcomesMeasuring Program Outcomes

© Performance Results, Inc.

The # and % of students who have an 80% or greater school attendance rate by year end, and

The # and % who complete 80% of school assignments, and

The # and % who do not drink alcohol, use illegal drugs, or engage in sexual behavior

Increase Self-discipline

Indicator:Outcome # 1

www.performance-results.net

The slide above gives examples of the indicators that could be used to measure desired outcome #1, increasing self-discipline. Note that sometimes an outcome can have more than one indicator that helps to define what the outcome means. For the True Teens program, we can identify two indicator statements for measuring the program’s first outcome. When there is more than one statement (indicator) that describes what the behavior, skill or gain will be, the two indicator statements need to be separated by an “and” or “or” word. It is important to know whether both indicators have to be met by an individual for the outcome to be achieved, or whether just one of the conditions mentioned in the indicators should be met.

36

Measuring Program OutcomesMeasuring Program Outcomes

© Performance Results, Inc.

The # and % of students who do not drop out of school during the school year

Prevent school drop out

Indicator:Outcome # 2

www.performance-results.net

Here is the second indicator for the True Teens program. This one is used to measure the second desired outcome of the program, namely, the prevention of school drop out among the participating teens.

Note that the indicator statement begins with the “number and percent of the target population who” demonstrate a skill, behavior, or show a gain. Writing indicator statements in this way assures that the statements are measurable. Do not attempt to fill in what the actual number or percent would be at this point in the development phase; use these symbols as “placeholders” for when you actually know what these numbers will be. Setting targets for change is discussed later in this toolkit.

37

Measuring Program OutcomesMeasuring Program Outcomes

© Performance Results, Inc.

The # and % of students who improve their grade point average by 1 point or better

Improve academic performance

Indicator:Outcome # 3

www.performance-results.net

Indicators not only identify the observable behavior, but also show volume or quantity. If the point of an outcome is to demonstrate a knowledge or behavior or to show that a specific skill has been learned, it is also necessary to identify within the indicator statement the extent to which the participants must demonstrate the behavior. Consider the example above of the indicator for outcome #3 of the True Teens program. Here, students will be judged to have improved their academic performance if they increase their grade point average by 1 point or more.

In summary, indicator statements need to include two measurable pieces of information. First, they need to show who will be countedand second, they need to identify how much (or the extent to which) behavior, skill, knowledge needs to be demonstrated before the participant can be counted as having achieved the outcome.

38

Measuring Program OutcomesMeasuring Program Outcomes

Logic ModelSTEP 2 EVALUATESTEP 2 EVALUATE

Data Intervals

Goals

Applied to

Data SourcesData Sources

Indicators

Outcomes

Outputs

Activities & Svs

Inputs

Data SourcesData SourcesData sources are tools, documents, and locations for information that will show what happened to your target audience.

Pre and Post test scoresProgram recordsAssessment reportsRecords from other organizationsObservations

© Performance Results, Inc.

www.performance-results.net

The next part of the evaluation plan involves data sources. Data sources are tools, documents, and sources of information that will be used to show what happened to your participants. As the slideabove indicates, there are several different kinds of data sources. Test scores, for example, are sources of information commonly used to show an increase in knowledge or skills. Surveys are often used to assess changes in attitudes and behaviors. Sometimes it is important for a program to collect a variety of information and the range of information will be used to demonstrate a skill increase or knowledge gain. Qualitative information is important, too, in order to gain a broader understanding of changes or improvements. Examples of these include portfolio assessments, open ended survey questionnaires, focus groups and other vehicles where participants can be given an opportunity to describe their feelings and experiences.

39

Measuring Program OutcomesMeasuring Program Outcomes

Logic ModelSTEP 2 EVALUATESTEP 2 EVALUATE

Data Intervals

Goals

Applied to

Data SourcesData Sources

Indicators

Outcomes

Outputs

Activities & Svs

Inputs

Data SourcesData Sources

Survey questions with quantifiable attitude rating scaleSelf report of frequency of behaviorSelf report of perceived level of competence

Quantifiable self-reports

InterviewsOpen-ended surveys

Anecdotal self-reports

© Performance Results, Inc.

www.performance-results.net

Program directors need to consider whether it is important to measure a change in behavior, skill, knowledge, attitude, or life condition as a result of the program. If it is important to know that an increase has happened or a change has occurred as a result of the program, then pre and post program information needs to be collected.

Keep in mind, though, that it is often sufficient for a program to show simply that participants demonstrate a certain skill, knowledge, or behavior as they complete the program, as an indication that that program is being successful. This is different than measuring an increase in a skill – a change from a “baseline” time period (pre-test) to a time period later (post-test). Simply setting a benchmark for expected results for all participants is often an acceptable way to evaluate a program.

40

Measuring Program OutcomesMeasuring Program Outcomes

Logic ModelSTEP 2 EVALUATESTEP 2 EVALUATE

Data Intervals

Goals

Applied to

Data SourcesData Sources

Indicators

Outcomes

Outputs

Activities & Svs

Inputs

Data SourcesData Sources

Academic, attendance, and other school recordsEmployment recordsSurveys and focus group results

Government, school, business, or organization records

Teacher checklist of student behaviorPortfolio assessment

Professional Assessments

© Performance Results, Inc.

www.performance-results.net

Data sources used to confirm that a change in or establishment of a skill, knowledge, or behavior has occurred can range from objective sources (e.g., a test) to subjective sources (a self-reported answer in a survey), and from a more quantitative nature (i.e., the percentage of students achieving a grade point average increase) to more qualitative (observations from a teacher about a student’s behavior in school). There are benefits and drawbacks to each of these data sources. So, let’s think about how to choose the data sources right for your purposes.

41

Measuring Program OutcomesMeasuring Program Outcomes

© Performance Results, Inc.

School attendance records

Staff reports

Police and teacher reportsStudent self reports

The # and % of students who have an 80% or greater attendance rate, andThe # and % who complete 80% or more of school assignments, and

The # and % who do not drink alcohol, use illegal drugs, or engage in sexual behavior

Increase self-discipline

Data SourceIndicator:Outcome 1

www.performance-results.net

When considering what data sources to use, be mindful of when data is available and how easy it is to obtain it. In the slide above, data sources for the True Teens program evaluation include a mix of school records, staff reports, and student self-reports.

Sometimes programs need to develop specific or customized data instruments (usually surveys) to measure attitude changes or achievement of knowledge or skills that are harder to see and test. In these cases, surveys should be piloted on a group similar to those for whom the instrument will be used. This will help staff to understand how it might work when it is being used and whether the instrument is clear and the results are measurable.

42

Measuring Program OutcomesMeasuring Program Outcomes

© Performance Results, Inc.

School recordsThe # and % of students who do not drop out of school during the school year

Prevent school drop out

Data SourceIndicator:Outcome # 2

www.performance-results.net

When a program sets out to measure a knowledge or skills change, it is best to rely on instruments that have been developed for that purpose. Scholastic, academic, and other tests that measure knowledge have already been tested and validated for most groups. Where these tests exist, and could be used, program directors are encouraged to do so.

43

Measuring Program OutcomesMeasuring Program Outcomes

© Performance Results, Inc.

Teacher recordsThe # and % of students who improve their GPA by one point or better

Improve academic performance

Data SourceIndicator:Outcome # 3

www.performance-results.net

In thinking through data sources, be aware of issues of accessibility and confidentiality. For example, for the True Teens program, wehave decided to obtain teacher records to learn about the number and percentage of students that have improved their grades, so that we can measure how well the program is achieving its third outcome—improved academic performance. The data source to this indicator relies on teachers to provide the information. Think about how busy their schedule is and find outwhen you can get the information. Concerning issues of confidentiality; when it is necessary, you can get individual program or school records from another institution without breeching confidentiality if the data comes to you coded. To put it another way, you do not get individual names associated with these records, but rather individual records with a code to represent the individual. Schools may also provide you with the information you need in the aggregate. If you’re looking for an average grade increase, then getting the aggregate of the group rather than individual records will do.

44

MeasuringMeasuring Program OutcomesProgram Outcomes

Logic ModelSTEP 2 EVALUATESTEP 2 EVALUATE

Data Intervals

Goals

Applied toApplied to

Data Sources

Indicators

Outcomes

Outputs

Activities & Svs

Inputs

Applied toApplied toThe target audience to whom the indicator applies.

Decide if you will measure all participants, completers of the program, or another subgroup Special characteristics of the target audience can further clarify the group to be measured

© Performance Results, Inc.

www.performance-results.net

For each indicator, it is necessary to specify whether the information will be collected on all participants, a specific subsection of people served in the program, or only those who complete the program. In other words, you need to know to whom the indicator is being applied. This determination may affect the results of the program. Individuals who complete a program are more likely to experience the impact of that program than are those who do not complete the program. In addition, achievements of specific sub-groups of an audience may be important to measure. For example, perhaps you may be interested in the performance of younger children versus that of older youth in the program.

45

Measuring Program OutcomesMeasuring Program Outcomes

© Performance Results, Inc.

All studentsN = 45

All studentsN = 45

Students with past records N = 12

School attendance records

Staff reports

Police and teacher reportsStudent self reports

The # and % of students who have an 80% or greater attendance rate, andThe # and % who complete 80% or more of school assignments, and

The # and % who do not drink alcohol, use illegal drugs, or engage in sexual behavior

Increase self-discipline

Applied toData SourceIndicator:Outcome 1

www.performance-results.net

© Performance Results, Inc.

In some circumstances, the “applied to” of the indicator is common sense. For example, if you have a program whose primary outcomesare to help participants get and keep a job, the first thing program staff would want to know is how many program participants out of all that were served got a job. The indicator in this situation “applies to” all participants. The next logical outcome for this program is to help participants keep a job. This indicator would only apply to those who got a job to start with, and would not apply to everyone in the program. Let’s say that the final outcome of the program was to be sure that those who stayed on the job for more than 90 days got employment benefits from their employers. The “applied to “ in this case are those participants who got jobs AND stayed on the job for more than 90 days.

46

Measuring Program OutcomesMeasuring Program Outcomes

All students identified as at-risk to school drop outN = 30

School records

The # and % of students who do not drop out of school during the school year

Prevent school drop out

Applied toData SourceIndicator:Outcome # 2

www.performance-results.net

© Performance Results, Inc.

Let’s refer back to our True Teens example. As the slide above shows, in this case the “applied to” ONLY applies to those students who were identified to be at-risk for dropping out of school. The “N” represents the number of those who were identified as being at-risk.

47

Measuring Program OutcomesMeasuring Program Outcomes

© Performance Results, Inc.

All studentsN = 45

Teacher records

The # and % of students who improve their GPA by one point or better

Improve academic performance

Applied toData SourceIndicator:Outcome # 3

www.performance-results.net

This indicator, by contrast, applies to all students in the program. It is the goal of the program to help all the involved students improve their GPA.

48

Measuring Program OutcomesMeasuring Program Outcomes

Logic ModelSTEP 2 EVALUATE STEP 2 EVALUATE

Data IntervalsData Intervals

Goals

Applied to

Data Sources

Indicators

Outcomes

Outputs

Activities & Svs

Inputs

Data IntervalsData IntervalsThe points in time when data are collected

Outcome information can be collected at specific intervals, for example, every 6 monthsData can also be collected at the end of an activity or phase and at follow-upData are usually collected at program start and end for comparison when increases in skill, behavior, or knowledge are expected

© Performance Results, Inc.

www.performance-results.net

Once you have thought through your data sources, you need to consider when data will collected. Outcome information for each indicator may be collected at specific intervals, such as every 6 months, or at fixed points--such as the beginning and end of a program or at a point in the future called “follow-up.” The timing of the data collection needs to be specified and the timing may depend on when the information or data is available. If the program’s service is delivered for only a short period of time, then the data collection may only occur at the end of a particular activity. In programs where increases in skill, behavior, or knowledge are expected, the data is usually collected at the program start and/or completion.

49

Measuring Program OutcomesMeasuring Program Outcomes

© Performance Results, Inc.

At the end of each school termAt the end of each school term

Monthly

All studentsN = 45

All studentsN = 45

Students with past records N = 12

School attendance records

Staff reports

Police and teacher reportsStudent self reports

The # and % of students who have an 80% or greater attendance rate, andThe # and % who complete 80% or more of school assignments, and

The # and % who do not drink alcohol, use illegal drugs, or engage in sexual behavior

Increase self-discipline

Data Interval

Applied to

Data SourceIndicator:Outcome 1

www.performance-results.net

© Performance Results, Inc.

While some data is always available, the “data interval” means the time program staff will capture the information and look at it. Like taking a snapshot with a camera, the reading of the data will show you whether the program is being successful.

50

Measuring Program OutcomesMeasuring Program Outcomes

© Performance Results, Inc.

At the end of each school semester

All students identified as at-risk to school drop outN = 30

School records

The # and % of students who do not drop out of school during the school year

Prevent school drop out

Data Interval

Applied toData Source

Indicator:Outcome # 2

www.performance-results.net

Keep in mind that waiting to capture the data after the program is over is typically not a good practice. While more data might be available, it might be too late to start program improvements if the program is off target in achieving its outcomes.

51

Measuring Program OutcomesMeasuring Program Outcomes

© Performance Results, Inc.

At the end of each school semester

All studentsN = 45

Teacher records

The # and % of students who improve their GPA by one point or better

Improve academic performance

Data Interval

Applied toData Source

Indicator:Outcome # 3

www.performance-results.net

At times, however, there are intervals when the information is available and we have to take advantage of that opportunity.

52

Measuring Program OutcomesMeasuring Program Outcomes

Logic ModelSTEP 2 EVALUATESTEP 2 EVALUATE

Data Intervals

GoalsGoals

Applied to

Data Sources

Indicators

Outcomes

Outputs

Activities & Svs

Inputs

GoalsGoalsGoals (targets) are the stated expectations for the performance of outcomes

Stated in terms of a number and/or percentMeets influencers’ expectationsMay be estimated by the program’s past performance

© Performance Results, Inc.

www.performance-results.net

Goals, sometimes also known as targets, are the stated expectations for performance of outcomes. Generally, they are stated as a percentage and/or a number. Goals are set for each indicator of each outcome.

53

Measuring Program OutcomesMeasuring Program Outcomes

© Performance Results, Inc.

1840%

3475%

12100%

At the end of each school termAt the end of each school term

Monthly

All studentsN = 45

All studentsN = 45

Students with past records N = 12

School attendance records

Staff reports

Police and teacher reportsStudent self reports

The # and % of students who have an 80% or greater attendance rate, andThe # and % who complete 80% or more of school assignments, and

The # and % who do not drink alcohol, use illegal drugs, or engage in sexual behavior

Increase self-discipline

GoalData Interval

Applied toData SourceIndicator:Outcome 1

www.performance-results.net

© Performance Results, Inc.

The goals for the True Teens program are listed above. For example, program staff hope that 75 percent of participating teens will complete nearly all of their homework assignments.

Knowing how to set a goal or target is often difficult. It may be best to rely on information you may have concerning how this program has worked in the past or you may decide to use target information based on what your funders and other “stakeholders” expect of your program. In all cases, the capabilities, needs, and abilities of your target audience will be factors in setting your targets for outcomes.

54

Measuring Program OutcomesMeasuring Program Outcomes

2480%

At the end of each school semester

All students identified as at-risk to school drop outN = 30

School records

The # and % of students who do not drop out of school during the school year

Prevent school drop out

GoalData Interval

Applied toData Source

Indicator:Outcome # 2

www.performance-results.net

© Performance Results, Inc.

Depending on what the outcome is and the characteristics of those being served, a very small percentage for achievement of the outcome may be appropriate. In fact, many staff from private grant-making foundations say that they would rather than programs set realistic goals and achieve them than try to look impressive by setting very high goals, that, given the challenges faced, are unrealistic.

It is unacceptable, however, to set a goal, measure it, realize you did not meet it, and then arbitrarily lower the goal. It is important for you to find out why it was not achieved and try to improve the program’s performance the next time.

55

Measuring Program OutcomesMeasuring Program Outcomes

© Performance Results, Inc.

3475%

At the end of each school semester

All studentsN = 45

Teacher records

The # and % of students who improve their GPA by one point or better

Improve academic performance

GoalData Interval

Applied toData Source

Indicator:Outcome # 3

www.performance-results.net

The process of setting goals for the first time may end up simply being your best guess, but there are factors to consider. Do not forget what you may have promised in a funded grant application. You may also want to consider the targets that your competitors (other organizations competing for the same funding sources) typically set and achieve, and then promise to do better.

56

Measuring Program OutcomesMeasuring Program Outcomes

ReportsReports

STEP 3 STEP 3 MANAGMENTMANAGMENT

Evaluate System

ReportsReportsReports summarize the results of outcome data and include:

Participant characteristicsInputs, activities & services, outputs, and outcomesElements requested by influencersComparisons of previous periodsInterpretation of the data

© Performance Results, Inc.

www.performance-results.net

After the evaluation results have been collected, the information gathered by the evaluation process should be compiled into a formal report.

The report should summarize all the elements of the logic model and provide a summary of outcomes achieved, along with comparative data acquired during the program, comparing one period of data collection to another. A report does not need to be a complicated document. Essentially, it needs to state:

•We wanted to do what?

•We did what?

• So what?

The report is a strong basis for communicating the value of the program to internal management decision-makers and to external groups.

57

Measuring Program OutcomesMeasuring Program Outcomes

ReportsReports

STEP 3 STEP 3 MANAGMENTMANAGMENT

Evaluate System

ReportsReports

What did we achieve for target audience?

Outcomes

How many units did we deliver?To whom? (Audience characteristics)

Outputs

What we didActivities & Services

What did we use?How much did we spend?How much did we consume?

Inputs

www.performance-results.net

© Performance Results, Inc.

As seen in the slide above, the report covers all the elements that were part of the evaluation plan from the beginning—inputs, activities and services, outputs, and outcomes. Imagine the strength of the report that includes all of these elements.

58

Measuring Program OutcomesMeasuring Program Outcomes

ReportsReports

STEP 3 STEP 3 MANAGMENTMANAGMENT

Evaluate System

ReportsReports

Basically:We wanted to do what?We did what?So what?

www.performance-results.net

© Performance Results, Inc.

An outcomes report can be as simple as answering these few questions:

•We wanted to do what? “We wanted to achieve 80%”

• We did what? “We only achieved 40%”

• So what? “This is what we will do to improve next time.”

59

Measuring Program OutcomesMeasuring Program Outcomes

Reports

STEP 3 STEP 3 MANAGMENTMANAGMENT

Evaluate SystemEvaluate System

Evaluate SystemEvaluate SystemAfter one year of operation, evaluate the effectiveness of the outcomes system:

Have audiences been sufficiently identified?Are outcomes clearly written?Are outcomes sufficient to describe what you hope will happen?Are data collection methods cost efficient?

© Performance Results, Inc.

www.performance-results.net

Many programs are time limited. Some funding may only be available for a short time, yet full evaluations are expected. This makes it difficult to test the evaluation system before using it for the first time. Even though you may have been very careful in developing the evaluation plan for your program, there may be some ”bugs” in it that will have to be worked out. Use the questions in this slide to find out whether the evaluation process is running as it should.

60

Measuring Program OutcomesMeasuring Program Outcomes

OBE DefinitionOBE DefinitionOutcome-based evaluation is a systematic way to assess the extent to which a program has achieved its intended results.

© Performance Results, Inc.

Why OBE?

CONCLUSIONCONCLUSION

Your notes

DefinitionDefinition

www.performance-results.net

Seeing this definition for the second time, you might now believe that outcome-based evaluation is both systematic and that it concerns measuring YOUR program’s outcomes—you can do it! OBE is a good management practice for all programs.

61

Measuring Program OutcomesMeasuring Program Outcomes

Why OBE?

CONCLUSIONCONCLUSION

Your notesYour notes

Definition

Your notesYour notesAs a result of this training toolkit, the following represents my action list:

I need more information about?What do I need to first?Who should be involved?

© Performance Results, Inc.

www.performance-results.net

Now that you have completed this tutorial and the accompanying handouts, answer the questions listed above. Create a plan for yourself for getting this done.

Remember, outcomes-based evaluation can help you do what you do better And it is a system that affords you opportunity to communicate in powerful ways to your constituents and potential funders just what your organization’s programs are accomplishing.