Matt Keene (PPT)

Preview:

DESCRIPTION

 

Citation preview

Integrating Evaluation into the Design of Your

Innovative Program

Evaluation Support DivisionNational Center for Environmental InnovationOffice of Policy, Economics and InnovationUS Environmental Protection Agency

Innovation Symposium Chapel Hill, NCThursday, January 10, 2008

2

Workshop Outline

1. Introductions

2. Activity – Evaluation in Our Lives

3. Evaluation and its Evolution at EPA

4. Case Study – Product Stewardship in MN

5. Exercise – Integrating Evaluation in MN

6. Opportunities to Integrate Evaluation

3

Introductions

This will be an interactive workshop… so let’s interact!

• Get to know someone at your table• Tell us

• Who they are, • Who they work with, and • Their New Year’s resolution

4

Purpose of the Workshop

Through discussion and a practical, real-world example, provide participants with the structure and conceptual understanding necessary to integrate evaluation and performance management into the design of environmental programs.

5

Evaluation In Our Lives

Activity

• Name something in your life that you or someone else decided was worth measuring and evaluating.

• What was the context?

• Was there a target or goal…what was it?

• Who was the audience?

• How did you measure progress or success?

• How did you use what you learned?

6

Evaluation In Our Programs

What can we take from evaluation in our lives and apply to addressing environmental challenges?

• Measure what matters

• Evaluate for others and for ourselves

Integrating evaluation into program design

• Equal parts art and skill

• Performance management and quality evaluation are inseparable

7

Evaluation In The EPA

Evaluation Support Division

ESD’s Mission

• Evaluate innovations

• Build EPA’s capacity to evaluate

Performance Management

• An approach to accomplishing EPA goals and ESD’s mission

8

Performance Management

PERFORMANCE MANAGEMENTPerformance management includes activities to ensure that goals are consistently being met in an effective and efficient manner. Performance management tools include logic models, performance measurement and program evaluation.

Logic Model

Tool/framework that helps identify the program/project

resources, activities, outputs customers, and

outcomes.

Performance Measurement

Helps you understand what

level of performance is achieved by the program/project.

Program Evaluation

Helps you understand and

explain why you’re seeing the

program/project results.

9

Steps to Completing an Evaluation

VI. Design the Evaluation

II. Identify Team/Develop Evaluation Plan

III. Describe the Program

IV. Develop Evaluation Questions

V. Identify/Develop Measures

VIII. Analyze and Interpret Information

IX. Develop the Report

VII. Collect Information

I. Selecting a Program for Evaluation

10

Logic Model

Longer term outcome

(STRATEGIC AIM)

Short termoutcome

CustomersOutputs

WHYHOW

PROGRAM RESULTS FROMPROGRAM

EXTERNAL CONDITIONS INFLUENCING PERFORMANCE (+/-)

Intermediateoutcome

ActivitiesActivitiesResources/ InputsResources/ Inputs

VictoryCommitment TrainingSnodgrass Juggling Regimen Me

12

Performance Measurement

Definition

• The ongoing monitoring and reporting of program progress and accomplishments, using pre-selected performance measures

Measures are designed to check the assumptions illustrated in the logic model

13

Measures Across the Logic Model SpectrumElement Definition Example Measure

Resources/ Inputs

Measure of resources consumed by the organization.

Amount of funds, # of FTE, materials, equipment, supplies (etc.).

Activities Measure of work performed that directly produces the core products and services.

# of training classes offered as designed; Hours of technical assistance training for staff.

Outputs Measure of products and services provided as a direct result of program activities.

# of technical assistance requests responded to; # of compliance workbooks developed/delivered.

Customer Reached

Measure of target population receiving outputs.

% of target population trained; # of target population receiving technical assistance.

Customer Satisfaction

Measure of satisfaction with outputs. % of customers dissatisfied with training; % of customers “very satisfied” with assistance received.

Outcomes Accomplishment of program goals and objectives (short-term and intermediate outcomes, long-term outcomes--impacts).

% increase in industry’s understanding of regulatory recycling exclusion; # of sectors that adopt regulatory recycling exclusion; % increase in materials recycled.

14

Program Evaluation

Definition

• A systematic study that uses measurement and analysis to answer specific questions about how well a program is working to achieve its outcomes and why.

Orientation/Approaches to Evaluation

• Accountability External Audience

• Learning & Program ImprovementInternal/External Audiences

15

Types of Evaluation

Process Evaluation

OutcomeEvaluation

ImpactEvaluation

Longer term

outcome (STRATEGIC

AIM)

Intermediate outcome

Short term outcome

CustomersOutputsActivitiesResources/Inputs

WHYHOW

Design Evaluation

16

Questions, Comments and Clarifications

Are there any questions or comments about what we have covered so far?

17

Environmental Evaluation: Evolving Theory and Practice

ESD is witnessing the shift from awareness to action

We are adapting to the increasing sophistication of our clients and demands from stakeholders

• Capacity Building

• Evaluations

Managing performance requires integrating evaluation into program design

18

Our Case Study

Our case study is representative of a trend toward more sophisticated evaluations of environmental programs

ESD is applying learning and adding to it as we take on more sophisticated projects

From here on, you are receiving information necessary to complete the exercises

• You are responsible for integrating evaluation into the program

• Ask questions and take notes!

19

Case Study: Paint Product Stewardship Initiative Background on…

Current Status and Goals of PPSI

Minnesota Demonstration Program

20

Evaluating the Demonstration Program

What Will We Evaluate?

• Paint• Management

Systems• Education• Markets• Cooperation?• Financing

system?

21

Regional Draft Infrastructure

Why Are We Evaluating?

• Leadership

• Legislation

• Learning

• Transfer

22

Evaluating the Demonstration Program

What will we evaluate?

• Paint, Management Systems, Education, Markets

Why are we evaluating the program?

• Leadership, Legislation, Learning, Transfer

Can we integrate evaluation into this project?

• We need a framework to follow…and we are building it as we go

• Initially, integrating evaluation into your program is a design and planning activity

Integrating Evaluation into Program Design

1. Team 2. Mission3. Goals & Objectives4. Logic Model

Integrating Evaluation into Program Design

Program

QuestionsDocumentation

Measures

1. Context2. Audience3. Communication4. Use

1. Data Sources2. Collection Methods

& Strategy3. Analysis Tools4. Data Collection5. Data Management

1. Performance Management Policy

2. Evaluation Methodology

24

Questions, Comments and Clarifications

Take a few minutes to familiarize yourself with the mission, goals and objectives of the MN demonstration program

25

Exercise: Integrating Evaluation

Minnesota Demonstration Project and Performance Management

• We will introduce a process for integrating evaluation into the MN program

• We will use the process to, step-by-step, integrate evaluation into the design of the MN program

Logistics

• Your table is your group for the rest of the workshop

• After brief instruction, each team will complete each step of the process and report the results

1. Team 2. Mission3. Goals & Objectives4. Logic Model

Integrating Evaluation into Program Design

Integrating Evaluation into Program Design

Program

QuestionsDocumentation

Measures

1. Context2. Audience3. Communication4. Use

1. Data Sources2. Collection Methods & Strategy3. Analysis Tools4. Data Collection5. Data Management

1. Performance Management Policy2. Evaluation Methodology

is our program

Your table is the team that will build evaluation into the MN program.

Describing the MN program

Mission

Goals and objectives

Logic model: we are going to make one!

Program

Select and Describe the Program

1. Team 2. Mission3. Goals & Objectives4. Logic Model

Integrating Evaluation into Program Design

Describe the Program: Logic Model

VictoryCommitment TrainingSnodgrass Juggling Regimen Me

Instructions: Each table will craft a line of logic based on one goal (long-term outcome) of the MN project. For each component of the model (e.g. activity, output, outcome), brainstorm with your group to decide on 2-3 items to complete your line of logic.

Resources Activities Outputs Customers Short Term Intermediate Long Term

Outcomes

1. Team 2. Mission3. Goals & Objectives4. Logic Model

Integrating Evaluation into Program Design

Program

Questions

What are the critical questions to understanding the success of the MN program?

Use an outcome from your logic model to create your evaluation question

Evaluation Questions

1. Team 2. Mission3. Goals & Objectives4. Logic Model

Integrating Evaluation into Program Design

Program

Questions

What contextual factors may influence the answers to each question?

Who are the audiences for each question?

•What’s the best way to communicate with each audience?

•How might each audience use the answer to each question?

Evaluation Questions

1. Context2. Audience 3. Communication4. Use

31

Evaluation Questions

What are the critical questions to understanding the success of the MN program?

Use an outcome from your logic model to create your evaluation question.

What contextual factors may influence the answers to each question?

Who are the audiences for each question?• What’s the best way to communicate

with each audience?• How might each audience use the

answer to each question?

1. Team 2. Mission3. Goals & Objectives4. Logic Model

Integrating Evaluation into Program Design

Program

Questions

1. Context2. Audience 3. Communication4. Use

1. Team 2. Mission3. Goals & Objectives4. Logic Model

Integrating Evaluation into Program Design

Program

Questions

1. Context2. Audience 3. Communication4. Use

What can we measure to answer each question?

Where can we find the information for each measure?

How can we collect the information?

Given our questions and information to be collected, what will be an effective collection strategy?

Performance Measures

1. Data Sources2. Collection Methods & Strategy3. Analysis Tools4. Data Collection5. Data Management

Measures

What analytical tools will give us the most useful information?

How will we implement the collection strategy?

How will we manage the data?

Performance Measures

1. Team 2. Mission3. Goals & Objectives4. Logic Model

Integrating Evaluation into Program Design

Program

Questions

1. Context2. Audience 3. Communication4. Use

1. Data Sources2. Collection Methods

& Strategy3. Analysis Tools4. Data Collection5. Data Management

Measures

34

Performance Measures

What can we measure to answer each question?

What methods are best suited for each measure?

What analytical tools will give us the most useful information?

Given our questions and information to be collected, what will be our collection strategy?

• How will we implement the collection strategy?

• How will we manage the data?

1. Team 2. Mission3. Goals & Objectives4. Logic Model

Integrating Evaluation into Program Design

Program

Questions

1. Context2. Audience 3. Communication4. Use

1. Data Sources2. Collection Methods

& Strategy3. Analysis Tools4. Data Collection5. Data Management

Measures

1. Team 2. Mission3. Goals & Objectives4. Logic Model

Integrating Evaluation into Program Design

Program

Questions

1. Context2. Audience 3. Communication4. Use

1. Data Sources2. Collection Methods

& Strategy3. Analysis Tools4. Data Collection5. Data Management

Documentation: Methodology & Policy

Evaluation Methodology

The process of integrating evaluation generates a framework for a methodology and an evaluability assessment

Performance Management Policy

Across office programs and projects

Guides strategy and planning

1. Evaluation Methodology2. Performance Management Policy

Measures

Documentation

36

Check the Logic

Revisit the process and the decisions made

Look for the flow in the process and identify potential breaks

Identify potential obstacles to our approach to managing the performance of the MN demonstration program

1st cycle is integrating – next cycle begins implementation

37

What is happening today with the PPSI?

MOU

Workgroups/committees

Minnesota demonstration project planning

Integrating evaluation into project design

38

Recap and Next Steps

Practice : Theory

• An inconsistent ratio

Movement in the environmental community toward:

• Evidence

• Effectiveness

• Evaluation

Opportunities to merge theory and practice

• Policy

• Leadership

• New programs

• Capacity building efforts like this one

39

Thank You!

Evaluation Support DivisionNational Center for Environmental

InnovationOffice of Policy, Economics and InnovationU.S. Environmental Protection Agency

Matt Keene

(202) 566-2240

Keene.Matt@epa.gov

www.epa.gov/evaluate

40

41

42

43

Adaptive Management Cycle

44

Evaluation…In the Life of a Program

When to do it?

What are the obstacles?

Are there solutions?

Are there opportunities to improve evaluations in your shop?

45

46

Evaluation Questions

What are the critical questions to understanding the success of the MN program?

Link your questions to a component in your line of the logic model

What contextual factors may influence the answers to each question?

Who are the audiences for each question?

• What’s the best way to communicate with each audience?

• How might each audience use the answer to each question?

47

Document Evaluation Policy and Methodology

Evaluation Policy

Evaluation Methodology

48

Performance Measures

What can we measure to answer each question?

What methods are best suited for each measure?

What analytical techniques could we use to maximize the rigor of our analysis?

Given the level of rigor desired, what will be our collection strategy?

• How will we implement the collection strategy?• How will we manage the data?

49

Materials

Presentation

Flip charts

Markers

Projector

Laptop

Tape for flipchart paper

Post its

50

Supporting documents from PPSI, etc.

MN MOU

MN Goals and Objectives and Tasks

Workplan

Logic Model

51

Logic Model

Conceptual framework

Performance Measurement

Helps you understand

what.Program

Evaluation

Helps you understand and

explain why.

Program Mission

Adapt/Learn/ Transfer

Aggregate/Analysis

Planning

Performance Management Cycle – needs adaptive management componets like “implement”

52

Steps to Integrating Evaluation into Program Design

Needs

Mission

Goals & Objectives

Logic ModelContext

Select a Program

Document

Identify Measures

Develop Questions

Describe Program

Identify a Team

AudiencesUse

Communication

Data Management

CollectionCollection Strategy

Analysis

Methods

Policy

Methodology

Integrating Evaluation into Program Design

TeamProgram

Questions

Measures

Documentation

Needs & Mission

Goals & Objectives

Logic Model

Audience

Methods

Analysis

Strategy

CollectionContext

Communication

Use

Performance Management

Policy

Evaluation Methodology

Data Management

Integrating Evaluation into Program Design

54

Program Management Cycle

55

Needs, Mission and Goals and Objectives

Mission

What drives the need for performance management?

Goals and Objectives

56

Logic Model

Each table gets a logic model template

Goals from the MN project represent a long term outcomes

Each table fills in the other components of the Logic Model

We’ll put the lines of logic together to form a complete’ish model

Integrating Evaluation into Program Design

Program

Questions

Measures

DocumentationIntegrating

Evaluation into Program Design

58

Program

Measures

Documentation

Goals & Objectives

Logic Model

Data Sources

Methods &Strategy

AnalysisTechniques

CollectionContext

Performance Management

Policy

Evaluation Methodology

Data Management

Integrating Evaluation into Program Design

Needs & Mission

Questions

Communication

Use

Audience

Team

59

1. Team2. Mission3. Goals & Objectives4. Logic Model

Program

QuestionsDocumentation

Measures

1. Audience2. Context 3. Communication4. Use

1. Data Sources2. Collection Methods & Strategy3. Analysis Tools4. Data Collection5. Data Management

1. Performance Management Policy2. Evaluation Methodology

60

Program

Measures

Documentation Questions

1. Team 2. Mission3. Goals and Objectives4. Logic Model

1. Data Sources2. Collection Methods & Strategy3. Analysis Techniques4. Data Collection5. Data Management

1. Audience 2. Context3. Communication4. Use

1. Performance Management policy2. Evaluation Methodology

61

62

Recommended