56
Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald & Tom DeGomez

Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

  • View
    213

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Evaluation Planning & Outcome Measurement

Beyond the #!@#*$! Logic Model

2010 AAEA In-Service, Maricopa Agricultural Center

Al Fournier, Dan McDonald & Tom DeGomez

Page 2: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Why Evaluate?

Page 3: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Why Evaluate? (1)

• “Customer satisfaction”• Program improvement (formative)• Did people learn? (knowledge, skills)• Did people change? (adoption)• Did the situation change? (outcome,

impact)

Page 4: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Why Evaluate? (2)

• To document outcomes• Reporting• To prove our worth (APR, continuing)• To keep $ flowing (grant success)• Because we have to!

Page 5: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

What do you evaluate?

Page 6: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

How often to you evaluate “customer satisfaction”?

1 2 3 4 5

8%

58%

8%

0%

25%

1. Always2. Most of the time

(> 66%)3. Sometimes (34 -

65%)4. Occasionally (<

33%)5. Never

Page 7: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

How often to you evaluate clientele learning?

1 2 3 4 5

23%

46%

0%0%

31%

1. Always2. Most of the time

(> 66%)3. Sometimes (34 -

65%)4. Occasionally (<

33%)5. Never

Page 8: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

How often to you evaluate application of learning?

1 2 3 4 5

14%

21%

0%

7%

57%1. Always2. Most of the time

(> 66%)3. Sometimes (34 -

65%)4. Occasionally (<

33%)5. Never

Page 9: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

How often to you evaluate Outcomes?

1 2 3 4 5

13%

60%

0%

7%

20%

1. Always2. Most of the time

(> 66%)3. Sometimes (34 -

65%)4. Occasionally (<

33%)5. Never

Page 10: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

How often to you evaluate impacts?

1 2 3 4 5

0%

47%

0%

33%

20%

1. Always2. Most of the time

(> 66%)3. Sometimes (34 -

65%)4. Occasionally (<

33%)5. Never

Page 11: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Outcomes versus Impacts

• Short term• Increase awareness• Increase knowledge• Change attitudes• Apply knowledge• Adopt a practice

• Long term• Result of short term

outcomes “in the world”

• Increase profits• Improve health• Reduce pollution

Page 12: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Taught growers how to reduce irrigation but maintain yields

1 2

0%

100%1. Outcome

2. Impact

Page 13: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Increased parent awareness of cyber-bullying

1 2

6%

94%1. Outcome

2. Impact

Page 14: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Reduced absentee rates in schools after adopting IPM

1 2

100%

0%

1. Outcome

2. Impact

Page 15: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Barriers to Evaluation

Page 16: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Barriers to Evaluation

• Lack of time

• Lack $

• Lack of knowledge

• Lack of experience

• Low priority / not important

• Not interested

• Other?

Page 17: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Which program should I Evaluate? (PSU tipsheet)

http://extension.psu.edu/evaluation/titles.html

Page 18: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Why Evaluate? (2)

• To prove our worth (APR, continuing)• To keep $ flowing (grant success)• Because we have to!

Page 19: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Example: Regional IPM

Applications must provide detailed plans for evaluation of the project…. The evaluation plan should include specific evaluation objectives and measurement indicators (e.g., adoption rate, number of acres impacted, pesticide use, risk reduction, profitability) that will be used to measure impacts and outcomes resulting from the project. Evaluation plans that include surveys should indicate survey expertise of investigators and/or describe the survey methodology that will be used.

Page 20: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Evaluation Planning

• Evaluation Objectives: What change do you want to document?

• Measurement Indicators: What data will you use to document change?

• Methodology: How will you collect the data?

Page 21: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Evaluation Plan (1)

• Objective 1: To measure knowledge of natural enemies (I.D.) & their role in whitefly management.

• Data: Can they identify natural enemies? Do they know the role of specific fauna in whitefly management? (knowledge)

• How: audience response survey

Page 22: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Evaluation Plan (2)

• Objective 2: To collect baseline data on current whitefly management practices.

• Data: self-reported sampling practices and thresholds for treatment, do they consider natural enemies?

• How: survey implemented face-to-face & online.

Page 23: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Evaluation Plan (3)

• Objective 3: To measure the intention of clientele to adopt revised thresholds, sampling and management guidelines.

• Data: self-reported attitudes about usefulness of NE in WF control; willingness to adopt.

• How: the same survey implemented face-to-face & online.

Page 24: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Example: Regional IPM

Applications must provide detailed plans for evaluation of the project…. The evaluation plan should include specific evaluation objectives and measurement indicators (e.g., adoption rate, number of acres impacted, pesticide use, risk reduction, profitability) that will be used to measure impacts and outcomes resulting from the project. Evaluation plans that include surveys should indicate survey expertise of investigators and/or describe the survey methodology that will be used.

Page 25: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Example: Regional IPM

Applications must provide detailed plans for evaluation of the project….

…such as logic models or other established methods.

Page 26: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Remember 3 Things:

1. What change do you want to document? (Evaluation Objectives, based on program goals, linked to program activities)

2. What data will you use to document change? (Measurement Indicators)

3. How will you collect the data? (Methodology)

Page 27: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

What about your programs?

• Evaluation Objective

• Measurement Indicator

• Methodology

Page 28: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

DeterminingEvaluation Objectives

• Look at desired outcomes & impacts.

Page 29: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald
Page 30: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

DeterminingEvaluation Objectives

• Look at desired outcomes & impacts.• Be careful about commitments.• Look at your outputs for who to target,

what to focus on. Outputs should relate to desired outcomes.

Page 31: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald
Page 32: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

DeterminingEvaluation Objectives

• Look at desired outcomes & impacts.• Be careful about commitments.• Look at your outputs for who to target,

what to focus on. Outputs should relate to desired outcomes.

• Look at your inputs (resources).

Page 33: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald
Page 34: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

DeterminingEvaluation Objectives

• Look at desired outcomes.• Be careful about commitments.• Look at your outputs for who to target,

what to focus on. Outputs should relate to desired outcomes.

• Look at your inputs (resources).• Keep it real & prioritize. (Needs, budget,

abilities.)

Page 35: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

What makes a good Measurement Indicator?

• Is it measurable?

• Are data obtainable?

• Can it be quantified? (or qualified)

• Does it relate directly to your program goals?

Page 36: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

What makes a good Measurement Indicator?

• Is it measurable?

• Are data obtainable?

• Can it be quantified? (or qualified)

• Does it relate directly to your program goals?

Page 37: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Data Sources (methods)

• Existing or common data (public sources, census data, Dept. of Education, etc.)

• Surveys (written, online, telephone)• Interviews• Observations• Focus groups• Other…?

Page 38: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Indicate your level of experience with written surveys

1 2 3 4 5

0%

50%

0%0%

50%

1. I am an expert

2. Lots of experience

3. Some experience

4. Tried it once or twice

5. No experience

Page 39: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Indicate your level of experience with online surveys

1 2 3 4 5

0%

19%

31%

6%

44%1. I am an expert

2. Lots of experience

3. Some experience

4. Tried it once or twice

5. No experience

Page 40: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Indicate your level of experience with telephone surveys

1 2 3 4 5

0%

7%

27%

53%

13%

1. I am an expert

2. Lots of experience

3. Some experience

4. Tried it once or twice

5. No experience

Page 41: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Indicate your level of experience with focus groups

1 2 3 4 5

6%

13% 13%

19%

50%

1. I am an expert

2. Lots of experience

3. Some experience

4. Tried it once or twice

5. No experience

Page 42: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Indicate your level of experience with advisory groups

1 2 3 4 5

13%

60%

0%

13%13%

1. I am an expert

2. Lots of experience

3. Some experience

4. Tried it once or twice

5. No experience

Page 43: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Indicate your level of experience with interviewing

1 2 3 4 5

6%

38%

13%

31%

13%

1. I am an expert

2. Lots of experience

3. Some experience

4. Tried it once or twice

5. No experience

Page 44: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Indicate your level of experience using existing data sources for

evaluation

1 2 3 4 5

0%

18%

0%

29%

53%1. I am an expert

2. Lots of experience

3. Some experience

4. Tried it once or twice

5. No experience

Page 45: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Indicate your level of experience with Human Subjects (IRB)

1 2 3 4 5

0%

44%

19%

25%

13%

1. I am an expert

2. Lots of experience

3. Some experience

4. Tried it once or twice

5. No experience

Page 46: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Indicate your level of experience with evaluation planning

1 2 3 4 5

0%

44%

0%0%

56%1. I am an expert

2. Lots of experience

3. Some experience

4. Tried it once or twice

5. No experience

Page 47: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Indicate your level of familiarity with UA CE evaluation website

1 2 3 4 5

24%

12% 12%

35%

18%

1. Used the site more than once

2. Used the site to find resources

3. Viewed the site4. Heard of it, never

visited5. Never heard of it

Page 48: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Evaluation Planning (PSU tipsheets)

• Evaluation strategy (8 steps)• http://extension.psu.edu/evaluation/pdf/TS82.pdf

• Linking outcomes to program activities and writing Impact statements

• http://extension.psu.edu/evaluation/pdf/TS81.pdf

Page 49: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Documenting Impacts

Page 50: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

IPM Program Goals (desired impacts)

• Reduce economic risk (profit)

• Reduce risk to human health

• Reduce risk to the environment

Page 51: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

'90 '91 '92 '93 '94 '95 '96 '97 '98 '99 '00 '01 '02 '03 '04 '05 '06 '07 '08

0

2

4

6

8

10

12

14

Whitefly Pink bollworm Lygus bugs Other

Cotton IPM Saves Millions $

$201,000,000 saved costs & yield loss

IGRs, Bt cotton & AZ IPM plan

Zero grower sprays for PBW

Ellsworth et al. 2008

Page 52: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

0

50

100

150

200

250

300

350

Lowest Costs in 30 years(inflation-adjusted to 2008 dollars)

Lygus: -35%

PBW: -89%

Whitefly: -71%

Fewer SpraysFewer Sprays in last 7 yearsin last 7 years

Ellsworth 2008

Page 53: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

'90 '91 '92 '93 '94 '95 '96 '97 '98 '99 '00 '01 '02 '03 '04 '05 '06 '07 '08

0

2

4

6

8

10

12

14

Whitefly Pink bollworm Lygus bugs Other

Health & Environment

1.7M lbs reduction in insecticide use

Lowest usage in 30 yrs!

Ellsworth et al. 2008

Page 54: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

What is the likelihood you will use something you learned

about today?

1 2 3 4 5

88%

13%

0%0%0%

1. Very likely

2. Somewhat likely

3. Slightly likely

4. No way

5. Not sure

Page 55: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

Resources for Evaluation

• Extension Program Evaluation website

• Other internet resources

• eXtension community of practice

Page 56: Evaluation Planning & Outcome Measurement Beyond the #!@#*$! Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald

New! Evaluation Lending Library