51
EVALUATION FOR SCHOOL-WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon www.pbis.org December 9, 2011 Acknowledgements : George Sugai, Susan Barrett, Celeste Rossetto Dickey, Lucille Eber Donald Kincaid, Timothy Lewis,

E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon December 9, 2011

Embed Size (px)

Citation preview

Page 1: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

EVALUATION FORSCHOOL-WIDE PBIS

Bob AlgozzineUniversity of North Carolina: Charlotte

Rob HornerUniversity of Oregonwww.pbis.orgDecember 9, 2011

Acknowledgements: George Sugai, Susan Barrett, Celeste Rossetto Dickey, Lucille EberDonald Kincaid, Timothy Lewis, Tary Tobin

Page 2: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Goals/ Assumptions for Webinar

• Goals: Participants will be able to…– Define the core elements of an evaluation plan– Define a schedule for evaluation data collection– Define the core elements of an evaluation report

• Assumptions: Participants already can…– Define core features of SWPBIS– Are implementing SWPBIS– Have reviewed the SWPBIS Evaluation Blueprint

Page 3: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Assumptions

• Every evaluation is unique. Any specific evaluation report will be affected by:– Funding and political focus – Number of years implementing PBIS – Focus on Tier II and Tier III– Additional emphasis areas

• (Wraparound, Bully Prevention, Discipline Disproportionality)

Page 4: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Supporting Materials

• Tobin et al., 2011• Fidelity Matrix• Childs, George & Kincaid, 2011 (eval brief)• Outline of Evaluation Report• Evaluation Blueprint• Implementation Blueprint

• Exemplar state evaluation reports – (Illinois, Colorado, Missouri, Maryland, Florida, North

Carolina)

Page 5: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Foundations

• Evaluation is the process of gathering data to– Answer questions– Make decisions

• Define the core questions/decisions, before selecting measures.

• Provide an adequate context for interpreting the data.

• Effective evaluation is cyclical (repeated)– Use the evaluation process for improvement

Page 6: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Evaluation Questions/ Cycle

Measure

Perform

Plan

Compare

Context

InputFidelity

Impact

ReplicationSustainability

Page 7: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Evaluation Cycle

• Core Questions

Evaluation Plan

Data Collection

Evaluation Report

Page 8: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Core Evaluation QuestionsContext• What are/were the goals and objectives for SWPBIS

implementation? (state/district capacity; school adoption; student outcomes)

• Who delivered the training and technical assistance for SWPBIS implementation?

Input• What professional development events were part of SWPBIS

implementation support?– Was projected level of TA capacity provided (training/coaching)?

• Who received the professional development (schools/ cohorts)?

Page 9: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Core Evaluation Questions

Fidelity• To what extent was Tier I SWPBIS implemented with

fidelity ?

• To what extent were Tier II and Tier III SWPBIS implemented with fidelity?

• To what extent is the Leadership Team establishing core functions?

Page 10: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Leadership Team

Active Coordination

FundingVisibility Political

Support

Training Coaching Evaluation

Local School/District Teams/Demonstrations

BehavioralExpertise

Policy

Sugai et al., www.pbis.org

Page 11: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Core Evaluation QuestionsImpact• To what extent is SWPBIS associated with changes in

student behavioral outcomes?

• To what extent is SWPBIS associated with changes in academic performance, and dropout rates?

• To what extent is district/state capacity established? ( local training, coaching, evaluation , behavioral expertise)

• To what extent is leadership and policy structure established?

Page 12: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Core Evaluation Questions

Replication, Sustainability, and Improvement• To what extent do Fidelity and Impact outcomes sustain

across time?

• To what extent does initial SWPBIS implementation affect Implementation with later cohorts?

• To what extent did SWPBIS implementation change educational/behavioral capacity/policy?

• To what extent did SWPBIS implementation affect systemic educational practice?

Page 13: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Evaluation Plan• Context

– Evaluation questions, stakeholders, purpose(s)– Schedule of reports for stakeholder decision-making

• Input– Who will provide what TA/Training (and how much) to whom

• Fidelity– What data, when collected, by whom to assess:

• Leadership Team, • School teams Tier I, Tier II, Tier III

• Impact– What data, when collected, by whom to assess:

• District/state capacity (training, coaching, expertise, evaluation)• Student behavior• Student academic

• Replication, Sustainability, Policy

Page 14: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Evaluation Plan

• Context– Define the roles of the evaluation

– Stakeholders, Implementers, Adopters, Evaluators

– Define the purpose of the evaluation– What decisions are to be affected by the evaluation?– What questions need to be answered to make

informed decisions?

– Define the basic goals, timeline and constraints associated with the implementation effort

Page 15: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Evaluation Plan

• Input– Who will provide what training and technical assistance…

on what schedule… to improve the capacity of the leadership team to establish capacity to sustain and scale up SWPBIS?

– Who will provide what training and technical assistance to school teams to result in Tier I implementation of SWPBIS?

– Who will provide what training and technical assistance to school teams to result in Tier II an Tier III implementation of SWPBIS.

Page 16: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Evaluation Plan

• FidelityContent Measure(s) Schedule

Tier I SWPBIS Fidelity TIC (Progress Monitor)BoQ (once TIC is at criterionSET (for research measures

TIC (every 3-4 meetings)BoQ or SET (Annually in Spring)

Tier II and Tier III SWPBIS Fidelity

MATT (Progress Monitor)BAT (Annual)ISSET (for formal research)

MATT (every 3-4 team meetings)BAT or ISSET (Annually in Spring)

Page 17: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Evaluation Plan• Impact

Content Measure Schedule

Leadership Team Capacity Implementation Self-Assessment

Annually (Spring) for use in planning for Fall.

Behavior ODRs (SWIS)AttendanceGraduation Rate

ContinuousContinuousAnnual

Academic (CBM)Oral Reading Fluency

Standardized Assessments

Graduation

Start of school, Winter, Early Spring

Annually (Spring)

Spring

Page 18: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Data Collection

• 8 core PBIS measures – (Tobin et al., 2011; Childs et al., 2011)

• Basic logic– Research quality measures– Annual self-assessment measures– Progress monitoring measures

– (to be used every 3-4 meetings/cycles)

• Fidelity measure matrix (Tobin & Vincent)

Page 19: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Data Collection Level of Support Research

MeasuresAnnual Self-Assessment Measures

Progress Monitoring Measures

Universal School-wide Evaluation Tool (SET)

Self-Assessment Survey (SAS)

Benchmarks of Quality (BoQ)

Team Implementation Checklist (TIC)

Secondary and Tertiary

Individual Student School-wide Evaluation Tool (I-SSET)

Benchmarks of Advanced Tiers (BAT)

Measure of Advanced Tiers Tool (MATT)

Leadership Team SWPBIS Implementation Self-Assessment

Page 20: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Data Collection ScheduleLevel Measure Pre Year 1 Year 2 Year 3 S F W S S F W S S F W SUniversal SWPBS Progress

Monitoring: TIC X X X X X X

Annual Self-Assessment: BoQ,

X X X

Research Measure: SET

X X X

Self-Assessment Survey: SAS

X X X X

Secondary/ Tertiary

Progress Monitoring: MATT

X X X X X X

Annual Self- Assessment: BAT

X X

Research Measure: I-SSET

X X

Leadership Team Implementation Self-Assessment

X X X X

Page 21: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Data Collection ScheduleLevel Measure Pre Year 4 Year 5 Year 6 S F W S S F W S S F W SUniversal SWPBS Progress

Monitoring: TIC

Annual Self-Assessment: BoQ,

X X X

Research Measure: SET

Self-Assessment Survey: SAS

Secondary/ Tertiary

Progress Monitoring: MATT

Annual Self- Assessment: BAT

X X X

Research Measure: I-SSET

Leadership Team Implementation Self-Assessment

X X X

Page 22: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Evaluation Report

• Basic Outline– Context, Input, Fidelity, Student Outcomes, Future Directions

• Local Adaptation– Cohorts– Locally important questions

• Disproportionality• Bully Prevention• Cost

– Change in evaluation focus over time• Increased emphasis on Tier II and Tier III

• Examples– At www.pbis.org/evaluation.

Page 23: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Evaluation Report• Context• Define SWBIS

– SWPBIS is a framework for establishing a school-wide social culture with the necessary individualized supports needed for all students to achieve academic and social success.

• Define Goals of the specific project– Number of schools per year implementing SWPBIS

• How schools were selected• Expectation of 2-3 years for Tier I implementation to criterion• Expectation of 2 years of criterion implementation to affect academic outcomes.

– Development of district/state capacity• Capacity needed for sustained and scaled implementation

– Behavioral and academic outcomes for students• Student outcomes linked to fidelity of implementation

• Define Stakeholders/ Evaluation Questions– Evaluation report is written at the request of:– Evaluation report is focused on the following key questions:

Page 24: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Evaluation Report

• Input• Who received what support, and from

whom?– Leadership team– Local Capacity Building

• Training, Coaching, Behavioral Expertise, Evaluation

– School teams

Page 25: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

• Leadership Team– Planning Dates– Development of Implementation Plan– Dates of SWPBIS Implementation Self-

Assessment

• Capacity Building– Training, Coaching, Behavioral Expertise,

Evaluation• Data collection systems

Page 26: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011
Page 27: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Schools; Coaches; Trainers

Page 28: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Number of Schools, Use of Fidelity Data, and Access to ODR Data

Year 1 Year 2 Year 30

20

40

60

80

100

120

140

Schools TIC/BoQ DATA

Page 29: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Evaluation ReportImpact on SWPBIS Fidelity

Leadership TeamSWPBIS Implementation Self-Assessment

Capacity DevelopmentNumber of trainers/coaches available to support teams/districtsBehavioral expertise available to support Tier II and Tier III implementationEvaluation capacity (data collection, data use, information distribution)

School TeamsTier I Implementation (TIC, BoQ, SET, SAS)

Collectively, and/or by training cohortTier II / Tier III Implementation (MATT, BAT, ISSET)

Collectively, and/or by training cohortAdditional measures of fidelity

Phases of ImplementationCICO checklist

Page 30: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Evaluation Report

Year1 Year 2 Year 30%

10%20%30%40%50%60%70%80%90%

100%

Lead Team Self-Assessment Total

Lead Team Self-Assessment Total

Page 31: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Evaluation Report

• Fidelity of Leadership Team development– SWPBIS Implementation Self-Assessment sub-scales

Lead

Team

Funding

Visibilit

y

Politica

l Sup

Policy

Training

Coaching

Evaluati

on

Experti

se

Demo Schools

0

20

40

60

80

100

Year 1Year 2Year3

Page 32: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Evaluation Report

• Fidelity (TIC, BoQ, SET, SAS) Total Score– Schools

Page 33: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Evaluation Report

• Fidelity (cohort)Total Score N = 17

0-10 10-20 20-30 30-40 40-50 60-70 70-80 80-90 90-1000

1

2

3

4

5

6

7

Fall Yr1

0-10 10-20 20-30 30-40 40-50 60-70 70-80 80-90 90-1000

1

2

3

4

5

6

7

Spring YR1

Page 34: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Evaluation Report

• Fidelity (subscales) TIC

Page 35: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Evaluation Report

BoQSubscale Report

Page 36: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Evaluation Report

0-10 10-20 20-30 30-40 40-50 60-70 70-80 80-90 90-1000

1

2

3

4

5

6

7

ISSET/BAT/MATT Tier IIISSET/BAT/MATT Tier II

0-10 10-20 20-30 30-40 40-50 60-70 70-80 80-90 90-1000

1

2

3

4

5

6

7

ISSET/BAT/MATT Tier IIISSET/BAT/MATT Tier II

Tier II and Tier IIIFidelity MeasureCohort N = 17

ISSET/BAT/MATTTotal Scores

Time 1 and Time 2

Time 1

Time 2

Page 37: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Evaluation Report

• Fidelity (Tier II & III)

Year 2 Year 30

10

20

30

40

50

60

Schools Using CICO Schools at CICO criterion

Page 38: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Tier I, Tier II, Tier III Criterion:Percentage

Year 1 N =34 Year 2 N = 68 Year 3 N = 970

0.10.20.30.40.50.60.70.80.9

1

At Tier I criterion At Tier II criterion At Tier III criterion

Page 39: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Evaluation Report

• Impact: Student outcomes

Page 40: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Steve [email protected]

www.cenmi.org/miblsi

Page 41: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Participating Schools

2004 Schools (21)2005 Schools (31)

2006 Schools (50)

2000 Model Demonstration Schools (5)

2007 Schools (165)2008 Schools (95)2009 Schools (150*)

Total of 512 schools in collaboration with 45 of 57 ISDs (79%)

The strategies and organization for initial implementation need to

change to meet the needs of larger scale implementation.

Page 42: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Average Major Discipline Referral per 100 Students by Cohort

Cohort 1 (n=15) Cohort 2 (n=19) Cohort 3 (n=34) Cohort 40

20

40

60

80

100

120

140

160

180

2004-2005 2005-2006 2006-2007 2007-2008

Page 43: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Not Met Criteria (< 70) n=5, range: 41-65

Met Criteria (> 70) n=8, range: 72-94

-20%

-15%

-10%

-5%

0%

5%

10%

Average Change in Major Discipline Referrals: One District Example (13 elementary schools)

Increase 8%

Decrease 14.6%

Focus on Implementing with Fidelityusing Benchmarks of Quality (BoQ)/ODR ’06-’07 and ’07-’08

Page 44: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Cohort 1 Cohort 2 Cohort 3 Cohort 40%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

2003-04 2004-05 2005-06 2006-07 2007-08 2008-09

Percent of Students meeting DIBELS Spring Benchmarkfor Cohorts 1 - 4 (Combined Grades)

5,943 studentsassessed

5,943 studentsassessed

8,330 studentsassessed

8,330 studentsassessed

16,078 studentsassessed

16,078 studentsassessed

32,257 studentsassessed

32,257 studentsassessed

Spring ’09: 62,608 students assessed in cohorts 1 - 4

Spring ’09: 62,608 students assessed in cohorts 1 - 4

Page 45: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

• Percent of Students at DIBELS Intensive Level across year by Cohort

Cohort 1 Cohort 2 Cohort 3 Cohort 40%

5%

10%

15%

20%

25%

30%

2003-04 2004-05 2005-06 2006-07 2007-08 2008-09

Pe

rce

nt

of

Stu

de

nts

at

DIB

EL

S I

nte

nsi

ve I

nte

rve

n-

tion

Le

vel

Page 46: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

North CarolinaPositive Behavior Interventions & Support Initiative

Heather R. ReynoldsNC Department of Public InstructionBob AlgozzineBehavior and Reading Improvement Center

http://www.dpi.state.nc.us/positivebehavior/

Page 47: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Suspensions per 100 students

Page 48: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Cedar Creek Middle SchoolFranklin County, North Carolina

Pre PBIS 05-06 Post PBIS 08-09640

660

680

700

720

740

760

780

Enrollment

Pre PBIS 05-06 Post PBIS 08-090

0.5

1

1.5

2

2.5

ODR/100

Pre PBIS 05-06 Post PBIS 08-090

10

20

30

40

50

60

70

% Meeting Reading AND Math EOG

Pre PBIS 05-06 Post PBIS 08-090%5%

10%15%20%25%30%35%40%45%

Staff Turnover

Page 49: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

North CarolinaPositive Behavior Support Initiative

0.00 0.10 0.20 0.30 0.40 0.50 0.60 0.70 0.80 0.90 1.0050

55

60

65

70

75

80

85

90

95

100

ReadingLinear (Reading)

ODRs

EOG

Read

ing

rxy = -.44(n = 36)

Dr. Bob Algozzine

Schools with Low ODRs and High

Academic Outcomes

Office Discipline Referrals per 100 Students

Prop

ortio

n of

Stu

dent

s M

eetin

g St

ate

Acad

emic

Sta

ndar

d

Page 50: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Evaluation Report

• Implications– Policy– Practice– Technical Assistance

• Future Directions

Page 51: E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon  December 9, 2011

Summary

• Building an Evaluation Plan for SWPBIS implementation

• Collecting and organizing Data

• Reporting Data for Active Decision-making