Welcome to Directors’ Program Measure Webinar - Presenter: Jennifer Coffey, Ph.D., OSEP Project...

Preview:

Citation preview

Welcome to Directors’ Program Measure Webinar - Presenter:Jennifer Coffey, Ph.D., OSEP Project Officer, SPDG Program Lead

Click the Person icon to:• Raise Your Hand• Agree/Disagree• Other….

Click the Full Screen: • To maximize

presentation screen

Click This Icon to:- Send private messages- Change text size or chat color

To Post Chat Messages:• Type your message in this

box• Then hit ‘enter’ on your

keyboard to send

Webinar Ground Rules

Mute your phones: - To Mute or Un-Mute Press *6- Please do not put your phones on ‘Hold’

For webinar technical difficulties:- Send email to adesjarl@uoregon.edu

Q & A Process (audio/chat): - Ask questions in two ways:

1. Audio/Voice2. Type your question in the Chat Pod

Archive Recording, PPT, & Materials- To be posted to; http://signetwork.org/events/108

SAVE THE DATES!

SPDG National Meeting

March 6 & 7, 2012,Washington, DC

OSEP Project Directors’ Conference

July 23-25, 2012, Washington, DC

SPDG National MeetingMarch 5 – MEET UP Night

Upcoming Calls

7

December 1, 2011Time: 3:00-4:30pm ET

Program Measure Directors' Webinar Series #3: Ongoing Technical Assistance and Teacher RetentionJennifer Coffey, PhD, OSEP

Date: January 11, 2012Time: 3:00-4:30pm ET

Adult Learning Principles Carol Trivette, Ph.D.Orelena Hawks Puckett Institute

8

Building a Leadership Support Network from Identification of Need to the Launch--the Oklahoma StoryNovember 7, 2011 12pm EST

The Personnel Improvement Center @ NASDSE and Oklahoma State Department of Education will share how they are building a support network for new special education directors, which includes leadership institutes, webinars with follow-up blog discussions, and mentoring by veteran directors. Network goals include the development and retention of special education directors to ensure that they are able to attract, support and retain highly-qualified, highly-effective special education teachers.

http://click.icptrack.com/icp/relay.php?r=17436393&msgid=408221&act=W7IT&c=572590&destination=https%3A%2F%2Ftadnet.ilinc.com%2Fregister%2Ffmcbsby

10

Professional Learning CommunitiesUpdates

• Disband Secondary Education & Transition PLC (includes Adolescent Literacy)

• Potential Merge – NCRTI – RTI CoP Calls & SPDG RtI/Multi-Tiered Models of Intervention PLC

11

State Grantees Profile – Desktop Share

The Revised SPDG Program Measures: An Overview

Jennifer Coffey, Ph.D. SPDG Program LeadAugust 30, 2011

12

Capturing Performance

Performance Measurement 1: Projects use evidence-based professional development practices to support the attainment of identified competencies.

Performance Measurement 2: Participants in SPDG professional development demonstrate improvement in implementation of SPDG-supported practices over time.

13

Performance Measurement 3: Projects use SPDG professional development funds to provide follow-up activities designed to sustain the use of SPDG-supported practices. (Efficiency Measure)

Performance Measurement 4: Highly qualified special education teachers that have participated in SPDG supported special education teacher retention activities remain as special education teachers two years after their initial participation in these activities.

14

Continuation Reporting

2007 grantees will not be using the new program measures:

Everyone else will have 1 year for practice› Grantees will use the revised measures

this year for their APR› This continuation report will be a pilot

OSEP will learn from this round of reports and make changes as appropriate

15

16

Performance Measurement 2:

Participants in SPDG professional development demonstrate improvement in implementation of SPDG-supported practices over time.

Implementation fidelity Fidelity of implementation is traditionally

defined as “the extent to which the user’s current practice matches the ideal (Loucks, 1983).

17

Citations

Dusenbury, Brannigan, Falco, & Hansen (2003)

Dane & Schneider (1998) O’Donnell (2005) Blase “Innovation Fluency” presentation:

http://signetwork.org/content_pages/154 Mowbray, Holter, Teague & Bybee (2003)

18

From O’Donnell (2005) “All five studies consistently showed

statistically significantly higher outcomes when the program was implemented with greater fidelity.

The studies reviewed here suggest that fidelity of implementation is more probable when an intervention manual is in place that clearly defines the critical components of the intervention and articulates a theory.

Distinctions should be made between measuring fidelity to the structural components of a curriculum intervention and fidelity to the processes that guide its design.”

19

20

EVALUATION DRIVES ERIA’S EVIDENCE-BASED PRACTICES

The Program Guide, a 16-page booklet, explicitly addresses both implementation and intervention practices to guide the design of a site-based program.

The Implementation Rubric is a 10-item instrument which provides a framework for trainers, coaches, site team members, and teachers to evaluate and discuss implementation, fidelity, and next steps.

Some additional tools include: end-of-event training surveys and three-month follow-ups feedback and support from cohort coaches and site team fidelity observations student data

Alan Wood
I might move this down to just before the Program Guide's first slide (currently slide 14), and delete the current slide 12. That content is perhaps better addressed verbally, directly to the Guskey slide?

21

ERIA’S EVIDENCE-BASED PRACTICES

The Program Guide articulates a comprehensive set of practices for all stakeholders.

Implementation Practices Intervention Practices

Initial Training

Team-based Site-level Practice and Implementation

Implementation Rubric facilitates self-eval

Ongoing Coaching

Booster Trainings

Implementation Rubric reflection on next steps

The 5 Steps of ERIA

Data-informedDecision-making Screening and

Assessment Progress Monitoring

Tiered Interventions and Learning Supports

Enhanced Literacy Instruction

Measure 2 Methodology

The projects will report on those initiatives that they are reporting on for Program Measure 1

Each initiative should have a fidelity measure that notes the presence or absence of the core features of the innovation/program/system that the initiative is focused on

22

When possible…

Use implementation measures that have already been created› For example – new RTI implementation

measure presented to the RTI PLC› Literacy implementation – Planning and

Evaluation Tool – Revised (PET-R)› Schoolwide Evaluation Tool (SET)› Others?

23

Developing a Fidelity Measure (O’Donnell, 2005)

To develop fidelity criteria, researchers often reported starting with a curriculum profile or analysis that outlined the critical components of the intervention along with an indication of the range of variations for acceptable use. The researcher or developer then outlined acceptable ranges of variation (Songer & Gotwals, 2005).

A component checklist was then developed to record fidelity to these components (Hall & Loucks, 1977).

24

What is “it”? Operationalize

Part of Speech:  verb Definition:  to define a concept or variable so that it can be measured or expressed quantitatively

Webster's New Millennium™ Dictionary of English, Preview Edition (v 0.9.7)Copyright © 2003-2008 Lexico Publishing Group, LLC

The “it” must be operationalized whether it is:

» An Evidence-Based Practice or Program» A Best Practice Initiative or New Framework » A Systems Change Initiative

Practice Profiles » Help Operationalize Practice, Program, and Systems

Features

25

Searching for “It”

Research findings, materials, manuals, and journal articles do not necessarily provide clarity around core intervention elements

Current and new evidence-based practices, frameworks, programs will have a range of operational specificity

Developing clarity around the “it” is critical

26

Practice Profile Defining “it” Through the Development and

Use of Practice Profiles

Guiding Principles identified

Critical Components articulated

Hall and Hord, 2010 Implementing Change: Patterns, Principles, and Potholes (3rd Edition) and Adapted from work of the Iowa Area Education Agency

27

Practice Profile Defining “it” Through the Development and

Use of Practice Profiles

Guiding Principles identified

Critical Components articulated

For each critical component:

Identified gold standard

Identified acceptable variations in practice

Identified ineffective practices and undesirable practices

Hall and Hord, 2010 Implementing Change: Patterns, Principles, and Potholes (3rd Edition) and Adapted from work of the Iowa Area Education Agency

28

????

Have you ever developed or helped to develop a Practice Profile or Innovation Configuration?

Vote Now:» Yes

» No

29

Practice Profiles: Pay Now or Pay Later

Identifies Critical Components Guiding Principles Critical Components Match the Guiding Principles Core Activities to Achieve the Critical Components

For each Critical Component: Identified “gold standard” activities Identified acceptable variations in practice Identified ineffective practices and undesirable practices

Your Implementation Support» Identify and Support Implementation Team» Provide Conceptual Overview and Rationales» Provide Resources, Worksheets, Templates» Facilitate Consensus Building C

apac

ityB

uild

ing

30

Resources for Building Practice Profiles

31

• National Centers• Experts in Your State• National Purveyors• Manuals and Materials• Implementing Districts and

Schools• Other States• Consensus Building in Your State

Example

Problem-Solving Practice Profiles in an RtI Framework

32

RESOURCE - Professional Practices in Problem Solving: Benchmarks and Innovation Configurations

~ Iowa Area Education Agency Directors of Special Education, 1994

Practice Profile Defining “it” Through the Development and

Use of Practice Profiles

Guiding Principles identified

Critical Components articulated

For each critical component:

Identified gold standard

Identified acceptable variations in practice

Identified ineffective practices and undesirable practices

Hall and Hord, 2010 Implementing Change: Patterns, Principles, and Potholes (3rd Edition) and Adapted from work of the Iowa Area Education Agency

33

Practice Profiles

Each Critical Component is a heading

Each level of implementation specifies the activities necessary to operationalize that Critical Component

Critical Component Ideal Implementation

Acceptable Variation

Unacceptable Variation

Unacceptable Variation

Critical Component 1: Description

Description of implementer

behavior

Drastic Mutation

Hall and Hord, 2010, Implementing Change: Patterns, Principles, and Potholes (3rd Edition) and Adapted from work of the Iowa Area Education Agency

34

Professional Problem Solving 9 Critical Components

Parent Involvement

Problem Statement

Systematic Data Collection

Problem Analysis

Goal Development

Intervention Plan Development

Intervention Plan Implementation

Progress Monitoring

Decision Making

Critical Component Ideal Implementation

Acceptable Variation

Unacceptable Variation

Unacceptable Variation

Critical Component 1: Description

Description of implementer

behavior

Professional Practices in Problem Solving: Benchmarks and Innovation ConfigurationsIowa Area Education Agency Directors of Special Education, 1994

35

Professional Problem Solving Parent Involvement as a Critical

ComponentCritical Component Ideal

Implementation

Acceptable Variation

Unacceptable Variation

Unacceptable Variation

Critical Component 1: Description of ParentInvolvement and Rationales for its importance

Parents are informed at all decision –making points and invited to participate by phone, letter, or email. Parents “choose” to participate.

Professional Practices in Problem Solving: Benchmarks and Innovation ConfigurationsIowa Area Education Agency Directors of Special Education, 1994

36

Professional Problem Solving Parent Involvement as a Critical

ComponentCritical Component Ideal

Implementation

Acceptable Variation

Unacceptable Variation

Unacceptable Variation

Critical Component 1: Description of ParentInvolvement and Rationales for its importance

Parents are informed at all decision –making points and invited to participate by phone, letter, or email. Parents “choose” to participate.

Parents are informed at all decision –making points and invited to participate by phone, letter, or email. Parents “choose” not to participate.

Professional Practices in Problem Solving: Benchmarks and Innovation ConfigurationsIowa Area Education Agency Directors of Special Education, 1994

37

Professional Problem Solving Parent Involvement as a Critical

ComponentCritical Component Ideal

Implementation

Acceptable Variation

Unacceptable Variation

Unacceptable Variation

Critical Component 1: Description of ParentInvolvement and Rationales for its importance

Parents are informed at all decision –making points and invited to participate by phone, letter, or email. Parents “choose” to participate.

Parents are informed at all decision –making points and invited to participate by phone, letter, or email. Parents “choose” not to participate.

Parents are informed of decisions at all decision –making points. But parents are not invited to participate.

Parents are not informed or invited to participate at decision-making points.

Professional Practices in Problem Solving: Benchmarks and Innovation ConfigurationsIowa Area Education Agency Directors of Special Education, 1994

38

Professional Problem Solving Parent Involvement – Critical

Components Critical Component Ideal and/or

AcceptableKnowledge, Skills, and Abilities

DriverAnalysis

Areas ofImpact

(outcomes)

Critical Component: Parent Involvement

Parents are informed at all decision –making points and invited to participate by phone, letter, or email. Parents “choose” to participate.

We know what is necessary to put “it” in place.

This is how we ensure that “it” is in place!

We can prove that we’ve “got it”!

39

Things to Think About Think about your SPDG effort and your

involvement and guidance at the State, District, and School levels.

Currently, our SPDG work is well operationalized ? ….At the Classroom level

» _Strongly Agree _Agree __Disagree __Strongly Disagree

…At the district level

…At the regional level

…At the State level

40

Michigan’s Practice Profile: Building Leadership Team Example

41

42

CALIFORNIA’S EVALUATION TOOL:IMPLEMENTATION RUBRIC

The 10 items are intervention practices-focused mostly, with site team and fidelity items

The overall tool and process of how the rubric isused drives the implementation practices Self-evaluate and reflect on learning and

implementation. Shared with coaches and trainers to guide activities Evaluates the fidelity of implementation of both the PD

model and the interventions

Former 26-item, 3-point ERIA Checklist lacked the specificity to be meaningful and useful.

43

IMPLEMENTATION RUBRIC, ADAPTED FROM “GOAL ATTAINMENT SCALES” Amy Gaumer Erickson and Monica Ballay presented

“goal attainment scales” on a June 17 SIG Network webinar: http://www.signetwork.org/content_pages/78

Rubric explicitly describes 5 implementation levels for each of 10 items: Levels 1, 2, and 3 reflect the “Not started,” “In

progress,” and “Achieved” implementation levels of former checklist.

Levels 4 and 5 detail concrete steps towards optimal implementation, beyond the basics.

Each implementation level for each item is explicitly described, building more meaning into the tool than our previous checklist format allowed.

44

IMPLEMENTATION RUBRIC EXCEL FILE:MULTI-YEAR TRACKING AND AUTOMATED REPORTS

The same file is used in all three years of ERIA,reporting both the trend and most-recent entries.

45

ERIA on the Web:http://calstat.org/effectivereading.html

Li Walter: li@sonic.net

Alan Wood: alan.wood@calstat.org (707) 287-0054

Other Research on Implementation Fidelity

Observations may be crucial because teachers are known to be biased in their reports (Hansen and McNeal, 1999).

Given the frequency with which adaptations are observed in research and practice, program developers need to anticipate how and when teachers will modify programs and develop guidelines and recommendations to ensure program goals are met (Dusenbury, Brannigan, Hansen, Walsh, & Falco, 2005)

46

• The project will set its own benchmarks for professional development participants 1 year into training/assistance, 2 yrs in, 3 yrs in, 4 yrs in • For example: 1 yr benchmark = 40% of core

features in place, 4 yr benchmark = 80% of features in place

The project will then determine what percentage of participants they expect to reach this benchmark (e.g., 80% of participants)

a.Participants could be individual teachers (if working with just a few teachers or other type of professional per school or district) or could be a school (if working on a school-wide basis, such as RTI or PBIS)

47

Self-assessment is acceptable, but projects will need to sample from the group to validate the self-assessment

a.For example, if 15 schools were being measured someone from the project would observe at least 3 (1/5th) of the schools and compare their assessment with the self-assessment

A baseline wouldn’t be necessary

48

Questions?

What kind of guidance would be helpful to you?

What challenges do you foresee in capturing these data?

49

Recommended