84
Making Program Evaluation “Visible” October 1, 2014 MAS/FPS Fall Directors’ Institute Oakland Schools & Wayne RESA Collaborative 1

Making Program Evaluation “Visible”

Embed Size (px)

DESCRIPTION

Making Program Evaluation “Visible”. October 1, 2014 MAS/FPS Fall Directors’ Institute Oakland Schools & Wayne RESA Collaborative. Let’s Connect. Table Introductions…30 seconds each Name District/Role What is something about program evaluation you hope to learn more about?. - PowerPoint PPT Presentation

Citation preview

Page 1: Making Program Evaluation “Visible”

MakingProgram Evaluation “Visible”

October 1, 2014MAS/FPS Fall Directors’ Institute Oakland Schools & Wayne RESA Collaborative

1

Page 2: Making Program Evaluation “Visible”

Table Introductions…30 seconds each◦ Name◦ District/Role◦ What is something about program evaluation you

hope to learn more about?

Let’s Connect

2

Page 3: Making Program Evaluation “Visible”

Purpose: Deepen understanding of the program evaluation process through the use of visual tools.

Objectives: Participants will …1. Understand how to use visual tools to plan for

program evaluation.

2. Understand how to select “what” & “when” to monitor to inform the evaluation process.

3. Describe how the program evaluation process is linked to continuous school improvement.

Products:◦ “Tri-Fold” of an instructional program◦ Strategic Map of DIP/SIP goal area

“POP” for Today

3

Page 4: Making Program Evaluation “Visible”

Write any burning questions NOT addressed in today’s “POP” on an index card.

Hold up when ready.

Burning Questions

4

Page 5: Making Program Evaluation “Visible”

Setting the StagePurpose Matters

Page 6: Making Program Evaluation “Visible”

Is your “program” meeting its intended purpose?

Ultimate Evaluation Question

Page 7: Making Program Evaluation “Visible”

ESEA, Sect. 1001: Purpose of Title I

“The purpose of this title is to ensure that all children have a

fair, equal, and significant opportunity to obtain a high-

quality education and reach, at a minimum, proficiency on

challenging State academic achievement standards and state

academic assessments.”

Page 8: Making Program Evaluation “Visible”

State and Federal Requirements Related to Program Evaluation

❑ Annual evaluation of the implementation and impact of the School Improvement Plan

❑ Modification of the plan based on evaluation results

❑ Annual evaluation of all federal programs—effectiveness & impact on student achievement, including subgroups

❑ Modification of the plan based on evaluation results

MICHIGAN FEDERAL

ISDs/RESAs are required by PA25 to provide technical assistance to schools and districts to develop annual evaluations. ESEA requires annual evaluations of programs funded by the federal programs such as Title I, Part A, C, D; Title II and Title III.

Page 9: Making Program Evaluation “Visible”

Title I Program-Specific Requirements:Supplemental Services

Needs assessment of eligible students

Supplemental Services provided only to eligible students

Research-based strategies Ongoing review of

student progress Provide additional

support, if needed Revise Title I TA

program Provide training for

educators

Measures to ensure students’ difficulties are identified

Timely & additional assistance to students having difficulty.

Research-based strategies

Annual evaluation

Targeted Assistance Schoolwide

9

Page 10: Making Program Evaluation “Visible”

Conduct Train the Trainer workshop on Program Evaluation to include representatives from each of ISD/SIFN, OFS, OEII, AdvancED, MICSI, LEAs.

ISD/MDE trainers to conduct regional workshops for LEAs

Include program evaluation activities in SIP/DIP to support Program Evaluation as part of the Continuous Improvement Process Implement Program Evaluation activities throughout the 2014-2015 school year

Report on the evaluation of ONE program using the MDE Program Evaluation Diagnostic (submit in ASSIST). (Required for approval of 2015 – 2016 Consolidated Application.)

Sustain professional learning by reconvening trainers to discuss successes, challenges, and develop the required follow-up training materials and support systems

MDE Program Evaluation Roll Out

Feb-Mar 2014

Mar-Aug 2014

Spring 2014:DIP/SIP 2014-15

Summer 2015+

June 30, 2015

Page 11: Making Program Evaluation “Visible”

11

 

http://www.youtube.com/watch?v=IGQmdoK_ZfY

Best Evidence Is Collected…when the Right questions are asked..

Page 12: Making Program Evaluation “Visible”

MDE’s Questions for Evaluation

5. Impact on students?

2. Knowledge and skills?1.

Readiness? 3. Opportunity?

4. Implemented as intended?

Page 13: Making Program Evaluation “Visible”

ASSIST: Program Evaluation Diagnostic

1. What was the READINESS for implementing the strategy/ program/initiative?

2. Did participants have the KNOWLEDGE AND SKILLS to implement the program?

3. Was there OPPORTUNITY for implementation?

4. Was the program IMPLEMENTED AS INTENDED?

IMPACT: What was the IMPACT of the STRATEGY/ PROGRAM/ INITIATIVE ON STUDENT ACHIEVEMENT?

DESCRIPTION of the Program/Strategy/Initiative

CONCLUSIONS: continue, adjust, discontinue

Page 14: Making Program Evaluation “Visible”

Linking Program Evaluation to Program

Planning

14

Page 15: Making Program Evaluation “Visible”

Gather & Study Achievement & Process Data:Q4: Implemented with fidelity?Q5: Impact on Students?

Michigan Continuous School Improvement“Plan”: Get Ready, Implement, Monitor, and Evaluate

A Strategy/Initiative/Program

Gather & Study Process Data:Q1: Readiness to implement?Q2: Knowledge & Skills of Implementers?Q3: Opportunity to implement?

Page 16: Making Program Evaluation “Visible”

“Evaluation” begins with Planning

Getting Ready1. Readiness 2. Knowledge/Skills3. Opportunity

Implementation/ Monitoring4. ImplementingWith Fidelity5. Impact on Students

Planning:What will we do to ensure….. ?

“GR-IM-E”EvaluationConclusions:Program Effectiveness

Page 17: Making Program Evaluation “Visible”

Baseline

PM 1, Formative

PM 2, Formative

PM 3, Formative

PM 4 = Program Evaluation;Summative

“Body of Evidence” Program Evaluation

17

Draw Conclusions

Monitor & Adjust

DURING implementatio

n.

Data on Adult Implementation AND Impact on Students

Page 18: Making Program Evaluation “Visible”

Q5: Impact on Students Student Work Assessment Results

• Universal Screening• Benchmark

Assessments• Diagnostic Assessments

Observations of Students

Surveys Others

Q4: Program Implementation (What adults are doing) Lesson Plans Student work, artifacts Activity Logs Observation Data, Checklists Self assessments of practice Surveys Staffing; Job Descriptions Agendas, Minutes Schedules Policies & Procedures Others

Sources of Evidence

18

Page 19: Making Program Evaluation “Visible”

What is making sense?

What questions might you have?

Debrief: “Take 3”

19

Page 20: Making Program Evaluation “Visible”

Guiding Question: What did we say we would do?

Using a “Tri-Fold” to ‘See’ Program

Components

20

Page 21: Making Program Evaluation “Visible”

Builds shared understanding of the program, strategy or initiative that will be implemented

Articulates critical program components◦ Purpose◦ Evidence of need◦ Expected student outcomes◦ Expected adult actions◦ Resources needed

Helps us “see” what to monitor

“Tri-Fold”

21

Page 22: Making Program Evaluation “Visible”

Select a supplemental program/service from your DIP or SIP.

It should meet as many of the following criteria as possible:o It is a instructional program for students. (Tier 2 or 3

program/service.)o It has been implemented for at least a semester…a year is

preferable.o The process for selecting students to participate in the

program is clear to you.o You understand the program/service and can describe it to

others.22

“Tri-Fold” Case Study: Select a Program*

*Program = program, strategy, or initiative

Page 23: Making Program Evaluation “Visible”

Be able to answer two questions:

1) WHAT…is the name of the program or service?

2) WHY…does it exist?

23

Team Turn and Talk

Page 24: Making Program Evaluation “Visible”

Activity: Create “Tri-Fold”

Use blank paper (landscape) Draw a line near the top to create a

“header” Draw a line near the bottom to create a

“footer” See example on next slide.

24

Page 25: Making Program Evaluation “Visible”

25

Page 26: Making Program Evaluation “Visible”

“Name of Program”

26

Purpose Statement

Page 27: Making Program Evaluation “Visible”

To improve the foundational math skills of students in grades 4-6 so they can access and achieve grade-level standards.

Title I Before School Math Program

Page 28: Making Program Evaluation “Visible”

28

To improve the foundational math skills of students in grades 4-6 so they can access and achieve grade-level standards.

Title I Before School Math Program

Fold into thirds.

Page 29: Making Program Evaluation “Visible”

“Program”

Eligibility Criteria Key Components Exit Criteria

29

Purpose Statement

Resources

Page 30: Making Program Evaluation “Visible”

3) WHAT…are the Key Components or “Critical Features” of the program or service? (Refer or find in your DIP/SIP)

What “defines” this program or service? What would observers expect to see if the

program/service is implemented as intended (based on research)?

What do students “get”?

30

Team Turn and Talk

Page 31: Making Program Evaluation “Visible”

“Program”

Eligibility Criteria Key Components Exit Criteria

31

Purpose Statement

Resources

List the key components or

“critical features”. What does the

program/service look like when

implemented well?

What do students “get”?

Page 32: Making Program Evaluation “Visible”

“Title I Before School Math Program”To improve the foundational math skills of students in grades

4-6 so they can access and achieve grade-level standards.

•30 min., 2-3x week, a.m.•Breakfast snack•10:1 stud:tchr ratio•Students grouped by need•Pre-teaching of skills to support access to grade-level curriculum•Aligned to daily classroom instruction•Use research-based strategies (i.e., Concrete-Pictorial-Abstract)•Teacher & interventionist plan weekly

Key Components Exit CriteriaEligibility Criteria

32Resources/PD

Page 33: Making Program Evaluation “Visible”

4) WHAT…are the eligibility and exit criteria for the program?

5) WHAT…are the resources (including PD) needed to implement the program?

33

Team Turn & Talk

Page 34: Making Program Evaluation “Visible”

What criteria are used to identify

eligible children? Consider the

purpose of the program and the

needs it was designed to

address.

What criteria are used to determine when students exit

the program? Consider the

eligibility criteria and how you know when a student’s needs have been met.

Include the data source as well as the

“cut score” and timeframe

.

Eligibility Criteria Key Components Exit Criteria

34

“Program”

Purpose Statement

Resources/PD

Page 35: Making Program Evaluation “Visible”

• MEAP: Level 3 or 4 for two consecutive years

• District Benchmark Assessment: below 65% on 2 of last 3 assessments

• NWEA: 0-25th percentile on most recent assessment

• MEAP: Level 1 or 2 on most recent assessment

• District Benchmark: at least 70% on 2 of last 3 assessments

• NWEA: 26th percentile or higher on last assessment

• C or better semester grade in math.

Key Components Exit CriteriaEligibility Criteria

35Resources/PD

“Program”

Purpose Statement

Page 36: Making Program Evaluation “Visible”

“Program”

Eligibility Criteria Key Components Exit Criteria

36

Purpose Statement

What resources are needed? (materials, PD, food….?)

Page 37: Making Program Evaluation “Visible”

“Title I Before School Math Program”Purpose: To improve the foundational math skills of students in

grades 4-6 so they can access and achieve grade-level standards.

• MEAP: Level 3 or 4 for two consecutive years

• District Benchmark Assessment: below 65% on 2 of last 3 assessments

• NWEA: 0-25th percentile on most recent assessment

• MEAP: Level 1 or 2 on most recent assessment

• District Benchmark: at least 70% on 2 of last 3 assessments

• NWEA: 26th percentile or higher on last assessment

• C or better semester grade in math.

Key Components Exit CriteriaEligibility Criteria

• 30 min., 2-3x week, a.m.

• Breakfast snack• 10:1 stud:tchr ratio• Students grouped by

need• Pre-teaching of skills to

support access to grade-level curriculum

• Aligned to daily classroom instruction

• Use research-based strategies (i.e., Concrete-Pictorial-Abstract)

• Teacher & interventionist plan weekly

37

Resources/PD: Manipulatives, Snacks, Intervention Teacher (360 hrs), PD for Int. Tchr. (registration, sub costs), others..,

1

34 4

5

2

Page 38: Making Program Evaluation “Visible”

What is making sense?

What questions might you have?

Debrief: “Take 3”

38

Page 39: Making Program Evaluation “Visible”

“Program”

Eligibility Criteria Key Components Exit Criteria

39

Purpose Statement

Resources/PD

5. Impact on

students?4.

Implemente

d

as inte

nded?

1. Readiness?

2. Knowledge and skills?

3. Opportunity?

Getting Ready

Page 40: Making Program Evaluation “Visible”

Randel Jasserand,Chicago Public Schools

Strategic Management:Introduction

40

Page 41: Making Program Evaluation “Visible”

Strategic ManagementEnsuring School & Student Success

Selected Slides from webinarFeb. 20, 2013

Randel Josserand, Chief of Schools

Page 42: Making Program Evaluation “Visible”

Typical School Improvement

42

Student Outcome MetricImprove State Reading Assessment from 56.2%

Meet/Exceed to 64.2% M/E (+8% Points)

Lunch Bunch Program (GR 3-8)

Reading Recovery Program (GR K-2)

After School Tutoring (3-8)

Balanced Literacy Program ( GR 3-5)

Extended Small Group (K-2)

SS & Reading / Lit Circles ( GR 6-8)

Guided Reading Program ( GR K-2)

Lunch Bunch Program (GR 3-8)

Writer’s Workshop Program ( GR 6-8)

Page 43: Making Program Evaluation “Visible”

Typical School Improvement

43

Student Outcome MetricImprove State Reading Assessment from 56.2%

Meet/Exceed to 64.2% M/E (+8% Points)

Lunch Bunch Program ( GR 3-8)

Reading Recovery Program ( GR K-2)

After School Tutoring (3-8)

Balanced Literacy Program ( GR 3-5)

Extended Small Group( GR K-2)

SS & Reading / Lit Circles ( GR 6-8)

Guided Reading Program ( GR K-2)

Lunch Bunch Program ( GR 3-8)

Writer’s Workshop Program ( GR 6-8)

Computer Program ( GR K-2)

Math Literacy Program ( GR 6-8)

Silent Reading ( GR 3-5)

SSR( GR K-2)

Reading Workshop ( GR K-2)

After School Tutoring (3-8)

Breakfast Club (3-8)

Before School Tutoring (3-8)

Small Group Reading (K-2)

New Grade Card( GR 6-8)

Page 44: Making Program Evaluation “Visible”

Typical School Improvement

44

Student Outcome MetricImprove State Reading Assessment from 56.2%

Meet/Exceed to 64.2% M/E (+8% Points)

Running Record(K-2)

Running Record(3-5)

Interim Assessment(3-5)

Interim Assessment(6-8)

DIBELS Assessment(K-1)

Reading Recovery Program ( GR K-2)

After School Tutoring (3-8)

Balanced Literacy Program ( GR 3-5)

Extended Small Group(GR K-2)

SS & Reading / Lit Circles ( GR 6-8)

Guided Reading Program ( GR K-2)

Lunch Bunch Program ( GR 3-8)

Writer’s Workshop Program ( GR 6-8)

Lunch Bunch Program (GR 3-8)

Lunch Bunch Program ( GR 3-8)

Reading Recovery Program ( GR K-2)

After School Tutoring (3-8)

Balanced Literacy Program ( GR 3-5)

Extended Small Group( GR K-2)

SS & Reading / Lit Circles ( GR 6-8)

Guided Reading Program ( GR K-2)

Lunch Bunch Program ( GR 3-8)

Writer’s Workshop Program ( GR 6-8)

Computer Program ( GR K-2)

Math Literacy Program ( GR 6-8)

Silent Reading ( GR 3-5)

SSR( GR K-2)

Reading Workshop ( GR K-2)

After School Tutoring (3-8)

Breakfast Club (3-8)

Before School Tutoring (3-8)

Small Group Reading (K-2)

New Grade Card( GR 6-8)

Page 45: Making Program Evaluation “Visible”

Strategically Managing School Improvement

45

Student Outcome MetricImprove State Reading Assessment from 56.2%

Meet/Exceed to 64.2% M/E (+8% Points)

Reading Recovery Program ( GR K-2)

After School Tutoring (3-8)

Balanced Literacy Program ( GR 3-5)

Extended Small Group(GR K-2)

SS & Reading / Lit Circles ( GR 6-8)

Guided Reading Program ( GR K-2)

Lunch Bunch Program (GR 3-8)

Writer’s Workshop Program ( GR 6-8)

Running Record(K-2)

Running Record(3-5)

Interim Assessment(3-5)

Interim Assessment(6-8)

DIBELS Assessment(K-1)

Develop Project PlansEach New Intervention MUST Have a Fully Developed Project Plan.

GR-IM-E

Page 46: Making Program Evaluation “Visible”

Strategically Managing School Improvement

46

Student Outcome MetricImprove State Reading Assessment from 56.2%

Meet/Exceed to 64.2% M/E (+8% Points)

Reading Recovery Program ( GR K-2)

After School Tutoring (3-8)

Balanced Literacy Program ( GR 3-5)

Extended Small Group(GR K-2)

SS & Reading / Lit Circles ( GR 6-8)

Guided Reading Program ( GR K-2)

Lunch Bunch Program (GR 3-8)

Writer’s Workshop Program ( GR 6-8)

Running Record(K-2)

Running Record(3-5)

Interim Assessment(3-5)

Interim Assessment(6-8)

DIBELS Assessment(K-1)

Identify Fidelity Metrics for Each InterventionEach New Intervention MUST Have One Or More Fidelity Metrics

Fidelity MetricsFidelity Metrics

Fidelity Metrics

Fidelity Metrics

Fidelity Metrics Fidelity Metrics

Fidelity Metrics

Fidelity Metrics

Page 47: Making Program Evaluation “Visible”

Are the RIGHT PEOPLE…

Doing the RIGHT THINGS…

In the RIGHT WAY…

At the RIGHT TIME…

…for the benefit of Students?

Fidelity METRICS TELL YOU…

Page 48: Making Program Evaluation “Visible”

Strategically Managing School Improvement

48

Student Outcome MetricImprove State Reading Assessment from 56.2%

Meet/Exceed to 64.2% M/E (+8% Points)

Running Record(K-2)

Running Record(3-5)

Interim Assessment(3-5)

Interim Assessment(6-8)

DIBELS Assessment(K-1)

Reading Recovery Program ( GR K-2)

After School Tutoring (3-8)

Balanced Literacy Program ( GR 3-5)

Extended Small Group(GR K-2)

SS & Reading / Lit Circles ( GR 6-8)

Guided Reading Program ( GR K-2)

Lunch Bunch Program (GR 3-8)

Writer’s Workshop Program ( GR 6-8)

Fidelity Metrics

Fidelity Metrics Fidelity Metrics

Fidelity Metrics

Fidelity Metrics

Fidelity Metrics

Fidelity Metrics

Fidelity Metrics

Page 49: Making Program Evaluation “Visible”

Progress Monitoring 1 & 2

49

Quarter 1

“CPA”Level of Use(Rating = 0 - 4)

Percent Proficient

(Benchmark)

Sept. 3.5 70%

Oct. 3.0 53%

Nov. 3.0 51%

Dec. 2.5 48%

Fidelity Metric

(Process Data)

Outcome Metric

(Impact Data)

Page 50: Making Program Evaluation “Visible”

Leadership and Learning Center 2010

CPA Strategy Implementation Data“Level of Use”, Rating Scale 0-4

Fidelity = 3.0

Page 51: Making Program Evaluation “Visible”

Leadership and Learning Center 2010

CPA Strategy Implementation Data“Level of Use”, Rating Scale 0-4

Fidelity =3.0

Outcome Metric =

75%

What is the relationship

between Fidelity & Achievement?

Page 52: Making Program Evaluation “Visible”

52

Same Data….Different View

2.4 2.6 2.8 3.0 3.2 3.4 3.60%

10%20%30%40%50%60%70%80%

Relationship Between CPA “Level of Use” and Percent Proficient

Strategy "Level of Use" Rating

Perc

ent P

rofic

ient

(2.5, 48%)

(3.5, 70%)

Page 53: Making Program Evaluation “Visible”

Fidelity Indicator Fidelity Indicator

Fidelity Indicator Fidelity Indicator

Fidelity Indicator

Fidelity Indicator

Fidelity Indicator Fidelity Indicator

“Level of Use”

Fidelity Indicator

Title I Math Lab

# Minutes Direct Instruction

Fidelity Indicator

53

MATHObj. 60% students proficient by June 2016

Title I Before School

Lunch Bunch

Online Courses

TIER I (CORE) TIER II(SUPPLEMENTAL)

TIER III(SUPPLEMENTAL)

ASSESSMENT

Math Workshop

Curriculum-Based Assessments NWEA District

Benchmark DIBELS-Math

Concrete-Pictorial-Abstract

SIP

Page 54: Making Program Evaluation “Visible”

Fidelity Indicator Fidelity Indicator

Fidelity Indicator Fidelity Indicator

Fidelity Indicator Fidelity Indicator

Fidelity Indicator Fidelity Indicator

54

MATHObj. 70% students proficient by June 2016.

Research-Based Math Strategies (CPA, ….)

Everyday Math

Title I Extended Day Programs (K-6)

Pull-Out Programs

Title I Summer School

Reading Recovery

TIER I (CORE) TIER II(SUPPLEMENTAL)

TIER III(SUPPLEMENTAL)

DIBELS-Math ASSESSMENTDistrict BenchmarkNWEACurriculum-Based

Assessments

DIP

Title I Tech-Based

Math Workshop

Page 55: Making Program Evaluation “Visible”

What would your map look like?◦Core Instruction◦Supplemental instruction◦Assessments

What is making sense? What questions might you have?

Check In: Turn and Talk

55

Page 56: Making Program Evaluation “Visible”

Creating a “Strategic Map”

Step-by-Step ProcessSIP/DIP

56

Page 57: Making Program Evaluation “Visible”

57

SIP/DIP GOAL AREAMeasurable Objective

ASSESSMENT ASSESSMENTASSESSMENTASSESSMENTASSESSMENT

Step 1.

Step 2:List all local assessments used to measure and/or monitor student achievement in this content area.

Link to Measurable Objective.

Page 58: Making Program Evaluation “Visible”

58

SIP/DIP GOAL AREAMeasurable Goal/Objective

TIER I (CORE) TIER II(SUPPLEMENTAL)

TIER III(SUPPLEMENTAL)

ASSESSMENT ASSESSMENTASSESSMENTASSESSMENTASSESSMENT

Step 3:Identify core and supplemental programs, strategies

or initiatives that support student learning in the content area. (DIP/SIP)

Page 59: Making Program Evaluation “Visible”

59

MathematicsMeasurable Goal/Objective

“CPA” Strategy

Title I Before School

Lunch Bunch

Title I Lab

Online Courses

TIER I (CORE) TIER II(SUPPLEMENTAL)

TIER III(SUPPLEMENTAL)

ASSESSMENT ASSESSMENTASSESSMENTASSESSMENTASSESSMENT

Math Workshop

Step 4: Sort and label programs/initiatives.

Title I Math Lab

Page 60: Making Program Evaluation “Visible”

60

MathematicsMeasurable Goal/Objective

“CPA” Strategy

Title I Before School

Lunch Bunch

Title I Lab

Online Courses

TIER I (CORE) TIER II(SUPPLEMENTAL)

TIER III(SUPPLEMENTAL)

ASSESSMENT ASSESSMENTASSESSMENTASSESSMENTASSESSMENT

Step 5: Link assessments to

programsTitle I

Math LabMath

Workshop

Page 61: Making Program Evaluation “Visible”

Fidelity Indicator Fidelity Indicator Fidelity Indicator

Fidelity Indicator Fidelity Indicator Fidelity Indicator Fidelity Indicator Fidelity Indicator

Fidelity Indicator

Fidelity Indicator

Title I Math Lab

# Minutes Direct Instruction

Fidelity Indicator

61

MathObj. 60% students proficient by June 2016

“CPA” Title I Before School

Online Courses

TIER I (CORE)TIER II

(SUPPLEMENTAL)TIER III

(SUPPLEMENTAL)

DIBELS-Math ASSESSMENTDistrictBenchmark

NWEACurriculum-Based Assessment

Math Workshop Step 6: Identify

Fidelity Indicators

Page 62: Making Program Evaluation “Visible”

Selecting“Fidelity Indicators:

Monitoring Adult Practice

62

Page 63: Making Program Evaluation “Visible”

“Title I Before School Math Program”Purpose: To improve the foundational math skills of students in grades 4-6

so they can access and achieve grade-level standards.

• MEAP: Level 3 or 4 for two consecutive years

• District Benchmark Assessment: below 65% on 2 of last 3 assessments

• NWEA: 0-25th percentile on most recent assessment

• MEAP: Level 1 or 2 on most recent assessment

• District Benchmark: at least 70% on 2 of last 3 assessments

• NWEA: 26th percentile or higher on last assessment

• C or better semester grade in math.

Key Components Exit CriteriaEligibility Criteria

• 30 min., 2-3x week, a.m.

• Breakfast snack• 10:1 stud:tchr ratio• Students grouped by

need• Pre-teaching of skills to

support access to grade-level curriculum

• Aligned to daily classroom instruction

• Use research-based strategies (i.e., Concrete-Pictorial-Abstract)

• Teacher & interventionist plan weekly

63

Resources/PD: Manipulatives, Snacks, Intervention Teacher (360 hrs), PD for Int. Tchr. (registration, sub costs), others..,

MONITOR Fidelity Indicator:# Minutes Instruction

“Implement

ing

w/fidelity”

“Getti

ng R

eady

Page 64: Making Program Evaluation “Visible”

“Title I Before School Math Program”Purpose: To improve the foundational math skills of students in

grades 4-6 so they can access and achieve grade-level standards.

• MEAP: Level 3 or 4 for two consecutive years

• District Benchmark Assessment: below 65% on 2 of last 3 assessments

• NWEA: 0-25th percentile on most recent assessment

• MEAP: Level 1 or 2 on most recent assessment

• District Benchmark: at least 70% on 2 of last 3 assessments

• NWEA: 26th percentile or higher on last assessment

• C or better semester grade in math.

Key Components Exit CriteriaEligibility Criteria

• 30 min., 2-3x week, a.m.

• Breakfast snack• 10:1 stud:tchr ratio• Students grouped by

need• Pre-teaching of skills to

support access to grade-level curriculum

• Aligned to daily classroom instruction

• Use research-based strategies (i.e., Concrete-Pictorial-Abstract)

• Teacher & interventionist plan weekly

64

Resources/PD: Manipulatives, Snacks, Intervention Teacher (360 hrs), PD for Int. Tchr. (registration, sub costs), others..,

“Implement

ing

w/fidelity”

Page 65: Making Program Evaluation “Visible”

“Concrete - Pictorial – Abstract (CPA)”http://www.loganschools.org/mathframework/CPA.pdf

To improve students’ skills in the area of numeration and operations in order to achieve grade-level proficiency.

“Critical Features” Learning GoalsBaseline Data• Organize content into

concepts• Use 3-step approach

1. Introduce concepts using concrete materials (i.e., manipulatives, measurement tools)

2. Use pictures to show visual representations of concrete materials. Explain relationship of pictures to concrete model.

3. Formal work with symbols (abstract) to represent numerical operations.

• Common grade-level planning 2x/month

• Use Lesson Plan/Data templates

65Resources/PD

Research Base

Local Expectations for Implementers

Page 66: Making Program Evaluation “Visible”

“Concrete - Pictorial – Abstract (CPA)”http://www.loganschools.org/mathframework/CPA.pdf

To improve students’ skills in the area of numeration and operations in order to achieve grade-level proficiency.

“Critical Features” Learning GoalsBaseline Data• Organize content into

concepts• Use 3-step approach

1. Introduce concepts using concrete materials (i.e., manipulatives, measurement tools)

2. Use pictures to show visual representations of concrete materials. Explain relationship of pictures to concrete model.

3. Formal work with symbols (abstract) to represent numerical operations.

• Common grade-level planning 2x/month

• Use Lesson Plan/Data templates

66Resources/PD

Fidelity Indicator:

Level of Use, 0-4

Page 67: Making Program Evaluation “Visible”

Possible Data Sources for Monitoring Fidelity of Implementation

Assessing the Fidelity Protocols

Classroom Observations

Staff Self-Assessments/Surveys

Walk Through Data

Focus Group Interviews

Page 68: Making Program Evaluation “Visible”

“Program”

Eligibility Criteria Exit Criteria

68

Purpose Statement

Resources/PD

5. Impact on

students?4.

Implemente

d

as inte

nded?

1. Readiness?

2. Knowledge and skills?

3. Opportunity?

Key Components

Page 69: Making Program Evaluation “Visible”

MDE Program Planning Toolhttp://www.michigan.gov/documents/mde/Planning_Tool.Sep_14_401912_7.doc

DESCRIPTION of the Program/Strategy/Initiative

Q1. How will we ensure readiness for implementing the strategy/program/initiative?

Q2. How will we ensure that staff and administrators have the knowledge and skills to implement the strategy/program/initiative?

Q3. How will we ensure that there is opportunity to implement the strategy/program/initiative with fidelity?

Q4. How will we ensure that the strategy/program/ initiative will be implemented as intended?

Q5. How will we ensure a positive impact on student achievement?

Getting

Ready

Page 70: Making Program Evaluation “Visible”

MDE Program Planning Toolhttp://www.michigan.gov/documents/mde/Planning_Tool.Sep_14_401912_7.doc

DESCRIPTION of the Program/Strategy/Initiative

Q1. How will we ensure readiness for implementing the strategy/program/initiative?

Q2. How will we ensure that staff and administrators have the knowledge and skills to implement the strategy/program/initiative?

Q3. How will we ensure that there is opportunity to implement the strategy/program/initiative with fidelity?

Q4. How will we ensure that the strategy/program/ initiative will be implemented as intended?

Q5. How will we ensure a positive impact on student achievement?

Implementing &

Monitoring, to inform

Evaluation

Page 71: Making Program Evaluation “Visible”

Created by Beth Brophy, State School Support Team Coordinator

Page 72: Making Program Evaluation “Visible”

Michigan Continuous School Improvement District & School: Plan, Monitor, and Evaluate

Page 73: Making Program Evaluation “Visible”

Closing:

Take 5 Big Ideas Next Steps

Feedback: +/∆/?

73

Page 74: Making Program Evaluation “Visible”

Oakland Schools:◦ Jan Callis, Title I: [email protected]

WAYNE RESA:◦ Jolia Hill, Title I: [email protected]

74

Contact Info

Page 75: Making Program Evaluation “Visible”

Other Tri-Fold Examples

75

Page 76: Making Program Evaluation “Visible”

Generic Model of Improvement Planning

(“Tri-Fold”)

Current State

Preferred Future

Baseline Data Implement “Program” Summative Data

76

Page 77: Making Program Evaluation “Visible”

“Program/Strategy/Initiative”Purpose Statement

“Critical Features”(“Gold Standard”)

Learning Goals(Evidence of

Success)

Baseline Data(Evidence of

Need)

77Supports, Resources, PD

Page 78: Making Program Evaluation “Visible”

“Program/Strategy/Initiative”Purpose Statement

“Critical Features”(“Gold Standard”)

Learning Goals(Evidence of

Success)

Baseline Data(Evidence of

Need)

78Supports, Resources, PD

Current State

Preferred Future

Page 79: Making Program Evaluation “Visible”

“Program/Strategy/Initiative”Purpose Statement

“Critical Features”(“Gold Standard”)

Learning Goals(Evidence of

Success)

Baseline Data(Evidence of

Need)

79Supports, Resources, PD

Are we READY to

implement?

Page 80: Making Program Evaluation “Visible”

“Program/Strategy/Initiative”Purpose Statement

“Critical Features”(“Gold Standard”)

Learning Goals(Evidence of

Success)

Baseline Data(Evidence of

Need)

80Supports, Resources, PD

Are we implementing with fidelity?

Page 81: Making Program Evaluation “Visible”

“Concrete - Pictorial – Abstract (CPA)”To improve students’ skills in the area of numeration and

operations in order to achieve grade-level proficiency.

• MEAP: 40% of students successful (80% correct) on numeration/ operation items

• Benchmark Assessments: Error analysis suggests misconceptions with numeration concepts impacting performance on operations.

• Classroom Assessments: 60% students performing at grade-level on operations

• 55% students proficient on numeration/operation items (80% correct) on state assessment, May 2015

• Benchmark Assessments: Increase proficiency on numeration items by 10% or more for each grade level

• Classroom Assessments: 80% students at grade level on operations by June 2014

“Critical Features” Learning GoalsBaseline Data• Organize content into

concepts• Use 3-step approach

1. Introduce concepts using concrete materials (i.e., manipulatives, measurement tools)

2. Use pictures to show visual representations of concrete materials. Explain relationship of pictures to concrete model.

3. Formal work with symbols (abstract) to represent numerical operations.

• Common grade-level planning 2x/month

• Use Lesson Plan/Data templates

81Manipulatives, PD for All Math Teachers (registration, sub costs),; others..,

Page 82: Making Program Evaluation “Visible”

“School-Parent Compact”Purpose Statement

Parents will: Students will:Teachers will:

82Resources, Workshops, PD

Page 83: Making Program Evaluation “Visible”

83

“Tri-Fold” Brochure: Outside Cover

School Improvement Goals

Achievements

Important Dates

Who to Call:

School Name

Logo

Address

PrincipalContact Information

Page 84: Making Program Evaluation “Visible”

Oakland Schools:◦ Jan Callis, Title I: [email protected]

WAYNE RESA:◦ Jolia Hill, Title I: [email protected] ◦ Ann LaPointe, School Improvement:

[email protected]

84

Contact Info