17
Howell Public Schools Continuous Improvement Model Howell Public Schools utilizes the following Continuous Improvement Model, which incorporates AdvancEd’s 5 Components of Continuous Improvement . HPS Continuous Improvement Model Documentation of the AdvancEd 5 Components of Continuous Improvement Component 1 Component 2 Component 3 Component 4 Component 5 Analyzing Data Gather-Report- Study Set Goals Measurable- Attainable Plan Strategies- Resources-Actions Implement Benchmarks- Deliverables Evaluate Monitor Success Adjust 1. District Level: Annual Report 1. District Level: District Strategic Plan 1. District Level: District School Improvement Plan that includes a PD Plan 1. District and School Levels: State Assessments 1.District Level: HPS Evaluation of the District Improvement Plan (D.I.P.) 2.School Level: Student Data Profile 2. District Level: District School Improvement Plan 2. School Level: School Improvement Plans Some include a Title 1 Plan 2.District and School Levels: District Common Assessments 2.School Level: HPS Evaluation of each building’s School Improvement Plan (S.I.Ps) 3.District and School Levels: Documented data using District data warehouse tool (Data Director), analysis meetings, such as District PD days, staff meetings, PLCs 3.School Level: School Improvement Plans Some include a Title 1 Plan 3. School Level: Administrator Leadership Logs, PLC Meeting Notes 3. Classroom and Individual Student Levels: Progress Monitoring Assessments - DRA - SRI - SMI

HPS Continuous Improvement Model Documentation of the

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Howell Public Schools Continuous Improvement Model

Howell Public Schools utilizes the following Continuous Improvement Model,

which incorporates AdvancEd’s 5 Components of Continuous Improvement.

HPS Continuous Improvement Model Documentation of the AdvancEd 5 Components of Continuous Improvement

Component 1

Component 2

Component 3

Component 4

Component 5

Analyzing Data Gather-Report-

Study

Set Goals Measurable-Attainable

Plan Strategies-

Resources-Actions

Implement Benchmarks-Deliverables

Evaluate Monitor Success

Adjust

1. District Level: Annual Report

1. District Level: District Strategic Plan

1. District Level: District School Improvement Plan that includes a PD Plan

1. District and School Levels: State Assessments

1.District Level: HPS Evaluation of the District Improvement Plan (D.I.P.)

2.School Level: Student Data Profile

2. District Level: District School Improvement Plan

2. School Level: School Improvement Plans Some include a Title 1 Plan

2.District and School Levels: District Common Assessments

2.School Level: HPS Evaluation of each building’s School Improvement Plan (S.I.Ps)

3.District and School Levels: Documented data using District data warehouse tool (Data Director), analysis meetings, such as District PD days, staff meetings, PLCs

3.School Level: School Improvement Plans Some include a Title 1 Plan

3. School Level: Administrator Leadership Logs, PLC Meeting Notes

3. Classroom and Individual Student Levels: Progress Monitoring Assessments - DRA - SRI - SMI

The Continuous Improvement Process

The HPS Continuous Improvement Process incorporates the 5 components of the AdvancEd

Continuous Improvement Model. Action steps are taken at both the district and the school levels

with each of the components.

Component 1: Analyzing Data (Gather-Report-Study)

Gather

The district collects student achievement data and stores it in a program called Data Director.

The program is accessible to all teachers and administrators and encompasses state and district

assessment results. The district also collects demographic data, including attendance and

discipline, in our PowerSchool program. This program also identifies students who receive

services through programming such as English Language Learner (ELL), Title 1, At Risk,

special education, and homeless services. Again, all teachers and administrators have access to

this information for the students they are responsible for servicing. Finally, our school staff

utilizes state web sites, such as the OEAA secure site, the MI School Data site

(www.mischooldata.com) and others to gather information for individual or group analysis.

Report

Student achievement and other pertinent information, as designated by the Michigan Department

of Education (MDE) are published annually. A District Annual Education Report (AER) and

individual school AERs are published by September 1st of each year.

In addition, the district and school make periodic reports to parents via their monthly newsletter,

reporting out their MEAP and MME as they are released to the public. Schools also give parents

continuous feedback on their child’s achievement through 24-7 access to PowerSchool, our

online grading program, as well as through quarterly or semester report cards.

Study

At the district level, data is analyzed by the District Improvement Team. This team is composed

of school representatives (the building School Improvement Chairs and Principal), parents, a

School Board member and district level administrators. At the building level, data is studied by

the staff as a whole, as well as by their School Improvement Steering Committee, which has

parent and support staff representation.

At the school level, each of our ten schools has a Data Team Leader, a person assigned to help

with the technical aspects of Data Director and to assist with data collection for the School

Improvement Team, the administration and the staff. The Data Team Leader also assists the

building School Improvement Chair in collecting data for their annual Comprehensive Needs

Assessment, more recently called the Student Data Profile. The School Improvement Chair and

Data Team Leader then lead the staff in data analysis to complete the reflective questions within

that document. In addition, various groups, as well as individual staff, look at data throughout

the year. Specific data reports are generated for Professional Learning Communities (PLCs), for

grade level teams, for study groups, for individual staff, for departments or for whole staff

purposes.

Component 2: Set Goals (Measurable-Attainable)

The district has created a Strategic Plan with objectives that pertain to both academic and non-

academic goals. Each of these objectives has identified measurable benchmark data that is

collected on a quarterly basis and is projected on a web-based dashboard display. The Strategic

Plan is reviewed annually.

The district also sets specific student academic goals within its District Improvement Plan.

Howell Public Schools is AdvancEd District Accredited and works within that organization’s

continuous improvement framework. The district improvement team is currently using the AYP

Target Goal percentages in math and ELA, by grade level, as their measurable goals.

At the school level each building sets academic goals in their School Improvement Plans. Again,

they work with the AdvancEd District Accreditation framework. The school improvement teams

are also currently using the AYP Target Goal percentages in math and ELA, by grade level, as

their measurable goal. Five of the HPS elementary schools are Title 1 schools and contain a

Title 1 plan with their School Improvement Plan.

Component 3: Plan (Strategies-Resources-Action)

Documented plans at the district level include the HPS Strategic Plan and District Improvement

Plan. The documented plan for the individual schools is the School Improvement Plan, some of

which contain a Title 1 School Improvement Plan. Each of these are evaluated and revised

annually. Each contains measurable and attainable goals, strategies, assigned resources (in dollar

amounts) and action steps.

Currently the district is focusing on three strategies:

Improve Teaching and Learning

Improve Documenting and Using Results

Improve our Model of Continuous Improvement

These strategies come from the AdvancEd 5 Quality Standards. A number of Action Steps are

then created under each Strategy, and resources are determined for each Action Step.

The Actions or Activity Steps in our District and School Improvement Plans are based on

research, as well as on the needs shown to us through our data studies. The School Improvement

Teams are diligent in providing professional development for Activity Steps that require this, per

our PD Needs Assessment.

The Strategies, Action Steps and Resources are determined at the district level by the district

improvement team and at the school level by the school improvement team. School

improvement chairs and principals serve on the district improvement team, allowing

communications to flow up and down between the levels.

Component 4: Implement (Benchmarks-Deliverables)

The implementation of the School/District Improvement Plans is done through the building

school improvement teams and overseen by the district improvement team. A template school

improvement meeting agenda was created to ensure monthly school improvement meetings are

focused on this task. That template includes a statement of purpose at the top, followed by

agenda items of 1) Report from the building’s Title 1 and/or 31a representative 2) Implementing

the Plan 3) Review of data and 4) Research.

Agenda / Minutes

Meeting: School Improvement School:

Date:

Facilitator:

Time:

The charge of the School Improvement Team (SIT) is to meet monthly to implement their School

Improvement Plan (SIP), including their 31a (At Risk)/Title 1 Plan and the professional

development contained within their SIP. The SIT should include the Principal, School

Improvement Chair, Data Team Leader, Teacher Consultant and At Risk/Title 1 teacher, as well

as representatives from the stakeholder groups of classroom teacher(s), support staff, parents

(Title 1 parents if a Title 1 school), community members (optional) and students (if applicable).

Minutes should include a list of Attendees and their role in this space.

I. Welcome, Introductions, Review of Agenda

II. Report from 31a (At Risk) or Title 1 Representative

III. Implementing our School Improvement Plan, including professional development.

a. Issue, Activity Step or PD #1:

b. Issue, Activity Step or PD #2:

c. (Add as needed)

IV. Review of Data (optional) (Be sure to include this data dialogue in your Student Data Profile)

V. Discussion of ideas/research for next year’s SIP (optional)

VI. Next Steps and Adjourn

HPS has invested time and resources into providing strong foundational curriculum documents to

guide our teaching and learning and to provide the desired student proficiency through

designated benchmark assessments. That curriculum includes:

HPS Scope & Sequence of Essential Skills, with Power-Standards

Designated Marzano Instructional Strategies

Resources provided to teach the Standards in the HPS Scope & Sequence

District designated benchmark assessments, as well as utilization of the state assessments

Benchmark assessments and data are utilized at every level to better inform our stakeholders

regarding district, school, and individual academic student growth.

At the district level, benchmarks are identified as the state assessments (MEAP, MEAP

ACCESS, MI-ACCESS, ACT and MME), as well as our district created common

assessments.

At the school level, benchmarks are identified as the EXPLORE (grades 8 and 9), the PLAN

(grade 10), district created common assessments, MLPP, DRA, DIEBELS NEXT, and the

Scholastic Reading Inventory (SRI).

Progress monitoring refers to benchmark assessments used for individual students or sub-

groups of students. Benchmark assessments include the DRA, DIEBELS NEXT, SRI and

the Scholastic Math Inventory (SMI), as well as other teacher created formative

assessments.

Component 5: Evaluate (Evaluate-Monitor Success-Adjust)

Evaluate

Each spring the district improvement team sets aside two release days to evaluate the current

District Improvement Plan, make adjustments for the upcoming year, and outline a new plan.

Each Action Step is reviewed carefully by looking at available data and hearing reports from

leaders and implementers of that Step.

Using the MDE Sample Evaluation Plan as an inspiration, the district improvement team created

two evaluation templates. One is to be utilized to evaluate the District Improvement Plan

(D.I.P.) and the other, the individual School Improvement Plans (S.I.Ps).

The D.I.P evaluation template begins by comparing the target measurable data objective to the

actual measurable data objective for each of the Goals. This is documented by grade level. The

district improvement team will review their state achievement data, document their gains and

record their reflections in the comments column. The second part of the template requires the

district improvement team to evaluate each of the Action Steps within the plan. Components of

that evaluation included ranking the degree of implementation, documenting the method of

monitoring and evaluation, documenting the people responsible for the Action Step, making a

decision as to whether or not the Action Step should be continued in the upcoming year’s D.I.P.,

recording any comments regarding the Action Step and finally, assigning resources to the Action

Step for the upcoming year. The team will determine if data can be assigned to the Action Step,

and if so, will carefully review that data and utilize it in rendering their evaluation. Some Action

Steps are more easily tied to data than others.

The S.I.P. evaluation template mirrors the D.I.P. evaluation template. Because the S.I.P.s

contain the foundational Action Steps of the D.I.P., schools are asked to only evaluate the Action

Steps of the S.I.P. that differ from the D.I.P. They must also indicate if the Action Step pertains

to Title 1, as part of their inclusive Title 1 Improvement Plan. In this way, the School

Improvement Teams will be evaluating both their overall S.I.P. and their Title 1 Improvement

Plan.

HPS Continuous Improvement Model - Evaluation of District Improvement Plan (D.I.P.) TEMPLATE

Evaluation of District Improvement Plan for School Year of xxxx-xxxx Date Evaluation Completed by District Improvement Team: __________

Evaluation of Goals with Data Based Objectives

Chosen Objective: MDE AYP Targets

Growth Measurement

Comments

SUBJECT GOALS

Grade Level Data Objective was met Show % Prof & (Target %)

Grade Level Data Objective was not met Show % Prof & (Target %)

Grade Levels with Student Growth

ELA Reading

EXAMPLE: 3 MEAP 87% (86%) 4 MEAP 85% (85%) 5 MEAP 85% (84%) 6 MEAP 91% (83%) 7 MEAP 84% (82%) 8 MEAP 83% (82%) 11 MME 2011 74% (T=71%)

EXAMPLE: 3 MEAP 72% (86%) 4 MEAP 83% (85%) 5 MEAP 82% (84%) 6 MEAP 81% (83%) 7 MEAP 74% (82%) 8 MEAP 75% (82%) 11 MME 2011 70% (T=71%)

EXAMPLE: 3 75-72% -3% 4 76-83% +7% 5 77-77% Main. 6 76-81% +5% 7 68-74% +6% 8 60-75% +15% 11 MME 2010 to 2011 Maintain 0% growth

ELA Writing

4 MEAP 7 MEAP 11 MME

4 7 11

Evaluation of Goals with Data Based Objectives

Chosen Objective:MDE AYP Targets Growth

Measurement

Comments

SUBJECT GOALS

Grade Levels Objective was met (Target %)

Grade Levels Objective was not met (Target %)

Grade Levels with Student Growth

Math

3 MEAP 4 MEAP 5 MEAP 6 MEAP 7 MEAP 8 MEAP 11 MME

3 4 5 6 7 8 11 MME

Science 5 MEAP 8 MEAP 11 MME

5 8 11

Social Studies

6 MEAP 9 MEAP 11 MME

6 9 11

Non-Core Subjects

Evaluation of Strategies & Activity Steps

Implementation Not Evident Emerging Operational Highly Functional

Monitoring and Evaluation MEAP EXPLORE PLAN MME/ACT Common Assess DRA SRI Annual D.I.T. Eval

People Responsible

Continue or

Discontinue

Comments on Implementation & Suggestions

for the Upcoming

School Year

Resources Necessary for the Upcoming

School Year ($ amount)

STRATEGY Improve Teaching

& Learning

Activity Steps / Interventions Examples below from the 2011-12 DIP

ALL Learn and Implement Reading Apprenticeship

ELEM Learn Instructional Consultation Strategies (IC Teams)

HS Provide Credit Recovery Opportunities

MS & HS

Provide extended opportunities for learning, such as summer school

STRATEGY Improve

Documenting & Using Results

Activity Steps / Interventions

ALL

Develop an assessment process to systematically collect, analyze, and communicate multiple measures of data

ALL

Teach students to know, understand and be able to share their ELA data

Elem Engage Staff in Data Analysis

STRATEGY Improve

our Model of Continuous Improvement

Activity Steps / Interventions

ALL Create a Continuous Improvement Process

ALL

Provide PD in the AdvancEd School Improvement (S.I.) Process, as well as all state and federal mandates

September 24, 2012 Page 11

Monitor Success

Monitoring of the school and district improvement plans is done at the monthly school and district meetings.

At those times they review the data in accordance with the HPS Data Analysis Calendar. Teachers and staff

also sit on these teams and are involved in the monitoring of data and programming through their

professional learning communities. Data for this work is that of local and state assessments. Required data

review from the HPS Data Analysis Calendar includes the DRA 2, Dibels Next, SRI, Writing Prompts,

District Common Assessments, Explore, Plan and state assessments.

Adjust

The school improvement chairs and principals represent and are the voice of their building as they serve on

the district improvement team. This facilitates the constant flow of communications between the school and

district level. Adjustments may be made, therefore, during the school year, but are made more formally in

the spring.

During the spring district improvement release days, time is dedicated to evaluate the current District

Improvement Plan, make adjustments for the upcoming year, and outline a new plan. Each Action Step is

reviewed carefully by looking at available data and hearing reports from leaders and implementers of that

Step. Adjustments are then made, based on this evaluation.

SUMMARY

In summary, the HPS Continuous Improvement Model, created in the 2011-12 school year, will be reviewed

annually by the district improvement team and will be shared with an AdvancEd consultant for additional

feedback. It is assumed this is a living document and modifications will be made as recommended.

See following pages for an optional tool from the MDE.

Their MDE EVALUATION TOOL may be utilized by District Improvement Team presenters reporting on

their program, such as Reading Apprenticeship, IC, Summer School, etc.

September 24, 2012 Page 12

Michigan Department of Education EVALUATION TOOL

Prepared by [Insert team members]

Description Title: Brief description: Need being addressed: Reason for selection, including intended results: Research citation and brief summary: Impact: What was the program/strategy/initiative’s impact on students?

IN AN IDEAL PROGRAM/STRATEGY/INITIATIVE, the school’s achievement results on state or district wide assessments meet proficiency standards. Achievement gaps between each of the relevant subgroups and their counterparts have been narrowed as proposed in the School Improvement Plan’s measurable objectives. Interim assessment results indicate progress toward proficiency for all students to the satisfaction of all stakeholders.

a) What is the evidence and what does it show regarding achievement of the measureable

objective for all students when compared to baseline state and local data?

b) What is the evidence and what does it show regarding achievement of the measureable

objective for subgroups and their counterparts when compared to baseline state and local

data?

c) What is the evidence and what does it show regarding stakeholder (staff, parents, students)

satisfaction with the results?

Conclusion: If objectives were met, should the strategy/program/initiative be continued or institutionalized?

a) What is the evidence and what does it say regarding whether this was the right

program/strategy/initiative to meet your needs?

b) What is the evidence and what does it say regarding whether the benefits of the

program/strategy/initiative are sufficient to justify the resources it requires?

c) What adjustments if any might increase its impact while maintaining its integrity?

d) What is needed to maintain momentum and sustain achievement gains?

e) How might these results inform the School Improvement Plan?

If objectives were not met, consider the following analysis:

September 24, 2012 Page 13

1. Readiness: What was the readiness for implementing the program/strategy/initiative?

IN AN IDEAL PROGRAM/STRATEGY/INITIATVE, stakeholders are well-prepared to implement the program. They have read and can articulate the research foundation, and regularly use the terms in conversation with each other, students, and with parents. Staff, students and parents express a high level of interest in, support for and commitment to the program. Specific concerns have been identified and solutions have been planned/ implemented. Staff is able to seamlessly integrate the program within the context of other building/district initiatives.

a) What is the evidence and what does it show regarding stakeholder understanding of

the need as well as stakeholder ability to articulate the research regarding the choice of

the program/strategy/initiative?

b) What is the evidence and what does it show regarding stakeholders having a shared

vision and purpose for the work and a strong commitment to the

program/strategy/initiative?

c) What is the evidence and what does it show regarding how stakeholder concerns were

identified and addressed?

d) What is the evidence and what does it show regarding the ability of staff and

administrators to integrate the program/strategy/initiative with existing work?

Suggested Evidence for Question 1:

Meeting agendas/minutes

Books/papers about the program

Staff surveys

SI Plan elements

Professional development materials

Conference/workshop attendance

Data collection plan; data analysis work

Stakeholder survey results

Suggestion box ideas collected

SI team agendas

Focus group interviews

Given the evidence you’ve assembled, choose one overall self-assessment for Question 1:

What was the readiness for implementing the program/strategy/initiative?

Interest and/or commitment were low.

Some promising elements exist, but were mixed with major gaps in knowledge or confidence.

Support and commitment were generally high, but some concern or work remains.

Stakeholders were fully prepared to implement.

NEXT STEPS: What action steps are needed to increase readiness?

September 24, 2012 Page 14

2. Knowledge and Skills: Did staff and administrators have the knowledge and skills to implement the

program/strategy/initiative?

IN AN IDEAL PROGRAM/STRATEGY/INITIATIVE, personnel are able to clearly articulate what successful implementation looks and sounds like and how specific practices will change as a result of its implementation. Staff and administrators can articulate specific outcomes and specific criteria for evaluation. Personnel can demonstrate their ability to apply the knowledge and skills required to successfully implement with fidelity, and professional learning opportunities are provided to address gaps in knowledge and skills.

a) What is the evidence and what does it show regarding staff and administrators’ vision

for how practice would change as a result of the program/strategy/initiative?

b) What is the evidence and what does it show regarding administrator knowledge and

ability to monitor and assess the effectiveness of the program/strategy/initiative?

c) What is the evidence and what does it show regarding the sufficiency of opportunities

for staff to learn knowledge and skills identified as essential (the non-negotiable or

acceptable variations of the elements) to the program/strategy/initiative?

d) What is the evidence and what does it show regarding staff ability to apply the

acquired knowledge and skills?

Suggested Evidence for Question 2:

Minutes of professional conversations

Self-assessment checklists,

Staff surveys,

Superintendent or administrator observations/ walkthroughs

Professional learning agendas, sign-in sheets

program simulations, administrator observations Given the evidence you’ve assembled, choose one overall self-assessment for Question 2:

Did participants have the knowledge and skills to implement the program/strategy/initiative?

Participants were beginning to acquire the necessary knowledge and skills.

A solid start was documented, but many skill levels and much knowledge need to be acquired.

Much knowledge and skill were evident, but few skills (or some knowledge bases) still need work.

Participants had sufficient knowledge and skills to succeed.

NEXT STEPS: What action steps are needed to improve participants’ knowledge and skills?

September 24, 2012 Page 15

3. Opportunity: Was there opportunity for high quality implementation of the

program/strategy/initiative?

IN AN IDEAL PROGRAM/STRATEGY/INITIATVE, building and district administrators provide significant support for project implementation. Sufficient funds have been allocated and continue to be managed by building principal and or program director. Adequate resources are available for full implementation including time for staff collaboration in various forms. Clearly defined structures/protocols are in place to collect and review formative implementation data.

a) What is the evidence and what does it show regarding the sufficiency of administrative support to achieve the intended results?

b) What is the evidence and what does it show regarding the sufficiency of professional learning during implementation, e.g. modeling/coaching?

c) What is the evidence and what does it show regarding the sufficiency of resources – including financial and time - to achieve the intended results?

d) What is the evidence and what does it show regarding staff collaboration in support of the

program/strategy/initiative?

e) What is the evidence and what does it show regarding structures being in place to collect

and review implementation data?

Suggested Evidence for Question 3:

Agendas/minutes

Action plans

Email correspondence

Focus group and/or anonymous surveys

Budget sheets

Logs, school schedules

Inventories

Curriculum pacing guides

collaboration models (such as Professional Learning Communities, Collaborative Action Research, Lesson Study Teams)

Curriculum pacing guides

Staff meeting results

Protocols for reviewing formative assessment

Given the evidence you’ve assembled, choose one overall self-assessment for Question 3:

Was there opportunity for high quality implementation?

Opportunity and resources were just beginning to align in support of the program.

Basic resources and opportunities were available, but significant gaps need to be filled.

Many necessary resources were aligned with program goals, but more are needed.

Necessary support and resources (time, funding, and attention) were solidly in place.

NEXT STEPS: What action steps are needed to ensure opportunity for high quality implementation?

September 24, 2012 Page 16

4. Implementation with Fidelity: Was the strategy/program/initiative being implemented as

intended?

IN AN IDEAL PROGRAM/STRATEGY/INITIATVE, all personnel involved in the program implement

the strategies with fidelity according to the research, carrying out responsibilities by their

proposed timelines. They use clearly defined protocols to collect and review formative

implementation data to identify unintended consequences. Program leaders consider

adjustments guided by implementation data while maintaining the integrity of results.

a) What is the evidence and what does it show regarding the fidelity of implementation

of the non-negotiable or acceptable variations of the elements of the

program/strategy/initiative, including timelines and responsibilities?

b) What is the evidence and what does it show regarding unintended consequences that

may have occurred?

c) What do student achievement results suggest for implementing/modifying the

program/strategy/initiative? How might these affect the integrity of the results?

Suggested Evidence for Question 4:

Principal’s walkthroughs

Number of staff implementing with fidelity

Model lessons

Surveys

Coaching schedule

Agendas and minutes of common planning time/meetings

Focus group interviews

Debriefing following model lessons

Collegial observations/visits

Training agendas & material

Program Time Line

Lists of acquired resources

September 24, 2012 Page 17

Given the evidence you’ve assembled, choose one overall self-assessment for Question 4

Was the program implemented as intended?

Parts of the program were working, but others have yet to be implemented.

The overall design was in place, but variations in practice were evident and may be adversely affecting results.

Critical elements have been implemented, but work on consistency and depth remains.

All research-based elements have been implemented with fidelity following the proposed timelines.

NEXT STEPS: What action steps are needed to ensure faithful implementation of program plans?

If you have questions regarding this Tool, contact Shereen Tabrizi, Ph.D. Office of Field Services-MDE at [email protected]