Dazed and Confused: General Supervision Administrators’ Management Meeting September 2009 Karen...

Preview:

Citation preview

Dazed and Confused: General Supervision

Administrators’ Management Meeting September 2009

Karen Denbroeder, AdministratorSpecial Programs, Information, and Evaluation

Kim C. Komisar, Ph.D., AdministratorProgram Administration and Quality Assurance

Bureau of Exceptional Education and Student ServicesFlorida Department of Education

Topics

General Supervision System• Overview – conceptual framework

• General supervision tools

• Changes to SPP Indicators

• Correction of noncompliance

• LEA determinations

• Leveled monitoring system

Topics

Other hot topics…• Coordinated early intervening services (CEIS)

• Services to parentally-placed private school students

• Revocation of consent

• ???

GS – Conceptual Framework

SPP/APR guides the process of general supervision

Continuous improvement focuses on the SPP indicators

Why “Dazed and Confused”?

“Me, Myself, and Irene” was a close second…

GS – Conceptual Framework

General supervision activities and processes are:1. Tied to school year by data requirements2. Confounded by APR reporting cycle (e.g., reporting on 2007-08 and 2008-09 in February 2010…)3. Despite #1 and #2 above, action planning is conducted “…from this point forward…”

GS – Conceptual Framework

Improvement planning related to performance indicators conducted by:• SPP indicator teams

• Targeted districts

Correction of noncompliance identified by• Monitoring

• Dispute resolution

• Data collection

GS – Tools

Guide to Calculations SPP indicator teams Planning calendar

GS – Tools

Self-assessment system• Assists with data collection/tracking

• SPP 13 – Secondary transition

• SPP 15 – Timely correction of noncompliance

• Informs district’s problem-solving process by identifying or ruling out procedural issues impacting performance

GS – Tools

General supervision Web site• Program improvement plans (PIPs) for

targeted districts

• Self-assessment system for monitoring and professional development

• Corrective action plans (CAPs) for systemic noncompliance

• Student-level correction of noncompliance

• Corrective actions required through state complaint investigations (new)

GS – Tools: General Supervision Web Site

GS – Tools: General Supervision Web Site

GS – Tools: General Supervision Web Site

GS – Tools: General Supervision Web Site

The Lion King was another option…

“It's the Circle of Life And it moves us all

Through despair and hope…”

October

July

August

November January

September

June

March

February

April

May

December

October

July

August

November January

September

June

March

February

April

May

December

LEA Determinations

Data Submission• SPP 12

CEIS Determinations

Data Submission• SPP 11

October

July

August

November January

September

June

March

February

April

May

December

P

F

Fall CycleLevel 1 – All districts begin self-assessment:• SPP 13• Matrix• DJJ

All districts submit self-assessment results

Districts submit:• Student-level correction of noncompliance• Corrective action plan (CAP), if required

Fall cycle preliminary monitoring report disseminated

Fall cycle final monitoring report disseminated

Correct student-specific noncompliance

October

July

August

January

September

June

March

February

April

May

December

P

F

Fall CycleDistricts targeted on:• LRE/Student Performance (SPP 3, 5)• Suspension/Expulsion (SPP 4)Begin Level 2 Fall self-assessment:• LRE/Student Performance• Suspension/Expulsion

Districts submit:• Self-assessment results• Program improvement plan (PIP)• CAP, if required

Correct student-specific noncompliance

Included in fall cycle preliminary monitoring report

Districts submit:• Student-level correction of noncompliance• Address in CAP, if required

Included in fall cycle final monitoring report

November

October

July

August

January

September

June

March

February

April

May

December

P

F

November

PF

Spring CycleDistricts targeted on: • Exiting (SPP 1, 2, 13, 14)• Disproportionality (SPP 9, 10)• 60-Day Timeline (SPP 11)• C-to-B Transition (SPP 12)Begin Level 2 Spring self-assessment:• Exiting• Disproportionality

Districts submit:• PIP• Self-assessment results • CAP, if required

Spring cycle preliminary monitoring report disseminated

Districts submit:• Student-level correction of noncompliance•CAP, if required

Correct student-specific noncompliance

October

July

August

January

September

June

March

February

April

May

December

P

F

November

PF

Level 3 – On-site monitoring begins

Level 3 – On-site monitoring ends

Key

Submit self-assessment

DOE Report

Submit PIP

Submit CAP, if required

Monitoring

Correction of student-specific noncompliance

Implement CAP

Level 2 Spring targeted districts submit:•Self-assessment results •CAP, if required•PIP (Level 2 Spring)

Level 2 Fall – Districts targeted and begin self-assessment:LRE/Student Performance (SPP 3, 4, 5)

October

July

August

January

September

June

March

Level 1 – All districts begin self-assessment:•SPP 13•Matrix•DJJ

Districts submit self-assessment results: •Level 1 – All districts•Level 2 Fall – Targeted districts

February

Level 1 – Districts submit, if needed:Student-level correction of noncompliance•CAP and/or•PIP (Level 2 Fall)

Level 1 – Preliminary monitoring report disseminated

Level 1 – Final monitoring report disseminated

Level 3 – On-site monitoring begins

Level 3 – On-site monitoring ends

April

May

Level 1 and Level 2 Fall – Districts submit:CAP (Level 1 and/or Level 2 Fall, if needed)PIP (Level 2 Fall)

December

Level 2 Spring – Districts targeted and begin self-assessment:•Exiting (SPP 1, 2, 13, 14)•Disproportionality (SPP 9, 10)•Timely Evaluation (SPP 11)•C to B Transition (SPP 12)

LEA Determinations

CEIS Determinations

P

F

Data Submission•SPP 12

Data Submission•SPP 11

November

PF

Changes to State Performance Plan Indicators

February 1, 2010 Submission

Changes to Annual Performance Report No February 2010 APR reporting for SPP

indicators 6, 7, 13, and 14 (baseline and targets will be reported for indicator 7 in the SPP)

Data to lag one year for indicators 1, 2, and 4• 2007-08 data will be reported in APR for these

indicators

• For all other indicators, 2008-09 data will be reported

Calculation Guide Organized by indicator Data sources Timeframe for data retrieval Calculation method Key data elements

Changes to APR

Calculations have changed for• Indicator 1: Graduation

• Indicator 3: Participation and performance on statewide assessment

• Indicator 11: 60-day timeline

• Indicator 13: Secondary transition in the IEP (no changes for Florida)

• Indicator 14: Post-school outcomes

Indicator 1: Graduation Rate

Old formula• Standard diploma SWD graduates in a given

year divided by Total SWD completing their education or dropping out in the same year

New formula• NCLB calculation (four-year cohort model)

2007-08 Graduation Rate

45.2%

43.0%

41.5%

42.0%

42.5%

43.0%

43.5%

44.0%

44.5%

45.0%

45.5%

Old Formula New Formula

Indicator 1: Graduation Rate

Because of the change to the indicator, we will be establishing new baseline and targets

Indicator 3: Participation and Performance

Performance will now be calculated based on students enrolled for full academic year (reported in October and February) rather than all students taking the test

Participation still calculated for all students enrolled

Indicator 11: 60-Day Timeline

Old Measurement• Reported separately the number of students (1)

determined not eligible and (2) determined eligible whose initial evaluations were completed within 60 days

New Measurement• (1) and (2) have been collapsed

Web-based reporting for 2008-09 data

Indicator 13: Secondary Transition in the IEP

No reporting this year Indicator language has changed to mirror

IDEA secondary transition requirements Florida is already using the correct

language

Indicator 14: Post-school Outcomes

Reporting not required this year Indicator requires mutually exclusive

reporting of students• Enrolled in higher education

• Competitively employed

• Enrolled in other postsecondary education or training

• Employed in some other employment

Indicator 14: Post-school Outcomes

Consulting with FETPIP Office to address changes

Indicator 4B: Disproportionality in Suspension/Expulsion

This indicator was in the original State Performance Plan and was removed by OSEP

It will be included in the February 2011 APR submission based on data from 2008-09

It will be treated in the same way as indicators 9 and 10 (including a review of policies, practices and procedures)

Correction of Noncompliance

Compliance Indicators

Disproportionality due to inappropriate identification (SPP 9 and 10)

Completion of initial evaluations within 60 day timeline (SPP 11)

Transition from Part C to Part B (SPP 12) Secondary Transition in the IEP (SPP 13) Correction of noncompliance (SPP 15) Timely and accurate reported data (SPP 20)

Compliance Indicators

Findings of noncompliance are corrected as soon as possible but no later than one year from identification/notification

Correction of noncompliance occurs at the individual student level and at a systemic level

Systemic noncompliance is defined as identified noncompliance in 25% or more of individual cases

Compliance Indicators

Reporting correction of noncompliance in the APR occurs in the year following identification

Example

In 2007-08, District A had findings of noncompliance in Indicator 10.

In 2008-09, verification of correction of this noncompliance is reported in both Indicator 10 and in Indicator 15.

The 2007-08 findings would be part of a district’s LEA determination in Spring 2009.

The 2008-09 correction of these findings would be part of the district’s LEA determination in Spring 2010.

Timeline Compliance Indicators

For indicator 11, evidence of correction for individual students demonstrates that the student was evaluated.

For indicator 12, evidence of correction for individual students demonstrates that an IEP was developed and implemented.

The current data reporting structure for both of these indicators includes this evidence.

LEA Determinations

2008-09 Data

LEA Determinations

Determinations are made using a rubric that allocates points. Total points decide which determination a district receives.

For the purposes of determinations, all calculations will be rounded to the nearest whole number.

LEA Determination Elements

A district receives one point for each of the following if they meet substantial compliance (95%):• Indicator 9

• Indicator 10

• Indicator 11

• Indicator 12

• Indicator 20

LEA Determination Elements

A district receives one point if there is 100% correction of noncompliance identified in 2007-08 data.

A district receives one point if there are no critical state audit findings reported by the Auditor General.

Total possible points = 7

Leveled Monitoring System

2009-10

2009-10: Level 1 Monitoring

Level 1 Self-Assessment – All districts• SPP 13 – Secondary transition addressed in

the IEP

• Matrix of services

• Services to students in DJJ facilities• Basic procedural compliance at the facility and

district levels

• IEP implementation

• Random sampling provides a “snapshot” of the district

2009-10: Level 2 Monitoring

Level 2 Self-Assessment – Targeted districts• Fall Cycle

• SPP 3, 5 cluster (LRE/assessment)

• SPP 4 (Suspension/Expulsion)

• Concurrent with Level 1

• Spring Cycle• SPP 1, 2, 13, 14 cluster (Exiting)

• SPP 9, 10 cluster (Disproportionality)

2009-10: Level 2 Monitoring

Level 2 – Targeted districts• Focused protocols for newly targeted districts

• Why newly targeted only? If the district was targeted for a given indicator in 2008-09, procedural self-assessment was conducted and:• Either procedural noncompliance was not a systemic

issue or

• Procedural noncompliance was a systemic issue, and the district already has addressed or is currently addressing it through a CAP

2009-10: Level 2 Monitoring

Level 2 – Targeted districts• Purposeful sampling of those students most

likely to be impacted by noncompliance in the indicator-specific related requirements provides more meaningful and useful data to district problem-solving teams

2009-10: Level 3 Monitoring

Level 3 – On-site monitoring • Level 1 self-assessment protocols and

• Level 2 (Spring, Fall, or both) self-assessment protocol(s), if applicable, and

• On-site monitoring of one or more of the following:• Matrix of services 254/255

• Timely correction of noncompliance

• Pattern of poor performance on multiple indicators

• Focus on IEP implementation

2009-10: Level 3 Selection Criteria

Matrix of services 254/255• Adjusted for out-of-district students for

purposes of district selection• > 150% of state rate for 254

• > 150% of state rate for 255

• > 150% of state rate for 254 and 255 combined

• Monitoring activities will apply to both in-district and out-of-district students

2009-10: Level 3 Selection Criteria

Timely correction of noncompliance• Self-assessment results

• State complaint investigations

• Due process hearings

• SPP compliance indicators (11, 12, 13, (15))

2009-10: Level 3 Selection Criteria

Timely correction of noncompliance• OSEP timeline of “as soon as possible, but in

no case longer than one year from identification” applies to LEA determinations

• BEESS internal timelines applies to district selection for on-site monitoring• 60 days for student-specific noncompliance

identified through self-assessment

• 10-12 months for systemic noncompliance

• Established timelines for noncompliance identified through state complaints (30-60 days) or due process hearings

2009-10: Level 3 Selection Criteria

Timely correction of noncompliance – moving forward to 2010-11• OSEP timeline requires that within one year:

• Districts must correct all noncompliance

• Bureau must verify correction occurred

2009-10: Level 3 Selection Criteria

Timely correction of noncompliance – moving forward to 2010-11• OSEP allows states to not report in the APR

noncompliance that is corrected before it is formally “identified,” although states must verify the correction (e.g., discovered during on-site monitoring, but corrected prior to the report being disseminated)

2009-10: Level 3 Selection Criteria

Pattern of poor performance on multiple indicators or clusters over time

Example:• Targeting by the exiting cluster doesn’t

automatically trigger Level 3 –

• But targeting by the exiting cluster for three years in a row likely will!

Monitoring Timeline

2009-10

Still not sufficiently “Dazed and Confused”?? Just wait!!

Timeline – Closing Out 2008-09

Final 2008-09 on-site visits being conducted now

January 27, 2010 – Districts with CAPs submit final status report demonstrating:• Correction of all student-specific noncompliance

• > 75% compliance on designated standards

• Yes, there is an elephantin the room… There will

be overlap between years.

Timeline – 2009-10

Level 1 and Level 2 Fall Cycle (focused)• October 15, 2009 – Level 2 Fall Cycle districts

notified of status as targeted districts• LRE/student performance (SPP 3, 5)

• Suspension/expulsion (SPP 4)

• October 15, 2009 – Draft manual and conference call information in BEESS Weekly Memo

Timeline – 2009-10

Level 1 and Level 2 Fall Cycle (focused)• October 20, 21, 2009 – Informational

conference calls • Level 1 monitoring

• Level 2 monitoring

• October 26, 2009 – Districts begin self-assessment

Timeline – 2009-10

Level 1 and Level 2 Fall Cycle (focused)• January 8, 2010 – Districts submit via Web

site• Self-assessment results

• PIP for Level 2 Fall

• January 29, 2010 – Level 1 and Level 2 Fall Cycle preliminary report disseminated

Timeline – 2009-10

Level 1 and Level 2 Fall Cycle (focused)• March 8, 2010 – Districts submit

• Correction of student-specific noncompliance

• CAP for systemic noncompliance, if required

• March 29, 2010 – Level 1 and Level 2 Fall Cycle final report disseminated

Timeline – 2009-10

Level 2 Spring Cycle (focused)• February 3, 2010 – Level 2 Spring Cycle

districts notified of status as targeted districts• Exiting (SPP 1, 2, 13, 14)

• Disproportionality (SPP 9, 10)

• Timely evaluation (SPP 11)

• C-to-B transition (SPP 12)

• February 8, 2010 – Level 2 Spring Cycle districts begin self-assessment• Exiting (SPP 1, 2, 13, 14)

• Disproportionality (SPP 9, 10)

Timeline – 2009-10

Level 2 Spring Cycle (focused)• April 5, 2010* – Districts submit via Web site

• Level 2 Spring Cycle self-assessment results

• PIP for Level 2 Spring Cycle• Exiting (SPP 1, 2, 13, 14)

• Disproportionality (SPP 9, 10)

• Timely evaluation (SPP 11)*

• C-to-B transition (SPP 12)*

• April 26, 2010 – Level 1 and Level 2 Spring Cycle preliminary report disseminated

* Date may differ for SPP 11, 12

Timeline – 2009-10

Level 2 Spring Cycle (focused)• June 7, 2010 – Districts submit

• Correction of student-specific noncompliance for Level 2 Spring Cycle

• Exiting (SPP 1, 2, 13, 14)

• Disproportionality (SPP 9, 10)

• CAP for systemic noncompliance, if required

• June 28, 2010 –Level 2 Spring Cycle final report disseminated

Timeline – 2009-10

Level 3 - On-site monitoring• Notification (goal) – November 1, 2009

• On-site visits (goal)• January – May 2010

• May need to extend to August – October 2010

And now you know! Clearly - “Dazed and Confused”!

Coordinated Early Intervening Services

CEIS

Use of CEIS

Districts may choose to use up to 15% of IDEA funds for early intervening services

Districts may be required to use the full 15%

Required CEIS

Districts are required to set aside 15% of IDEA funds for early intervening services if any of the following criteria are met:• Students of any race are at least 3.5 times more

likely to be identified as disabled compared to all other races (SWD, IND, EBD, SLD, ASD, OHI, SILI)

Required CEIS

• Students with disabilities ages 6-21 of any race are at least 3.5 times more likely to be placed in a separate class or other separate environment when compared to all other races (SWD, IND, EBD, SLD, ASD, OHI, SILI)

• Students with disabilities of any race are at least 3.5 times more likely to be suspended/expelled when compared to all other races combined (SWD only)

Remember…….

You must report in the automated student data base each student who received services funded through CEIS dollars

States are required to track by student those nondisabled children who received these services and whether or not they ultimately were found eligible for special education and related services.

CEIS Reporting Requirements

For districts voluntarily using up to 15% of IDEA dollars for CEIS and those who are required to use funds for CEIS, a code has been added to the Element “Fund Source” in automated student data base to indicate students receiving CEIS under requirements in the IDEA.

Fund Source

Code I indicates student receiving Early Intervening Services funded by IDEA, Part B dollars.

Reported on Federal/State Indicator Status format in Survey 5.

Percentage of Districts Required to Implement CEIS and Districts Choosing Whether to Implement CEIS on

Voluntary Basis (Yes or No)

CEIS Required19%

CEIS VoluntaryYes37%

CEIS VoluntaryNo44%

Uses of CEIS Set Aside Funds

Personnel• RtI Coordinators, teachers, behavior

specialists, substitutes, and paraprofessionals

Professional Development• Consultants

• Stipends for teachers and other staff

• Travel costs for participants

Uses of CEIS Set Aside Funds

Technology• Instructional

• Data collection and reporting tools Materials and Supplies

• Consumables for teachers and students

Challenges/Issues

CEIS• Specify applicable population of students

• Nondisabled K-12 students

• Tier II and Tier III

• Identify appropriate set-asides in budget(s)

• Budget for the 15% limit, if required

Parentally Placed Private School Students

Requirements and Challenges

Proportionate Share – Part B

Year Students Funds

2007-08 13,871 $ 17,976,857

2008-09 8,901 $ 11,941,119

2009-10 9,478 $ 28,634,542

Year Students Funds

2007-08 770 $ 417,258

2008-09 361 $ 176,676

2009-10 252 $ 293,862

Proportionate Share – Part B

FY 2008-09Proportionate Share Expenditures

Expenditures Roll

Part B = $8,846,354 $5,354,325

PreK = $ 133,886 $ 49,729

Private School Consultation

How are eligible students enrolled in non-profit private schools identified?

How are private school representatives and parents of children with disabilities informed of the process?

How is the amount of proportionate share determined and how will funds be used?

Private School Consultation

How are decisions made with regard to services offered in the consultative agreement? (i.e., types of services, including direct service, and any alternate service delivery)

How is this information (consultative agreement) shared with private schools and parents?

Private School Consultation

How are affirmations obtained from representatives of private schools? (Affirmation signed by private school reps should document that “timely and meaningful consultation occurred”)

Grant Challenges/Issues

Consultation/Proportionate Share• Describe district’s unique consultation

process, including affirmation

• Identify set-asides in budget(s)

• Describe appropriate expenditures for satisfying share funds

You must have documentation on file that a“timely and meaningful consultation has occurred”

and

is signed off by private school officials or by a representative of private schools.

Remember…

Expending Proportionate Share

“YES!” Speech therapy Language therapy Occupational/physical

therapy Instructional support

per student’s services plan (SP)

“NO!” Psychological testing Guidance counseling Any activities, including

observations, leading to identifying eligibility (both initial and for reevaluations)

Expending Proportionate Share

“YES!” Consumables and all

instructional materials age-appropriate and for specific use by students with disabilities and their teachers

“NO!” Evaluation and

testing materials used by professionals conducting assessments to identify children initially and for reevaluations

Expending Proportionate Share

“YES!” Computer hardware

and software specific for use by SWD

Transportation costs of serving SWD in public school (between private school/public school)

“NO!” Upgrade computers

at the school Purchase site license

for new reading curriculum at school

Reimburse parents for transporting SWD to public school

Revocation of Consent

What does it really mean?

Revocation of Consent

Parent must make the request in writing The district may not delay the cessation

of services The district may not challenge the

parent’s request Revocation of consent reflect dismissal

from ESE, not discontinuation of some services

Revocation of Consent

District must provide prior written notice of change of FAPE/placement• Will reflect the parent’s request

• Can include the district’s rationale for advising that consent not be revoked

• Should include a description of the rights and benefits no longer conveyed

Revocation of Consent

Historical record stands – prior participation in ESE cannot be deleted from the record

Disciplinary protections no longer apply FCAT waiver no longer applicable OSEP says accommodations “may” be

continued if the teacher provides them to other nondisabled students

Revocation of Consent

Application to districts’ virtual instruction programs?• The district has an obligation to provide FAPE

to an eligible student with a disability

• FAPE is not “one size fits all”

• There is no algorithm to plug in the student data and have “FAPE in the LRE” fall out

• Think outside the box; be flexible; be honest

Revocation of Consent

Application to districts’ virtual instruction programs?• What does the district do when the parent of a

student with multiple significant disabilities in need of highly specialized “low availability” services rejects placement at the school site(s) where the services currently are provided?

• If the parent refuses to allow the district to provide FAPE, the district may need to discuss revocation of consent.

Contact Us850-245-0475

Data Collection and Reporting; SPP/APR; CEIS• Karen.Denbroeder@fldoe.org

General Compliance• Kim.Komisar@fldoe.org

Monitoring• Patricia.Howell@fldoe.org

Dispute Resolution• Demetria.Harvell@fldoe.org

Recommended