52
Using Data Using Data in the in the Delaware Delaware Performance Performance Appraisal System Appraisal System Wednesday, September 27, 2006 Wednesday, September 27, 2006

Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

Embed Size (px)

Citation preview

Page 1: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

Using DataUsing Datain thein the

Delaware Delaware Performance Performance

Appraisal SystemAppraisal System

Using DataUsing Datain thein the

Delaware Delaware Performance Performance

Appraisal SystemAppraisal System

Wednesday, September 27, 2006Wednesday, September 27, 2006

Page 2: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

2

Working Assumptions

• Everyone is thoroughly familiar with DPAS 1 and knows that DPAS 2 is being field tested.

• All administrators are familiar with the DSTP pages of the DOE website.

• All teachers and specialists have access to the DSTP pages of the DOE web site and know how to use those pages to find information about their students.

• NCLB is a data-based accountability systemand finally…

Page 3: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

3

Working Assumptions

• Collecting and analyzing data is the best way to identify and to help focus instruction on areas of need; therefore,

Page 4: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

4

Working Assumptions

• Collecting and analyzing data is the best way to identify and to help focus instruction on areas of need; therefore, collecting and analyzing data need to become integral parts of the school culture.

“Data Culture”

Page 5: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

5

DPAS 1vs.

DPAS 2

Page 6: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

6

DPAS Components

ListDPAS 11. Instructional Planning2. Organization and

Management of Classroom

3. Instructional Strategies4. Teacher/Student

Interaction5. Evaluation of Student

Performance6. Related Responsibilities

DPAS 21. Planning and

Preparation2. Classroom Environment3. Instruction4. Professional

Responsibilities5. Student Improvement

Page 7: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

7

DPAS Components

ComparisonDPAS 11. Instructional Planning

2. Organization and Management of Classroom

3. Instructional Strategies4. Teacher/Student Inter-

action5. Evaluation of Student

Performance6. Related Responsibilities

DPAS 21. Planning and

Preparation2. Classroom Environment

3. Instruction

5. Student Improvement

4. Professional Responsibilities

Page 8: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

8

DPAS 2 Components

Data PotentialDPAS 11. Instructional Planning

2. Organization and Management of Classroom

3. Instructional Strategies4. Teacher/Student Inter-

action5. Evaluation of Student

Performance6. Related Responsibilities

DPAS 21. Planning and

Preparation2. Classroom Environment

3. Instruction

5.5. Student Improvement Student Improvement

4. Professional Responsibilities (by encouraging certain kinds of staff development over others)

Page 9: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

9

DPAS 11. Instructional Planning

2. Organization and Management of Classroom

3. Instructional Strategies4. Teacher/Student Inter-

action5. Evaluation of Student

Performance6. Related Responsibilities

DPAS 21. Planning and

Preparation2. Classroom Environment

3. Instruction

5.5. Student Improvement Student Improvement

4. Professional Responsibilities (by encouraging certain kinds of staff development over others)

DPAS 1 Components

Data Potential

Page 10: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

10

So…

• How do I create a Data Culture in my school?

• How do I use DPAS 1 to encourage staff members to use data in making instructional decisions?

Page 11: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

11

Creating a Data Culture

• Open access to all data, for all staff– DOE website – The Honeycomb– Display data all around the school

• Hallways• Copy room!• School publications

• Model using data– If you don’t use it, why should your staff?– Setting school goals– Ask questions in terms of data

• in meetings• in both formal and informal conversations with staff

Page 12: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

12

Set the stage

• Make your expectations known up front– In writing– To all certified staff– Early in the observation cycle

• i.e., before you begin any observations

– School-Wide Expectations• summer letter• “First day” packet

– Reinforce frequently• every meeting: staff, departments/teams, SIP or

other leadership teams

Page 13: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

13

EXAMPLE: School-Wide Expectations

Areas of emphasis for next year: 1) classroom assessments and 2) focusing on individual low performers – There can be no doubt that we must continue our ongoing work on integrating reading, writing and math instruction into all content areas and that we must continue our work on identifying and meeting the needs of students in on our most problematic NCLB cells. As Chester and I observed classes, analyzed data and took stock of our status and progress, however, two things have become apparent. First, we must focus on classroom assessments as well as on state assessments. To be successful in reaching ever higher performance targets, we must be ever more clear as to what our instructional goals are, and we must plan all instruction to contribute directly to reaching those goals. In theory, the process is simple: we become thoroughly familiar with the state standards, we develop classroom assessments that will show that our students have met the state standards, and then we create lessons to teach students to do well on those assessments. This is exactly what many of you do, and in your cases Chester and I will be documenting your work and, I hope, working together with you to take your assessments to that proverbial “next level.” Where we need to work with others of you on the fundamentals, we will do that as well. Second, to meet ever higher performance targets, we must aim our instruction more and more directly toward meeting the needs of individual students. We have done well at teaching to the majority of learners, and we have done significant work at reaching out to bring in other less traditional learners. Literature circles, All Kinds of Minds analyses, and the CMP practice of expecting that in any class students will derive multiple methods of arriving at the correct answer to a problem are good examples of how we are doing this kind of work, for they are based on a belief that instructional goals should be the same for all students while instructional materials and methods should vary depending on individual needs. One destination, many paths. To meet ever higher performance targets we must continue providing what we know works best for the majority of mainstream learners, while focusing our professional growth efforts on finding ways to reach the exceptional cases. Our future success depends on us reaching the outliers. As you do your summer planning, then, do so with the expectation that Chester and I will be asking for copies of all assessments that relate in any way to the lessons we observe (from the vocab quiz coming up in a day or two to the final major assessment for the current unit) and that we will be asking what specific steps you are taking to reach your students with different needs and styles.

Page 14: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

14

Areas of emphasis for next year: 1) classroom assessments and 2) focusing on individual low performers – There can be no doubt that we must continue our ongoing work on integrating reading, writing and math instruction into all content areas and that we must continue our work on identifying and meeting the needs of students in on our most problematic NCLB cells. As Chester and I observed classes, analyzed data and took stock of our status and progress, however, two things have become apparent. First, we must focus on classroom assessments as well as on state assessments. To be successful in reaching ever higher performance targets, we must be ever more clear as to what our instructional goals are, and we must plan all instruction to contribute directly to reaching those goals. In theory, the process is simple: we become thoroughly familiar with the state standards, we develop classroom assessments that will show that our students have met the state standards, and then we create lessons to teach students to do well on those assessments. This is exactly what many of you do, and in your cases Chester and I will be documenting your work and, I hope, working together with you to take your assessments to that proverbial “next level.” Where we need to work with others of you on the fundamentals, we will do that as well. Second, to meet ever higher performance targets, we must aim our instruction more and more directly toward meeting the needs of individual students. We have done well at teaching to the majority of learners, and we have done significant work at reaching out to bring in other less traditional learners. Literature circles, All Kinds of Minds analyses, and the CMP practice of expecting that in any class students will derive multiple methods of arriving at the correct answer to a problem are good examples of how we are doing this kind of work, for they are based on a belief that instructional goals should be the same for all students while instructional materials and methods should vary depending on individual needs. One destination, many paths. To meet ever higher performance targets we must continue providing what we know works best for the majority of mainstream learners, while focusing our professional growth efforts on finding ways to reach the exceptional cases. Our future success depends on us reaching the outliers. As you do your summer planning, then, do so with the expectation that Chester and I will be asking for copies of all assessments that relate in any way to the lessons we observe (from the vocab quiz coming up in a day or two to the final major assessment for the current unit) and that we will be asking what specific steps you are taking to reach your students with different needs and styles.

EXAMPLE: School-Wide Expectations

Page 15: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

Let’s look at some Let’s look at some sample data.sample data.

Let’s look at some Let’s look at some sample data.sample data.

Page 16: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

16

Trend Data

PerformanceOverTime

Page 17: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

17

Note:

All data on the following Christina School District graphs was taken from the public pages of the DOE website. It is freely available to anyone with access to the internet, and the graphs can be created by anyone who

can use Excel.

Page 18: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

18

Examining Trend Data

CSD: Math Percentile Trends - 5th Grade

0

10

20

30

40

50

60

70

80

90

100

PL5 PL4 PL3 PL2 PL1

Performance Level

Per

cecn

t at

Lev

el

1998

1999

2000

2001

2002

2003

2004

2005

2006

Page 19: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

19

Examining Trend Data

CSD: Math Percentile Trends - 5th Grade

0

10

20

30

40

50

60

70

80

90

100

PL5 PL4 PL3 PL2 PL1

Performance Level

Per

cecn

t at

Lev

el

1998

1999

2000

2001

2002

2003

2004

2005

2006

Page 20: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

20

Examining Trend Data

CSD: Math Percentile Trends - 3rd Grade

0

10

20

30

40

50

60

70

80

90

100

PL5 PL4 PL3 PL2 PL1

Performance Level

Per

cecn

t at

Lev

el

1998

1999

2000

2001

2002

2003

2004

2005

2006

CSD: Math Percentile Trends - 5th Grade

0

10

20

30

40

50

60

70

80

90

100

PL5 PL4 PL3 PL2 PL1

Performance Level

Per

cecn

t at

Lev

el

1998

1999

2000

2001

2002

2003

2004

2005

2006

5

3

Page 21: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

21

Examining Trend Data

CSD: Math Percentile Trends - 3rd Grade

0

10

20

30

40

50

60

70

80

90

100

PL5 PL4 PL3 PL2 PL1

Performance Level

Per

cecn

t at

Lev

el

1998

1999

2000

2001

2002

2003

2004

2005

2006

CSD: Math Percentile Trends - 5th Grade

0

10

20

30

40

50

60

70

80

90

100

PL5 PL4 PL3 PL2 PL1

Performance Level

Per

cecn

t at

Lev

el

1998

1999

2000

2001

2002

2003

2004

2005

2006

CSD: Math Percentile Trends - 8th Grade

0

10

20

30

40

50

60

70

80

90

100

PL5 PL4 PL3 PL2 PL1

Performance Level

Per

cecn

t at

Lev

el

1998

1999

2000

2001

2002

2003

2004

2005

2006

5

3

8

Page 22: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

22

Examining Trend Data

CSD: Math Percentile Trends - 3rd Grade

0

10

20

30

40

50

60

70

80

90

100

PL5 PL4 PL3 PL2 PL1

Performance Level

Per

cecn

t at

Lev

el

1998

1999

2000

2001

2002

2003

2004

2005

2006

CSD: Math Percentile Trends - 5th Grade

0

10

20

30

40

50

60

70

80

90

100

PL5 PL4 PL3 PL2 PL1

Performance Level

Per

cecn

t at

Lev

el

1998

1999

2000

2001

2002

2003

2004

2005

2006

CSD: Math Percentile Trends - 8th Grade

0

10

20

30

40

50

60

70

80

90

100

PL5 PL4 PL3 PL2 PL1

Performance Level

Per

cecn

t at

Lev

el

1998

1999

2000

2001

2002

2003

2004

2005

2006

CSD: Math Percentile Trends - 10th Grade

0

10

20

30

40

50

60

70

80

90

100

PL5 PL4 PL3 PL2 PL1

Performance Level

Per

cecn

t at

Lev

el

1998

1999

2000

2001

2002

2003

2004

2005

2006

10

5

3

8

Page 23: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

23

Examining Trend Data

CSD: Math Percentile Trends - 10th Grade

0

10

20

30

40

50

60

70

80

90

100

PL5 PL4 PL3 PL2 PL1

Performance Level

Per

cecn

t at

Lev

el

1998

1999

2000

2001

2002

2003

2004

2005

2006

CSD: Math Percentile Trends - 8th Grade

0

10

20

30

40

50

60

70

80

90

100

PL5 PL4 PL3 PL2 PL1

Performance Level

Per

cecn

t at

Lev

el

1998

1999

2000

2001

2002

2003

2004

2005

2006

CSD: Math Percentile Trends - 5th Grade

0

10

20

30

40

50

60

70

80

90

100

PL5 PL4 PL3 PL2 PL1

Performance Level

Per

cecn

t at

Lev

el

1998

1999

2000

2001

2002

2003

2004

2005

2006

CSD: Math Percentile Trends - 3rd Grade

0

10

20

30

40

50

60

70

80

90

100

PL5 PL4 PL3 PL2 PL1

Performance Level

Per

cecn

t at

Lev

el

1998

1999

2000

2001

2002

2003

2004

2005

2006

10

8

5

3

Page 24: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

24

Examining Performance Against NCLB Targets

CSD- Performance Against Targets: 3rd Gr. MATH

0

10

20

30

40

50

60

70

80

90

100

Testing Years

Per

cen

t M

eeti

ng

Sta

nd

ard

Target 33.00 33.00 41.375 41.375 49.75049.750 58.12566.500 74.87583.250 91.625100.00

Actual 53.520 60.41073.60073.23072.88075.420 77.180 82.32 79.790 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000

1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014

Page 25: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

25

Examining Performance Against NCLB Targets

CSD- Performance Against Targets: 3rd Gr. MATH

0

10

20

30

40

50

60

70

80

90

100

Testing Years

Per

cen

t M

eeti

ng

Sta

nd

ard

Target 33.00 33.00 41.375 41.375 49.75049.750 58.12566.500 74.87583.250 91.625100.00

Actual 53.520 60.41073.60073.23072.88075.420 77.180 82.32 79.790 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000

1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014

CSD - Performance Against Targets: 5th Gr. MATH

0

10

20

30

40

50

60

70

80

90

100

Testing Years

Per

cen

t M

eeti

ng

Sta

nd

ard

Target 33.00 33.00 41.375 41.375 49.75049.750 58.12566.500 74.87583.250 91.625100.00

Actual 41.320 46.76054.790 56.80057.890 67.730 69.99 74.22071.910 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000

1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014

CSD - Performance Against Targets: 8th Gr. MATH

0

10

20

30

40

50

60

70

80

90

100

Testing Years

Per

cen

t M

eeti

ng

Sta

nd

ard

Target 33.00 33.00 41.375 41.375 49.75049.750 58.12566.500 74.87583.250 91.625100.00

Actual 33.27031.62028.700 29.90 40.48 30.78025.30038.14050.930 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000

1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014

CSD - Performance Against Targets: 10th Gr. MATH

0

10

20

30

40

50

60

70

80

90

100

Testing Years

Per

cen

t M

eeti

ng

Sta

nd

ard

Target 33.00 33.00 41.375 41.375 49.75049.750 58.12566.500 74.87583.250 91.625100.00

Actual 32.170 25.68027.030 23.88 28.780 40.71041.890 42.160 43.170 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000

1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014

10

8

5

3

Page 26: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

26

Disaggregations

Getting to theschool,

teacher, andclassroom

levels.

Page 27: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

27

Disaggregating by NCLB CellsCSD - Disaggregated DSTP Trend Data - 10th Grade MATH

0

10

20

30

40

50

60

70

80

90

100

1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008

Test Year

Per

cen

t M

eeti

ng

Sta

nd

ard

Targets

All

Page 28: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

28

Disaggregating by NCLB CellsCSD - Disaggregated DSTP Trend Data - 10th Grade MATH

0

10

20

30

40

50

60

70

80

90

100

1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008

Test Year

Per

cen

t M

eeti

ng

Sta

nd

ard

Targets

All

White

Asian

Page 29: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

29

Disaggregating by NCLB CellsCSD - Disaggregated DSTP Trend Data - 10th Grade MATH

0

10

20

30

40

50

60

70

80

90

100

1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008

Test Year

Per

cen

t M

eeti

ng

Sta

nd

ard

Targets

All

Af-Am

Hisp

White

Asian

Low Inc

LEP

Page 30: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

30

Disaggregating by NCLB CellsCSD - Disaggregated DSTP Trend Data - 10th Grade MATH

0

10

20

30

40

50

60

70

80

90

100

1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008

Test Year

Per

cen

t M

eeti

ng

Sta

nd

ard Targets

All

SpEd

Af-Am

Hisp

White

Asian

Low Inc

LEP

Page 31: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

31

Disaggregating by NCLB CellsCSD - Disaggregated DSTP Trend Data - 10th Grade MATH

0

10

20

30

40

50

60

70

80

90

100

1998

1999

2000

2001

2002

2003

2004

2005

2006

2007

2008

2009

2010

2011

2012

2013

2014

Test Year

Per

cen

t M

eeti

ng

Sta

nd

ard Targets

All

SpEd

Af-Am

Hisp

White

Asian

Low Inc

LEP

Page 32: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

32

Disaggregating at the

TeacherLevel- - - - -

ChartingCohort

Progress

Cohort Progress: FMS 2006 -- 8th Grade Special Ed -- MATH

300

325

350

375

400

425

450

475

500

525

550

575

3rd 4th 5th 6th 7th 8th

Grade Level

Sca

le S

core

Bergold, Heather

Blades, Michael

Blaw n, David

Boyer, Matthew

Bracy, Mica

Brow n, Shelbi

Buckley, Brett

Bumbrey, Fatima

Cannon, Shilar

Coffee, Harold

Dixon, Azariah

How ard, Tiffany

Johnson, Erica

Kepner, Michelle

Krambeck, Erica

Manley, Jovon

Mannings, Robert

Michalski, Brice

Miller, Katie

Myers, James

O'Neal, Steven

Parrish, Dillion

Roos, Cody

Sanchez, Jessica

Sargent, Zachery

Spencer, Robert

Stevens, Ty'Ree

Stew art, Mary

Sylvain, Sharon

White, Crystal

Wilkerson, Dennis

Wintjen, Courtney

PL 2

PL 3

Median (middle)

Old PL 3

Page 33: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

33

Incorporating

Datainto

DPAS

Page 34: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

34

The Link

• The purpose of DPAS is to document how well (or whether or not) a staff member is doing his/her job.

• If you have framed that job in terms of using data– to set instructional goals and– to make decisions as to how to achieve those

goals, then

data and DPAS are a natural fit.

Page 35: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

35

DPAS Components

DPAS 11. Instructional Planning

2. Organization and Management of Classroom

3. Instructional Strategies4. Teacher/Student Inter-

action5. Evaluation of Student

Performance6. Related Responsibilities

DPAS 21. Planning and

Preparation2. Classroom Environment

3. Instruction

5.5. Student Improvement Student Improvement

4. Professional Responsibilities (by encouraging certain kinds of staff development over others)

Page 36: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

36

Data in DPAS 1 ComponentsINSTRUCTIONAL PLANNING:

• provides appropriate instructional objectives• provides methods and materials which maximize learning• includes provisions for evaluating objectives• provides scope and sequence for lesson

ORGANIZATION AND MANAGEMENT OF CLASSROOM:• arranges classroom for instructional effectiveness• uses instructional time efficiently• establishes, communicates and maintains standards for students• maintains high engagement rate• maintains a positive classroom environment• monitors the learning activities of students

Page 37: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

37

Data in DPAS 1 ComponentsINSTRUCTIONAL STRATEGIES:

• uses and organizes appropriate methods and activities in their proper sequence and time frame, i.e., reviews, modeling, guided and independent practice, and closure• demonstrates sufficient knowledge of subject matter being taught• uses available instructional media and materials effectively• establishes a mind set for learning• focuses lesson on teaching objective• uses level of instruction that is appropriate• maintains pace of learning• provides opportunities for student differences• checks for student understanding• conveys appropriately high expectations for students

Page 38: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

38

Data in DPAS 1 ComponentsTEACHER/STUDENT INTERACTION:

• promotes high rate of student interest• provides prompt and specific feedback in a constructive manner• provides opportunities for active participation• uses questioning techniques effectively• demonstrates fairness and consistency in dealing with students• speaks and writes clearly, correctly and at an appropriate level for student understanding

EVALUATION OF STUDENT PERFORMANCE:• uses appropriate formative and summative tools and techniques• makes effective use of norm- and/or criterion-referenced test data• provides prompt feedback and constructive comments on tests, homework and other assignments• maintains accurate records documenting student performance

Page 39: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

39

Data in DPAS 1 ComponentsRELATED RESPONSIBILITIES:

• complies with policies, regulations and procedures of school district/building• engages in professional development• communicates effectively with parents• works cooperatively with staff• performs non-instructional responsibilities as assigned

Page 40: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

40

Data in DPAS 1 Components

1. Instructional Planning• “What data did you use in deciding to teach

this lesson?”• “What data-identified needs are you

addressing with this lesson?”– How does this lesson address trends shown in

the graphs I presented in my opening day presentation (or Sept. staff meeting, etc.)?

• “How do you plan to meet the needs of students in our target NCLB cells?”

Page 41: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

41

Data in DPAS 1 Components

3. Instructional Strategies• “What data is there to show that the

instructional strategies you have chosen are effective for meeting your instructional goals?”

– general data from professional development– specific data generated by the teacher for this

group of students

• Timed scans of the classroom– time on task, incidences of specific behaviors

Page 42: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

42

Data in DPAS 1 Components

5. Evaluation of Student Performance• “How have you determined that your chosen

assessment strategies will accurately reflect student learning?”

• “How closely does data you have collected from your classroom assessments mirror data from the DSTP?”

• If there are discrepancies, why?• Are students learning?• Are the assessments not good enough?• Are changes in instruction indicated?

Page 43: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

43

Data in DPAS 1 Components

6. Related Responsibilities• “What data analysis work have you undertaken

on your own?”– Review of your particular students’ DSTP scores?

Instructional needs comments?– Data generated from classroom assessments?

• How have you applied what you have gained from professional development opportunities (course work, district workshops, individual reading) to improve your understanding of your students’ performance and instructional needs?

Page 44: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

44

Objections & Responses

“I taught it, but they didn’t learn it.”

NCLB & the DSTP don’t care what you did. They only care about your results.

“My job is to teach. It’s the kids’ responsibility to learn.”

No, your job is to bring about student learning and to improve student performance.

There may once have been a time when your job was simply to “put it out there,” but that time is long gone.

Page 45: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

45

Objections & Responses

Like it or not, districts, schools and teachers are no longer being judged on what they do. What matters now is what their students do.

Results matter above all, and process (teaching) is valued only to the degree that it produces the desired outcome (student performance at or above targets).

Page 46: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

46

“Teaching the Curriculum”

vs.Teaching to needs identified by data

• Should be a false dichotomy• Teachers are expected to teach the curriculum,

however…• that should never be an excuse for not meeting

students’ needs.• If there are problems, it is important to identify where

they lie.

Objections & Responses

Page 47: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

47

Curriculum Alignment

Instruction(implemented by the teacher)

Curriculum(set by the district)

Standards(set by the state)

X

Page 48: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

48

Curriculum Alignment

the

TAUGHTcurriculum

Xthe

ASSESSEDcurriculum

the

WRITTENcurriculum

Page 49: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

49

Tools & Strategies

• Differentiated Instruction• Authentic & embedded assessments• Understanding By Design (& similar systems)• Rubrics• Staff cooperative work sessions:

– analysis of school/grade data– planning of common lessons– scoring of common assessments

• Gates-McGinitie, Dibels, unit tests that come with texts

Page 50: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

50

Other Data

• Attendance

• Discipline

• TAG / Special Ed referral test results

• Schools Attuned / A Mind at a Time info

• Learning Styles inventories

• Teachers’ grade distributions

• Comparisons of grade distributions with DSTP scores of the same students

Page 51: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

51

Still not enough?

Ask for more!

Sample form:

Performance Appraisal Request for Additional Information

(see handout)

Page 52: Using Data in the Delaware Performance Appraisal System Wednesday, September 27, 2006

52

Keep in mind:Analyzing school data can point out

both strengths and weaknesses…but the best thing that data does is to

raise questions.

Your school’s ability to find those questions and then to answer them will determine your future success.