Upload
victor-herbert-bryan
View
225
Download
3
Tags:
Embed Size (px)
Citation preview
LISD Data Camp
June and August, 2011
LISD TECH Center
Welcome
• Necessary Forms– SB-CEUs– SHU Credit
Session 1 Essential Questions
• Many Meaning of Multiple Measures– Why use multiple measures of data?– Which ways will we use multiple measures in
our school improvement process?– What data sources will we use to make
decisions about student achievement?
Session 1 Outcomes
• Identify multiple measures of data
• Align data with their school improvement goals
• Develop an action plan for engaging staff in analyzing multiple measures
School Improvement Process
WE MUST UTILIZE AN
INQUIRY APPROACH
TO DATA ANALYSIS
WE MUST USE MULTIPLE
SOURCES OF DATA
We need a data warehouse
for our 21st century
schools
WE MUST FOCUS ON DATA TO INCREASE STUDENT ACHIEVEMENT
Talking Points for the Purpose of Implementing
a Data Warehouse in Lenawee Schools
Norms for Our Work
• Participate actively
• Actively listen
• Seek application
• Press for clarification
• Honor time agreements and confidentiality
• Keep ‘side bars’ to a minimum and on topic
• Take care of adult learning needs
FERPA/HIPAA Pre-Test
To be considered an “education record,” information must be maintained in the student’s cumulative or permanent folder.
• False, because any record that has a student name is an educational record.
FERPA/HIPAA Pre-Test
FERPA grants parents the right to have a copy of any education record.
• True
FERPA/HIPAA Pre-Test
You are in charge of a staff meeting to study student achievement on school improvement goals. As part of your meeting, you are showing a report to the entire staff that shows student scores on a common local assessment. The report shows the student names. In addition, you have given them a paper copy of the report.
It is a violation of FERPA to display the results of the assessment to the entire staff.
The exception would be a group of teachers working on a specific student strategies, as they are a specific population that then has a “legitimate educational interest” in the information.
Data Roles
• What roles will each member of your team play in today’s work?– Identify roles– Describe responsibilities– Hold each other accountable
The Many Meanings of “Multiple Measures”
Susan Brookhart
Volume 2009, Volume 67:3
ASCD, November 2009, pp. 6-12
Would you choose a house using one measure alone?
Guiding Principle for Multiple Measures
• Know your purpose!
–What do you need to know?
–Why do you need to know it?
• assessment for learning– formative
(monitors student progress during instruction)
– placement(given before instruction to gather information on where to start)
– diagnostic(helps find the underlying causes for learning problems)
– interim (monitor student proficiency on learning targets)
• assessment of learning– summative
(the final task at the end of a unit, a course, or a semester)
Purposes of Assessments
Sources: Stiggins, Richard J, Arter, Judith A., Chappuis, Jan, Chappius, Stephen. Classroom Assessment for Student Learning. Assessment Training Institute, Inc., Portland, Oregon, 2004. Bravmann, S. L., “P-I Focus: One test doesn’t fit all”, Seattle Post-Intelligencer, May 2, 2004. Marshall, K. (2006) “Interim Assessments: Keys to Successful Implementation”. NewYork: New Leaders for New Schools.
Why use multiple measures for decisions in education?
• Construct validity– The degree to which a
score can convey meaningful information about an attribute it measures
• Decision validity– The degree to which
several relevant types of information can inform decision-making
Multiple Measures
• Measures of different constructs
• Different measures of the same construct
• Multiple opportunities to pass the same test
Using Multiple Measures for Educational Decisions
Conjunctive Approach
(All measures count)
Compensatory Approach
(High performance on one measure can
compensate for lower performance on another
measure)
Complementary Approach
(High performance on any measure counts)
Examples
• NCLB accountability is conjunctive (i.e., aggregate and subgroups must reach threshold to make AYP)
• Most classroom grading policies are compensatory (i.e., average, percentage)
• Getting a driver’s license is complementary (i.e., passing one of the requirements when you want)
Using Multiple Measures for Educational Decisions
Conjunctive Approach
(All measures count)
Compensatory Approach
(High performance on one measure can
compensate for lower performance on another
measure)
Complementary Approach
(High performance on any measure counts)
Measures of different constructs
Different measurers of the same construct
Multiple opportunities to pass the same test
Examples• MEAP measures different constructs
in mathematics (i.e., measurement, numbers and operations, geometry, algebra, probability)
• Retelling, constructed responses, and cloze tasks are different measures of the same construct (comprehension)
• Some students utilize multiple opportunities to take the ACT (i.e., scholarships, NCAA eligibility)
Using Multiple Measures for Educational Decisions
Conjunctive Approach
(All measures count)
Compensatory Approach
(High performance on one measure can
compensate for lower performance on another
measure)
Complementary Approach
(High performance on any measure counts)
Measures of different constructs
School accreditation ratings based upon
student achievement meeting identified
targets in Reading, Math, Science, and
Social Studies
An outside agency identifies the
“best schools”, identified by computing an index of weighted scores
AYP “Safe Harbor” by having a percentage of students who
scored below proficiency decreasing by ten percentage points from the previous year
Different measurers of the same construct
Students have to pass a reading comprehension test on two stories at the
same reading level before the student is
allowed to read stories at the next higher
reading level
Teachers determine standards-based grades in a course using scores on
multiple assessments measuring the same
GLCE or HSCE
Teachers allow student choice on assessment tasks to
demonstrate their understanding of the learning
targets for a unit
Multiple opportunities to pass the same test
Students meeting all requirements will
graduate after passing an exit exam, no matter how many opportunities
Teachers allow students to retake a unit test to
demonstrate mastery of the unit’s outcomes
Students must pass one mathematics test in order to
graduate; students can choose the state test or an end-of-course exam in either Algebra I or Geometry
Suggestions for UsingMultiple Measures
for Decision Making
• Classroom assessments linked to the same construct to determine mastery
• Granting credit for graduation requirements
• Teacher evaluations
Questions?
Stan MastersCoordinator of
Instructional Data ServicesLenawee Intermediate School
District4107 N. Adrian HighwayAdrian, Michigan 49921
Phone: 517-265-1606Email: [email protected] ID: stan.masters
Data Warehouse webpage:www.lisd.us/links/data
LISD Data Camp
June and August, 2011
LISD TECH Center
Session 2 Essential Questions
• DataDirector Functions– What are the important aspects of various
DataDirector functions?– What decisions must be made in sharing
DataDirector products with others?– How will we build our 2011-2012
assessment calendar?
Session 2 Outcomes
• Describe the functions of DataDirector tabs
• Identify the permissions in sharing a DataDirector product
• Create an assessment calendar for 2011-2012 school year
School Improvement Process
Norms for Our Work
• Participate actively
• Actively listen
• Seek application
• Press for clarification
• Honor time agreements and confidentiality
• Keep ‘side bars’ to a minimum and on topic
• Take care of adult learning needs
Data Roles
• What roles will each member of your team play in today’s work?– Identify roles– Describe responsibilities– Hold each other accountable
How do you develop a monitoring plan?
• Identify specific learning indicators
• Create data collection templates
• Schedule assessment calendar– collaborative collection and analysis
Source: “Developing a Monitoring Plan”. Maryland Department of Education. Accessed May 25, 2010 from http://mdk12.org/data/progress/developing.html
Video Source: Reeves, D. (2009). “Planning for the New Year”. Accessed May 25, 2010 from http://www.leadandlearn.com/webinars
Assessment Calendars
Time Elements of an Assessment CalendarSource: White, S. H. (2005). “Beyond the Numbers: Making Data Work for Teachers
and School Leaders”. Lead and Learn Press: Englewood, CO
• When will we administer the assessment?• When will we collect the data?• When will we disaggregate the data?• When will we analyze the data?• When will we reflect upon the data?• When will we make recommendations?• When will we make the decisions about the
recommendations?• When will we provide written documentation about the
decisions?• When will we share the data with other stakeholders?
Components of an Assessment Calendar Source: White, S. H. (2005). “Beyond the Numbers: Making Data Work for Teachers
and School Leaders”. Lead and Learn Press: Englewood, CO
• Norm-referenced tests• State assessments• Criterion-referenced tests• Writing assessments• End-of-course assessments• Common assessments• Performance assessments• Unit tests• Other
Questions?
Stan MastersCoordinator of
Instructional Data ServicesLenawee Intermediate School
District4107 N. Adrian HighwayAdrian, Michigan 49921
Phone: 517-265-1606Email: [email protected] ID: stan.masters
Data Warehouse webpage:www.lisd.us/links/data