Upload
ita
View
30
Download
0
Tags:
Embed Size (px)
DESCRIPTION
RtI: Big Ideas and Key Elements. VPA 2011 Julie J. Benay. The Big Ideas of RtI. High quality instruction Frequent Assessment Data Based Decision Making. Key Components to RtI. High quality, research based core instruction Universal screening and benchmark testing - PowerPoint PPT Presentation
Citation preview
RtI: Big Ideas and Key Elements
VPA 2011Julie J. Benay
The Big Ideas of RtI
High quality instructionFrequent AssessmentData Based Decision Making
Key Components to RtI
• High quality, research based core instruction• Universal screening and benchmark testing• Continuous progress monitoring• Research based interventions• Interventions adjusted based on data,
including: frequency, intensity, fidelity• Collaboration, teaming, shared responsibility
RtI: Procedures
• Universal benchmark screening• Ongoing progress monitoring• Interventions provided with sufficient
frequency, fidelity and intensity• Instructional adjustments made based on data
Why Universal Screening and Benchmarks?
• Universal screening in the fall provides a quick measure of gains or losses over the summer months
• Winter benchmark prevents students from “falling through the cracks” while there is still time to intervene
• Spring benchmark provides picture of annual growth; useful information for summer programming; and a starting point for comparison in the fall
Research on “Curriculum Based Measures”
• 30 years of strong research indicate the reliability and predictive value of CBM (Fuchs and Fuchs)
• More than 200 empirical studies published in peer-review journals – (a) provide evidence of CBM’s reliability and validity
for assessing the development of competence in reading, spelling, and mathematics and
– (b) document CBM’s capacity to help teachers improve student outcomes at the elementary grades.
Mastery Measurement
• With mastery measurement, teachers test for mastery of a single skill and, after mastery is demonstrated, they assess mastery of the next skill in a sequence
• Scores in mastery measurement cannot be compared over the course of a year, making it impossible to quantify rates of progress
• Many tests of mastery are teacher designed and lack validity and reliability
Curriculum Based Measures
• Each CBM test assesses all the different skills covered in the annual curriculum.
• CBM samples the many skills in the annual curriculum in such a way that each weekly test is an alternate form (with different test items, but of equivalent difficulty).
• Therefore, scores earned at different times during the school year can be compared to determine whether a student’s competence is increasing.
Curriculum Based Measures
CBM makes no assumptions about instructional hierarchy for determining measurement
CBM incorporates automatic tests of retention and generalization
CBM is distinctive:– Each CBM test if of equivalent difficulty
Samples the year-long curriculum– CBM is highly prescriptive and standardized
Reliable and valid scores
CBM Basics
CBM monitors student progress throughout the school year
Students are given probes at regular intervals depending on the interventionWeekly, bi-weekly
Teachers use CBM data along with other data to quantify short- and long-term goals that will meet end-of-year goals
Using CBM
CBM tests are brief and easy to administerAll tests are different, but assess the same skills
and the same difficulty levelCBM scores are graphed for teachers to use to
make decisions about instructional programs and teaching methods for each student
CBM data management solutions are available commercially
Taking the Temperature
• CBMs are highly sensitive to learning but they are not perfect assessments
• They do one thing, and they do it well. They take the temperature of student learning
• A value in CBM is that they can be given again if the student had a bad day or testing procedures were invalidated
• Results of CBM should be triangulated with other forms of assessment to develop a well rounded approach to making instructional decisions
CBM Results
False Positive• Student CBM score
indicates a learning problem, but all other data sources provide the team with confidence that no problem exists
False Negative• Student performs within
average range on the CBM, but other data sources indicate the student is not fully understanding learning objectives
Using CBMs: Devil in the Details
• Determine who, what, where, and when to administer
• Train staff to administer and prepare to monitor fidelity (less of an issue with computer based administration such as Renaissance)
• Decide how to organize and access data• Establish times to meet as teams to discuss data• Trained leaders to guide discussions
Screening Tools Chart
• The National Center on RtI has developed a “tools” chart to assist schools in selecting benchmark screening and progress monitoring tools.
• The chart allows schools to consider validity and reliability in measuring both proximal and distal results
• Other factors to consider include expense of the product and complexity of administration
Aimsweb
• Aimsweb started in Eden Prarie MN with 9 employees in a small company called Edformation
• The tool was so useful that it was quickly purchased by Harcourt Assessment
• Following that purchase, Pearson purchased the product and now owns it
• This year, Aimsweb became SIF compatible, allowing the information to be integrated with PowerSchool or other SIF compatible SIS modules
Aimsweb DemonstrationThe easiest way for me to illustrate the use of CBMs in planning for
instruction is to demonstrate using Aimsweb. However, all the principles of practice I will demonstrate can be done with pencil and graph paper!
The principles and practice include:• Establishing and recording benchmark scores• Determining which students need supplemental (Tier II)
instruction• Establishing a goal or target to be achieved within the timeframe
(usually 10-20 weeks) and setting aimline. Rate of improvement (ROI) can be used to calculate the goal mathematically
• Setting up progress monitoring charts for each student• Tracking data using progress monitoring probe scores
The really essential question
Access to another tier of instruction is a potentially life changing decision for a student. How and when do you make that decision?
RtI Decision Making ModelsModel Description Analysis
Standard Treatment •Same treatment for all learners with similar profiles•Decisions based on response to standard treatment
•More rigorous for identifying special needs than the problem solving model•May miss identifying some students
Problem solving •Process oriented•Decisions made by problem solving team that looks at ecological factors
•Less rigorous yet does include most/all students with special needs•Risks over identification
Combined •Elements of both•Problem solving team considers both standard treatment and other related information
•Draws on strengths from both•Time consuming•Requires skilled facilitators
Decision Rules in a Problem Solving Model
We need guidance in regard to decision rules. We should NEVER be encouraged to use one source of data, and we need to ask:
• What measures are you using and what information are you getting from that data?
• What source of data did you examine first?• What was the target and who established it?• When you make a decision about who will be with an
intervention teacher, how did you decide? Think about how you considered your boundaries (eg group size) and how you prioritized the needs of the learners.
Using CBM for Systems Analysis
“Screening and progress monitoring data can be aggregated and used to compare and contrast the adequacy of the core curriculum as well as the effectiveness of different instructional and behavioral strategies for various groups of students within a school. For example, if 60% of the students in a particular grade score below the cut point on a screening test at the beginning of the year, school personnel might consider the appropriateness of the core curriculum or whether differentiated learning activities need to be added to better meet the needs of the students in that grade.”
Response to Instruction
“How will we respond when a student isn’t learning?” – Mike Schmoker
“Find out what the child is thinking and intervene accordingly” – Dr. Lillian Katz
RtI is more than “response to intervention.” Digging deeper means considering the quality of instruction at all tiers, building in quality assessments, and helping teachers to have meaningful conversations about their practice.
How do we respond to the data?
“Students in the same school who experience difficulty in learning will be subject to very different responses based upon the beliefs and practices of individual teachers.” (DuFour et al)
“In short, a primary difference between the historical and contemporary approaches is the emphasis on proper instruction first, rather than believing there is something wrong within the student.” (John Hoover)
RtI: The Really Big IdeasShared Ownership for Learning
How can we work together to design a coherent, cohesive plan of instruction?How can all teachers share ownership for student success across grade levels
and tiers of instruction?
Accountability Among AdultsHow do we create a climate where it is safe to admit what you don’t know?How can we create communities of professional practice that capitalize on
the strengths within our own system?
Response to InstructionHow do we help all teachers become better at “finding out what the child is
thinking and responding accordingly” (Dr. Lillian Katz)How do we move from checklists of symptoms and a focus on eligibility to
identifying and implementing effective teaching strategies?
The Big IdeasHigh quality instruction: Deep curriculum and assessment work,
supported by teacher learning communities where teachers openly and honestly participate in collaborative work; coordinated, coherent instructional plans for striving learners
Frequent assessment: Universal screening and benchmark testing to ensure that no student “falls through the cracks,” reliable progress monitoring data from more than one source, ongoing, quality formative assessments
Data based decision making: Thoughtful decisions focused on coherent plans of instruction rather than eligibility; using the data to change instruction at all tiers rather than spending time admiring or discussing the data itself
Shared ow
nership
Adults are accountable Response to instruction