52
1 So What? What’s Next? Scott Marion Center for Assessment Wyoming Formative Assessment Institute February 4, 2008

So What? What’s Next?

Embed Size (px)

DESCRIPTION

So What? What’s Next?. Scott Marion Center for Assessment Wyoming Formative Assessment Institute February 4, 2008. Some Questions. Does formative assessment Improve teaching? Improve learning? If so, how and why? If not, why not? Does interim assessment Improve teaching? - PowerPoint PPT Presentation

Citation preview

Page 1: So What? What’s Next?

1

So What?What’s Next?

Scott Marion

Center for AssessmentWyoming Formative Assessment Institute

February 4, 2008

Page 2: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 20082

Some Questions

Does formative assessment Improve teaching? Improve learning? If so, how and why? If not, why not?

Does interim assessment Improve teaching? Improve learning? If so, how and why? If not, why not?

Page 3: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 20083

My Focus Question What is the mechanism for turning an

assessment result—broadly speaking—into a useful instructional action?

With all of the claims for the instructional benefits for a variety of assessments—value-added, interim/benchmarks, formative—I see little attention to the black box by which teachers and others are supposed turn these results into a useful teachable action

Page 4: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 20084

What’s our plan for this morning? This is all about figuring out what to do once

we learn something from the assessments Peaking inside the “black box”? Key attributes of formative assessment with a

major focus on learning progressions Analyzing student work with an eye toward “what

comes next” Relating this interim and summative score reports

Data base decision making approaches Analyzing score reports with an eye toward

program evaluation

Page 5: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 20085

What’s inside the black box?

Data

Information

Decision(s)

Action(s)

Valid inferences are

required for each transition

Page 6: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 20086

What do I mean by valid inferences? Each transition (e.g., from data to information) is

inferential In other words, it is NOT a concrete certainty, which means we

can be wrong! We are creating hypotheses about how things are working

and what makes sense Stronger compared with weaker hypotheses are grounded

in theory (e.g., learning theory) and/or previous research Hypotheses are falsifiable and we must consider plausible

alternative explanations for the phenomena we observe Try to protect against a “confirmationist bias” (e.g., trying to

support your favorite program)

All of these transitions are really hard!!

Page 7: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 20087

Data to Information

On a sheet of paper, jot down your answers to the following: What’s the difference between data and

information? What’s the same?

What are some criteria that we might use to distinguish between the two concepts?

How might we facilitate the transition from data to information?

Page 8: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 20088

Data to Information-2 We are bombarded with data, but we can’t and

shouldn’t act on all of it Moving from data to information often involves

transformations and reductions For example, some might argue that everything going on in

your classroom is a formative assessment opportunity—and it might be—but you can’t attend to all of it Using targeted probes and/or observations will allow you to

focus on important areas…and you still will not use all of it Similarly, you probably do not spend a lot of time looking at

lists of student raw scores from the state assessment, but look instead at some type of aggregation and disaggregations such as % in category or mean scale scores But, these can also mislead!! That’s why the validity

checks are critical

Page 9: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 20089

Information to Decisions

Once you have reduced the data to information, the next step is to make decisions about this information

What are some of the things we want to consider when making decisions about how and what we might act on regarding this information?

Page 10: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200810

Information to Decisions-2 Some things to consider? Is what the information is telling us something within

our control on which to act? Do we know enough about what the information

means so that we can make valid decisions? Are we sure about the validity of the information? What is the cost if we are wrong?

Replacing a math program is expensive! Providing students with some differential practice can be

cheap Have we prioritized our decisions? How?

Social justice/equity Resource (time, money, etc) constraints

Page 11: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200811

Decisions to Actions Once we decide what needs to be done, do

we know how to do it? Once we choose an action, how will we know

if it is the right action?

Start the cycle again…

Page 12: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200812

A little Diversion (already?)

Dena M. Bravata and her colleagues just published a meta-analysis in JAMA documenting the positive correlation among pedometer usage, increases in physical activity (as much as 1 mile extra walking/day), and decreases in body mass index (BMI) Setting a goal—such as 5,000 or 10,000

steps/day was also a key to increased activity So, is this really a diversion?

Page 13: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200813

Key Attributes of Formative Assessment

The Council of Chief State School Officers’ (CCSSO) Formative Assessment work group (SCASS) recently released a draft paper describing six essential attributes of formative assessment... Learning progressions Learning goals Descriptive feedback Embedded in instruction Self and peer assessment Collaboration and culture

Based on what I have seen, these should not be new to you

Page 14: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200814

From FAST SCASS Key Attributes

Learning Progressions: The formative assessment process should be grounded in learning progressions that provide a clear understanding of the inter-relationships between major concepts and principles in a discipline and how they develop.

Learning Goals: Learning goals and criteria for success should be clearly identified and communicated to students.

Page 15: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200815

Learning Progressions

In the following slides on learning progressions, I draw largely on the work of my colleague, Brian Gong, and to a lesser extent, Margaret Heritage from CRESST.

Learning progressions should serve as the basis for formative assessment, but we argue that they should serve as the basis for most other assessments, especially those intended to measure student growth

Page 16: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200816

Challenge: Supporting increased student learning Assertion 1: Current large-scale assessment

and accountability are insufficient to produce the student learning we desire. We can make accountability and large-scale

assessment better, but they will always be insufficient because they are too distal to teaching and learning (that’s why you’re here!).

Assertion 2: Learning progressions and instructional assessments can support learning and teaching more powerfully (but still are only part of the solution).

Page 17: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200817

Learning Progressions, Standards, and Educational Reform

Learning progressions are an extension of the standards-assessment model for educational reform. Clear learning targets (content and performance),

all students, feedback to guide instruction HOWEVER – Content standards and

performance descriptors typically lack the detail, developmental coherence, and skills (reasoning, etc.) needed to present sufficient learning targets and learning progressions

Page 18: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200818

Content standards are not enough

We are starting to see limited good examples of state grade-level content standards showing some development of knowledge, skills, or complexity over timeNRC Science (2007)New England Common Assessment

Program

Page 19: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200819

Learning Progressions: Multiple, related perspectives that are not new!

“Descriptions of the successively more sophisticated ways of thinking about an idea that follow one another as students learn” (Wilson & Bertenthal, 2005)

“Descriptions of successively more sophisticated ways of reasoning within a content domain” (Smith et al., in press)

“A picture of the path students typically follow as they learn... A description of skills, understandings and knowledge in the sequence in which they typically develop.” (Masters & Forster, 1996)

The organization of learning experiences designed to “help students go ahead in different subjects as rapidly as they can”… “to learn how to learn.” “Any subject can be taught effectively in some intellectually honest form to any child at any stage of development.” (Bruner, 1960)

Page 20: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200820

Some Key Traditions Informing Learning Progressions

Developmental psychology Cognitive psychology

Misconception/Naïve theories Expertise

Curriculum Structure (Content & Skills) Scope & Sequence

Task analysis Instructional technology Assessment

Page 21: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200821

Developmental PsychologyFeatures and examples

Features: Inherent Universal Invariant*

Examples: Piaget: preoperational, concrete, symbolic ACER: Spelling Applebee: Children’s concept of story

Page 22: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200822

Curriculum: Scope & Sequence Features and examples

Features: Lays out content knowledge – can be content web

rather than linear Examples:

AAAS: Atlas of Science Literacy Most state content standards

Page 23: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200823

Cognitive: Misconception Features and examples

Features: Incorrect schema (concept, procedure) Rule-oriented (“mal-rules”) – well-structured

content with a “grammar” Examples:

Brown & Burton: “buggy subtraction” Minstral: Inertial motion

Page 24: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200824

Curriculum: Structure Features and examples

Features: Focuses on “essence” and “structure”

Examples: Bruner: NSF “learning progressions” projects Ginsberg: Multiplication deep structure, multiple

representations

Page 25: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200825

Cognitive: Expertise Features and examples

Features: Content plus cognitive skills (habits of mind,

heuristic) Plus structure, models, extensive schemata

Examples: Wilson: density; Siegler: balance beam Conant: Sea of Air Applebee: Child’s concept of story

Page 26: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200826

Example 1: AAAS Atlas of Science Literacy (is there enough detail?)

Characteristics:

Page 27: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200827

Example 2: A Counting and Ordering Progress Map (lots of detail and very sequential)

From Curriculum Corporation. (1994). Mathematics Profile for Australian Schools. Carlton: Curriculum Corporation. In Masters, G., & Forster, M. (1997). Developmental Assessment. Victoria, AU: The Australian Council for Educational Research Ltd.

Page 28: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200828

Page 29: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200829

Example 3: Graphic Organizer for Conceptual FlowFrom DiRanna, K. & Topps, J. (2005). What’s the Big Idea? San Francisco: K-12 Alliance/ WestEd.

Page 30: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200830

Example 4: Multiplication Acquisition – movement from addition to

multiplication Multiplication: problem of finding the total quantity of

objects contained in a given number of groups with the same number of elements

Cognitive challenges: Learner has to know and operate with two different

grouping systems (number of groups and number of items in a group) – not like addition or subtraction

Operational number systems different than place value system (e.g., 12 is one ten and two ones)

Generalization of learned representations (e.g., quantity per set model; area model; number line model)

Page 31: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200831

Multiplication – cont.

Multiplication Example (Ginsburg)

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

Page 32: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200832

What is intended to develop

Deep content representationOther dimensions

Skills; ComplexityIndependenceGeneralization, application, reflection

= “habits of mind”= Proficiency

Page 33: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200833

Complexity continua Rote recall to strategic thinking (Webb) Memorize, perform routine procedures, communicate understanding,

perform nonroutine problems, conjecture/generalize/prove (Porter & Smithson)

Concrete to abstract (Dienes) Global to analytic to deductive (van Hiele) Pre-operational to operational (Piaget & Beth) Concepts to rules to problem-solving (Gagne) Enactive to symbolic (Bruner) External to internal (Vygotsky) Situated to decontextualized (Cole & Griffen; Greeno) Facts/skills to applications to analysis/synthesis/evaluation (Bloom) Naïve interpretations (based on superficial characteristics) to scientific

models (focused on key attributes and underlying regularities) (Steen) Application, learning potential, metacognition, beliefs and values, whole

(Ginsburg et al.)

Page 34: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200834

Developing Learning Progressions and Assessments Don’t wait for conclusive research and consensus about “the right

way” For instance, researchers might argue whether the “map” type

progressions or list type are better, but you need to figure out what works best for your purposes

Do understand the conceptual frameworks, and decide what emphasis you’ll make (and why)

Do collect as many examples of different approaches as you can Do clarify your purpose(s) Consider revising/extending your state content standards Analyze curricula (frameworks, syllabi, maps) Think about creating developmental rubrics compared with or in

addition to achievement level rubrics (I know this is a major shift) Consider supporting teacher benchmarking to develop learning

progressions at some level (state, multi-district, district)

Page 35: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200835

References AAAS. (2001, 2007). Atlas of Science Literacy (Vols. I and II). Washington, DC: American

Association for the Advancement of Science and the National Science Teachers Association.

Applebee, Arthur. (1978). The Child’s Concept of Story: Ages Two to Seventeen. Chicago: University of Chicago Press.

Bruner, Jerome. (1960). The Process of Education. Cambridge, MA: Harvard University Press.

Gong, Brian, Venezky, Richard, & Mioduser, David. (1992). Instructional Assessments: Lever for systemic change in science education classrooms. In Journal of Science Education and Technology, 1 (1), 157-176.

Kennedy, Cathleen & Wilson, Mark. (2007). Using progress variables to map intellectual development. Presentation at the MARCES Conference, University of Maryland-College Park.

Lesh, Richard, Lamon, Susan J., Gong, Brian, & Post, Thomas R. (1992). Using learning progress maps to improve instructional decision making. In R. Lesh & S.J. Lamon (Eds.), Assessment of Authentic Performance in School Mathematics. Washington, DC: AAAS Press.

Masters, Geoff & Forster, Margaret. (1996). Progress Maps. (Part of the Assessment Resource Kit) Melbourne, Australia: The Australian Council for Educational Research.

Mariotti, Arleen S. & Homan, Susan P. Linking reading assessment to instruction (4 th ed.). National Science Foundation solicitation,

http://www.nsf.gov/pubs/2005/nsf05612/nsf05612.htm Smith, Wiser, Anderson, & Krajcek (in press). Wilson, Mark & Bertentahl, Meryl (Eds.). (2005). Systems for State Science Assessment.

Board on Testing and Assessment, Center for Education, National Research Council of the National Academies. Washington, DC: National Academies Press.

Page 36: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200836

Break

Let’s take a 5 minute break

Page 37: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200837

Student Work-Materials I’ve given you samples of student work from:

Wyoming Activities Assessment Consortium Mathematics Language Arts

NAEP 4th grade science 8th grade science

We’re going to focus on just one aspect of the student work protocol provided to you by Barb and Janet in November…

Page 38: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200838

‘Looking at Student Work’ Form and Protocol

Section #5: Most important section List strategies that the teacher can do to

address the students’ needs (individual or small group instruction).

Learning Needs for ALL Students (whole class instruction): Strategies that could be implemented “whole” group

Page 39: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200839

Student Work: Instructions At your table, please examine as many of the sets of

work as you have time for, but at least two different sets

I’m a nice guy, so I left the annotations and comments attached, but you can see—and you learned last time—that making rubric or target-related comments/notes on the student work can be an important source of feedback

You can use the annotations, but I think you’ll need more than that, to figure out what you would do next with each of the students. That includes students whose work is satisfactory

Page 40: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200840

Student Work: Instructions-2 Try to think about the section or component of the

learning progression that is represented by the particular work sample and where in the progression you would want to help the student move next Avoid the problem that Lorrie Shepard calls the “1000 mini

lessons” Write notes for each particular work sample about:

What you would do next in terms of feedback to the student (the annotations help with this) and your instructional strategies

What additional information do you need or want? Your analytic processes—how are you thinking about this?

(this can help your colleagues) What are the implications for assessment design?

45 minutes or so

Page 41: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200841

Debrief Math activity Science activities Language arts activity

What you would do next in terms of feedback to the student and your instructional strategies

What additional information would you need or want?

Your analytic processes—how are you thinking about this?

Page 42: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200842

Debrief-cont.: Implications for assessment design

How can the design of our tasks and rubrics facilitate feedback to teachers and students?

How can the design of tasks and rubrics facilitate helping us to figure out what to do next?

What are the differences in information received from the short compared with the longer tasks?

Page 43: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200843

In the moment formative assessment Many argue—and I tend to agree—that true

formative assessment occurs “in the moment” such as: questioning in a whole class situation Observing and probing students working on a

group project Having students “think aloud” when they are

putting math problems on the board How would you apply the kind of thinking and

acting we just did with these in the moment activities?

Page 44: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200844

This is really hard, so how do you get smart about this stuff?

Disciplinary understanding Intense collaboration

Lesson study-type approaches

Page 45: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200845

Larger scale assessment

One of the major values of formative assessment is that the information is descriptive…some have argued that it isn’t formative once it is scored

Interim and summative assessments almost always come with score reports. Some are good and most are pretty poor. Using these results to inform decisions and

actions is often called “data based decision making (DBDM)”

Page 46: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200846

The black box still applies…

We still have to turn data into information, decisions, and actions

What kinds of information would you like to be able to get from score reports? Interim assessments Summative assessments

Do you think you’ll be able to get this information?

Page 47: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200847

Skills and knowledge Knowledge is power!

What kinds of skills do you need to be able to turn assessment reports into useable actions?

Page 48: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200848

Knowledge and skills Some basic statistics—central tendency, variability,

correlation The concept of error

A issue for all assessments Much larger for subscores Comparisons can be tricky

A little bit about test design, construction, and analysis (psychometrics)

It is easier to get smarter about these skills because so much as been written

Page 49: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200849

Examining score reports

Let’s try to use the same approach for looking at these score reports as we did with student work

What are the actions that you might propose as a result of these scores? What is the level targeted by these proposed

actions? Student, classroom, school, district?

Why do you think these actions are justified? What are some possible alternative explanations

for these results?

Page 50: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200850

What’s Common and What’s Not

What are the similarities and differences about the thought processes and skills necessary to operate within the black box for classroom formative assessment compared with interim and summative assessments?

Page 51: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200851

Last thoughts This is hard! Continue to work on learning progressions Don’t be satisfied just administering and scoring

assessments—there’s a lot more to learn! If there were simple solutions we would have found

them already Continue to operate using a hypothesis testing

approach Avoid a confirmationist bias Search for alternative explanations

Embrace complexity and uncertainty

Page 52: So What? What’s Next?

Scott Marion. Center for Assessment. WY FAI February 4, 200852

For more information

Scott Marion, Center for Assessment [email protected] www.nciea.org