24
Robert J. Mislevy University of Maryland National Center for Research on Evaluation, Standards, and Student Testing (CRESST) NCME San Diego, CA April 15, 2004 Some Observations on Cognitive Psychology and Educational Assessment

Robert J. Mislevy University of Maryland National Center for Research on Evaluation, Standards, and Student Testing (CRESST) NCME San Diego, CA April 15,

Embed Size (px)

Citation preview

Page 1: Robert J. Mislevy University of Maryland National Center for Research on Evaluation, Standards, and Student Testing (CRESST) NCME San Diego, CA April 15,

Robert J. Mislevy

University of MarylandNational Center for Research on Evaluation,Standards, and Student Testing (CRESST)

NCME San Diego, CAApril 15, 2004

Some Observations on Cognitive Psychology and Educational Assessment

Page 2: Robert J. Mislevy University of Maryland National Center for Research on Evaluation, Standards, and Student Testing (CRESST) NCME San Diego, CA April 15,

Outline of the talk

Themes from cog psych

How cog psych informs what we assess and how we might assess it (esp. school & work)

How cog psych helps us understand and organize what we do in assessment

Page 3: Robert J. Mislevy University of Maryland National Center for Research on Evaluation, Standards, and Student Testing (CRESST) NCME San Diego, CA April 15,

Themes

Capabilities & limitations

Reasoning in terms of patterns

Psychological perspectives

Acquiring expertise

Forms of knowledge representation

Page 4: Robert J. Mislevy University of Maryland National Center for Research on Evaluation, Standards, and Student Testing (CRESST) NCME San Diego, CA April 15,

Capabilities & limitations

Ways we are the same / differing / unique

Experiential & reflective cognition

• Optical illusions / cognitive illusions

• Limited working memory & attention

• Can think about our thinking (metacognition)

• Benefit from procedures, methods, tools

Page 5: Robert J. Mislevy University of Maryland National Center for Research on Evaluation, Standards, and Student Testing (CRESST) NCME San Diego, CA April 15,

An Optical Illusion (http://www.optillusions.com)

Page 6: Robert J. Mislevy University of Maryland National Center for Research on Evaluation, Standards, and Student Testing (CRESST) NCME San Diego, CA April 15,

Reasoning in terms of patterns

Perception combines input from environment and patterns from experience

• Chi, Feltovich, Glaser example

Narratives / schemas / scripts / mental models

• This is how we make sense of the world

• Some “wired in”

• Some learned informally and experientially

• Some through instruction and conscious effort

Page 7: Robert J. Mislevy University of Maryland National Center for Research on Evaluation, Standards, and Student Testing (CRESST) NCME San Diego, CA April 15,

What is this a picture of? (http://www.optillusions.com)

Page 8: Robert J. Mislevy University of Maryland National Center for Research on Evaluation, Standards, and Student Testing (CRESST) NCME San Diego, CA April 15,

Reasoning in terms of patterns

Simultaneous use of patterns at many levels

Perception / Meaning /Action

Key role of interacting with situation

• Inquiry cycle / model-based reasoning

• Interactive tasks (construction, simulation)

• Even in static tasks, focus on perception / explanation / action

Page 9: Robert J. Mislevy University of Maryland National Center for Research on Evaluation, Standards, and Student Testing (CRESST) NCME San Diego, CA April 15,

Reasoning in terms of patternsAssessment as Evidentiary Argument

What complex of knowledge, skills, or other attributes should be assessed … ?

What behaviors or performances should reveal those constructs [broadly construed]?

What tasks or situations should elicit those behaviors?

(Messick, 1994)

Page 10: Robert J. Mislevy University of Maryland National Center for Research on Evaluation, Standards, and Student Testing (CRESST) NCME San Diego, CA April 15,

Psychological perspectives

Trait/Differential (Spearman, Carroll)

• Origin of machinery of psychometrics

Behaviorist (e.g., CRTs of 1970s)

Developmental (Piaget)

Information-processing (Newell & Simon)

Sociocultural/situative (Vygotsky, Lave)(Greeno, Pearson, and Schoenfeld (1996)

Page 11: Robert J. Mislevy University of Maryland National Center for Research on Evaluation, Standards, and Student Testing (CRESST) NCME San Diego, CA April 15,

Psychological perspectives

A perspective shapes…

• what you pay attention to;

• what entities and relationships you use in explanations;

• what you see as problems and solutions.

A perspective both enables and constrains thinking.

Page 12: Robert J. Mislevy University of Maryland National Center for Research on Evaluation, Standards, and Student Testing (CRESST) NCME San Diego, CA April 15,

Psychological perspectives

For assessment, perspective shapes…

• Inferences you target – patterns that shape students’ actions (Meaning & Action)

• What you look for in what students say, do, or make (Perception)

• What are the features of the situation that evoke the evidence you need.

A perspective both enables and constrains what you can learn from an assessment.

Page 13: Robert J. Mislevy University of Maryland National Center for Research on Evaluation, Standards, and Student Testing (CRESST) NCME San Diego, CA April 15,

What will Jimmie’s path be if he steps off the merry-go-round right now?

Page 14: Robert J. Mislevy University of Maryland National Center for Research on Evaluation, Standards, and Student Testing (CRESST) NCME San Diego, CA April 15,

Psychological perspectives

Hydrive

• Info-processing + sociocultural

AP Studio Art

• Sociocultural; interpretational

• Note interpretation of variables in model

Task-based language assessment

• All perspectives relevant

• Target language use (Bachman & Palmer)

• What to stress, how to design situations

Page 15: Robert J. Mislevy University of Maryland National Center for Research on Evaluation, Standards, and Student Testing (CRESST) NCME San Diego, CA April 15,

Acquiring expertise

Expertise as overcoming human cognitive processing limitations

• Patterns for perceiving, understanding, acting (incl. sociocultural)

• Use of knowledge representations

• Automating processes to varying degrees

• Metacognitive skills

Page 16: Robert J. Mislevy University of Maryland National Center for Research on Evaluation, Standards, and Student Testing (CRESST) NCME San Diego, CA April 15,

Acquiring expertise

Examples in assessment

• Katz, re NCARB simulations as example for “design under constraint” (assessment is another such domain!)

• Embretson as example for differential perspective measurement

• Marshall & Derry as example for assessment design based on schemata

• Stevens re ordered pairs of actions

Page 17: Robert J. Mislevy University of Maryland National Center for Research on Evaluation, Standards, and Student Testing (CRESST) NCME San Diego, CA April 15,

Forms of knowledge representation

Symbol sets & manipulation

Forms of knowledge representation (KRs)

• Maps, diagrams, object models, flow charts

Central to expertise

Mediated cognition

Distributed cognition

Nexus between info-processing & sociocultural perspectives

Page 18: Robert J. Mislevy University of Maryland National Center for Research on Evaluation, Standards, and Student Testing (CRESST) NCME San Diego, CA April 15,

Forms of knowledge representation

Some forms of knowledge representation for design and using assessments:

Measurement models & representations

Argument structures

Evidence-centered design structures

Design patterns, templates, object models

IMS/QTI standards

Page 19: Robert J. Mislevy University of Maryland National Center for Research on Evaluation, Standards, and Student Testing (CRESST) NCME San Diego, CA April 15,

Three basic models that embody theassessment argument

Forms of knowledge representation

Page 20: Robert J. Mislevy University of Maryland National Center for Research on Evaluation, Standards, and Student Testing (CRESST) NCME San Diego, CA April 15,

Forms of knowledge representation

Measurement models: Multivariate models for different aspects of

knowledge / skill / propensities (MRCMLM)

Integration of statistical inference with task design (Tatsuoka, Embretson): Cognitive diagnosis, mixed strategies, multilevel models

Conditional dependence (re interaction)

Re-interpretation of variables (propensities to act in situations w x features; rater models)

Page 21: Robert J. Mislevy University of Maryland National Center for Research on Evaluation, Standards, and Student Testing (CRESST) NCME San Diego, CA April 15,

Example: HYDRIVE

Student-model variables in HYDRIVE

• Motivated by cognitive task analysis

• Scope shaped by purpose

• Grain-size determined by instructional options

A Bayes net fragment

Overall Proficiency

Procedural Knowledge

PowerSystem

SystemKnowledge

Strategic Knowledge

Use ofGauges

SpaceSplitting

Electrical Tests

SerialElimination

Landing GearKnowledge

Canopy Knowledge

ElectronicsKnowledge

HydraulicsKnowledge

Mechanical Knowledge

Page 22: Robert J. Mislevy University of Maryland National Center for Research on Evaluation, Standards, and Student Testing (CRESST) NCME San Diego, CA April 15,

HYDRIVE, continued

A Bayes Net Measurement Model, docked with Student Model

Canopy Situation--No split possible

Canopy Situation--No split possible

Use ofGauges

SerialElimination

Canopy Knowledge

HydraulicsKnowledge

Mechanical Knowledge

Library of Measurement Model fragments

Page 23: Robert J. Mislevy University of Maryland National Center for Research on Evaluation, Standards, and Student Testing (CRESST) NCME San Diego, CA April 15,

Conclusion

Assessment is a particular kind of narrative:

• An evidentiary argument about aspects of what students know and can do, based on a handful of particular things that have said, done, or made.

Assessment integrates perceiving, understanding, and acting.

Assessment forms both enable and constrain thinking about students.

Page 24: Robert J. Mislevy University of Maryland National Center for Research on Evaluation, Standards, and Student Testing (CRESST) NCME San Diego, CA April 15,

Conclusion

Cognitive psychology helps us understand what to make inferences about, what we need to see, what situations can provide us with clues.

• Conceiving targets of assessment

• Explicating and improving the design and use of assessments