30
Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

Embed Size (px)

Citation preview

Page 1: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

Carl WiemanAssoc. Director for ScienceWhite House Office of Science and Technology Policy

Measuring Impact in STEM Ed;Are they thinking like experts?

Page 2: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

The White House perspective

“Maintaining our leadership in research and technology is crucial to America’s success. But if we want to win the future – if we want innovation to produce jobs in America and not overseas – then we also have to win the race to educate our kids.”  B. Obama

Major Policy questions

What is effective teaching, particularly in STEM?Can it be developed? How?How can we achieve better learning? (evidence!)

Page 3: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

switching hats to science education researcher

What is the broad goal of your project? → how to measure

What is the learning that matters to you? (30 s)

Bunch of facts & solution techniques?May be useful, but use tiny fraction of what learn in school, and in career need vastly more than learn in school.

Want them to understand _____! [DNA, relativity, PH…]

What does “understand” mean? How measure if achieved?

Think about and use ____ like a scientist/engineer.

Page 4: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

“Think like a scientist/engineer.”

I. What does that mean?Expert thinking (cog. pysch.)

II. Development of expert thinking

III. More details on expert thinking

IV. Measuring --developing tools

Page 5: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

cognitivepsychology

brainresearch

Scienceclassroom

studies

Major advances past 1-2 decadesConsistent picture Achieving learning

→principles of learning help design experiments andmake sense of results. Understand both what and why.

Page 6: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

or ?

Expert competence =•factual knowledge• Mental organizational framework retrieval and application

Expert competence research*

•Ability to monitor own thinking and learning("Do I understand this? How can I check?")

New ways of thinking-- require MANY hours of intense practice to develop

*Cambridge Handbook on Expertise and Expert Performance

patterns, relationships, scientific concepts

historians, scientists, chess players, doctors,...

Page 7: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

Significantly changing the brain, not just adding bits of knowledge.

Building proteins, growing neurons enhance neuron connections, ...

Brief digression on research on developmentof expertise.

Page 8: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

Essential element of developing expertise*“Deliberate practice” (A. Ericcson)•task of challenging but achievable level that requires explicit expert-like thinking. Intensely engaged •reflection and guidance on result•repeat & repeat & repeat, ...

10,000 hours later-- very high level expertise Different brain, develops with “exercise.”

* accurate, readable summary in “Talent is over-rated”, by Colvin

cew interpretation--“formative assessment”, “constructivism”, “self-regulated learning” all contained in “deliberate practice” framework.

Page 9: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

“Think like a scientist/engineer.”

I. What does that mean?Expert thinking (cog. pysch.)

II. Development of expert thinking

III. More details on expert thinking

IV. Measuring --developing tools

Page 10: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

How experts solve a problem—”Cognitive task analysis” (and how different

from non-experts)

“How Scientists Think in the Real World: Implications forScience Education”, K. Dunbar, Journal of Applied Developmental Psychology 21(1): 49–58 2000

•concepts and mental models (analogies)•testing these and recognizing when apply or not•distinguishing relevant & irrelevant information•established criteria for checking suitability of solution method or final answer (“sense-making and self-checking”)

features in your discipline? (1 min)

Page 11: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

Lots of complex pattern recognition

What features and relationships important?Which are not?(“surface features” vs. “underlying structure”)

Often hear--“Novice problem solvers just do pattern matching,experts use more sophisticated concept based strategies.”

cew unproven claim (not official WH position)—It is all pattern matching– experts justlook for and recognize different patterns.

Page 12: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

Non-cognitive elements of thinking like a scientist.Perceptions/attitudes/beliefs

(important, but changed more quickly, essential precursor to “deliberate practice”)

Page 13: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

Novice Expert

Content: isolated pieces of information to be memorized.

Handed down by an authority. Unrelated to world.

Problem solving: simple matching to memorized recipes.

Perceptions about science (& how learned and used)

Content: coherent structure of concepts.

Describes nature, established by experiment.

Prob. Solving: Systematic concept-based strategies. Widely applicable.

*adapted from D. Hammer

consistent views across scientists in a discipline(physics, chem, bio)

Page 14: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

Student Perceptions/Beliefs

0%

10%

20%

30%

40%

50%

60%

0 10 20 30 40 50 60 70 80 90 100

All Students (N=2800)

Intended Majors (N=180)

Actual Majors (N=52)

CLASS Overall Score (measured at start of 1st term of college physics)

ExpertNovice

Kathy Perkins, M. Gratny

Page 15: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

Student Beliefs

0%

10%

20%

30%

40%

50%

60%

0 10 20 30 40 50 60 70 80 90 100

Actual Majors who wereoriginally intended phys majors

Actual Majors who were NOT originally intended phys majors

CLASS Overall Score (measured at start of 1st term of college physics)

ExpertNovice

Page 16: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

Course Grade in Phys I or Phys II(beliefs more important factor than grades)

0%5%

10%15%20%25%30%35%40%45%

DFW C B AGrade in 1st term of college physics

All Students (2.7/4)

Intended Majors (2.7/4)

Actual Majors (3.0/4)

Page 17: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

Creating tests to measure expert thinkingas different from non-expert (technical details)A. Cognitive

Things to look for•What mental models?•How make decisions?•What resources called upon (or not)?

Must understand student thinking!No substitute for interviews. Cognitive– think aloud solution to task. Look forconsistent features that appear.

Code interviews and have independent coding to make objective. (BEWARE CONFIRMATION BIAS!)

Page 18: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

Creating tests to measure expert thinkingas different from non-expert

Example– testing use of expert mental model

“troubleshooting” Your laser suddenly put out only half as much light as it had been before. What change may have produced this result?

“redesign”What are all the ways you could double the power coming out of your laser?

You would like to …(e.g. build a bridge across this river).What information do you need to solve this problem?

Page 19: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

Steps in test development

1. Interview faculty-- 2. Interview students-- understand student thinking

3. Open-ended survey questions to probe.4. Create multiple choice test-- answer choices reflect actual student thinking.5. Validation interviews on test– experts and sample population6. Administer to classes-- run statistical tests on results.

Often iterate and/or skip steps, refine.“Reasonable” data much better than no data!

Page 20: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

Measuring perceptions. Same basic approach.Interview students, capture perceptions intheir own words.

Survey as to level of agreement.

~40 statements, strongly agree to strongly disagree--

Understanding physics basically means being able to recall something you've read or been shown.

I do not expect physics equations to help my understanding of the ideas; they are just for doing calculations.

Page 21: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

Conclusion

Important educational goal “Thinking like a scientist”

Requires careful analysis to make explicit,and distinguish from thinking of nonexperts.

Straightforward process to create tests that measure. More sensitive and meaningful than typical exams.

Development and validation of instruments to measure learning of expert-like thinking, W. Adams and C. Wieman, Int. J. Sci Ed (in press). Covers last part of talk and technical details.

Page 22: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

Tips for developing assessment tools.

1. Interview largest possible range of people.Patterns and expert-novice differences more obvious.

2. 100+ student classes in large university don’t vary year-to-year. Good way to get test-retest reliability, find out if can measure changes.

3. Best questions: a)measure important aspect of student thinking and learning.b) measure aspect that instructors care about & shocked at poor result.

4. Hard and not so useful to measure expert-likethinking on everything. Sample as proxy.

Page 23: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

Key elements of Good Concept Inventory:Key elements of Good Concept Inventory: created by physicists, key concepts where student created by physicists, key concepts where student

failure is shocking. failure is shocking.

(not probed by standard exams)(not probed by standard exams)

easy to administer exam pre & post. easy to administer exam pre & post. Learning from Learning from this coursethis course

set of hard-to-learn topics-- set of hard-to-learn topics-- (not everything)(not everything)

proxy for broader learning (mastery & application of proxy for broader learning (mastery & application of concepts)concepts)

Suitable for use with wide range of institutions and Suitable for use with wide range of institutions and studentsstudents

Page 24: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

How administer?How administer?

Attitude surveys-- online, 1st and last week of class small bonus mark for completion. 80-98%

Concept inventories-- Pre--in class, 1st week. Paper,scantron. Students not keep test.

Post-- In class last week (“guide to in-class review and study for final exam”). No affect on course mark. Occasional question on final. 90+ %

Page 25: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

Summary:

Data to drive educational improvement

Requirements• measure value added (pre -post)• easy to use (more important than perfection)• test “expert-thinking” of obvious value to instructor• validated (measure what is claimed)• need many such instruments to use acrosscurriculum (collaborate)

instruments & research papersclass.colorado.eduCWSEI.ubc.ca

Page 26: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

On average learn <30% of concepts did not already know.Lecturer quality, class size, institution,...doesn't matter!Similar data for conceptual learning in other courses.

R. Hake, ”…A six-thousand-student survey…” AJP 66, 64-74 (‘98).

• Force Concept Inventory- Force Concept Inventory- basic concepts of force and motion 1st semester university physics. Simple real world applications.

Fraction of unknown basic concepts learned

Average learned/course 16 traditional Lecture courses

Measuring conceptual mastery

Ask at start and end of semester--What % learned? (100’s of courses)

improvedmethods

Page 27: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

Nearly all intro classes average shifts to be5-10% less like scientist.

Explicit connection with real life → ~ 0% change

+Emphasize process (modeling) → +10% !!new

Page 28: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

What every teacher should knowComponents of effective teaching/learning apply to all levels, all settings

1. Motivation (lots of research)

2. Connect with prior thinking

3. Apply what is known about memorya. short term limitations (relevant to you)b. achieving long term retention

retrieval and application-- repeated & spaced in time

*4. Explicit authentic practice of expert thinking. Extended & strenuous

basic cognitive & emotional psychology, diversity

Page 29: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

Measuring student (dis)engagement. Erin LaneWatch random sample group (10-15 students). Check against list of disengagement behaviors each 2 min.

time (minutes)

example of data from earth science course

Page 30: Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

Design principles for classroom instruction1. Move simple information transfer out of class. Save class time for active thinking and feedback.

2. “Cognitive task analysis”-- how does expert thinkabout problems? 3. Class time filled with problems and questions that call for explicit expert thinking, address novice difficulties, challenging but doable, and are motivating.4. Frequent specific feedback to guide thinking.

DP