29
Innovations in Higher Education Learning Bror Saxberg CLO, Kaplan, Inc. February 8, 2012

BISG's MIP for Higher Ed 2012 -- SAXBERG

  • Upload
    bisg

  • View
    658

  • Download
    1

Embed Size (px)

Citation preview

Innovations in Higher Education Learning Bror Saxberg CLO, Kaplan, Inc.

February 8, 2012

1

What’s new for learning?

Overview Cognitive Task Analysis Kaplan Way – Kaplan University Course Redesigns Q and A

2

• Integrating the design, building, monitoring, and improvement of learning environments; individualize learning experiences using our scale; and, ultimately, drive greater student career success.

• Former CLO for K12, Inc. – structured use of technology, cognitive science, on-line and off-line materials for 1,700 teachers, 55k students

• Former Publisher and General Manager for DK Multimedia, Inc. • Management consultant with McKinsey & Company • Education:

• Ph.D. in Electrical Engineering and Computer Science from MIT • M.D. from Harvard Medical School • M.A. in Electrical Engineering and Computer Science from MIT • M.A. in Mathematics from Oxford University • B.S. in Electrical Engineering and B.S. with Honors in Mathematics from the

University of Washington

Bror Saxberg Chief Learning Officer, Kaplan, Inc.

3

What Our Students Told Us They Want Brand

Promise

Brand Pillars

Pillar Definitions

We strive to make education as personalized to you as possible−tailoring our courses around your individual needs.

We are dedicated to getting you the results that matter in the time that matters.

We move quickly with constant innovation to better meet your needs.

We are here to help you achieve success at critical milestones along your educational journey.

4

To respond, consider structuring key initiatives to take advantage of what’s known about learning – and data

Rapidly test and scale learning

innovations

5

What’s new for learning?

Overview Cognitive Task Analysis Kaplan Way – Kaplan University Course Redesigns Q and A

6

Employers actually expect job applicants to lack the occupational/technical skills required to do the job…

• Slightly over half of all respondents (52.8%) expected that job applicants would lack occupational skills

• In healthcare, where occupational certifications and licensures are required, over 68% of respondents expect that job applicants would lack occupational skills

Do you expect job applicants to be lacking specific occupational skills or technical skills?

March 2011 Workforce Connections, Inc. survey of employers in western Wisconsin. Over 400 employers from all 8 counties responded to the survey. All sizes of businesses were represented with the majority of responses coming from businesses with less than 50 employees.

7

Cognitive Task Analysis (CTA) may provide answers

• CTA is an interview strategy for capturing how highly successful experts perform complex tasks in a variety of settings

• Goal is to develop authentic demonstration and practice opportunities for how to perform at expert levels

• Experts are interviewed who 1) have recent (past 2-3 mo.) experience, 2) are consistently successful, and 3) are NOT trainers.

• Interviews are done with 3-4 experts to unpack their strategies;

these are merged to make an efficient approach suitable for training

• A range of problem examples or performance scenarios are collected from the experts for use in instruction as well

8

Medical Assistant current course content:

X = substantial content; x = ancillary content

Pharmacology course

Diseases - human body

9

MA CTA: Identifies key tasks/skills performed by experts Original content New focus

• Tie to domain tasks as identified by experts

10

MA Program: Skills addressed in new sequence New focus

• Tie to domain tasks as identified by experts

11

MA Program: Skills addressed in new sequence

• Tie to domain tasks as identified by experts

• Repeated use of skills across courses

B: Begin; A: Advanced; R: Reinforce

12

MA Program: New courses include previous content • Tie to domain tasks as

identified by experts • Repeated use of skills

across courses • Original concepts spread

across task instruction, not confined to courses

13

What’s new for learning?

Overview Cognitive Task Analysis Kaplan Way – Kaplan University Course Redesigns Q and A

14

A lot is known about what drives learning now

Learning Events (hidden - inside students’

minds)

Student Performance

(observable -indicates knowledge)

Instructional Events (in the learning environment)

15

A lot is known about what drives learning now

Learning Events (hidden - inside students’

minds)

Student Performance

(observable -indicates knowledge)

Instructional Events (in the learning environment)

Knowledge

Motivation

Metacognition

16

A lot is known about what drives learning now

Learning Events (hidden - inside students’

minds)

Student Performance

(observable -indicates knowledge)

Instructional Events (in the learning environment)

Knowledge

• Explicit: Information, Explanation, Examples, Demos

• Implicit: Practice tasks/activities (prompts and response)

• Diagnosis and feedback

• Explicit/Declarative/Conceptual/What • Implicit/Procedural/How • Knowledge Components (Procedures

+ Facts, Concepts, Principles, Processes)

• Response accuracy/errors • Response fluency/speed • Number of trials • Amount of assistance (hints) • Reasoning

Motivation

• Orientation/Inoculation • Monitoring • Diagnosis and treatment:

Persuasion, Modeling, Dissonance

• Value beliefs • Self-efficacy beliefs • Attribution beliefs • Mood/Emotion

• Behavior related to • Starting • Persisting • Mental Effort

• Self-reported beliefs

Metacognition • Structure • Guidance

• Planning, Monitoring • Selecting, Connecting

• Amount of guidance required/requested

See: Koedinger, K.R., Corbett, A.T., and Perfetti, C. (2010). The Knowledge-Learning-Instruction (KLI) Framework: Toward Bridging the Science-Practice Chasm to Enhance Robust Student Learning

17

Task-centered instruction

• Move from simple to increasingly difficult tasks – NOT “PBL” sink or swim • Teach everything needed for each task • Fade coaching/support over time

18

ID can change instructional outcomes at scale

Principle Description Effect size (s.d. units)

Multimedia Use relevant graphics and text to communicate content 1.5

Contiguity Integrate the text nearby the graphics on the screen – avoid covering or separating integrated information

1.1

Coherence Avoid irrelevant graphics, stories, videos, media, and lengthy text 1.3

Modality Include audio narration where possible to explain graphic presentation 1.0

Redundancy Do not present words as both on-screen text and narration when graphics are present .7

Personalization Script audio in a conversational style using first and second person 1.3

Segmenting Break content down into small topic chunks that can be accessed t the learner’s preferred rate

1.0

Pre-training Teach important concepts and facts prior to procedures or processes 1.3

Etc. Worked examples, self-explanation questions, varied-context examples and comparisons, etc.

??

Source: E-learning and the Science of Instruction, Clark and Mayer, 2nd ed., 2008

19

Impact is not small!

50% 1 sd

84%!

20

Instructional Design process should follow evidence

Overviews Information Examples Practice Assessment Learning Outcomes

Guidance (for motivation and metacognition)

Design

Delivery

The evidence about learning points to a sequence of activities that optimizes learning. Design goes one way, delivery the other.

21

1. Apply “Kaplan Way” evidence-based instructional design to several Kaplan University courses (high volume, needed improving):

2. Deliver the courses in a simplified e-College template.

3. Develop replicable/scalable process, templates, technology.

4. Evaluate the impact on student outcomes

•Pilot 1: August 3 – October 12 •Pilot 2: October 19 – December 28

In 2011 KU and KLI launched a course redesign pilot

22

Read, Write, Discuss • Outcomes and content sometimes

loosely aligned • Limited demonstrations, worked

examples, and practice • General assessment rubrics • Reliance on discussion boards • Limited support for motivation

Prepare, Practice, Perform • Outcomes and content precisely aligned • Frequent demonstrations,

worked examples, practice, feedback • Detailed scoring guides • Evidence-based support for motivation • Instructor coaching

The student experience: before and after

23

Results: Significant learning and business impact from KU course redesigns – and more to come

Learning impact

Higher instructor satisfaction: • Instructors see benefits of design; instructor

materials/support for facilitator role Lower student satisfaction: • Courses more demanding and time

consuming Higher retention, less withdrawals: • Support for at-risk students to stay engaged More time-on-task: • Students in pilot versions of courses spend

more time online in course Better learning outcomes: • Pilot students earn higher CLA scores and

higher scores on common assessments

Financial impact

3% retention gain •Significant benefit to learners and university 14% gain in “student success”

24

42%

28%

“Success” = CLA Average >=4 AND passed course AND retained to next term

The pilot courses delivered an 14% difference in student success rate—a 50% increase over control courses

Pilots 1 and 2 combined analysis: Group differences in student “success”

Controlling for differences in course, students, instructors and seasonality

Statistical Significance

1 2 3 4

Least Squares Means for effect grp Pr > |t| for H0: LSMean(i)=LSMean(j)

Dependent Variable: success

i/j n 1 2 3 4 1 23,748 0.9795 <.0001 0.914 2 6,121 0.9795 <.0001 0.9584 3 508 <.0001 <.0001 <.0001 4 582 0.914 0.9584 <.0001

25

Results: Significant learning and business impact from KU course redesigns – and more to come

Learning impact

Higher instructor satisfaction: • Instructors see benefits of design; instructor

materials/support for facilitator role Lower student satisfaction: • Courses more demanding and time

consuming Higher retention, less withdrawals: • Support for at-risk students to stay engaged More time-on-task: • Students in pilot versions of courses spend

more time online in course Better learning outcomes: • Pilot students earn higher CLA scores and

higher scores on common assessments

Financial impact

3% retention gain •KU OIE’s team estimates a return of $1.5M in OI annually from an investment of $375K 14% gain in “student success” •Success” = CLA Ave>4, Passed, and Retained •Analysis controlled for variations in course, student, instructor and seasonality •Translates to a 50% increase over control courses

26

“Something I found to be interesting was the degree of understanding between me and another individual that wasn’t in this class. A girl I had met in a previous term that has a similar degree plan but ended up in a regular medical terminology course, still we would discuss the differences and similarities between are assigned classes. During our unit 8 test she called me hysterical about all the different elements of the final tests and couldn’t seem to grasp the concept of the 1st part of the test i.e., analysis diagram, creating new terms from word roots etc. I was mystified that something that had become 2nd nature to me mainly due to the time spent every week filling out the Analysis Tables was so difficult for her to comprehend. I t was at that point I realized all the griping I had done was actually the reason my level of understanding is more evolved than somebody who never experienced it.”

Student feedback on the benefits of extra practice

27

What’s new for learning?

Overview Cognitive Task Analysis Kaplan Way – Kaplan University Course Redesigns Q and A

28

Appendix: Initial readings for “learning engineers”

• Why Students Don’t Like School, Daniel Willingham – highly readable! ;-) • Talent is Overrated, Geoffrey Colvin – highly readable! ;-) • E-Learning and the Science of Instruction, Clark and Mayer, 3rd ed. • “First Principles of Learning,” Merrill, D., in Reigeluth, C. M. & Carr, A. (Eds.), Instructional

Design Theories and Models III, 2009. • How People Learn, John Bransford et al, eds. • “The Implications of Research on Expertise for Curriculum and Pedagogy”, David Feldon,

Education Psychology Review (2007) 19:91–110 • “Cognitive Task Analysis,” Clark, R.E., Feldon, D., van Merrienboer, J., Yates, K., and Early,

S.. in Spector, J.M., Merrill, M.D., van Merrienboer, J. J. G., & Driscoll, M. P. (Eds.), Handbook of research on educational communciatinos and technology (3rd ed., 2007) Lawrence Erlbaum Associates