29
Assessment of Knowledge and Performance John Littlefield, PhD University of Texas Health Science Center at San Antonio

Assessment of Knowledge and Performance John Littlefield, PhD University of Texas Health Science Center at San Antonio

Embed Size (px)

Citation preview

Assessment of Knowledge and Performance

John Littlefield, PhD

University of Texas

Health Science Center at San Antonio

Goals: Assessment of Knowledge and Performance

1. Clarify 2 distinct uses for assessments of knowledge and performance

2. Define 3 aspects of validity for all knowledge and performance assessment methods

3. Compare and contrast 3 techniques for assessing clinical knowledge and performance

4. Identify poorly written multiple choice test items and write a key features test item

5. Describe 3 options for scoring OSCE performance

6. Describe three elements of a clinical performance assessment system

7. Critique a clinical performance assessment system that you use

Agenda: Assessment of Knowledge and Performance

1. Exercise: Warm-up for assessing clinical knowledge and performance

2. Presentation: Quality assurance when assessing clinical knowledge and performance

3. Exercise: Take then critique a multiple choice test4. Presentation: Key features test items5. Exercise: Write several key features test items6. Presentation: Widening the lens on SP assessment7. Exercise: Strengths & weaknesses of a clinical

performance assessment system that you use

8. Presentation: Improving clinical performance assessment systems

9. Exercise: Critique your clinical performance assessment system

Recall a student/resident whose clinical performance made you uneasy

1. Was the student/resident aware of your concern? Yes No

2. What action did you take?a. Talk with faculty colleagues

about your concerns: Yes Nob. Write a candid performance

assessment and send it to clerkship/residency director: Yes No

3. Did any administrative action occur related to your concern? Yes No

4. Do you think the performance assessments in your clerkship/residency files reflect faculty candid performance appraisals? Yes No

What concerns do you have about clinical knowledge and performance assessment?

1. Smart but not professional2. Doesn’t have technical skills3. Heterogeneity of evaluator skills (fairness /

accuracy)4. How to motivate evaluators5. Options for remediation6. How to validate the exam7. Oral exams really worth it8. How many evals needed before making a

decision

Uses for Assessment: Formative vs. Summative

Purpose Feedback for Certification/

Learning Grading

Breadth of Narrow Focus on Broad Focus on

Scope Specific Objectives General Goals

Scoring Explicit Feedback Overall

Performance

Learner Affective Little Anxiety Moderate to High

Response Anxiety

Target Audience Learner Society

Validity of Knowledge & Performance Assessments *

1. Content: Does the assessment method measure a representative cross section of student/resident competencies?

2. Reliability of scores: Does student/resident perform at about the same level across 5 to 7 different patients / case problems? Does student receive similar ratings from different faculty?

3. Latent process: Does the context surrounding the assessment evoke the domain of cognitive processing used by a clinician?

* Lissitz RW & Samuelsen K. A suggested change in terminology and emphasis regarding validity and education. Educational Researcher, V. 36(8), 437-48, 2007.

Content of a Clinical Performance Assessment

1. Which clinical competencies are addressed by the performance assessment?

2. How thoroughly will you assess each competency?

3. How will you score performance?

Reliability of Physician Performance Scores on Multiple Patients

0

20

40

60

80

100

120

Patient 1 Patient 2 Patient 3 Patient 4 Patient 5

IdealActual

Latent Process Aspect of Validity: Four Levels of Performance Assessment *

Knows

(Examination – Multiple-choice)

Knows How

(Examination – Oral)

Shows How

(OSCE)

Does

(Global Rating)

* Miller, GE. Assessment of clinical skills/competence/performance, Academic Medicine, 65(9), supplement, 1990, S63-7

Compare and Contrast Three Assessment Techniques(Multiple choice exam, OSCE, Global ratings)

M.C.E. OSCE Global rtgs.

1. Content +++ ++ +

2. Reliability– 5 to 7 case problems +++ ++ +– agreement among +++ ++ +

raters

3. Latent process + ++ +++

+ = adequate ++ = good +++ = excellent

Interim Summary of Session

Session thus far– Two uses of knowledge and performance

assessments: Formative and Summative– Validity of all knowledge and

performance assessment techniques– Compare and contrast 3 assessment

techniques Coming up

– Take and critique a 14 item multiple choice exam

– Presentation on Key Features items

How are Multiple Choice Items Selected for an Exam?

Sample Exam Blueprint based on Clinical Problems

Patient Life Span % Total Patients

% Men/Women Number of Problems

Pregnancy/Infant 5 50/50 4

Pediatrics 16 53/47 6

Adolescence 16 31/69 6

Adults 47 34/66 18

Geriatrics 16 45/46 6

Total 100 39/61 40

Page G, Bordage G, Allen T. Developing key-feature problems and examinations to assess clinical decision-making skills, Acad. Med. 70(3), 1995.

Key Features of a Clinical Problem 1

Definition: Critical steps that must be taken to identify and manage a patient’s problem– focuses on a step in which examinees are likely to make an

error

– is a difficult aspect in identifying and managing the problem Example: For a pregnant woman experiencing third-

trimester bleeding with no abdominal pain, the physician should:– generate placenta previa as the leading diagnosis

– avoid performing a pelvic examination (may cause bleeding)

– avoid discharging from clinic or emergency room

– order coagulation tests and cross-match

1. Page G, Bordage G, Allen T. Developing key-feature problems and examinations to assess clinical decision-making skills, Acad. Med. 70(3), 1995.

Test Items based on a Clinical Problem and its Key Features

Jennifer, a 24-year-old woman, G3, P2, 31 weeks pregnant, comes to the emergency room at 8:00 PM complaining of bright red vaginal bleeding for the past two hours. The three sanitary napkins that she used were completely soaked. Her pregnancy has been uneventful, as were the previous ones. She has not had any contractions or abdominal pain. The fetus is moving as usual.

Her BP is 110/70 mm Hg, and her pulse is 92/min. The examination of the abdomen reveals a uterine height of 31 cm. with a soft and nontender uterus. The fetus is in a breech position and has a heart rate of 150/min. No active bleeding has occurred since she arrived 25 minutes ago.

1. What is your leading diagnosis at this time? List only one diagnosis. Write “normal” if you judge Jennifer’s situation to be within normal limits.

2. What steps would you include in your immediate assessment and management of this patient? List as many as are appropriate.

Scoring the Placenta Previa Clinical Problem

Key Feature 1: To receive one point, must list placenta previa or one of the following synonyms: marginal placenta or low placental insertion

Key Features 2-4: Receive 1/3 point for listing each of the following: 1. Avoid performing a pelvic exam, 2. Avoid discharging from clinic, 3. Order coagulation tests and cross match

Total Score for Problem: Add scores for items 1 and 2 and divide by 2 (range: 0 - 1)

Steps to Develop Key Features Problems1

1. Assemble problem-writing group

2. Select a problem and define its key features- Usually chosen from an exam blueprint

- Think of several real cases of the problem in practice

- What are the essential steps in resolving this problem (must be done)?

- Typical decisions or actions: Elicit hx., Interpret symptoms, Make dx.

- Define qualifiers such as urgency or decision-making priority

- Select a case scenario

- Select question formats

- Specify number of required answers

- Prepare scoring key

- Pilot test the problems

1. Farmer & Page. A practical guide to assessing clinical decision-making skills using the key features approach. Medical Education, 39:1188-94, 2005.

Interim Summary of Session

Session thus far– Two uses of knowledge and

performance assessments: Formative and Summative

– Validity of all assessment techniques

– Compare and contrast three assessment techniques

– Take and critique a 14 item multiple choice exam

– Write a Key Features item Coming up

– Scoring performance on an SP exam

Schematic Diagram of a 9 Station OSCE

Start

End

1 2 3 4

5

678 9

OSCE Stations: Standardized Patient or Simulation

http://www.med.unc.edu/oed/sp/welcome.htm http://www.simulution.com/product

Scoring OSCE Performance

Traditional scoring of SP assessment focuses on numerical data typically from checklists

Checklist scoring may not accurately assess clinical performance quality of residents and expert clinicians 1

Dimensions of the SP exam 2

– basic science knowledge (organize the information)– physical exam skills (memory of routines)– establishing a human connection– role of the student (appear knowledgeable)– existential dimension of the human encounter (balance one’s

own beliefs with the patient’s) Clinical competence – mixture of knowledge and feeling,

information processing and intuition

1. Hodges et. al. OSCE checklists do not capture increasing levels of expertise, Acad. Med. 74(10), 1999, 1129-34. 2. Rose & Wilkerson. Widening the lens on SP assessment: What the encounter can reveal about development of clinical competence, Acad. Med. 76(8), 2001, 856-59.

Interim Summary of Session

Session thus far– Two uses of knowledge and performance

assessments: Formative and Summative

– Validity of all assessment techniques

– Compare and contrast three assessment techniques

– Take and critique a 14 item multiple choice exam

– Write a Key Features test item

– Use global ratings and narrative comments when scoring OSCE performance

Coming up– Improving clinical performance assessment

systems

Bubble Diagram of a Resident Performance Assessment System

Diagnostic Checklist for Clinical Performance Assessment System

Identify Problems

Recommend Improvements

A. Organizational Infrastructure B. Individual Evaluators’ Role 1. Communicate Expectations 2. Observe Performance 3. Interpret and Judge Performance 4. Communicate Performance Info 5. Coach Resident 6. Complete PA Form C. Program Director’s Role 1. Monitor & Interpret Appraisals 2. Committee Decision 3. Formally Inform Resident

Three Year Study to Improve the Quality of Resident Performance Assessment Data

1. What median percentage of each resident’s rotations returned one or more completed forms?

2. How precise were the scores marked on the returned forms?

3. What median percentage of each resident’s rotations returned one or more forms with behaviorally-specific

written comments?

Littlefield, DaRosa, Paukert et. al. Improving Resident Performance Assessment Data: Numeric Precision & Narrative Specificity, Acad. Med. 80(5), 489-95, 2005

Results of the Study

Research Question Baseline Year

Intervene Year 1

Intervene Year 2

Stat Sig.

% Rot. Returned Form

Site A

Site B

M = 71%

M = 75%

M = 100%

M = 100%

M = 96%

M = 100%

Χ²

<.01

<.01

Gen. Coef. (95% Cnf Int)

Site A

Site B

.65 (±.78)

.58 (±.55)

.85 (±.58)

.79 (±.55)

.84 (±.61)

.77 (±.56)

N/A

N/A

% Rot. Beh. Spec. Cmnt

Site A

Site B

M = 50%

M = 50%

M = 57%

M = 60%

M = 82%

M = 67%

Χ²

<.01

<.01

Making Evaluation Decisions

Student/Res.

IS

Competent

Student/Res.

IS Not

Competent

Pass

(promote, graduate, etc.)

Correct

Decision

False

Positive

Error

Fail

(remediate, dismiss, etc.)

False

Negative

Error

Correct

Decision

Complete Knowledge (God)P

art

ial K

no

wle

dg

e (

Us)

Papadakis et. al. Disciplinary action by medical boards and prior behavior in medical school, NEJM, 353:25, 2673-82, 2005.

Goals: Assessment of Knowledge & Performance

1. Clarify 2 distinct uses for assessments of knowledge and performance

2. Define 2 aspects of validity for all knowledge and assessment methods

3. Compare and contrast 3 techniques for assessing clinical knowledge and performance

4. Identify poorly written multiple choice test items and write a key features test item

5. Describe 3 options for scoring OSCE performance

6. Describe three elements of a clinical performance assessment system

7. Critique a clinical performance assessment system that you use