41
Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Embed Size (px)

Citation preview

Page 1: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Assessment of Knowledge and

Performance

John Littlefield, Ph.D.

University of Texas

Health Science Center at San Antonio

Page 2: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Goals: Assessment of Knowledge and Performance

1. Clarify two distinct uses for assessments of clinical knowledge and performance

2. Define two aspects of validity for all assessment methods

3. Compare and contrast three techniques for assessing clinical knowledge and performance

4. Identify poorly written multiple choice test items

5. Write a key features test item

6. Describe a role for narrative comments in scoring interactions with Standardized Patients

7. Describe three elements of a clinical performance assessment system

8. Critique a clinical performance assessment system that you use

Page 3: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Agenda: Assessment of Knowledge and Performance

1. Exercise: Warm-up for assessing knowledge and performance

2. Presentation: Quality assurance when assessing clinical knowledge and performance

3. Exercise: Take then critique a multiple choice test

4. Presentation: Key features test items

5. Exercise: Write a key features test item

6. Presentation: Widening the lens on SP assessment

7. Exercise: Recommend program director actions based on faculty comments about a resident

8. Presentation: Improving clinical performance assessment systems

9. Exercise: Critique your clinical performance assessment system

Page 4: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Warm-up Exercise

1. What answer did you circle to questions 1.a and 1.b?

2. What adjectives did you list for the marginal student/resident?

3. What concerns do you have about assessing knowledge and performance that you would like addressed?

Page 5: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Uses for Assessment: Formative vs. Summative

Purpose Feedback for Certification/Grading

Learning

Breadth of Narrow Focus on Broad Focus on

Scope Specific Objectives General Goals

Scoring Explicit Feedback Overall Performance

Learner Affective Little Anxiety Moderate to High

Response Anxiety

Target Audience Learner Society

Page 6: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Validity of Knowledge and Performance Assessments 1

Content - Does the assessment method measure a representative cross-section of student/resident competencies?

Internal structure – Do content and scoring focus on a specific clinical competency (e.g., patient care)?

Relation to other assessments - Do scores from this assessment correlate highly with other measures of same student competency?

Consequences - Do various subgroups of students (e.g., different ethnic groups) score equally well on the assessment?

Generalizability– Does the student perform at about the same level across 5 to 7

different patients / case problems?– Does the student receive a similar rating from different faculty?

Cognitive process – the context surrounding the assessment evokes the domain of cognitive processing used by a physician

1. Standards for Educ. & Psych. Testing, AERA, APA, NCME, 1999, p 11-16.

Page 7: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Generalizability

Content

Cognitive process

ConsequencesRelation to other assessments

Internal structure

Six Aspects of Assessment Validity Viewed as a Cube

Page 8: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Generalizability of Physician Performance on Multiple Patients

0

20

40

60

80

100

120

Patient 1 Patient 2 Patient 3 Patient 4 Patient 5

IdealActual

Page 9: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Validity of Knowledge and Performance Assessments 1

Content - Does the assessment method measure a representative cross-section of student/resident competencies?

Internal structure – Do content and scoring focus on a specific clinical competency (e.g., patient care)?

Relation to other assessments - Do scores from this assessment correlate highly with other measures of same student competency?

Consequences - Do various subgroups of students (e.g., different ethnic groups) score equally well on the assessment?

Generalizability– Does the student perform at about the same level across 5 to 7

different patients / case problems?– Does the student receive a similar rating from different faculty?

Cognitive process – the context surrounding the assessment evokes the domain of cognitive processing used by a physician

1. Standards for Educ. & Psych. Testing, AERA, APA, NCME, 1999, p 11-16.

Page 10: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Cognitive Process Aspect of Validity: Four Levels of Performance Assessment 1

Knows

(Examination – Multiple-choice)

Knows How

(Examination – Oral)

Shows How

( OSCE)

Does

(Global Rating)

1. Miller, GE. Assessment of clinical skills/competence/performance, Academic Medicine, 65(9), supplement, S63-7, 1990

Page 11: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Compare and Contrast Three Assessment Techniques(multiple choice exam, OSCE, performance appraisals)

M.C.E. OSCE Perf. App.

Content +++ ++ + Internal structure +++ ++ + Rel. to other assessments + + + Consequences + ++ + Generalizability

– 5 to 7 case problems +++ ++ +

– agreement among raters +++ ++ + Cognitive process + ++ +++

+ = adequate ++ = good +++ = excellent

Page 12: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Interim Summary of Session

Session thus far

– Two uses of knowledge and performance assessments: Formative and Summative

– Validity of all assessment techniques

– Compare and contrast 3 assessment techniques

Coming up

– Take and critique a 14 item multiple choice exam

– Presentation on Key Features items

Page 13: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

How are Multiple Choice Items Selected for an Exam?

Page 14: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Sample Exam Blueprint based on Clinical Problems

Patient Life Span % Total Patients

% Men/Women Number of Problems

Pregnancy/Infant 5 50/50 4

Pediatrics 16 53/47 6

Adolescence 16 31/69 6

Adults 47 34/66 6

Geriatrics 16 45/46 6

Total 100 39/61 40

Page G, Bordage G, Allen T. Developing key-feature problems and examinations to assess clinical decision-making skills, Acad. Med. 70(3), 1995.

Page 15: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Key Features of a Clinical Problem

Definition: Critical steps that must be taken to identify and manage a patient’s problem

– focuses on a step in which examinees are likely to make an error

– is a difficult aspect in identifying and managing the problem Example: For a pregnant woman experiencing third-trimester

bleeding with no abdominal pain, the physician should:

– generate placenta previa as the leading diagnosis

– avoid performing a pelvic examination (may cause bleeding)

– avoid discharging from clinic or emergency room

– order coagulation tests and cross-match

Page G, Bordage G, Allen T. Developing key-feature problems and examinations to assess clinical decision-making skills, Acad. Med. 70(3), 1995.

Page 16: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Test Items based on a Clinical Problem and its Key Features

Page G, Bordage G, Allen T. Developing key-feature problems and examinations to assess clinical decision-making skills, Acad. Med. 70(3), 1995.

Jennifer, a 24-year-old woman, G3, P2, 31 weeks pregnant, comes to the emergency room at 8:00 PM complaining of bright red vaginal bleeding for the past two hours. The three sanitary napkins that she used were completely soaked. Her pregnancy has been uneventful, as were the previous ones. She has not had any contractions or abdominal pain. The fetus is moving as usual. Her BP is 110/70 mm Hg, and her pulse is 92/min. The examination of the abdomen reveals a uterine height of 31 cm. with a soft and nontender uterus. The fetus is in a breech position and has a heart rate of 150/min. No active bleeding has occurred since she arrived 25 minutes ago. 1. What is your leading diagnosis at this time? List only one diagnosis. Write “normal” if you judge Jennifer’s situation to be within normal limits. 2. What steps would you include in your immediate assessment and management of this patient? List as many as are appropriate.

Page 17: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Scoring the Placenta Previa Clinical Problem

Key Feature 1: To receive one point, must list placenta previa or one of the following synonyms: marginal placenta or low placental insertion

Key Features 2-4: Receive 1/3 point for listing each of the following: 1. Avoid performing a pelvic exam, 2. Avoid discharging from clinic, 3. Order coagulation tests and cross match

Total Score for Problem: Add scores for items 1 and 2 and divide by 2 (range: 0 - 1)

Page 18: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Steps to Develop a Clinical-Problem Based Exam

Define the domain of clinical problems to be sampled by the exam Develop an exam blueprint to guide selection of clinical problems Develop a key-feature problem for each clinical problem selected

– define clinical situation for the problem (e.g. single typical problem, life-threatening situation etc.)

– define key features of the problem

– select a clinical case to represent the problem and write scenario

– write exam items for case; in general one item for each key feature

– select suitable format for each item (e.g., write-in or mcq)

– develop scoring key for each item

– pilot test items for item analysis data to guide refinement

Page 19: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Interim Summary of Session

Session thus far

– Two uses of knowledge and performance assessments: Formative and Summative

– Validity of all assessment techniques

– Compare and contrast three assessment techniques

– Take and critique a 14 item multiple choice exam

– Write a Key Features item Coming up

– Scoring performance on an SP exam

Page 20: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Schematic Diagram of a 9 Station OSCE

Start

End

1 2 3 4

5

678 9

Page 21: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Widening the Lens on SP Assessment 1

Traditional scoring of SP assessment focuses on numerical data typically from checklists

Dimensions of the SP exam– basic science knowledge (organize the information)– physical exam skills (memory of routines)– establishing a human connection– role of the student (appear knowledgeable)– existential dimension of the human encounter (balance one’s

own beliefs with the patient’s) Clinical competence – mixture of knowledge and feeling,

information processing and intuition

1. Rose, M. & Wilkerson, L. Widening the Lens on Standardized Patient Assessment: What the Encounter Can Reveal about the Development of Clinical Competence, Acad. Med. 76(8), 2001.

Page 22: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Interim Summary of Session

Session thus far– Two uses of knowledge and performance

assessments: Formative and Summative– Validity of all assessment techniques– Compare and contrast three assessment

techniques– Take and critique a 14 item multiple

choice exam– Write a Key Features test item – Use narrative comments as part of

assessment via SP Coming up

– Improving clinical performance assessment systems

Page 23: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Dr. Tough’s Memo regarding Dr. Will E. Makit (PGY 2)

“The performance of Dr. Makit in General Surgery this month has been completely unsatisfactory. Every member of the clinical faculty who has had any contact with him tells me of his gross incompetence and irresponsibility in clinical situations. This person is an embarrassment to our school and our training program. I spoke to him about his performance after I talked with you several weeks ago and he told me that he would improve it. There was no evidence that he made any effort to improve. There is no way this can be considered a completed or satisfactory rotation in General Surgery. In fact, he is the most unsatisfactory resident who has rotated through our service in the last five years, and his behavior is an appalling example to the rest of our housestaff.” *************************************************************

Your 1. Refer the problem to the Resident Education Action? Committee for an administrative decision.

2. Assign Dr. Makit to a rotation with Dr. Insightful as the Attending Faculty.

Page 24: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Dr. Insightful’s Phone Comments regarding Dr. Makit

“Judging from this week, Will has a hesitant style of speech that sometimes conveys a sense of confusion. He needs to carefully outline what he wants to say before speaking. He knows what he is doing, but sometimes does not get credit for it. Rounds were well-organized. He allowed students full rein to discuss their cases before he contributed.” ************************************************************************** Your 1. Request that Dr. Insightful continue observing Dr. Action? Makit. 2. Request that Dr. Insightful coach Dr. Makit on making patient presentations. 3. Request that Dr. Insightful write an appraisal of Dr. Makit’s performance now.

Page 25: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Resident Performance Assessment System

Organizational Infrastructure

Page 26: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

A. Department’s Organizational Infrastructure

Department head emphasizes completing and returning PA forms

Consequences for evaluators who don’t complete PA forms

PA form is brief ( < 10 competencies) Don’t request pass/fail judgment by individual

faculty Evaluators trained to use PA form & criteria Evaluators believe they will be supported when

writing honest appraisals Specific staff assigned to monitor compliance in

returning forms Program Director alerted immediately when a

returned form reflects cautionary info

Organizational Infrastructure

Page 27: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

B. Elements of Individual Evaluator Role in PA System

1. Communicate expectations

2. Observe performance

3. Interpret and judge performance

4. Communicate performance information

5. Coach resident

6. Complete performance appraisal form

Page 28: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

B. Evaluator Role: Communicate Expectations and Observe Performance

1. Communicate Expectations– Consensus among evaluators

about service and education expectations

– Residents are crystal clear about service and education expectations

2. Observe Performance– Evaluators observe resident

multiple times before completing PA form

– Appraise only performance directly observed

– Other staff (e.g., nurses) complete PA forms

Communicate Expectations

Observe Performance

Page 29: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

B. Evaluator Role: Interpret and Judge Performance

Evaluators agree on what behaviors constitute outstanding, ‘average’, and marginal performance

When facing a marginal resident, evaluators record rationale for their judgment and info surrounding the event

Evaluators record their interpretation of the performance soon after behavior occurs

diagnose performance (quality wnl ?)

Page 30: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

B. Evaluator Role: Coach Resident

Evaluators aware of difference between corrective feedback, criticism and compliments

Faculty actively coach residents in timely manner

Residents are encouraged to ask for feedback

Evaluators regularly invite self-assessment from residents before giving feedback

Coach Resident

Page 31: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

B. Evaluator Role: Communicate Performance Information and Complete PA Form

5. Communicate Performance Info– Communicate incidents that are

significantly negative or positive

– Document in writing even single instances of poor or inappropriate behavior

6. Complete PA Form– Evaluators write specific narrative

comments on PA forms

– Evaluators forward completed PA forms to Director in timely way

Communicate performance

information (to whom ?)

Complete PA Form

Page 32: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

C. Elements of Program Director Role in PA System

1. Monitor and interpret performance appraisals

2. Committee decision

3. Inform resident

Page 33: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

C. Program Director’s Role: Monitor and Interpret Appraisals

Recognize evaluator rating patterns (stringent vs. lenient) to accurately interpret PA

Contact evaluators to elicit narrative info when absent to substantiate a marginal PA

Store PA forms in residents’ files in a timely manner

Summarize PA data to facilitate decision making by Resident Education Committee

Keep longitudinal records of PA data to develop norms for the PA form

Page 34: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

C. Program Director’s Role: Committee Decision

PA decisions are a collaborative process involving multiple faculty

Seven or more PA forms per resident are available when admin decisions are made

Sufficient written narrative documentation is available when admin decisions are madecommittee

decision

Page 35: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

C. Program Director’s Role: Inform Resident

Residents are given a summary of their performance every six months

Evaluators have written guidelines outlining what must legally be in a probation letter

Evaluators know what documentation is needed to ensure adequate due process

Each resident receives an end of program performance evaluation

inform resident

Page 36: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Formative Evaluation: Diagnostic Checklist for Resident Performance Assessment System

Identify Problems

Recommend Improvements

A. Organizational Infrastructure B. Evaluator’s Role 1. Communicate Expectations 2. Observe Performance 3. Interpret and Judge Performance 4. Communicate Performance Info 5. Coach Resident 6. Complete PA Form C. Program Director’s Role 1. Monitor & Interpret Appraisals 2. Committee Decision 3. Formally Inform Resident

Page 37: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Research: Improving Resident Performance Appraisals 1

Organizational Infrastructure – Discussed PA problems at department

meetings

– Appointed a task force to review PA problems and propose solutions

– Revised old appraisal form

– Pilot-tested and adapted the new appraisal form

Evaluator Role– Provided evaluators with examples of

behaviorally-specific comments Results

– Increased # of forms returned, # forms with behaviorally-specific comments, and # of administrative actions by prog.

1. Littlefield, J. and Terrell, C. Improving the quality of resident performance appraisals, Academic Medicine, 1997: 72(10) Supplement, S45-47.

Page 38: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Research: Improving Written Comments byFaculty Attendings

1

Organizational Infrastructure

– Conducted a 20 minute educational sessions on evaluation and feedback

– 3 by 5 reminder card and diary Results

– Increased written comments specific to defined dimensions of competence

– Residents rated quantity of feedback higher and were more likely to make changes in clinical management of patients

1. Holmboe, E., et.al. Effectiveness of a focused educational intervention on resident evaluations from faculty. J. Gen Intern Med. 2001: 16:427-34.

Page 39: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Research: Summative Evaluation of a PA System1,2

Research Question Pre-intervention Post-intervention

1. % rotations return PA forms?

mean: 73% range: 31-100%

mean: 97%

range: 67-100%

2. % PA forms communicate performance problems?

3.6%

(n = 17/479)

5.9%

(n = 64/1085)

3. Probability Program will take admin action?

.50

(5/10)

.47

(8/17)

4. Reproducibility of numerical ratings?

Gen. coef.= .59

(10 forms)

Gen. coef.= .80

(10 forms)

1. Littlefield, J.H., Paukert, J., Schoolfield, J. Quality assurance data for resident global performance ratings, Academic Med., 76(10), supp., S102-04, 2001.

2. Paukert, J. et. al., Improving quality assurance data for resident subjective performance assessment, manuscript in preparation

Page 40: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Recall a medical student or resident whose performance made you uneasy.

1. What behavior or event made you uneasy?

2. What action did you take?

a. Talk with faculty colleagues about your concerns

b. Write a candid performance appraisal and send it to the course/residency director

3. If you wrote a candid appraisal, did an administrative action occur related to the student/ resident?

Page 41: Assessment of Knowledge and Performance John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

Goals: Assessment of Knowledge & Performance

1. Clarify two distinct uses for assessments of clinical knowledge and performance

2. Define two aspects of validity for all assessment methods

3. Compare and contrast three techniques for assessing clinical knowledge and performance

4. Identify poorly written multiple choice test items

5. Write a key features test item

6. Describe a role for narrative comments in scoring interactions with Standardized Patients

7. Describe three elements of a clinical performance assessment system

8. Critique a clinical performance assessment system that you use