30
Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

Embed Size (px)

Citation preview

Page 1: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

Ramesh MehayProgramme Director (Bradford VTS)

Originally written 2007, updated Jan 2009

Page 2: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

Aims and objectives

Aims Increase our understanding of nMRCGP Help us feel more prepared for the assessments And therefore feel better!

Objectives Provide an overview of nMRCGP Share understanding Share concerns (and address them?) Practise COT

Page 3: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

Session plan

Overview of the MRCGP and its components Share fears and concerns Practise some COTs in groups ?modelling (IS2 – practise some CBDs)

Page 4: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

Background to nMRCGP

nMRCGP replaces both old MRCGP and SA Based on new GP curriculum

new curriculum developed by reviewing literature, very extensive consultation with doctors and patients, etc

All components of nMRCGP are mapped to the competencies in the curriculum

GP training now overseen by PMETB, like all other medical specialties (JCPTGP is dead)

Page 5: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

A programme of assessment…

Page 6: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

Components

AKT (Applied Knowledge Test) machine marked test, 3x/year, at various venues

CSA (Clinical Skills Assessment) OSCE-type exam, 3x/ year, Croydon

WPBA (Workplace Based Assessment)recorded in e-portfolio held by GP trainee throughout the 3 years

Page 7: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

Clinical Skills Assessment ‘Integrative assessment’ with 3 domains

Data gathering, technical and assessment Clinical management Interpersonal skills

13 stations, 10 mins each, balanced selection of cases

clear pass, marginal pass, marginal fail, clear fail, ‘serious concerns’

significant failure rate take early enough to have time to retake

Page 8: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

Work Place Based Assessment: WBPA

Workplace assessment: the assessment of actual working practices undertaken in the working

environment

Page 9: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

Overview of WBA

What the trainee actually does Competencies demonstrated ‘when ready’ Assessment of developmental progression -

guides decisions about future learning Recorded in an electronic portfolio Process is learner led - trainee has to ensure their

e-portfolio covers the e-curriculum

Page 10: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

WBA: compulsory components

Case Based Discussion (CBD) Consultation Observation Tool (COT) or Mini-

Clinical Evaluation Exercise (Mini CEX) Multi Source Feedback (MSF) Patient Satisfaction Questionnaire (PSQ) Direct Observation of Procedural Skills (DOPS)

Page 11: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

WBA: local subunits

OOH work booklet Clinical Supervisor’s Report (CSR)

Naturally Occurring Evidence (NOE) Significant Event Review (SER) Referrals analysis Audit (Case Review, Personal Learning, Complaints)

Page 12: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

Who makes judgements?

The Trainer/Clinical Supervisor as (s)he does the assessments

Educational Supervisor as he reviews the ‘whole’ thing with the trainee

ARCP panels who review the whole thing when a trainee is moving up an ST grade

Page 13: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

Case based Discussion (CBD) Structured interview designed to explore

professional judgement in clinical cases Professional judgement = ability to make holistic,

balanced and justifiable decisions in situations of complexity and uncertainty

Attributes tested: Application of medical knowledge Application of ethical frameworks Ability to prioritise, consider implications, justify decisions Recognising complexity and uncertainty

Page 14: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

CBD Competency areas

CBD looks at 10 of the 12 competencies

Practising holistically Data gathering and interpretation Making decisions/diagnoses Clinical management Managing medical complexity Primary Care Administration (IMT) Working with colleagues Community orientation Maintaining an ethical approach Fitness to practice

(not assessed by CBD: communication skills AND maintaining performance/learning/teaching)

Page 15: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

CBD - the process

Trainee selects 3 cases, gives material to trainer 1w in advance

Need balance of cases and contexts Trainer selects 2, and plans structured questions

in advance 1h session = cover 2 cases 20mins case, 10mins feedback Trainer records evidence and judges level of

performance (insuff evid/needs devel/competent/excellent)

Need to do a MINIMUM of 6 per post All 6 before the ES meeting! (really, within 4m)

Page 16: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

Key Points on CBD

It is a STRUCTURED oral interview On what the trainee actually did And why they did that And if they considered anything else at the time

So, don’t ask “what if” questions like you do in Random Case Analysis

Stick to the ‘here and now’ of the case Use the question maker framework on

www.bradfordvts.co.uk (click nMRCGP then click CBD)

Page 17: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

CBD: What’s the Experience So Far? Trainees

Initially anxious but less stressful than current SA Valued feedback Found it realistic Some concern re relationship with trainer

Trainers Time consuming, need extra protected time Helpful structure May be more helpful for difficult trainees Concern re relationship with trainees

Page 18: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

Consultation Observation Tool (COT)

Single consultation per session Trainee and Trainer view together Trainer assesses consultation on 4pt rating scale

(similar to old MRCGP/SA) No rule about consultation length Ideally at least one consultation is assessed by

someone other than trainer Ideally: wide range of contexts required, including

at least one child, older person, mental health problem

Page 19: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

What was wrong with the old MRCGP or Summative Assessment?

Page 20: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009
Page 21: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009
Page 22: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

Miller’s Pyramid or Prism of Clinical Competence

Page 23: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

What is Authentic Performance?“Testing should be as close as possible to the

situation in which one attacks the problem.”

“Ill-structured problems are not found in simulated and/or standardized tests.”

“The variation inherent in professional practice will always elude capture by a set of rules.”

Wiggins, Assessing Student Performance: Exploring the Purpose and Limits of Testing, Jossey-Bass, Inc. 1993

Page 24: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

Relationship between tools and competency areas

Page 25: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

Good Assessment Instruments have: Reliability (R) Validity (V) Educational impact (E) Acceptability (A) Cost (C)

(Mnemonic: CARVE)Van der Vleuten, The assessment of professional competence:

developments, research and practical implications, Adv Health Sci Educ 1 (1996),

Page 26: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

Why WPBA?

High validity = Authenticity High educational impact Reliability = depends on how many you do; also

some built in triangulation Reconnects assessment with learning and the

workplace Assessment over entire training envelope Cost Effective and now accepted!

Page 27: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

And it gives continuous feedback

“a process of monitoring student’s progress through an area of learning so that decisions can be made about the best way to facilitate future learning”

Page 28: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

The Problem With WPBA

Inter-observer variation Intra-observer variation Case specificity

Page 29: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

Requirements of a high stakes performance assessment Specification Calibration Moderation Training Verification and audit

(Baker, O’Neil, Linn 1991)

Page 30: Ramesh Mehay Programme Director (Bradford VTS) Originally written 2007, updated Jan 2009

Rough Guide to Rating Scale

Excellent – Smooth and efficient. Able to use knowledge, judgment and skills to adjust management appropriately to the specific patient and operative procedure.

Competent – Lacks smoothness and efficiency but is able to use knowledge, judgment and skills to adjust management appropriately to the specific patient and operative procedure.

NEEDS FURTHER DEVELOPMENT: Beginner – Lacks smoothness and efficiency. Able to

manage the case but exhibits limited use of personal judgment and responsiveness to the specifics of the patient and operative procedure. Requires some limited coaching or attending intervention.

Novice – Can only manage the case with extensive coaching and attending intervention.