Competencias Clinical Skills

Embed Size (px)

Citation preview

  • 8/13/2019 Competencias Clinical Skills

    1/8

    2013

    2013; 35: 883890

    HOW WE. . .

    How to develop a competency-basedexamination blueprint for longitudinal

    standardized patient clinical skills assessmentsSOMNATH MOOKHERJEE1, ANNA CHANG2, CHRISTY K. BOSCARDIN2 & KAREN E. HAUER2

    1University of Washington School of Medicine, USA, 2University of California, San Francisco, USA

    Abstract

    Background: Objective Structured Clinical Exams (OSCEs) with standardized patients (SPs) are commonly used in medical

    education to assess learners clinical skills. However, assessments are often discrete rather than intentionally developmentally

    sequenced.

    Aims: We developed an examination blueprint to optimize assessment and feedback to learners with purposeful sequence as a

    series of longitudinally integrated assessments based on performance milestones. Integrated and progressive clinical skillsassessments offer several benefits: assessment of skill development over time, systematic identification of learning needs, data for

    individualized feedback and learning plans, and baseline reference points for reassessment.

    Methods: Using a competency-based medical education (CBME) framework, we translated pre-determined competency

    milestones for medical students patient encounters into a four-year SP-based OSCE examination blueprint.

    Results: Initial evaluation of cases using the blueprint revealed opportunities to target less frequently assessed competencies and

    to align assessments with milestones for each year.

    Conclusions: The examination blueprint can guide ongoing SP-based OSCE case design. Future iterations of examination

    blueprints can incorporate lessons learnt from evaluation data and student feedback.

    Introduction

    Student assessment must evolve from discrete assessments thatdo not contextualize learners prior or future performance to a

    strategy that is intentionally and longitudinally integrated with

    students overall development. The Association of American

    Medical Colleges has recommended that, Medical schools

    adopt an explicit, developmental approach to the design of

    the skills education curriculum, including expected levels

    of skill performance proficiency throughout the four-year

    curriculum (Danoff 2005). The International Competency-

    Based Medical Education (CBME) Collaborators similarly state,

    Greater emphasis should be placed on the developmental

    progression of abilities and on measures of performance

    (Frank et al. 2010). Progression through medical school istypically organized in pre-clinical and clinical years;

    this division creates a natural barrier to designing devel-

    opmentally-integrated clinical skills curricula. As educators

    endeavor to re-design undergraduate medical education

    curricula to meet the goal of a more developmental sequence,

    similar efforts are needed to keep pace in the domain of

    student assessment.

    Objective Structured Clinical Exams (OSCEs) have been

    used for student assessment of clinical competence for over

    three decades (Harden et al. 1975). An OSCE is a series of

    stations in which students are required to perform clinical tasks

    with objective assessment of competence (Turner & Dankoski

    2008). OSCEs typically incorporate standardized patients(SPs) individuals who have been trained to depict clinical

    cases, interact with trainees during simulated clinical encoun-

    ters, and in some instances participate in trainee assessment

    and feedback. However, not all OSCEs include SPs, and SPs

    are used in other educational applications such as teaching

    without evaluation. SP-based OSCEs are particularly valuable

    in competency assessment because they challenge students

    to demonstrate skills in authentically simulated encounters.

    In this article, we use the term SP-based OSCEs to

    specifically identify this subset of OSCEs. Figure 1 presents a

    glossary of commonly used terms in this article.

    Practice points

    . Clinical skill examinations must reflect the expected

    level of competence at each stage of training and be a

    part of an integrated, developmentally-sequenced series

    of assessments.

    . A competency-based examination blueprint for standar-

    dized patient clinical skill examinations can improve

    examination validity, facilitate alignment with compe-

    tencies and milestones, enable progressive clinical skills

    assessments, and allow for targeted feedback.

    Correspondence: Somnath Mookherjee, MD, Assistant Professor, Department of Medicine, Division of General Internal Medicine, University ofWashington School of Medicine, Magnuson Health Sciences Center, Room B-503, Box 356429, Seattle, WA 98195-6429, USA. Tel: (206) 744-3391;fax: 206 221 8732; email: [email protected]

    ISSN 0142159X print/ISSN 1466187X online/13/1108838 2013 Informa UK Ltd. 883DOI: 10.3109/0142159X.2013.809408

  • 8/13/2019 Competencias Clinical Skills

    2/8

    CBME frameworks are based on competence in the

    activities and attributes of practicing physicians (Harris et al.

    2010). SP-based OSCEs are uniquely suited for operationaliz-

    ing assessment in a CBME framework, because they empha-

    size observable behaviors that are required for patient care in

    the context of realistic learning and assessment experiences.

    For example, using Millers pyramid as a model, SP-based

    OSCEs require that students show or do rather than simply

    know or know how across multiple competency levels

    (Miller 1990). In SP-based OCSEs, student behavior is observed

    and assessed in simulated clinical environments with a

    standardization and fidelity that are difficult to duplicate in

    actual clinical practice or with other modalities such as written

    or oral examinations, which target cognitive rather than

    performance level of competency (Patricio et al. 2009).

    Although SP-based OSCEs have been used in undergraduate

    medical education for nearly 40 years (Harden et al. 1975),

    they are typically used for summative assessment at the end of

    blocks of curricula (such as at the end of a clinical skills course

    or years of undergraduate training).

    As with any assessment, to be assured that SP-based OSCEs

    are measuring targeted aspects of performance, educatorsmust establish the validity of examinations. Psychometric

    theories of assessment explicate the reliability and validity of

    examinations, and emphasize the importance of the precision

    and reproducibility of measurements. (Brannick et al. 2011;

    Schuwirth & van der Vleuten 2011). However, the concept of

    construct validity is increasingly accepted as a framework for

    considering the overall validity of an examination. Simply put,

    a valid examination is one in which the outcomes can be

    trusted because the examination measures what it is intended

    to measure. What is intended to be measured is defined as the

    construct. Providing evidence forcontentvalidity is one of

    several important components of establishing the overallconstruct validity of an SP-based OSCE. Other sources of

    evidence for validity include response process, internal

    structure, relation to other variables, and consequences

    (Messick 1995; Cook & Beckman 2006). To evaluate the

    evidence for content validity, one should ask How do I know

    exactly what competencies the examination authors are trying

    to assess, and how do I know that this is what the examination

    is actually assessing? Evidence for content validity includes

    clearly defining the competencies being targeted in the

    examination and detailing the process used to develop

    examination items (i.e. the qualifications of item authors,

    methods used to select test items) (Cook & Beckman 2006).However, methods used to establish the evidence for

    content validity of SP-based OSCE exams are seldom pub-

    lished (Patricio et al. 2009). Published examples of SP-based

    OSCE blueprinting are rare (Tombleson et al. 2000; Boulet

    et al. 2003), and methods used to develop OSCE cases and

    assessment checklists are usually not reported (Gorter et al.

    2000). Consequently, medical educators have advocated for

    strengthening content validity evidence for SP-based OSCEs by

    using an examination blueprint as a framework for designing

    the examination (Newble 2004; Turner & Dankoski 2008).

    The validity of the examination is strengthened when it

    reflects explicitly defined competency milestones; an examin-ation blueprint facilitates this step (Coderre et al. 2009;

    Sales et al. 2010).

    In this article, we describe a systematic process for

    developing an examination blueprint for our medical schools

    four-year SP-based OSCE clinical skills assessment program. We

    use a CBME framework to align the blueprint with pre-defined

    medical school competency milestones for each year. Our goal

    is to produce a developmentally sequenced, longitudinal

    blueprint to serve as a guide in creating SP-based OSCE

    examinations. This blueprint supports the validity of the

    SP-based OSCE series, enables clinical skills examinations to

    be intentionally aligned with competencies, allows for progres-sive clinical skills assessment, enhances provision of devel-

    opmentally-sequenced feedback, and serves as the foundation

    for longitudinal assessment/reassessment of students.

    Competency:Skills or aributes required of physicians. Competence refers to achieving the level

    of ability required to pracce as a physician.

    Competency Based Medical Educaon (CBME) frameworks: Educaonal frameworks emphasizing

    competence in the acvies and aributes of praccing physicians (Harris, Snell et al. 2010).

    Examples include the American Accreditaon Council for Graduate Medical Educaon (ACGME)

    competencies (Swing 2007), the Canadian CanMEDS physician competency framework

    (Frank and Danoff 2007), and the European Tuning Project (Cumming and Ross 2007).

    Construct validity:How well that which is intended to be measured (the construct) is actually

    measured by the examinaon scores. (Messick 1995; Cook and Beckman 2006)

    Content evidence:One aspect of establishing the construct validity of an assessment; shows how

    well the assessment represents the construct. (Cook and Beckman 2006)

    Examinaon blueprint:A template used to define the content of an examinaon (Sales, Sturrock et

    al. 2010). This can take the form of a table in which the axes are labeled content area, competency

    area. In a longitudinal blueprint, one of the axes is the developmental stage of the learner.

    Milestones:Descripon of expected performance in a competency at a given level of training or

    development.

    Objecve Structured Clinical Examinaon (OSCE):An examinaon of competence in clinical tasks,

    usually comprised of mulple staons using standardized paents (Turner and Dankoski 2008).

    Standardized Paent (SP):Individual trained to depict a clinical condion or scenario, interact with

    students during a simulated clinical encounter, and in some cases parcipate in trainee evaluaon

    and feedback.

    Figure 1. Glossary of terms.

    S. Mookherjee et al.

    884

  • 8/13/2019 Competencias Clinical Skills

    3/8

    What we did

    At our institution, medical students participate in six SP-based

    OSCE examinations through the four years of medical school.

    Three of these sessions (once in year one and twice during

    year three) are designed primarily for formative assessment

    and to provide feedback. In those settings, students receivefeedback on communication skills from the patient perspective

    as well as feedback on their history taking, physical examin-

    ation, and clinical problem-solving skills. At the end of each of

    the first three years, there are high-stakes summative assess-

    ments, which are presented as data for student advancement

    decisions. Each of the OSCEs consists of three to eight SP-

    based stations. Students are assessed using checklists devel-

    oped by educators and completed by the SPs, and receive

    feedback from SPs, peers, and/or faculty. Students also answer

    post-encounter written medical knowledge and clinical rea-

    soning questions called interstation exercises that include

    multiple choice questions, short answer questions, written

    medical notes, or oral presentations. While OSCEs are

    designed to meet the learning objectives of the pre-clerkship

    clinical skills course during the first two years and the core

    clerkships during the latter two years, the OSCEs were

    originally not designed as a series of longitudinally sequenced

    assessments. In the following sections, we detail the process

    that we undertook to develop the blueprint. Figure 2 outlines a

    streamlined six-step blueprint development method based on

    the lessons we learned from this process.

    Development of institutional milestones

    The first step in developing an examination blueprint is to

    define the curricular objectives being targeted for assessment

    (Coderre et al. 2009; Sales et al. 2010). We began with existing

    milestones established by education leaders at the School

    of Medicine. At our institution, a Committee on Student

    Assessment had been charged with recommending strategies

    for improving the quality of assessment of student perform-

    ance. The committee recommended that assessment be based

    on competencies specific to each of the four years of medical

    school and modeled after the six American Accreditation

    Council for Graduate Medical Education (ACGME) competen-cies for graduate medical education. Based on graduation

    competencies, the process of authoring milestones specific to

    each year of medical school began with the third year core

    clerkships since clerkship directors were already familiar with

    the ACGME competencies. The clerkship directors worked in

    small groups by competency domains to define core clerkship

    year milestones. The small group work was approved by the

    Committee, which included faculty and students. Next, the

    course committee for the pre-clerkship curriculum and a

    committee representing the fourth year then authored mile-

    stones for students in those years. The pre-clerkship, clerkship,

    and fourth-year committees worked collaboratively and itera-tively to review proposed milestones, ensure a developmental

    trajectory, and discuss the relevance of each area until

    consensus was reached. These institutional milestones were

    used as the foundation for developing our SP-based OSCE

    examination blueprint.

    Development of the competency-based, longitudi-

    nal, developmentally-sequenced blueprint

    The pre-defined institutional milestones provided a framework

    with which to develop a longitudinal competency-based

    blueprint for SP-based OSCE exams. The blueprint develop-

    ment process consisted of four steps over a period of months,

    each discussed below with examples from our experience: (A)

    analysis of the institutional competencies and milestones in

    relation to OSCEs; (B) determination of guiding principles for

    1 Select a competency-based medical educaon (CBME) framework

    2

    Dichotomize appropriate competencies into (a) knowledge and

    comprehension and (b) applicaon, analysis, synthesis and evaluaon.

    3 Use the CBME framework to define milestones for each level of training

    4 Select milestones appropriate for assessment using OSCEs

    5

    Prepare for wring the blueprint by mapping exisng checklist items

    to selected milestones

    6 Compose final blueprint using guiding principles

    GUIDING PRINCIPLES

    1. Reflect the intent of

    instuonal milestones

    and the chosen CBME

    framework (if applicable).

    Performance descriponsare longitudinal

    progressive and successive

    2. Use descriptors of

    observable, low-inference

    student behaviors

    3. Ensure that the blueprint

    is interpretable by case

    authors to guide the

    development of case

    content and checklist items

    Figure 2. Method to develop a competency-based, longitudinal, developmentally-sequenced blueprint for objective structured

    clinical examinations.

    Clinical skills examination blueprint

    885

  • 8/13/2019 Competencias Clinical Skills

    4/8

    developing the blueprint; (C) creation of the blueprint; and (D)

    evaluation of blueprint usability.

    A. Analysis of the institutional competencies and

    milestones in relation to OSCEs

    We analyzed how the institutional milestones, developed to

    encompass the entire educational experience of the student,

    could be applied in SP-based OSCEs. The goal of this analysis

    was to determine whether student competence in selected

    milestones could be demonstrated in an OSCE and to create a

    compilation of milestones that are the most salient for clinical

    skillsassessments. Fora sample of OSCE cases from each year of

    medical school, we reviewed the narrative description of the

    case, instructions to the SP, assessment checklist items and

    interstation exercises. For each case, we asked: can a students

    competence in a milestone be assessed using this OSCE case?

    Four major lessons emerged during this initial analysis that were

    key to the creation of the examination blueprint are as follows:

    (1) We learnt that some milestones, particularly those inthe affective domain in Blooms taxonomy (Bloom

    1956), were unsuitable for assessment in an OSCE.

    Examples include first and second year milestones

    under the professionalism competency: Recognize

    when the needs of others diverge from ones own

    needs, and develop strategies to balance these. Some

    milestones were more readily assessed with other

    methods, for example, Use a portfolio to document

    professional and personal development and Navigate

    the campus [electronic curriculum management plat-

    form] and other university technology systems.

    (2) We found that the milestones most suitable for assess-

    ment with an SP-based OSCE examination described

    specific student behaviors that demonstrate compe-

    tence in clinical skills. For example, the milestone,

    Perform a full physical exam on an adult patient in a

    logical sequence is one which the use of a well-

    designed SP checklist could assess.

    (3) We realized that one of the major advantages of SP-

    based OSCEs is simultaneous demonstration of per-

    formance in multiple inter-related competency

    domains, such as medical knowledge and patient

    care. Conceptually, it can be useful to consider these

    competencies together in patient encounters, as com-

    petence in both domains is required for success indemonstrating competence in either domain.

    Therefore, we categorized these two domains together

    in a combined Medical Knowledge/Patient Care

    domain.

    (4) Finally, we realized that milestones in the Medical

    Knowledge/Patient Care combined competency

    domain are best conceptualized as representing two

    separate levels of cognitive skills. These two categories

    correspond to the components of the cognitive domain

    of Blooms taxonomy: knowledge and comprehension

    are the more basic and foundational skills, whereas

    application, analysis, synthesis, and evaluation are themore advanced and analytical skills. Dichotomizing the

    history taking, physical examination, and patient man-

    agement subdomains of the Medical Knowledge/

    Patient Care combined competency domain makes

    explicit the need for progressive learning in both

    acquisition of knowledge and development of the

    more advanced analytical skills. For example, a third

    year Medical Knowledge/Patient Care milestone

    regarding physical examination states that the student

    should be able to perform a clinically relevant, focusedphysical exam relevant to the discipline, patient com-

    plaint, and differential diagnosis, while the fourth year

    milestone states, Focus or expand the physical exam

    based upon clinical presentation and differential diag-

    nosis with a variety of patient presentations in a time

    efficient manner. Thus the level of knowledge and

    comprehension required could be similar, but more

    advanced skill is required in the fourth year milestone

    in application, analysis, synthesis and evaluation. On

    this basis, we dichotomized milestones under the

    Medical Knowledge/Patient Care domain into two

    separate levels: (a) knowledge and comprehensionand (b) application, analysis, synthesis, and evaluation.

    On the basis of these lessons, we operationalized the

    selected and integrated institutional milestones as a framework

    for creating the OSCE blueprint (Table 1).

    B. Determination of guiding principles for develop-

    ing the blueprint

    With the blueprint framework in place, we determined

    guiding principles to map the existing milestones onto the

    longitudinal OSCE blueprint. Over multiple, iterative discus-

    sions among the authors and other core educators, three

    principles emerged.(1) The blueprint must align closely with institu-

    tional milestones. This maintains concordance with

    the sequential nature of the milestones. The key

    characteristics were that the examination content must

    be progressive (a skill required to be demonstrated by a

    first year student should not be more advanced than

    one required of a fourth year student) and cumulative

    (when possible, it should be evident that competence

    in milestones from year to year builds on the skills of

    the prior year). Furthermore, to maximize the legitim-

    acy of the blueprint, the blueprint content must reflect

    the intent of the institutional milestones: wheneverpossible, the blueprint should contain similar text and

    descriptors as used in the original institutional

    milestones.

    (2) The blueprint must use descriptions of low-

    inference behavior. It is important to translate the

    milestones into descriptions of behaviors that can be

    objectively demonstrated by the student in the course

    of the OSCE whenever possible. A potential source of

    variability and decreased reliability in OSCEs are rubrics

    which require a high degree of subjectivity for deter-

    mination of competence (Brannick et al. 2011). While

    assessment reliability improves with multiple raters andmultiple assessments of the same competencies across

    stations, checklist items that raters can easily under-

    stand and use are desirable. Discrete, low-inference

    S. Mookherjee et al.

    886

  • 8/13/2019 Competencias Clinical Skills

    5/8

    behaviors, referred to as critical actions in one study,

    have been shown to correlate with performance (Payne

    et al. 2008). Therefore, assessment rubrics should be as

    low-inference as possible; optimally, the rater should be

    able to observe the presence or absence of a behavior

    that represents some aspect of competence. For

    example, under the Interpersonal and Communication

    Skills competency domain, a first year milestone states

    that the student Listens to patient, avoids jargon, and

    expresses empathy when appropriate. In an SP-basedOSCE in which the SP shares that she is in pain, one of

    the observable critical actions could be whether the

    student verbally expresses empathy to the patient. In

    contrast, a high-inference criterion that is less effective

    would read: the student is empathetic.

    (3) The blueprint must be of practical utility. Our

    final guiding principle stated that blueprint must be

    interpretable by SP case authors to guide the develop-

    ment of case content as well as student assessment

    checklist items that define low inference behaviors.

    Figure 2 summarizes our method of applying the three

    guiding principles to the development of a competency-based,longitudinal, developmentally-sequenced OSCE blueprint.

    C. Creation of the blueprint

    One author (SM) created the initial draft of the blueprint.

    Grounded in the pre-defined institutional milestones, one to

    three sentence descriptions of expected student performance

    for each year of medical school were composed using the

    blueprint guiding principles. Each description was then

    evaluated by three authors (SM, AC, and KEH) to determine

    alignment with guiding principles. Over 10 in-person meetings

    in addition to electronic written exchanges, the authors revised

    the blueprint until a final version that met all three principles

    (Table 2). The final examination blueprint fills-in the last fourcolumns of the blueprint framework in Table 2. By reading

    across each row, students expected development of compe-

    tence in each domain as they move through medical school can

    be seen. Reading down each column provides an overview of

    competence in each domain for a given year of medical school.

    D. Evaluation of blueprint usability

    We evaluated the usability of the blueprint by mapping the

    checklist items of the 14 current OSCE cases. Each existing

    checklist item was placed on the blueprint by examining

    competency assessed and fit with milestone by year of medical

    school. We found that the examination blueprint could be

    easily used to map checklist items to a competency domain

    and year of medical school. We scored each OSCE case for thenumber of assessment items in each year in each competency.

    Using this framework, we identified cases that appeared to be

    too advanced or too easy for the year in which they were used.

    For example, one first-year case which was known to be

    challenging had 74% of items mapped to third year milestones.

    We also found several competencies that were under-

    represented in the original cases, including Systems-Based

    Practice and Practice-Based Learning and Improvement.

    What to do next?

    Future plans include comparing performance by advancedstudents (i.e. graduating) versus novice students (i.e. entering)

    on the same cases to confirm that when applied, the

    examination blueprint reflects the developmentally-sequenced

    milestones. For example, beginning medical students on

    average should not score as well as senior medical students

    on cases that map to the fourth year milestones on the

    blueprint. The blueprint is also currently being used to develop

    a longitudinal case with multiple parts occurring over time,

    in which students will see the same standardized patient in

    different clinical scenarios across four years of medical school,

    with progressively more challenging expectations targeting

    milestones for each year. The blueprint is also being used todevelop a systematized competency-based feedback tool to

    target areas in which students under-perform. By using the

    blueprint, this feedback tool will provide a personalized profile

    Table 1. Blueprint framework.

    Descriptions of competency by year

    Competency Domains MS1 MS2 MS3 MS4

    Patient Care/Medical Knowledge History taking knowledge and comprehensionHistory taking application, analysis, synthesis and evaluation

    Physical examination knowledge and comprehension

    Physical examination application, analysis, synthesis, and evaluation

    Patient management knowledge and comprehension

    Patient management application, analysis, synthesis, and evaluation

    Oral Case Presentation and Medical Notes

    Practice-Based Learning

    and Improvement

    Evidence-Based Medicine

    Reflection and Self-Improvement

    Interpersonal and

    Communication Skills

    DoctorPatient Relationship

    Communication and Information Sharing with Patients and Families

    Prof es si on al ism Wo rk Ha bi ts , Appea ra nc e, a nd Et ique tt e

    Professional Relationships

    Ethical Principles

    Systems-Based Practice Healthcare Delivery Systems

    Clinical skills examination blueprint

    887

  • 8/13/2019 Competencias Clinical Skills

    6/8

  • 8/13/2019 Competencias Clinical Skills

    7/8

  • 8/13/2019 Competencias Clinical Skills

    8/8