JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations: a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 221
Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations: a systematic review
Kim Sears, RN, PhD1
Christina M. Godfrey, RN, PhD1
Marian Luctkar-Flude, RN, MScN2
Liane Ginsburg, RN, PhD3
Deborah Tregunno, RN, PhD2
Amanda Ross-White, MLIS, AHIP4
1 The Queen's Joanna Briggs Collaboration for Patient Safety: a Collaborating Center of the Joanna
Briggs Institute; Queen's University, Kingston, Ontario, Canada
2 School of Nursing, Queen's University, Kingston, Ontario, Canada
3 York University, Toronto, Ontario, Canada
4 Bracken Library, Queen's University, Kingston, Ontario, Canada
Corresponding author:
Kim Sears
Executive summary
Background
The measure of clinical competence is an important aspect in the education of healthcare
professionals. Two methods of assessment are typically described; an objective structured
clinical examination and self-assessment.
Objectives
To compare the accuracy of self-assessed competence of healthcare learners and healthcare
professionals with the assessment of competence using an objective structured clinical
examination.
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations: a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 222
Inclusion criteria
Types of participants
All healthcare learners and healthcare professionals including physicians, nurses, dentists,
occupational therapists, physiotherapists, social workers and respiratory therapists.
Types of intervention
Studies in which participants were first administered a self-assessment (related to
competence), followed by an objective structured clinical examination; the results of which
were then compared.
Types of outcomes
Competence, confidence, performance, self-efficacy, knowledge and empathy.
Types of studies
Randomized controlled trials, non-randomized controlled trials, controlled before and after
studies, cohort, case control studies and descriptive studies.
Search strategy
A three-step search strategy was utilized to locate both published and unpublished studies.
Databases searched were: Medline, CINAHL, Embase, ERIC, Education Research Complete,
Education Full Text, CBCA Education, GlobalHealth, Sociological Abstracts, Cochrane,
PsycInfo, Mosby’s Nursing Consult and Google Scholar. No date limit was used.
Methodological quality
Full papers were assessed for methodological quality by two reviewers working independently
using the Joanna Briggs Institute Meta-Analysis of Statistics Assessment and Review
Instrument (JBI-MAStARI).
Data collection
Details of each study included in the review were extracted independently by two reviewers
using an adaptation of the standardized data extraction tool from JBI-MAStARI.
Data synthesis
Meta-analysis was not possible due to methodological and statistical heterogeneity of the
included studies. Hence study findings are presented in narrative form. The data was also
analyzed using ‘The Four Stages of Learning’ model by Noel Burch.
Results
The search strategy located a total of 2831 citations and 18 studies were included in the final
review. No articles were removed based on the critical appraisal process. For both
competence and confidence, the majority of studies did not support a positive relationship
between self-assessed performance and performance on an OSCE.
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations: a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 223
Conclusions
Study participants’ self-assessed competence or confidence was not confirmed by
performance on an objective structured clinical examination. An accurate self-assessment
may be threatened by over confidence and high performers tend to underestimate their ability.
It is theorized that this disparity may in part be due to the stage that the learner or professional
is in, with regard to knowledge and skill acquisition. Educators need to examine their
evaluation methods to ensure that they are offering a varied and valid approach to
assessment and evaluation. Notably, if self-assessment is to be used within programs, then
learners need to be taught how to perform consistent and accurate self-assessments.
Implications for practice
It is important that educators understand the limitations within the evaluation of competence.
Key aspects are the recognition of the stage that the learner is in with regard to skill
acquisition and equipping both learners and professionals with the ability to perform consistent
and accurate self-assessments.
Implications for research
There is a need for standardization on how outcomes are identified and measured in the area
of competence. Further, identifying the leveling of an OSCE and the appropriate number of
stations is required.
Keywords
Self-assessment; objective structured clinical examinations; education; healthcare learners;
healthcare professionals
Background
Establishing the effectiveness of the health professional education process is complex and requires a
multifaceted approach to assess the outcomes.1 Typically, outcomes are assessed in terms of the
competence of the professional, level of confidence, performance and/or skills. Throughout the
literature on this topic, these terms are used interchangeably, but there is overlap and some terms
may encompass others. Descriptions/definitions of these terms are provided as follows.
Competence
In their paper that discusses the definition and assessment of professional competence, Epstein and
Hundert define professional competence as: “the habitual and judicious use of communication,
knowledge, technical skills, clinical reasoning, emotions, values and reflection in daily practice for the
benefit of the individual and community being served”.2(p.226)
Confidence
Holland’s concept analysis of professional confidence describes four components; namely affect
(feelings associated with action), reflection (thoughtfully examine one’s actions and intentions), higher
cognitive functioning (which includes aspects such as learning and integration of concepts, decision
making, attention, motivation and memory) and action.3
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations: a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 224
Performance
The on-line Merriam-Webster dictionary defines performance as the execution of an action or
something accomplished – a deed or feat.4
Skill
Skill is defined as proficiency, facility, or dexterity that is acquired or developed through training or
experience.5
Self-efficacy
Self-efficacy is defined as people's beliefs about their capabilities to produce designated levels of
performance that exercise influence over events that affect their lives.6
Knowledge
Knowledge is defined as: (1) the fact or condition of knowing something with familiarity gained
through experience or association; (2) acquaintance with or understanding of a science, art, or
technique.7
Empathy
Empathy is defined as the ability to understand and share the feelings of another.8 Empathy is an
essential quality for health professionals and is often measured along with other skills and
competence.
Looking at the above definitions of competence, confidence and performance it is clear that there is
considerable overlap. The term competence was found to be the most inclusive. Given the nuances
involved in each term, this review refers to two main concepts: that of competence (including
knowledge and performance) and that of confidence (including self-efficacy). Further, the findings
were examined for healthcare learners and healthcare professionals.
There are a variety of ways to measure health professionals’ competence. Although Objective
Structured Clinical Examinations (OSCEs) are considered to be valid and reliable for assessing
clinical skills, they are labor-intensive and costly to design and implement.9,10
By comparison self-
assessment requires less resources to implement; however, it is unclear whether self-assessment is
an effective measure of competence and how well it correlates with actual performance.
Self-assessment has been defined as: “the evaluation or judgment of ‘the worth’ of one’s performance
and the identification of one’s strengths and weaknesses with a view to improving one’s learning
outcomes.”11(p.146)
For example, self-reported patient safety competence may provide data about
learners’ insights into and likely safety of their own practice12
, and about their perceived strengths or
limitations.13,14
The value of using more objective methods to assess competence is unclear. Recent
studies examining self- versus expert assessment of technical and non-technical skills have produced
mixed results. Surgeons seem to be able to accurately assess their own technical skills but not their
non-technical skills15
; however, an earlier study of junior medical officers found no correlation between
their self-assessments of confidence and their measured competencies on routine procedural skills.16
A 2006 systematic review examined how accurately physicians subjectively evaluated their own
competence compared with external observations of their competence.17
Davis and colleagues
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations: a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 225
concluded that physicians have a limited ability to accurately self-assess.17
This may be particularly
true among those rated as the least skilled and those who were the most confident. These results
were found to be consistent with other professions.18
The OSCE is another method that has shown to be a useful means to assess the competence of a
learner. Typically, an OSCE consists of a specific scenario established by the examiners that requires
the learner to demonstrate their proficiency in that area. The evaluator can control the environment
and standardize the patient and in this manner use the OSCE to objectively assess competencies (i.e.
knowledge, attitudes and behaviors). There is growing recognition that OSCEs are appropriate for
evaluating the interpersonal skills associated with breaking bad news or cross-cultural interviewing.19
The use of the OSCE to assess physician communication skills is also becoming more common.20-23
In the realm of patient safety, there is a small but emerging body of literature encouraging the use of
OSCEs to assess aspects of patient safety competence among medical trainees.24-30
In this area,
most OSCEs assess the technical aspects of patient safety or quality improvement competence27,29-31
,
or clinical aspects of patient safety such as hand hygiene compliance and medication labeling.27
Few
studies describe the use of OSCEs to assess socio-cultural aspects of patient safety26,28
, and those
that do tend to focus on communicating/disclosing an error and are discipline-specific in nature.32,33
In nursing, a recent integrative review by Walsh and colleagues located 41 papers and identified
major gaps regarding the psychometrics of nursing OSCEs.34
In concluding their review, the
researchers highlighted the need for additional research on using the OSCE as an evaluative tool in
nursing.
The OSCE is thought to be a more objective measure than self-assessment. However, while limited,
examinations of the extent to which OSCE performance predicts outcomes on other performance
metrics are somewhat equivocal. Some studies have failed to detect a significant positive relationship
between OSCE performance and other forms of summative evaluations of health profession
learners.25
A study by Tamblyn found that scores achieved in a patient-physician communication and
clinical decision-making OSCE that was part of a national licensing examination predicted complaints
to medical regulatory authorities up to 10 years later.35
In an environment where providing optimal student learning and quality patient care is a goal, there is
a need to explore whether a link exists between self-assessment scores and OSCEs in light of
providing the best learning for the most affordable means. It has been noted that some studies
comparing self- and external assessments of competence (such as the OSCE) have had several
methodological problems. Davis and colleagues report that fewer than half of the studies they
included in their systematic review: (1) used pretested or validated OSCEs or standardized patients or
assessment instruments, or (2) described objective criteria for performance assessment.17
Others
have noted there is insufficient methodological detail in most published research involving
standardized patients (SP), in particular details pertaining to SP characteristics and their training.36
An examination of the Cochrane Library of Systematic Reviews, the JBI Database of Systematic
Reviews and Implementation Reports and the PROSPERO database indicates that no systematic
reviews have been completed (or proposed) on this topic since the Davis review in 2006.17
Building
on the Davis review which focused solely on physicians, this systematic review explored research that
included all healthcare learners and healthcare professionals and examined the relationship between
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations: a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 226
self-assessed competence and objective assessments of competence using the OSCE. As this
review overlaps the same time period as Davis, it is worth noting that three studies (Fox37
, Barnsley16
,
and Leopold38
) that were in the Davis systematic review met the inclusion criteria for this study. The
proposed synthesis is part of a broader program of research which builds on recommendations from
numerous international bodies regarding the need to restructure health professional education to
ensure it equips learners with the knowledge, skills and attitudes they need to function safely.1,39-42
Notably, there is also recognition that what is evaluated drives what is taught and learnt.43,44
Accordingly, development of an OSCE for adoption by various health professional education
programs may be crucial for truly integrating patient safety into health professional education. Just as
written examinations and OSCEs assess different things45,46
, so do subjective and objective
assessments; however, both are understood to yield important data.12
The objectives, inclusion criteria and methods of analysis for this review were specified in advance
and documented in a protocol.47
Review question/objective
The objective of this systematic review was to compare the use of self-assessment instruments to an
OSCE to measure the competence of healthcare learners and healthcare professionals. The question
used to guide the review was: when measuring the competence of healthcare learners and healthcare
professionals, is the evaluation of competence using self-assessment instruments comparable to
using an OSCE?
Inclusion criteria
Types of participants
This review considered all healthcare learners and healthcare professionals including but not limited
to physicians, nurses, dentists, occupational therapists, physiotherapists, social workers and
respiratory therapists.
Types of intervention
This review considered studies in which participants were first administered a self-assessment
(related to competence), followed by an OSCE; the results of which were then compared.
Types of outcomes
This review considered studies that included the following outcome measures: competence,
confidence, performance, self-efficacy, knowledge and empathy as defined above.
Types of studies
This review considered both experimental and epidemiological study designs including randomized
controlled trials, non-randomized controlled trials, quasi-experimental, before and after studies,
prospective and retrospective cohort studies, case control studies and analytical cross-sectional
studies for inclusion.
Descriptive epidemiological study designs including case series, individual case reports and
descriptive cross-sectional studies were also considered for inclusion.
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations: a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 227
Search strategy
The search strategy aimed to find both published and unpublished studies (Appendix I). A three-step
search strategy was utilized in this review. An initial limited search of MEDLINE and CINAHL was
undertaken, followed by an analysis of the text words contained in the title and abstract and of the
index terms used to describe the article. A second search using all identified keywords and index
terms was then undertaken across all included databases. Thirdly, the reference lists of all identified
reports and articles were searched for additional studies. This review only included studies published
in English. In order to provide the broader picture of all available literature on this topic, the non-
English literature was tallied (but not translated). Although this review was building on a review done
in 2006, in order to be thorough the search included articles from the inception of each database.
The databases searched included:
Medline, CINAHL, Embase, ERIC, Education Research Complete, Education Full Text, CBCA
Education, GlobalHealth, Sociological Abstracts, Cochrane, Mosby’s Nursing Consult and PsycInfo.
The search for unpublished studies included Dissertation Abstracts and Google Scholar.
Initial keywords used were: OSCE; objectiv$ structur$ clinic$ exam$; self-assessment; self-report;
competence; confidence; self-efficacy.
Method of the review
Assessment of methodological quality
Quantitative papers selected for retrieval were assessed by two independent reviewers for
methodological validity prior to inclusion in the review using standardized critical appraisal instruments
from the Joanna Briggs Institute Meta-Analysis of Statistics Assessment and Review Instrument (JBI-
MAStARI) (Appendix II). Any disagreements that arose between the reviewers were resolved through
discussion, or with a third reviewer. Studies reporting experimental/ quasi-experimental designs were
assessed using the JBI Critical Appraisal Checklist for Randomized Control/Pseudo-randomized
Trials. Studies reporting descriptive designs were assessed using the JBI Critical Appraisal Checklist
for Descriptive/Case Series.
Data collection
Data was extracted from papers included in the review using an adapted form of the standardized
data extraction tool from JBI-MAStARI (Appendix III). The adapted tool (Appendix IV) included
specific details about the interventions, populations, study methods and outcomes of significance to
the review question and specific objectives.
Data synthesis
Statistical pooling was not possible; therefore the findings were presented in narrative form including
tables and figures to aid in data presentation where appropriate. The strength of the reported
correlations (r values) were interpreted in accordance with the method described by Munro and apply
to positive or negative correlations: little if any correlation (r = .00 to .25); low (r = .26 to .49);
moderate (r = .50 to .69); high (r = .70 to .89); and very high (r = .90 to 1.00).48
The data was also
analyzed using ‘The Four Stages of Learning’ model by Noel Burch.49
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations: a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 228
Results
Characteristics of included studies
The search strategy located a total of 2831 citations (Figure 1). Of this set, 698 duplicates were
removed and 127 articles were retrieved for full review. One hundred and nine articles were excluded
on the basis of not meeting the inclusion criteria. No articles were removed based on the critical
appraisal process, leaving a final set of 18 included studies.
Eighteen studies compared learners’ self-assessment of competence with performance on an OSCE
(Appendix IV and Appendix V). Within the review, there was a wide range of study designs:
randomized control trials (two studies)50,51
, quasi-experimental (five studies)52-56
, and descriptive (11
studies)16,37,38,57-64
. Of the descriptive studies, six were cross-sectional16,57-59,61,63
, and two were
correlational.37,38
Eight studies were conducted in the U.S.A.38,50,51,57-59,61,62
, four studies in the
U.K.37,53,60,64
, three studies from Canada
52,54,55, and one each from Australia
16, Malaysia
56, and the
Netherlands.63
There were no non-English studies located on this topic. Publication dates spanned
2000-2012.
The studies focused on a variety of healthcare learners and healthcare professionals. Thirteen studies
examined medical learners16,37,51,53,56,57,59,61,62,64
; of which five studies evaluated residents16,37,54,58,62
,
two studies examined nursing learners52,55
, and
one explored nurses’ and midwives’ pre-registration.
60
Of the studies that explored healthcare professionals, one examined nurses and medical assistants50
,
and one examined multidisciplinary practitioners.38
One study examined both healthcare learners and
practitioners.63
Self-reported competence was measured by a range of instruments including: an 11-
point scale64
, a 10-point scale
38, a 7-point scale
52,53,57,59, a 6-point scale
61, a 5-point scale
37,51,54,55, and
a 4-point scale16,58,60,62
, a four-category 17-item behavioral checklist
58, two sub-scales with 18 items
50,
a 40-item inventory56
, and a five-item scale.
63
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations: a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 229
Figure 1: Search decision flow diagram
Medline
801
CINAHL
323
Embase
898
PsycInfo
159
Hand search / grey literature
4
Total number of articles retrieved from
searching 2831
Duplicates
removed 698
Number of articles reviewed at metadata
level (title/abstract) 2133
Number of articles excluded at metadata level: not on
topic 2006
Number of articles reviewed in full text 127 Number of articles excluded after reviewing full text:
not meeting inclusion criteria 109
Number of articles excluded: not meeting critical
appraisal criteria 0 Final number of articles reviewed for critical
appraisal 18
Final number of articles included in review 18
ERIC
267
Education
FT 179
CBCA
Education
1
GlobalHealth
47
Sociological
Abstracts 150
Cochrane
2
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations: a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 230
Assessment of methodological quality
Given the paucity of research in this area, a cut-off score of four was established for each critical
appraisal checklist (Tables 1a and 1b). Critical appraisal scores ranged from four to six out of nine for
descriptive studies, or four to six out of 10 for experimental studies; hence all eligible studies were
included. Only 22% of the included studies on this topic were experimental/quasi-experimental in
design; therefore, implications for inference and inherent bias must be considered. Four studies
reported experimental/quasi-experimental methods involving comparisons between an experimental
and control group.50,51,53,55
Studies were evaluated according to the research design specified by the
author. However, regardless of the design, the results that pertained to the comparison of self-
assessment and OSCEs tended to be descriptive in nature.
Table 1a: Assessment of methodological quality for experimental studies
Citation Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Q9 Q10
Ault et
al.
(2002)
Y U U N Y U Y Y Y Y
Dornan
et al.
(2003)
N N N U N N Y Y Y Y
Doyle
et al.
(2010)
U N U U Y Y Y Y Y Y
Luctkar-
Flude et
al.
(2012)
N N N N U U Y Y Y Y
% 25 0 0 0 50 25 100 100 100 100
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations: a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 231
Table 1b: Assessment of methodological quality for descriptive studies
Citation Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Q9
Barnsley et
al. (2004)
N Y U Y N/A N/A N Y Y
Baxter &
Norman
(2011)
N Y Y Y N/A N/A U Y Y
Berg et al.
(2011)
N Y U Y Y N/A N Y Y
Biernat et al.
(2003)
N Y N Y N/A N/A N/A Y Y
Chen et al.
(2009)
N Y Y Y Y N/A N/A Y Y
Fox et al.
(2000)
N Y U Y N/A N/A N Y Y
Langhan et
al. (2009)
N Y N Y N/A Y N Y Y
Lauder et al.
(2008)
N Y N Y Y N/A N/A Y Y
Leopold et al.
(2005)
N/A Y U Y N/A N/A N Y Y
Lukman et al.
(2009)
N Y N Y N/A Y N Y Y
Mavis et al.
(2001)
N Y Y Y N/A N/A Y Y Y
Parish et al.
(2006)
N Y N Y N/A N/A N Y Y
Turner et al.
(2009)
N Y N Y N/A N/A N/A Y Y
Vivekenanda-
Schmidt et al.
(2007)
Y Y Y Y N/A N/A N Y Y
% 7 100 29 100 21 14 7 100 100
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations: a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 232
In the unlikely event that people evaluating the OSCE were aware of the students’ self-assessed
grading prior to the OSCE this could compromise the results. However, no studies provided this level
of detail. Further, the quality of the evidence may have been limited as some studies did not provide
in-depth descriptions of methods, outcomes and use of statistical tests; an omission also noted by
Davis et al.17
Additionally some studies did not report the exact correlation data for their findings, but
rather used terms such as ‘limited’ or ‘moderate’.
Many of the assessment tools were descriptive in nature and only six of the 18 studies reported on
the validity of their tools.52,53,55,60,63,64
Five studies reported on the reliability of the OSCE
assessment16,37,60,62,64
, and 10 studies reported on the reliability of at least one tool.50-55,57,60,61,64
Additionally, not all studies reported the precise correlation results, rather they reported the correlation
as ‘limited’ or ‘mild’ but did not provide the data.37,53,56,62
Correlational analysis is the appropriate
statistical analysis to ascertain the relationship between self-assessment and OSCE performance.
When the studies were not precise, the preset limits proposed by Munro were used to quantify the
leveling of limited (negligible, r<0.26), mild (low, r =0.26-0.49), or moderate (r =0.50-0.69)
correlations.48
One study conducted a subgroup analysis to compare learners to professionals63
, and
two studies compared gender38,64
, which are potential confounds to the overall study results. Criteria 5
and 6 of the appraisal tool for descriptive studies were not applicable to many studies in the review as
they did not involve between-group comparisons and only reported cross-sectional correlations with
no follow-up conducted.
Findings of the review
As described in the background, there is an overlap between the concepts of confidence and
competence and these terms are frequently used interchangeably in the literature. As a consequence,
this review referred to two main concepts: that of competence (including knowledge and performance)
and confidence (including self-efficacy) (see Appendix VI for characteristics of included studies). For
both competence and confidence the majority of studies did not support a positive relationship
between self-assessed performance and performance on an OSCE.
Competence
The concept of competence included both knowledge and performance. Of the 18 studies that were
involved in this review, seven studies focused on competence including performance and skill37,53,54,57-
59,62, and two studies focused on both competence and confidence.
52,60 Of the studies that examined
self-report compared to the results from an OSCE in the area of competence, two identified a
negative correlation52,58
, four identified no correlation
37,53,60,62, two identified a negligible (small)
correlation57,59
, and only one reported low (mild) to moderate correlations (Table 2).54
Confidence
The concept of confidence included self-efficacy. Of the 18 studies that were involved in this review,
nine studies focused on confidence16,38,50,51,55,56,61,63,64
, and two studies examined both competence
and confidence.52,60
Five studies compared OSCE performance to learner self-assessment of
confidence and six studies compared OSCE performance to learner self-assessment related to self-
efficacy. Of the studies that examined self-report compared to the results from an OSCE in the area of
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations: a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 233
confidence three identified a negative correlation38,52,63
, four no correlation16,50,60
, three negligible
(small) correlation38,51,60
, and three identified a low (mild) correlation.55,63,64
There were no studies
reporting moderate or high correlations between confidence and OSCE performance.
Competence and Confidence
Of the three studies that examined both self-reported competence and confidence as compared to the
results from an OSCE, two identified a negative correlation38,52
, and one identified no correlation.60
Self-Assessment
A variety of concerns about the process of self-assessment were raised. For example, Lauder et al.
identified concerns related to the measurement of self-reported competence.60
In particular, they
identified issues related to the variation in reliability of instruments and suggested an expansion of the
range of components included in a self-reported assessment. The inability of participants to identify
their own weaknesses60
, or to have an accurate perception of their abilities62
, has also raised
concerns.16,37,38,50,52,58,60
Ultimately, Lauder et al. cast doubt on the value of self-assessment and the
benefit of correlating self-assessment with the assessment of performance on an OSCE.60
Baxter and Norman52
caution that the use of a self-assessment tool in nursing education to evaluate
clinical competence requires serious examination and a clear rationale for its use. They note that
overconfidence related to inaccurate self-assessment may lead to negative outcomes for new
graduates and that overconfident new graduates may threaten patient care.
Examination of healthcare professionals and healthcare learners
As the review performed by Davis et al. focused solely on physician competence, a post hoc analysis
was conducted to explore whether results would differ for other healthcare professionals or healthcare
learners.17
Two studies focused on healthcare professionals. One study found there was no
correlation50
, and one study demonstrated a negative correlation.38
Turner et al. examined both
healthcare learners and healthcare professionals, but did not differentiate between the findings of the
learners and professionals.63
However, they did mention that there was no correlation between self-
efficacy and the OSCE.
In total, there were 15 studies that examined healthcare learners. The results of the examination of
healthcare learners indicated that there was either no correlation37,50,53,56,61,62
, a negative
correlation16,52,58,59
, a weak64
, or limited correlation51
, a mild to moderate correlation54
, or a moderate
correlation55,57
, and no significant association between self-assessment and OSCE performance.60
An inference may be made from the healthcare professionals’ and healthcare learners’ data
suggesting that there is no increase in the level of insight into one’s competence or confidence with
increased experience in the profession. However this interpretation needs to be accepted with caution
given the small sample of studies examining healthcare professionals and the quality of the evidence.
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations: a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 234
Table 2: Comparison of self-assessment and OSCE (n=18)
Author
/year/country
Domain Healthcare
professionals/learners
Review results
Luctkar-Flude et
al.55
(2012)
Canada
Confidence Learners:
undergraduate
nursing
Low correlation between self-confidence & performance: Significant correlations with
secondary medication scores (Pearson correlation: r =.309, p =.044) & total
performance scores (r =.368, p =.015).
Baxter &
Norman52
(2011)
Canada
Confidence &
competence
Learners:
undergraduate
nursing
Negative correlation: All but one of 16 correlations were negative (inversely related to
actual performance). Only two of the self-assessment items were significantly
correlated with OSCE total scores: post-test level of confidence dealing with acute
care situations (Pearson correlation: r = -0.405, p<0.01); ability to manage a crisis
situation (r = -0.430, p<0.01).
Berg et al.57
(2011) USA
Competence
(empathy)
Learners:
undergraduate
Medical
Negligible correlation between self-reported empathy & assessments by
standardized patients (Pearson correlation: r =0.19, p<0.05).
Chen et al.59
(2010) USA
Competence
(empathy)
Learners: medical Negligible overall correlation between self-reported empathy (JSPE-S) scores and
OSCE empathy (r =0.22; p<0.001). Learners in their 2nd
year had a higher perceived
self-assessment compared to the OSCEs. By their 3rd
year their self-assessment
was lower than their OSCEs.
Doyle et al.50
(2010) USA
Confidence (self-
efficacy)
Healthcare
professionals
(nursing, medical)
High magnitude of self-efficacy improvement did not correlate with improved
performance: On self-assessment there was significant improvement for self-efficacy
(ANCOVA: F=24.43, p<0.001); on performance there was no significant
improvement between intervention and control groups (F=3.46, p=0.073).
Langhan et al.54
(2009) Canada
Competence
(knowledge &
performance)
Learners: medical
residents
Low to moderate correlations between expert assessment total rating scores and
residents’ self-assessed knowledge (Pearson correlation: r =0.40, p<0.05) & clinical
skills scores (r =0.51, p<0.01).
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations: a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 235
Author
/year/country
Domain Healthcare
professionals/learners
Review results
Lukman et al.56
(2009)
Malaysia
Confidence (self-
efficacy)
Learners:
undergraduate
medical
No correlation between self-efficacy & performance (actual correlational data not
provided)
Turner et al.63
(2009)
Netherlands
Confidence (self-
efficacy)
Practitioners:
pediatric &
anaesthesia trainees
& specialists
Little correlation between self-efficacy & OSCE performance. No differentiation made
between healthcare learners & professionals. Significant correlation between self-
reported self-efficacy and only two of seven OSCE performance scores: global
resuscitation score (Spearman correlation: r=0.467, p =0.002) and time to intention
to intubate (r= -0.642; p<0.001).
Lauder et al.60
(2008) UK
Competence &
confidence (self-
efficacy)
Learners: pre-
registration nursing &
midwifery
Negligible significant association between confidence (self-efficacy) and the drug
calculation OSCE (Spearman correlation: r =0.239, p=0.028); no significant
associations between self-reported competence & any of the OSCE scores.
Vivekenanda-
Schmidt et al.64
(2007)
UK
Confidence Learners:
undergraduate
medical
Correlation between self-assessment & OSCE scores were non-significant: total
scores (Pearson correlation: r =0.13, p=0.11), shoulder assessment scores (r =0.16,
p=0.150, knee scores (r =0.13, p=0.21); however, significant but weak correlations
were noted for two measures in female learners only: total scores total (r =0.22,
p=0.04) & shoulder scores (r =0.29, p=0.03).
Parish et al.62
(2006) USA
Competence Learners: residents
medical learners
No correlations between summary OSCE scores & either interest or competence.
(Actual correlational data not provided). High performers underestimated their
competence. Residents’ self-assessed competence was not associated with OSCE
performance.
Leopold et al.38
(2005) USA
Confidence Practitioners:
multidisciplinary
Negative correlation. Before instruction, participants’ confidence was significantly but
inversely related to competent performance (r =-0.253, p=0.02); that is greater
confidence correlated with poorer performance. Both men & physicians displayed
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations: a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 236
Author
/year/country
Domain Healthcare
professionals/learners
Review results
higher pre-instruction confidence (p<0.01) that was not correlated with better
performance. After instruction, confidence correlated with objective competence in all
groups (r =0.24, p=0.04); men & physicians disproportionately overestimated their
skills both before & after training, a finding that worsened as confidence increased.
Barnsley et al.16
(2004)
Australia
Confidence Learners: residents
medical learners
No correlation. Application of the Wilcoxon’s signed rank test confirmed a true
difference between self-reported confidence scores and the assessed competence
scores for seven procedural skills: venipuncture (Z=14.556, p=0.001), IV cannulation
(Z=-3.598, p=0.001), CPR (Z=-4.371, p=0.001), ECG (Z=-3.306, p=0.001),
catheterization (Z=-4.769, p=0.001), NG tube insertion (Z=2.737, p=0.01), blood
cultures (Z=-3.974, p=0.001). Thus there was no association between any of the self-
assessments and actual performance. For all skills, the OSCE performances were
lower than self-reported confidence skills.
Biernat et al.58
(2003) USA
Competence Learners: residents
medical learners
Mixed results: self-assessment and rater assessments of competence were
congruent for six out of seven categories: communication, history of present illness,
past medical history, social history, mini mental status and geriatric depression scale
and were significantly different for the 7th: functional assessment (p<0.01) in which
residents rated themselves higher than the assessors. However, when self-ratings
were compared to assessors’ individual item ratings within the categories large
discrepancies were revealed. There was a negative association between the scores
on the self-assessment & the results from the OSCE.
Dornan et al.53
(2003) UK
Competence
(real patient
learning)
Learners:
undergraduate
medical
Assessment results did not correlate with real patient learning. (Actual correlational
data not provided).
Ault et al.51
Confidence (self- Learners: Limited association between self-report examination & OSCE. Students who
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations: a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 237
Author
/year/country
Domain Healthcare
professionals/learners
Review results
(2002) USA efficacy) undergraduate
medical
participated in the breast exam workshop reported higher self-efficacy related to their
breast exam skills (t =10.72, p<0.05) and performed significantly higher in clinical
exam skills (t =-2.99, p<0.05) than students who did not attend the workshop. (Actual
correlational data not provided).
Mavis et al.61
(2001)
USA
Confidence (self-
efficacy)
Learners:
undergraduate
medical
Learners with high self-efficacy were more likely to score above the mean OSCE
performance compared to low self-rated learners (71% versus 51%); however self-
efficacy was not significantly correlated to OSCE performance (r =0.12, p>0.05).
Fox et al.37
(2000). UK
Competence Learners: residents
medical learners
There were no significant correlations between skills performed on OSCE stations &
participants’ self-ratings. (Actual correlational data not provided).
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-
assessment with objective structured clinical examinations: a systematic review © the authors 2014
doi:10.11124/jbisrir-2014-1605 Page 238
Discussion
The findings of this review suggest that the evaluation obtained by self-assessment instruments is not
comparable to performance on an OSCE. Notably, there are concerns emerging from the literature as
to the validity and reliability of self-report, as well as the standardization of measurement using an
OSCE.
Currently in the practice of health education, there is no gold standard method for the evaluation of
competence. It is thought that the OSCE provides a closer depiction of the students’ competence but
there is no standardization across programs, years or segments of learning to identify when it is
appropriate to use OSCEs, how many stations should be involved and what content lends itself best
to this form of evaluation. There is no consistency in terms of measurement for either an OSCE or
self-assessment. Self-assessment was measured by different instruments including scales with a
variety of categories and items (range 4-point scale to 40 item inventory). Similarly, OSCEs lacked
standardization in the number of stations used and method of evaluation (range 1-17 stations).
Using the criteria described by Munro as previously outlined in the data synthesis section, the overall
findings from this review demonstrate that out of the 18 studies examined, four studies found negative
correlations, seven found no correlation and seven found positive correlations that ranged from
negligible to moderate in strength.48
This finding is similar to the review that Davis conducted. Davis
found that out of the 20 studies on physicians that were included in their review, 13 demonstrated
little, no or an inverse relationship between self-assessment and other indicators. They found that
seven studies demonstrated a positive relationship between self-assessment and external
observations.
Interpretation according to the four stages of learning model
The findings were also interpreted using the ‘Four Stages of Learning’ theory developed by Noel
Burch (Gordon Training International) in the early 1970s.49
The model is comprised of four stages
including: stage 1 unconscious incompetence (do not know what they do not know); stage 2
conscious incompetence (know what they do not know); stage 3 conscious competence (know what
they know) and stage 4 unconscious competence (knowledge becomes a part of one’s being, almost
unconscious – the expert). According to Burch, everyone progresses through the same four stages
regardless of the skill that needs to be acquired. From the beginning stage of acquiring a new skill the
learner progresses from stage 1 towards stage 4 as they gain experience and their level of
competence increases. This systematic review focused on the assessment of competence, thus this
model provided a useful framework to interpret the levels of competence demonstrated by the
participants in the included studies. Three reviewers read through each article independently and
interpreted the study findings with respect to the Burch framework in order to align the findings with
the stages of the framework. Alignment was based on the stage of competence of the participant
learners. Consensus was reached for interpretation of each article (Table 3).
The stage of competence assigned by the reviewers was based on the level of agreement between
the participants’ self-assessed level of competence and their actual performance on an OSCE. For
example, if participants’ self-assessment scores were high but their performance scores were low,
these findings were interpreted as being representative of the unconscious incompetence stage of the
framework.
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-
assessment with objective structured clinical examinations: a systematic review © the authors 2014
doi:10.11124/jbisrir-2014-1605 Page 239
According to Burch, the skill that one is attempting to acquire is irrelevant as everyone has to
progress through the same four stages. Burch identified that by being aware of these stages one can
better anticipate and accept that learning can be a slow and frequently uncomfortable process.
In stage 1 (unconscious incompetence) - learners are unaware of their own shortcomings.49
In this
stage, learners are seen to have little knowledge of the extent of skill and abilities they will need to
master and do not realize their own knowledge base that currently exists. According to the
interpretation of the findings of this review, more novice learners thought they were doing better and
lacked a deeper level of self-awareness. In six studies, the participants were at this stage of
competence.16,37,38,50,52,58,60
Stage 2 (conscious incompetence) - learners are aware of their former state of ignorance/naïveté and
begin to process how much material/skill/knowledge they will need to absorb.49
Learners may
experience some emotional distress during this stage as they realize the consequences of their
actions or become overwhelmed at the process ahead. When interpreting the review findings in the
context of the framework, it was noted that as learners’ knowledge increased, their self-assessment
became more accurate. Further, stronger learners underrated themselves, which may be related to
the fact that although they had mastered the theoretical content of a skill, they appeared to have not
mastered the practical application or recognized the level of complexity that was involved to become
competent. Examples of this stage are provided by Parish et al.62
, and Baxter and Norman52
, who
found that high performers underestimated their competence.
Stage 3 (conscious competence) - learners begin absorbing; processing and practicing the skills and
abilities they will need to advance to the unconscious competence stage.49
In the conscious
competence stage learners become proficient, effective and reliable at completing required tasks and
assignments. It is at this conscious competence stage that learners change from unskilled amateurs
to skilled professionals. Learners at this stage still need to consciously think about performing the
correct skill/ability. In three studies, the participants were at this stage of competence as
demonstrated by congruence between their self-assessments and their OSCE performance.55,57,60
Stage 4 (unconscious competence) - following best practice becomes routine and learners no longer
need to consciously think about performing the skill/ability properly.49
Learners are comfortable with
their skill level and feel confident in their abilities and this is verified by others’ responses to their work.
As healthcare learners or healthcare professionals were not followed to the point where they would
have been performing at a stage of unconscious competence, it is not possible to link the review
findings to this stage of the model.
Several studies illustrated progressions between the stages. In the study by Chen et al., second year
learners had a higher perceived self-assessment compared to the OSCEs (in other words they were
in stage 1 unconscious incompetence); however, in their third year, their self-assessment decreased
while their ability on the OSCE increased (they progressed to stage 2 conscious incompetence).59
Participants in the study by Leopold et al.,38
demonstrated a progression from stage 1 unconscious
incompetence prior to an educational session, to stage 3 conscious competence following instruction.
The participants in the study by Mavis61
trended towards confidence, but the finding was not
significant; whereas, Langhan et al.’s54
and Ault et al.’s51
participants progressed from stage 1 to
stage 3. Further the findings of Turner et al.63
, Lukman et al.56
and Dornan et al.53
were unclear in
relation to the model.
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-
assessment with objective structured clinical examinations: a systematic review © the authors 2014
doi:10.11124/jbisrir-2014-1605 Page 240
Table 3: Interpretation and alignment of review results according to the four stages of learning model
(n=18)
Author /year/
country
Domain Healthcare
professionals/ learners
Alignment with the four stages of learning
model:
i) unconscious incompetence
ii) conscious incompetence
iii) conscious competence
iv) unconscious competence
Luctkar-Flude
et al.55
(2012)
Canada
Confidence Learners:
undergraduate nursing
Conscious competence
Baxter &
Norman52
(2011)
Canada
Confidence
&
competence
Learners:
undergraduate nursing
Unconscious incompetence
conscious incompetence
Berg et al.57
(2011) USA
Competence
(empathy)
Learners:
undergraduate medical
Conscious competence
Chen et al.59
(2010) USA
Competence
(empathy)
Learners: medical Progression
unconscious incompetence (learners in
their 2nd
year)
conscious incompetence (learners in their
3rd
year)
Doyle et al.50
(2010) USA
Confidence
(self-efficacy)
Healthcare
professionals (nursing,
medical)
unconscious incompetence
Langhan et
al.54
(2009)
Canada
Competence
(knowledge
&
performance)
Learners: medical
residents
Progression from unconscious
incompetence to conscious competence
Lukman et
al.56
(2009)
Malaysia
Confidence
(self-efficacy)
Learners:
undergraduate medical
Stage of model unclear
Turner et al.63
(2009)
Netherlands
Confidence
(self-efficacy)
Practitioners: pediatric
& anaesthesia trainees
& specialists
Stage of model unclear
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-
assessment with objective structured clinical examinations: a systematic review © the authors 2014
doi:10.11124/jbisrir-2014-1605 Page 241
Author /year/
country
Domain Healthcare
professionals/ learners
Alignment with the four stages of learning
model:
i) unconscious incompetence
ii) conscious incompetence
iii) conscious competence
iv) unconscious competence
Lauder et
al.60
(2008)
UK
Competence
& confidence
(self-efficacy)
Learners: pre-
registration nursing &
midwifery
Unconscious incompetence (competence)
conscious competence (confidence)
Vivekenanda-
Schmidt et
al.64
(2007)
UK
Confidence Learners:
undergraduate medical
Conscious competence
Parish et al.62
(2006) USA
Competence Learners: residents
medical learners
Conscious incompetence
Leopold et
al.38
(2005)
USA
Confidence Practitioners:
multidisciplinary
Unconscious incompetence (pre-
instruction)
conscious competence (post-instruction)
unconscious incompetence (pre & post
instruction)
Barnsley et
al.16
(2004)
Australia
Confidence Learners: residents
medical learners
Unconscious incompetence
Biernat et
al.58
(2003)
USA
Competence Learners: residents
medical learners
Unconscious incompetence
Dornan et
al.53
(2003)
UK
Competence
(real patient
learning)
Learners:
undergraduate medical
Stage of model unclear
Ault et al.51
(2002) USA
Confidence
(self-efficacy)
Learners:
undergraduate medical
Progression from unconscious
incompetence to conscious competence
Mavis et al.61
(2001) USA
Confidence
(self-efficacy)
Learners:
undergraduate medical
Trend towards conscious competence but
not significant
Fox et al.37
(2000) UK
Competence Learners: residents
medical learners
Unconscious incompetence
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-
assessment with objective structured clinical examinations: a systematic review © the authors 2014
doi:10.11124/jbisrir-2014-1605 Page 242
Limitations of the review
There are several limitations of this review. First of all, there is a lack of conceptual clarity of the terms
competence, performance, confidence and self-efficacy. These four terms were often used
interchangeably in the literature, with some studies focusing on more than one concept. Secondly, a
variety of assessment tools were used to establish the level of competence; hence it was difficult to
establish a consistent measure of competence across the studies. Thirdly, not all studies identified the
number of OSCE stations, nor is there a consensus of how many stations and the leveling of skills at
these stations to ensure consistency of measurement. To be an effective measure of competence, the
OSCE needs to be aligned with the skills, knowledge or performance being tested. Notably, there is
no consensus in the literature regarding how many stations and the leveling of skills at these stations
that is needed to ensure a comprehensive assessment and consistency of measurement. The
implications of these limitations to the review highlight the need for caution when exploring and
measuring the concept of competence. This review has used study authors’ definitions of the term
and the multiple descriptors associated with it. These results may not be generalizable to other
research or situations in which competence is examined in a broad or narrow manner.
Conclusion
In this review, it was found that the evaluation of competence obtained by self-assessment
instruments was not comparable to an OSCE. It was shown that an accurate self-assessment is
threatened by overconfidence and that high performers underestimate their ability. It is theorized that
this is in part due to the stage of the learner with regard to skill acquisition according to the four
stages of learning.49
It is also important to note that both healthcare professionals and healthcare
learners were not accurate when assessing their own abilities and competencies. The goal of
healthcare education is to provide optimal student learning thereby facilitating quality patient care.
However, the findings of this review are important as the quality of care delivered to patients and
families is heavily affected by the competence of healthcare professionals. Professional programs are
being held more accountable for the competence of the individuals that they graduate. Therefore,
educators need to examine evaluation methods to ensure that we are offering a varied and valid
approach to assessment and appraisal. A standardized method of evaluation is required for both self-
assessments and OSCEs and evaluations need to be standardized across years of a program to
confirm the degree of complexity is appropriate for learners. Notably, learners need to be taught how
to perform accurate self-assessments if these evaluations are to be used within programs. The
findings of this review are drawn from the level of evidence rated as a three (3) (observational analytic
designs).
Implications for practice
The following implications are based on a level of evidence rated as a three (3). Preparing novice
healthcare professionals to enter their chosen profession with an entry level of competence is
complicated. However, ensuring that they are functioning at this level is even more complex. The
findings of this review indicate that self-assessment is threatened by overconfidence that is often seen
in very novice learners. Furthermore, high performers were seen to underestimate their ability.
Educators need to understand this limitation within the evaluation approach and apply varied methods
of evaluation in an attempt to gather a holistic picture of a learner’s competence. Key messages were
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-
assessment with objective structured clinical examinations: a systematic review © the authors 2014
doi:10.11124/jbisrir-2014-1605 Page 243
the need for educators to understand the current stage of the learner with regard to skill acquisition
and the importance of teaching learners to accurately assess their own competence.
Implications for research
For the knowledge in this area to advance, there is a need for standardization on how outcomes are
identified and measured in the area of competence. It is important to clarify the constructs to be
measured and a standard tool with which to measure them. Further, identifying the appropriate
number of stations for and OSCE is required.
Conflicts of interest
The reviewers declare no conflict of interest.
Acknowledgements
The authors would like to acknowledge the support received from the staff of the Queen’s Joanna
Briggs Collaboration.
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-
assessment with objective structured clinical examinations: a systematic review © the authors 2014
doi:10.11124/jbisrir-2014-1605 Page 244
References page
1 Lucian Leap Institute. Unmet needs: Teaching physicians to provide safe patient care.
Boston: National Patient Safety Foundation; 2010.
2 Epstein RM, Hundert EM. Defining and assessing professional competence. [Review] [171
refs]. JAMA 2002 Jan 9;287(2):226-35.
3 Holland K, Middleton L, Uys L. Professional confidence: a concept analysis. Scandinavian
Journal of Occupational Therapy 2012 Mar;19(2):214-24.
4 Merriam-Webster I. Performance. Merriam-Webster,Incorporated 2013 [cited 2013 Jun
9];Available from: URL: http://www.merriam-webster.com/dictionary/performance
5 Houghton Mifflin Company. Skill. The American Heritage¨ Dictionary of the English
Language 2009 [cited 2013 Jun 9];(Fourth Edition)Available from: URL:
http://www.thefreedictionary.com/skill
6 Bandura A. Self-efficacy. In: Ramachaudran VS, editor. Encyclopedia of human behavior
Volume 4.New York: Academic Press; 1994. p. 71-81.
7 Merriam-Webster I. Knowledge. Merriam-Webster,Incorporated 2013 [cited 2013 Jun
9];Available from: URL: http://www.merriam-webster.com/dictionary/knowledge
8 Oxford Dictionary. Empathy. Google 2013Available from: URL:
http://www.oxforddictionaries.com/definition/english/empathy
9 Gormley G. Summative OSCEs in undergraduate medical education. Ulster Medical Journal
2011;80(3):127-32.
10 Palese A, Bulfone G, Venturato E, Urli N, Bulfone T, Zanini A, et al. The cost of the
objective structured clinical examination on an Italian nursing bachelor's degree course.
Nurse Education Today 2012;32:422-6.
11 Klenowski V. Student Self_evaluation Processes in Student-centred Teaching and Learning
Contexts of Australia and England. Assessment in Education: Principles, Policy & Practice
1995;2(2):145-63.
12 Carlisle C. Reflecting on levels of confidence and competence in skills acquisition. Med
Educ 2000;34(11):886-7.
13 Ginsburg LR, Tregunno D, Norton PG. Self-reported patient safety competencies among
new graduates in medicine, nursing and pharmacy. BMJ quality and safety 2013;22(2):147-
54.
14 Regehr G, Hodges B, Tiberius R, Lofchy J. Measuring self-assessment skills: an innovative
relative ranking model. Acad Med 1996;71(10 Suppl):S52-S54.
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-
assessment with objective structured clinical examinations: a systematic review © the authors 2014
doi:10.11124/jbisrir-2014-1605 Page 245
15 Arora S, Miskovic D, Hull L, Moorthy K, Aggarwal R, Johannsson H, et al. Self vs expert
assessment of technical and non-technical skills in high fidelity simulation. Am J Surg
2011;202(4):500-6.
16 Barnsley L, Lyon PM, Ralston SJ, Hibbert EJ, Cunningham I, Gordon FC, et al. Clinical
skills in junior medical officers: a comparison of self-reported confidence and observed
competence. Med Educ 2004;38(4):358-67.
17 Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of
physician self-assessment compared with observed measures of competence: a systematic
review. JAMA 2006;296(9):1094-102.
18 Hodges B, Regehr G, Martin D. Difficulties in recognizing one's own incompetence: novice
physicians who are unskilled and unaware of it. Academic Medicine 2001
Oct;76(10:Suppl):Suppl-9.
19 Casey PM, Goepfert AR, Espey EL, Hammoud MM, Kaczmarczyk JM, Katz NT, et al. To
the point: reviews in medical education--the Objective Structured Clinical Examination. Am
J Obstet Gynecol 2009;200(1):25-34.
20 Huntley CD, Salmon P, Fisher PL, Fletcher I, Young B. LUCAS: a theoretically informed
instrument to assess clinical communication in objective structured clinical examinations.
Med Educ 2012;46(3):267-76.
21 Iramaneerat C, Myford CM, Yudkowsky R, Lowenstein T. Evaluating the effectiveness of
rating instruments for a communication skills assessment of medical residents. Adv Health
Sci Educ Theory Pract 2009;14(4):575-94.
22 Ponton-Carss A, Hutchison C, Violato C. Assessment of communication, professionalism,
and surgical skills in an objective structured performance-related examination (OSPRE): a
psychometric study. Am J Surg 2011;202(4):433-40.
23 Van Nuland M, Van den Noortgate W, van der Vleuten C, Jo G. Optimizing the utility of
communication OSCEs: omit station-specific checklists and provide students with narrative
feedback. Patient Educ Couns 2012;88(1):106-12.
24 Battles JB, Wilkinson SL, Lee SJ. Using standardised patients in an objective structured
clinical examination as a patient safety tool. Qual Saf Health Care 2004;13 Suppl 1:i46-i50.
25 Varkey P, Natt N. The Objective Structured Clinical Examination as an educational tool in
patient safety. Jt Comm J Qual Patient Saf 2007;33(1):48-53.
26 Daud-Gallotti RM, Morinaga CV, Arlindo-Rodrigues M, Velasco IT, Martins MA, Tiberio IC.
A new method for the assessment of patient safety competencies during a medical school
clerkship using an objective structured clinical examination. Clinics (Sao Paulo)
2011;66(7):1209-15.
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-
assessment with objective structured clinical examinations: a systematic review © the authors 2014
doi:10.11124/jbisrir-2014-1605 Page 246
27 Pernar LI, Shaw TJ, Pozner CN, Vogelgesang KR, Lacroix SE, Gandhi TK, et al. Using an
Objective Structured Clinical Examination to test adherence to Joint Commission National
Patient Safety Goal--associated behaviors. Jt Comm J Qual Patient Saf 2012;38(9):414-8.
28 Singh R, Singh A, Fish R, McLean D, Anderson DR, Singh G. A patient safety objective
structured clinical examination. J Patient Saf 2009;5(2):55-60.
29 Varkey P, Natt N, Lesnick T, Downing S, Yudkowsky R. Validity evidence for an OSCE to
assess competency in systems-based practice and practice-based learning and
improvement: a preliminary investigation. Acad Med 2008;83(8):775-80.
30 Wagner DP, Hoppe RB, Lee CP. The patient safety OSCE for PGY-1 residents: a
centralized response to the challenge of culture change. Teach Learn Med 2009;21(1):8-14.
31 Gupta P, Varkey P. Developing a tool for assessing competency in root cause analysis.
Joint Commission Journal on Quality & Patient Safety 2009 Jan;35(1):36-42.
32 Posner G, Naik V, Bidlake E, Nakajima A, Sohmer B, Arab A, et al. Assessing residents'
disclosure of adverse events: traditional objective structured clinical examinations versus
mixed reality. J Obstet Gynaecol Can 2012;34(4):367-73.
33 Stroud L, McIlroy J, Levinson W. Skills of internal medicine residents in disclosing medical
errors: a study using standardized patients. Acad Med 2009;84(12):1803-8.
34 Walsh M, Bailey PH, Koren I. Objective structured clinical evaluation of clinical competence:
an integrative review. J Adv Nurs 2009;65(8):1584-95.
35 Tamblyn R, Abrahamowicz M, Dauphinee D, Wenghofer E, Jacques A, Klass D, et al.
Physician scores on a national clinical skills examination as predictors of complaints to
medical regulatory authorities. JAMA 2007;298(9):993-1001.
36 Howley L, Szauter K, Perkowski L, Clifton M, McNaughton N, Association of Standardized
Patient Educators (ASPE). Quality of standardised patient research reports in the medical
education literature: review and recommendations. Med Educ 2008;42(4):350-8.
37 Fox RA, Ingham Clark CL, Scotland AD, Dacre JE. A study of pre-registration house
officers' clinical skills. Med Educ 2000 Dec;34(12):1007-12.
38 Leopold SS, Morgan HD, Kadel NJ, Gardner GC, Schaad DC, Wolf FM. Impact of
educational intervention on confidence and competence in the performance of a simple
surgical task. Journal of Bone & Joint Surgery - American Volume 2005 May;87(5):1031-7.
39 American Association of Colleges of Nursing. Hallmarks of quality and patient safety:
Recommended baccalaureate competencies and curricular guidelines to ensure high-
quality and safe patient care. Journal of professional nursing : official journal of the
American Association of Colleges of Nursing 2006;22(6):329-30.
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-
assessment with objective structured clinical examinations: a systematic review © the authors 2014
doi:10.11124/jbisrir-2014-1605 Page 247
40 Cronenwett L, Sherwood G, Barnsteiner J, Disch J, Johnson J, Mitchell P, et al. Quality and
Safety Education for Nurses. Nursing Outlook 2007 May;55(3):122-31.
41 Department of Health. Modernising medical careers: The new curriculum for the foundation
years in postgraduate education and training. London United Kingdom: Department of
Health; 2007.
42 Frank JR, Brien S, on behalf of The Safety Competencies Steering Committee. The safety
competencies: Enhancing patient safety across the health professions. Ottawa, Ontario:
Canadian Patient Safety Institute; 2013.
43 Seiden SC, Galvan C, Lamm R. Role of medical students in preventing patient harm and
enhancing patient safety. Qual Saf Health Care 2006 Aug;15(4):272-6.
44 van der Vleuten C. Validity of final examinations in undergraduate medical training. BMJ
2000;321(7270):1217-9.
45 Dong T, Saguil A, Artino AR, Jr., Gilliland WR, Waechter DM, Lopreaito J, et al.
Relationship between OSCE scores and other typical medical school performance
indicators: a 5-year cohort study. Mil Med 2012;177(9 Suppl):44-6.
46 Probert CS, Cahill DJ, McCann GL, Ben-Shlomo Y. Traditional finals and OSCEs in
predicting consultant and self-reported clinical skills of PRHOs: a pilot study. Med Educ
2003;37(7):597-602.
47 Sears K, Godfrey CM, Luctkar Flude M, Ginsburg L, Tregunno D, Ross-White A. Measuring
competence in healthcare learners and healthcare professionals by comparing self-
assessment with objective structured clinical examinations (OSCEs): a systematic review
protocol. JBI Database of Systematic Reviews & Implementation Reports 2014;12(3):24-38.
48 Munro BH. Statistical methods for health care research. 5th Edition ed. Philadelphia:
Lippincott Williams & Wilkins; 2005.
49 Burch N. Four stages of learning. Gordon Training International 1970Available from: URL:
http://www.gordontraining.com/free-workplace-articles/learning-a-new-skill-is-easier-said-
than-done/
50 Doyle D, Copeland HL, Bush D, Stein L, Thompson S. A course for nurses to handle
difficult communication situations. A randomized controlled trial of impact on self-efficacy
and performance. Patient Educ Couns 2011;82(1):100-9.
51 Ault GT, Sullivan M, Chalabian J, Skinner KA. A focused breast skills workshop improves
the clinical skills of medical students. Journal of Surgical Research 2002;106(2):303-7.
52 Baxter P, Norman G. Self-assessment or self deception? A lack of association between
nursing students' self-assessment and performance. J Adv Nurs 2011;67(11):2406-13.
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-
assessment with objective structured clinical examinations: a systematic review © the authors 2014
doi:10.11124/jbisrir-2014-1605 Page 248
53 Dornan T, Scherpbier A, Boshuizen H. Towards valid measures of self-directed clinical
learning. Med Educ 2003;37(11):983-91.
54 Langhan TS, Rigby IJ, Walker IW, Howes D, Donnon T, Lord JA. Simulation-based training
in critical resuscitation procedures improves residents' competence. CJEM, Can j emerg
med care 2009;11(6):535-9.
55 Luctkar-Flude M, Pulling C, Larocque M. Ending infusion confusion: Evaluating a virtual
intravenous pump educational module. Clinical Education in Nursing 2012 May;8(2):e39-
e48.
56 Lukman H, Beevi Z, Yeap R. Training future doctors to be patient-centred: efficacy of a
communication skills training (CST) programme in a Malaysian medical institution. Med J
Malaysia 2009;64(1):51-5.
57 Berg K, Majdan JF, Berg D, Veloski J, Hojat M. A comparison of medical students' self-
reported empathy with simulated patients' assessments of the students' empathy. Med
Teach 2011 May;33(5):388-91.
58 Biernat K, Simpson D, Duthie E Jr, Bragg D, London R. Primary care residents self
assessment skills in dementia. Advances in Health Sciences Education 2003;8(2):105-10.
59 Chen DC, Pahilan ME, Orlander JD. Comparing a self-administered measure of empathy
with observed behavior among medical students. J Gen Intern Med 2010;25(3):200-2.
60 Lauder W, Holland K, Roxburgh M, Topping K, Watson R, Johnson M, et al. Measuring
competence, self-reported competence and self-efficacy in pre-registration students. Nurs
Stand 2008;22(20):35-43.
61 Mavis B. Self-efficacy and OSCE performance among second year medical students. Adv
Health Sci Educ Theory Pract 2001;6(2):93-102.
62 Parish SJ, Ramaswamy M, Stein MR, Kachur EK, Arnsten JH. Teaching about Substance
Abuse with Objective Structured Clinical Exams. J Gen Intern Med 2006;21(5):453-9.
63 Turner NM, Lukkassen I, Bakker N, Draaisma J, ten Cate OT. The effect of the APLS-
course on self-efficacy and its relationship to behavioural decisions in paediatric
resuscitation. Resuscitation 2009;80(8):913-8.
64 Vivekananda-Schmidt P, Lewis M, Hassell AB, Coady D, Walker D, Kay L, et al. Validation
of MSAT: an instrument to measure medical students' self-assessed confidence in
musculoskeletal examination skills. Med Educ 2007;41(4):402-10.
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-
assessment with objective structured clinical examinations: a systematic review © the authors 2014
doi:10.11124/jbisrir-2014-1605 Page 249
Appendix I: Medline search strategy
Database: Ovid MEDLINE(R) without Revisions <1996 to Present with Daily Update>
1 osce.mp. (813)
2 Objectiv$ structur$ clinic$ exam$.mp. [mp=title, abstract, original title, name of substance word,
subject heading word, keyword heading word, protocol supplementary concept, rare disease
supplementary concept, unique identifier] (837)
3 1 or 2 (1046)
4 Medical Errors/(11311)
5 Diagnostic Errors/(13804)
6 Medication Errors/(7033)
7 Accidental Falls/(11859)
8 Safety Management/(15117)
9 Safety/(23717)
10 Risk Management/(10836)
11 Truth Disclosure/(7668)
12 Accidents/(3736)
13 (adverse adj3 event$).ab,ti. (63913)
14 (health care adj3 error$).ab,ti. (198)
15 (medic$ adj3 error$).ab,ti. (5291)
16 (nurs$ adj3 error$).ab,ti. (307)
17 (patient$ adj3 safe$).ab,ti. (21061)
18 (physician$ adj3 error$).ab,ti. (236)
19 (safety adj3 climate$).ab,ti. (308)
20 (safety adj3 culture$).ab,ti. (974)
21 (human adj3 error$).ab,ti. (1197)
22 "root cause analysis".ab,ti. (320)
23 "near miss".ab,ti. (457)
24 Patient Safety/(2045)
25 4 or 5 or 6 or 7 or 8 or 9 or 10 or 11 or 12 or 13 or 14 or 15 or 16 or 17 or 18 or 19 or 20 or 21 or
22 or 23 or 24 (170951)
26 3 and 25 (42)
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-
assessment with objective structured clinical examinations: a systematic review © the authors 2014
doi:10.11124/jbisrir-2014-1605 Page 250
Appendix II: Appraisal instruments
MAStARI appraisal instrument
this is a test message
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-
assessment with objective structured clinical examinations: a systematic review © the authors 2014
doi:10.11124/jbisrir-2014-1605 Page 251
this is a test message
Insert page break
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-
assessment with objective structured clinical examinations: a systematic review © the authors 2014
doi:10.11124/jbisrir-2014-1605 Page 252
this is a test message
Insert page brea
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-
assessment with objective structured clinical examinations: a systematic review © the authors 2014
doi:10.11124/jbisrir-2014-1605 Page 253
Appendix III: Data extraction instruments
MAStARI data extraction instrument
Insert page break
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-
assessment with objective structured clinical examinations: a systematic review © the authors 2014
doi:10.11124/jbisrir-2014-1605 Page 254
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-
assessment with objective structured clinical examinations: a systematic review © the authors 2014
doi:10.11124/jbisrir-2014-1605 Page 255
Appendix IV: Data extraction form
Author Demographics Study
details
Self-
assessment
instrument
OSCE Domains
Author
name
Publication
date
Country of
lead author
Full ref
details
Sample size
Sample detail
Profession
Year of study
Gender
Research
method
Research
design
Study
purpose
Validated
instrument
Researcher
developed &
piloted
Number of
questions
Number of
stations
Available
OSCE details
Confidence
Competence
Self-efficacy
Empathy
Knowledge
Clinical
skills/performance
other
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-
assessment with objective structured clinical examinations: a systematic review © the authors 2014
doi:10.11124/jbisrir-2014-1605 Page 256
Appendix V: List of included studies
(1) Ault GT, Sullivan M, Chalabian J, Skinner KA. A focused breast skills workshop improves the clinical
skills of medical students. J Surg Res 2002;106(2):303-7.
(2) Barnsley L, Lyon PM, Ralston SJ, Hibbert EJ, Cunningham I, Gordon FC, et al. Clinical skills in junior
medical officers: a comparison of self-reported confidence and observed competence. Med Educ
2004;38(4):358-67.
(3) Baxter P, Norman G. Self-assessment or self deception? A lack of association between nursing
students' self-assessment and performance. J Adv Nurs 2011;67(11):2406-13.
(4) Berg K, Majdan JF, Berg D, Veloski J, Hojat M. A comparison of medical students' self-reported
empathy with simulated patients' assessments of the students' empathy. Med Teach 2011 May;33(5):388-
91.
(5) Biernat K, Simpson D, Duthie E Jr, Bragg D, London R. Primary care residents self assessment skills
in dementia. Adv Health Sci Educ Theory Pract 2003;8(2):105-10.
(6) Chen DC, Pahilan ME, Orlander JD. Comparing a self-administered measure of empathy with
observed behavior among medical students. J Gen Intern Med 2010;25(3):200-2.
(7) Dornan T, Scherpbier A, Boshuizen H. Towards valid measures of self-directed clinical learning. Med
Educ 2003;37(11):983-91.
(8) Doyle D, Copeland HL, Bush D, Stein L, Thompson S. A course for nurses to handle difficult
communication situations. A randomized controlled trial of impact on self-efficacy and performance. Patient
Educ Couns 2011;82(1):100-9.
(9) Fox RA, Ingham Clark CL, Scotland AD, Dacre JE. A study of pre-registration house officers' clinical
skills. Med Educ 2000 Dec;34(12):1007-12.
(10) Langhan TS, Rigby IJ, Walker IW, Howes D, Donnon T, Lord JA. Simulation-based training in critical
resuscitation procedures improves residents' competence. CJEM, Can j emerg med care 2009;11(6):535-
9.
(11) Lauder W, Holland K, Roxburgh M, Topping K, Watson R, Johnson M, et al. Measuring competence,
self-reported competence and self-efficacy in pre-registration students. Nurs Stand 2008;22(20):35-43.
(12) Leopold SS, Morgan HD, Kadel NJ, Gardner GC, Schaad DC, Wolf FM. Impact of educational
intervention on confidence and competence in the performance of a simple surgical task. Journal of Bone
& Joint Surgery - American Volume 2005 May;87(5):1031-7.
(13) Luctkar-Flude M, Pulling C, Larocque M. Ending infusion confusion: Evaluating a virtual intravenous
pump educational module. Clinical Simulation in Nursing 2012 May;8(2):e39-e48.
(14) Lukman H, Beevi Z, Yeap R. Training future doctors to be patient-centred: efficacy of a
communication skills training (CST) programme in a Malaysian medical institution. Med J Malaysia
2009;64(1):51-5.
(15) Mavis B. Self-efficacy and OSCE performance among second year medical students. Adv Health Sci
Educ Theory Pract 2001;6(2):93-102.
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-
assessment with objective structured clinical examinations: a systematic review © the authors 2014
doi:10.11124/jbisrir-2014-1605 Page 257
(16) Parish SJ, Ramaswamy M, Stein MR, Kachur EK, Arnsten JH. Teaching about Substance Abuse with
Objective Structured Clinical Exams. J Gen Intern Med 2006;21(5):453-9.
(17) Turner NM, Lukkassen I, Bakker N, Draaisma J, ten Cate OT. The effect of the APLS-course on self-
efficacy and its relationship to behavioural decisions in paediatric resuscitation. Resuscitation
2009;80(8):913-8.
(18) Vivekananda-Schmidt P, Lewis M, Hassell AB, Coady D, Walker D, Kay L, et al. Validation of MSAT:
an instrument to measure medical students' self-assessed confidence in musculoskeletal examination
skills. Med Educ 2007;41(4):402-10.
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations:
a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 258
Appendix VI: Characteristics of included studies
Author/date
/country
Demographics Study details Self-assessment
Instrument
OSCE Domains Results
Luctkar-
Flude et
al.55
(2012)
Canada
N=17
experimental
N=26 control
Undergraduate
Y3 nursing
learners
Quasi-experimental
non-equivalent
control group
Purpose: to develop
an online virtual
Intravenous pump
educational module
for undergraduate
nursing learners &to
evaluate the module
in terms of student
confidence,
performance,
&satisfaction
Self-confidence:
10 items
5-point scale
Internal reliability
α=0.83
Validity: peer review
by lab & clinical
instructors
Intravenous (IV)
pump skills
1 station with 3 tasks
Trained raters
Confidence Low correlation
between self-
confidence &
performance:
Significant correlations
with secondary
medication scores
(Pearson correlation: r
=.309, p =.044) & total
performance scores (r
=.368, p =.015).
Baxter &
Norman52
(2011)
Canada
N=27
Undergraduate
Y4 nursing
learners
Aged 21-23
years
No prior
experience to
high-fidelity
Quasi-experimental
one group
pre-post study
Purpose: To
examine senior year
nursing learners’
ability to self-assess
their performance
when responding to
simulated
7 items
7-point scale
Internal reliability
Pre-test α=0.70
Post-test α=0.81
Content & face
validity: Tool pre-
tested by 8 nursing
faculty & 2 YR4
Response to
emergency situations
3 stations
IRR=0.67
Correlation across
stations=0.48
Internal
consistency=0.98
Overall test
Confidence,
competence
& abilities
Negative correlation: All
but 1 of 16 correlations
were negative
(inversely related to
actual performance).
Only 2 of the self-
assessment items were
significantly correlated
with OSCE total scores:
post-test level of
confidence dealing with
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations:
a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 259
Author/date
/country
Demographics Study details Self-assessment
Instrument
OSCE Domains Results
simulation
emergency
situations
nursing learners reliability=0.73 acute care situations
(Pearson correlation: r
= -0.405, p<0.01);
ability to manage a
crisis situation (r = -
0.430, p<0.01).
Berg et al.57
(2011)
U.S.A.
N=248
Undergraduate
Y3 medical
learners (125
males, 128
females)
Descriptive cross-
sectional
Purpose: examine
the association
between
standardized
patients’
assessment of
medical learners
empathy &the
learners self-
reported empathy
Jefferson Scale of
Physician Empathy
(JSPE) -20 items
7-point scale,
construct, criterion &
predictive validity.
Internal &test-re-test
reliability have been
established
Jefferson Scale of
Patient Perceptions
of Physician Empathy
(JSPPPE) - 5 item 7-
point scale,
discussed
correlations between
ratings on this scale
& observations
Empathy
10 stations
Competence
(Empathy)
Negligible correlation
between self-reported
empathy &
assessments by
standardized patients
(Pearson correlation: r
=0.19, p<0.05).
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations:
a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 260
Author/date
/country
Demographics Study details Self-assessment
Instrument
OSCE Domains Results
Global rating of
empathy by
standardized patients
5 point scale
Chen et
al.59
(2010)
U.S.A.
N= 163 Y2
progressed to
Y3
N=159 class of
Y3
Undergraduate
medical
learners
Compared
rating of Y2
&Y3 related to
empathy
Descriptive cross-
sectional
Purpose: To explore
a self-reported
measure &
observed empathy
in medical learners.
Jefferson Scale of
Physician Empathy-
Student version
(JSPE-S) 20 items
(E)
7-point scale
Higher values =
higher empathy the
ratings were
compared to 6 OSCE
stations
Clinical skills
3 (Y2) or
6 (Y3) stations
Competence
(Empathy)
Negligible overall
correlation between
self-reported empathy
(JSPE-S) scores and
OSCE empathy (r
=0.22; p<0.001).
Learners in their 2nd
year had a higher
perceived self-
assessment compared
to the OSCE’s. By their
3rd
year their self-
assessment was lower
than their OSCE’s.
Doyle et
al.50
(2010)
U.S.A.
N= 21
experimental
N=20 control
Healthcare
Professionals
Nurses (RNS,
Randomized
Controlled Trial
Randomized by
work team, not
individually
Purpose: to
evaluate the impact
Self-efficacy scale
with 2 sub-scales
18 items
Internal reliability
Confidence α=0.88
Extent of difficulty
Communication skills
5 stations
18 item performance
rating scale
IRR=0.70
Agreement with the
Confidence
(Self-efficacy)
High magnitude of self-
efficacy improvement
did not correlate with
improved performance:
On self-assessment
there was significant
improvement for self-
efficacy (ANCOVA:
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations:
a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 261
Author/date
/country
Demographics Study details Self-assessment
Instrument
OSCE Domains Results
LPS & certified
medical
assistants)
from 2 primary
care settings, 3
nursing homes,
1 hospital & 1
private practice
3.8 yrs work
experience
(experimental)
3.5 yrs work
experience
(control)
of a course on self-
efficacy &
performance of
participating nurses
α=0.90
expert=0.4-0.5 F=24.43, p<0.001): on
performance there was
no significant
improvement between
intervention and control
groups (F=3.46,
p=0.073).
Langhan et
al.54
(2009)
Canada
N=28
PGY 1-PGY3
residents
(mean 30.2
years; 54%
male; 36% Y1,
32% Y2, 32%
Y3; 43% family
medicine, 14%
internal
medicine, 43%
Quasi-experimental
Longitudinal
Purpose: To assess
the impact of a
simulation-based
procedural skills
training course on
residents’
competence in the
performance of
critical resuscitation
25 items
5-point scale
Internal reliability
α=0.94
Resuscitation
procedures
1 station
16 item checklist
α=0.79 & 5-point
global rating scale
r=0.78 (p<0.001)
Competence
(Knowledge
&
performance)
Low to moderate
correlations between
expert assessment total
rating scores and
residents’ self-
assessed knowledge
(Pearson correlation: r
=0.40, p<0.05) &
clinical skills scores (r
=0.51, p<0.01).
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations:
a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 262
Author/date
/country
Demographics Study details Self-assessment
Instrument
OSCE Domains Results
emergency
medicine
procedures
Lukman et
al.56
(2009)
Malaysia
N=185 Y1
Undergraduate
medical
learners mean
age was 19.56,
54% female,
45.5% male
66.1% Chinese
Quasi-experimental
Longitudinal
Purpose: To
evaluate the
efficacy of the
preclinical
communication
skills training (CST)
program at the
International
Medical University
in Malaysia.
Efficacy indicators
include learners' (1)
perceived
competency (2)
attitude (3)
conceptual
knowledge, & (4)
performance with
regard to patient-
centred
communication.
Interpersonal
Communication
Inventory (ICI)
40 items
Patient-centred
communication
1 station
Confidence
(Self-efficacy)
No correlation between
self-efficacy &
performance (actual
correlational data not
provided)
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations:
a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 263
Author/date
/country
Demographics Study details Self-assessment
Instrument
OSCE Domains Results
Turner et
al.63
(2009)
Netherlands
N=55
Healthcare
learners &
professionals
Pediatric &
anaesthesia
trainees &
specialists
Descriptive cross-
sectional
Purpose: To
investigate the
relationship
between self-
efficacy &
measured
performance during
a simulated
resuscitation, & the
effect of death of a
simulated patient on
self-efficacy
Self-efficacy rating
7 items
Validated 100mm
VAS
Pediatric
resuscitation skills
Sim test & OSCE with
2 tasks
ICCs 0.561-0.910,
p<0.001
Confidence
(Self-efficacy)
Significant correlation
between self-reported
self-efficacy and only 2
of 7 OSCE
performance scores:
global resuscitation
score (Spearman
correlation: r=0.467, p
=0.002) and time to
intention to intubate (r=
-0.642; p<0.001). No
differentiation made
between healthcare
learners &
professionals.
Lauder et
al.60
(2008)
U.K.
N=99 Pre-
registration
Healthcare
learners
Nursing &
midwifery
Descriptive
Purpose: To
measure
competence, self-
reported
competence & self-
efficacy & explore
the relationship
between
competence, self-
reported
Self-reported
competence: SNCQ
(Watson et al, 2002)
18 items
4-point scale
Self-efficacy: GPSE
(Schwarzer & Born,
1997) 10 items
4-point scale
Reliable with good
Nursing
competencies
3 stations:
Drug calculations; 24
items (Wright, 2004)
Hand
decontamination10
items (Major, 2005)
Communication skill;
11 items (Arthur,
Competence
& Confidence
(self-efficacy)
Negligible significant
association between
confidence (self-
efficacy) and the drug
calculation OSCE
(Spearman correlation:
r =0.239, p=0.028); no
significant associations
between self-reported
competence & any of
the OSCE scores.
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations:
a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 264
Author/date
/country
Demographics Study details Self-assessment
Instrument
OSCE Domains Results
competence,
support & self-
efficacy
convergent &
discriminatory validity
1999)
Vivekenand
a-Schmidt
et al.64
(2007) U.K.
N=345
Y3
Undergraduate
medical
learners
66% female
Aged 19-25
Descriptive
longitudinal
Purpose: To
validate the 15-item
Musculoskeletal
self-assessment
tool (MSAT),
addressing its
construct validity,
internal
consistency,
responsiveness,
repeatability &
relationship with
competence
MSAT 15 items
11-point scale
Internal reliability
α=0.31-0.94
α=0.89 across
shoulder items
α=0.88 across knee
items
Musculoskeletal
examination skills
2 or 1 stations
Previously shown to
be reliable
α=0..56 across
shoulder items
α=0..31 across knee
items
ICC=0.97
Confidence Correlation between
self-assessment &
OSCE scores were
non-significant: total
scores (Pearson
correlation: r =0.13,
p=0.11), shoulder
examination scores (r
=0.16, p=0.150, knee
examination scores (r
=0.13, p=0.21);
however, significant but
weak correlations were
noted for 2 measures in
female learners only:
total musculoskeletal
examination scores (r
=0.22, p=0.04) &
shoulder examination
scores (r =0.29,
p=0.03).
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations:
a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 265
Author/date
/country
Demographics Study details Self-assessment
Instrument
OSCE Domains Results
Parish et
al.62
(2006)
U.S.A.
N= 131 PGY3
Residents
medical
learners
Internal
medicine &
family medicine
Descriptive
Purpose: To
determine
associations
between
performance & self-
assessed interest &
competence in
substance abuse, &
assessed learning
during the OCSE.
4-point scale
Faculty &
standardized patients
(SPs) assessed
residents' general
communication,
assessment,
management, &
global skills using 4-
point scales.
Residents completed
a pre-OSCE survey
of experience,
interest &
competence in
substance abuse, & a
post-OSCE survey
evaluating its
educational value.
Learning during the
OSCE was also
assessed by
measuring
performance
improvement from the
first to the final OSCE
Substance abuse
management skills
5-station OSCE,
including different
substance abuse
disorders & readiness
to change stages,
administered during
postgraduate year-3
ambulatory rotations
for 2 years.
Interstation reliability
was moderate α =
0.64.
Competence No correlations
between summary
OSCE scores & either
interest or competence.
(Actual correlational
data not provided).
High performers
underestimated their
competence.
Residents’ self-
assessed competence
was not associated with
OSCE performance.
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations:
a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 266
Author/date
/country
Demographics Study details Self-assessment
Instrument
OSCE Domains Results
station.
Leopold et
al.38
(2005)
U.S.A.
N=93
Healthcare
Professionals
multidisciplinar
y practitioners:
43 medical
doctors, 3
osteopathic
doctors, 35
registered NPs,
& 12
Descriptive
correlational
Purpose: To
determine the
relationship
between self-
reported confidence
& observed
competence
Confidence:
10-point scale
Performance:
10-point scale
1 station: injecting a
knee joint
Trained preceptors
used 10-point scale
to grade performance
Confidence
&
Competence
(Performance
)
Negative correlation.
Before instruction,
participants’ confidence
was significantly but
inversely related to
competent performance
(r =-0.253, p=0.02); that
is greater confidence
correlated with poorer
performance. Both men
& physicians displayed
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations:
a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 267
Author/date
/country
Demographics Study details Self-assessment
Instrument
OSCE Domains Results
physician’s
assistants
higher pre-instruction
confidence (p<0.01)
that was not correlated
with better
performance. After
instruction, confidence
correlated with
objective competence
in all groups (r =0.24,
p=0.04); however, this
correlation was weaker
than the correlation
between the
participants’ confidence
& their self-assessment
of performance (r=
0.72, p=0.001). Men &
physicians
disproportionately
overestimated their
skills both before &
after training, a finding
that worsened as
confidence increased.
Barnsley et
al.16
(2004)
Australia.
N=30 PGY1
Residents
medical learner
Descriptive cross-
sectional
Self-reported
confidence for
procedures on a 4-
7-part observed
structured clinical
examination, pilot
Confidence
No correlation.
Application of the
Wilcoxon’s signed rank
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations:
a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 268
Author/date
/country
Demographics Study details Self-assessment
Instrument
OSCE Domains Results
Purpose: To
determine the
relationship
between self-
reported confidence
& observed
competence for a
number of routine,
procedural clinical
skills
point Likert rating
scale;
testing of criteria
checklist, observers
blinded
Acceptable interrater
reliability
test confirmed a true
difference between self-
reported confidence
scores and the
assessed competence
scores for 7 procedural
skills: venipuncture
(Z=14.556, p=0.001), IV
cannulation (Z=-3.598,
p=0.001), CPR (Z=-
4.371, p=0.001), ECG
(Z=-3.306, p=0.001),
catheterization (Z=-
4.769, p=0.001), NG
tube insertion (Z=2.737,
p=0.01), blood cultures
(Z=-3.974, p=0.001).
Thus there was no
association between
any of the self-
assessments and
actual performance.
For all skills the OSCE
performances were
lower than self-reported
confidence skills.
Biernat et N=12 Descriptive 4 category 1-station Competence Mixed results: Self-
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations:
a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 269
Author/date
/country
Demographics Study details Self-assessment
Instrument
OSCE Domains Results
al.58
(2003)
U.S.A.
Residents
medical learner
(3 Y1, 4 Y2, 5
Y3; 6 male, 6
female)
cross-sectional
Purpose: To assess
resident’s ability to
accurately self-
assess their
competencies
related to a
commonly
presenting problem
in geriatrics
(communication,
history of present
illness, social history,
functional
assessment) 17-item
behavioural checklist
developed by an
expert panel of
geriatric health
professionals & pilot
tested with a group of
Y2 medical learners
standardized patient
station rated by 2
faculty assessors
Interrater reliability
Cohen’s kappa 0.62-
0.75
assessment and rater
assessments of
competence were
congruent for 6 out of 7
categories:
communication, history
of present illness, past
medical history, social
history, mini mental
status and geriatric
depression scale, and
were significantly
different for the 7th:
functional assessment
(p<0.01) in which
residents rated
themselves higher than
the assessors.
However, when self-
ratings were compared
to assessors’ individual
item ratings within the
categories large
discrepancies were
revealed. There was a
negative association
between the scores on
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations:
a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 270
Author/date
/country
Demographics Study details Self-assessment
Instrument
OSCE Domains Results
the self-assessment &
the results from the
OSCE.
Dornan et
al.53
(2003)
U.K.
N=33
experimental
N=31 historical
control
N=33
contemporary
control
Undergraduate
Medical
Learners
Quasi-experimental
after-only
Purpose: To
compare the validity
of different
measures of self-
directed clinical
learning.
51 items
7-point scale
Internal reliability
α=0.90 & face validity
The measures were:
(1) a 23-item
quantitative
instrument measuring
satisfaction with the
learning process &
environment; (2) free
text responses to 2
open questions about
the quality of learners'
learning experiences;
(3) a quantitative,
self-report measure
of real patient
learning, & (4)
objective structured
clinical examination
(OSCE) & progress
Clinical skills 14
stations
Competence
(Real Patient
Learning)
Assessment results did
not correlate with real
patient learning (actual
correlational data not
provided).
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations:
a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 271
Author/date
/country
Demographics Study details Self-assessment
Instrument
OSCE Domains Results
test results.
Ault et al.51
(2002)
U.S.A.
N=67
experimental
N=57 control
Undergraduate
Medical
Learners
Experimental
Randomized
prospective study
Purpose: To
determine
effectiveness of a
breast skills
workshop in
teaching medical
learners physical
examination &
diagnostic skills for
breast disorders in
comparison to a
traditional
ambulatory setting
Self-efficacy rating
scale
5 items
5-point scale
Internal reliability
α=0.797
Cognitive knowledge
MCQ exam
6 items on breast
disease
Internal reliability
α=0.732
Breast exam skills
2 stations:
1 SP & 1 paper case
α=0.707
Confidence
(Self-efficacy)
Limited association
between self-report
examination & OSCE.
Students who
participated in the
breast exam workshop
reported higher self-
efficacy related to their
breast exam skills (t
=10.72, p<0.05) and
performed significantly
higher in clinical exam
skills (t =-2.99, p<0.05)
than students who did
not attend the
workshop. (Actual
correlational data not
provided)
Mavis et
al.61
(2001)
U.S.A
N=113
Undergraduate
Y2 medical
learners (mean
25.9 years;
44% female)
Descriptive cross-
sectional
Purpose: To
examine the self-
efficacy of second-
year medical
learners regarding
Self-efficacy scale
31 items
6-point scale
Internal reliability
α=0.96
Clinical skills
11 stations
Confidence
(Self-efficacy)
Learners with high self-
efficacy were more
likely to score above
the mean OSCE
performance compared
to low self-rated
learners (71% versus
JBI Database of Systematic Reviews & Implementation Reports 2014;12(11) 221 - 272
Sears et al. Measuring competence in healthcare learners and healthcare professionals by comparing self-assessment with objective structured clinical examinations:
a systematic review © the authors 2014 doi:10.11124/jbisrir-2014-1605 Page 272
Author/date
/country
Demographics Study details Self-assessment
Instrument
OSCE Domains Results
their OSCE
performance, &
focuses on
establishing the
relationship
between self-
efficacy beliefs &
clinical performance
51%), however self-
efficacy was not
significantly correlated
to OSCE performance
(r =0.12, p>0.05).
Fox et al.37
(2000) U.K.
N=22
Residents
medical
learners
Descriptive
correlational
Purpose: To
determine whether
PHROs have
deficiencies in basic
clinical skills & to
determine if the
PRHOs or their
consultants are
aware of them
Competence:
5-point Likert
Clinical skills:
17 station OSCE that
included 2
communication
stations
Rating by faculty
supervisor of resident
performance on
pretested & validated
OSCE stations
(scored by a trained
observer)
Reliability α=0.73
Competence There were no
significant correlations
between skills
performed on OSCE
stations & participants’
self-ratings. (Actual
correlational data not
provided)