Upload
john-bligh
View
218
Download
1
Embed Size (px)
Citation preview
Assessment: the gap between theory and practice
We have, in fact, two kinds of mor-
ality side by side: one which we
preach but do not practise, and
another which we practise but sel-
dom preach.
Bertrand Russell,
Sceptical Essays (1928)
Assessment is one of the major themes
running through this issue of the jour-
nal but, according to Prystowsky and
Bordage from Chicago, this shouldn't
surprise any of us (see pages 331±3361 ).
In a content analysis of three major
medical education journals between
1996 and 1998, they found that papers
relating to assessment formed by far the
commonest type of report. In fact, for
this journal at least, we receive enough
manuscripts describing aspects of
assessment practice each year to launch
a new journal devoted solely its study.
One of the reasons we haven't done this
is our belief that such a step would
alienate technical and academic think-
ing about assessment from its practical
application.
A recent survey of assessment prac-
tices in UK medical schools was critical
of some for their failure to use assess-
ment blueprints or test matrices in the
planning of their examinations.1 Such
matrices enable assessors to more
closely match course objectives to the
items and processes used in assess-
ments ± a vital step in ensuring the
validity of an examination (a point
emphasized by Dauphinee and Black-
more in their Commentary on page
3172 ). In his helpful Short Report, John
Hall from Perth, Western Australia
describes how the development of
detailed matrices aided the construc-
tion of a new integrated undergraduate
curriculum (see pages 345±3473 ). The
matrices encouraged teachers to
reconsider wordy, untestable objec-
tives, to weigh the relative values of
differing aspects of the course, and as a
result, achieve a better overall balance
between curriculum themes and speci-
®c learning objectives. Two other
important points can be found in the
report: the analysis process stimulated
by the matrix exercise led to acceptance
of portfolios as an assessment tool, and
initial uncertainty about the complexity
of the approach led to face to face
discussions. The power of talk is an
often-underused curriculum develop-
ment tool, even in successful schools.
This paper should be compulsory
reading for members of examination
boards in medical schools ± it is com-
mendably short but its message is
strong ± good assessments measure
what learners know and do ± not what
was taught or even what was intended.
In the same vein, and echoing other
®ndings in the UK medical school
assessment report, Manogue and his
colleagues (see pages 364±3704 ) investi-
gated the espoused values of dental
teachers for a variety of assessment
techniques and then compared these
values with actual practice. The authors
write that the `values [of the dental
teachers] were in-line with evidence-
based good practice, but their practices
are not in line with their values'.
Objective, structured testing, more
feedback, the use of self assessment and
portfolios ®gured highly on the wish list
but did not appear often in practice (the
most common methods being based on
implicit judgements and checklist
schedules).
It appears that assessment is an
example of a subject where there are
two camps: one full of well meaning,
earnest teachers and researchers
immersed in the language and culture
of assessment practice (validity, gener-
alizabilty, psychometrics are examples of
the words they commonly use); the
other full of well meaning, earnest
teachers facing the day to day practical
problems of running assessments in full
awareness of what should be done, but
only too well aware of what can be
achieved in their circumstances. Clearly,
there is a role here for the General
Medical and Dental Councils in help-
ing to set requirements for modern
assessment practice, but perhaps an
even more fundamental step is provi-
ding training in assessment methodo-
logy for all teachers in medical and
dental schools. One of the incidental
®ndings of the Manogue study was the
recognition by teachers of the need for
such training, but this was confounded
by a lack of provision of suitable train-
ing opportunities ± a situation mirrored
in medical education. We would be
interested to hear of experiences with
faculty development activities relating
to assessment, and will gladly publish
notices of courses and training events
related to this important set of skills.
John Bligh
Peninsula Medical School
Plymouth, UK
Reference
1 Fowell SF, Maudsley G, Maguire P,
Leinster S, Bligh J. Student assessment
in undergraduate medical education in
the United Kingdom 1998. Med Educ
2000;34 (Suppl. 1):1±80.
Editorials
312 Ó Blackwell Science Ltd MEDICAL EDUCATION 2001;35:312