Upload
andrew
View
213
Download
0
Embed Size (px)
Citation preview
This article was downloaded by: [The UC Irvine Libraries]On: 17 December 2014, At: 23:55Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH,UK
New Review of Information andLibrary ResearchPublication details, including instructions for authorsand subscription information:http://www.tandfonline.com/loi/rilr20
Bridging the Research-PracticeGap? The Role of Evidence BasedLibrarianshipAndrew Bootha Senior Lecturer in Evidence Based HealthcareInformation, School of Health and Related Research ,University of Sheffield , Regent Court, 30 RegentStreet, S1 4DA, Sheffield E-mail:Published online: 12 Jul 2010.
To cite this article: Andrew Booth (2003) Bridging the Research-Practice Gap? The Roleof Evidence Based Librarianship, New Review of Information and Library Research, 9:1,3-23, DOI: 10.1080/13614550410001687909
To link to this article: http://dx.doi.org/10.1080/13614550410001687909
PLEASE SCROLL DOWN FOR ARTICLE
Taylor & Francis makes every effort to ensure the accuracy of all theinformation (the “Content”) contained in the publications on our platform.However, Taylor & Francis, our agents, and our licensors make norepresentations or warranties whatsoever as to the accuracy, completeness, orsuitability for any purpose of the Content. Any opinions and views expressedin this publication are the opinions and views of the authors, and are not theviews of or endorsed by Taylor & Francis. The accuracy of the Content shouldnot be relied upon and should be independently verified with primary sourcesof information. Taylor and Francis shall not be liable for any losses, actions,claims, proceedings, demands, costs, expenses, damages, and other liabilitieswhatsoever or howsoever caused arising directly or indirectly in connectionwith, in relation to or arising out of the use of the Content.
This article may be used for research, teaching, and private study purposes.Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expresslyforbidden. Terms & Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions
Dow
nloa
ded
by [
The
UC
Irv
ine
Lib
rari
es]
at 2
3:55
17
Dec
embe
r 20
14
Bridging the Research-Practice Gap? TheRole of Evidence Based LibrarianshipAndrew Booth
Senior Lecturer in Evidence Based Healthcare Information, School of Health
and Related Research, University of Sheffield, Regent Court, 30 Regent
Street, Sheffield, S1 4DA, UK
e-mail: [email protected]
Librarianship has had a long preoccupation with the research-practice gap. Practitioner-ledresearch is criticised for its lack of rigour, academic research for its lack of relevance. Evidencebased practice is a pragmatic approach to bridging this gap. This review starts by charting thedevelopment of evidence based practice from its origins in medicine through healthcare toother disciplines. It then examines the context for the development of evidence basedlibrarianship, focusing on examples from the wider library literature and on the healthinformation literature from 2002 onwards.
The review examines each stage of the evidence based practice process, examining the legacyfrom the wider paradigm as it specifically relates to information practice. Tools and methodsdeveloped within evidence based information practice are briefly summarised. The reviewconcludes by outlining the challenges that remain if evidence based information practice is tobe adopted within the profession at large.
1. INTRODUCTION
Librarianship shares with social work (1) and education (2), among
other professions, a longstanding preoccupation with the research-
practice gap (3, 4). Published research typically originates from
academic institutions and is criticised for a perceived lack of relevance to
day-to-day practice (5). Turner (6) examined information professionals’
perceptions concerning their use of research, why they consult research,
how often they do so and personal factors affecting usage. She found that
applied research (addressing operational concerns and consistently com-
prising about 50% of published output (7, 8)) is most widely used. She
observes that most research is not consulted because it seems divorced from
the real concerns of practice or is presented in ways that impair
understanding and application (6). Practitioners, on the other hand, are
frequently accused of failing to engage with the findings from research that
might shape their future practice (9). Cullen describes practitioners’ innate
suspicion of research concluding that ‘‘we do not make enough use of
research to improve services or practice’’ (10).
The New Review of Information and Library Research 2003 3
# 2003 Taylor & Francis Ltd
DOI: 10.1080/13614550410001687909
Dow
nloa
ded
by [
The
UC
Irv
ine
Lib
rari
es]
at 2
3:55
17
Dec
embe
r 20
14
Straddling this divide, with an uncertain foot in both camps, are the
practitioner-researchers. Practitioner-researchers tend to use such designs
as survey research, action research and secondary data analysis (11) which
are more likely to struggle for acceptance by bona fide academic
researchers. Time spent in acquiring research skills and conducting the
research itself may be at the expense of their continuing to be viewed by
colleagues as ‘‘real’’ practitioners.
Over the years proposed solutions to bridge the research-practice gap have
included mentors, secondments and collaborative research networks (12).
Such measures seek to address the organisational and structural barriers
while doing little to challenge the prevailing culture of librarianship.
Achieving a real difference requires a paradigm shift. Over recent years
many have claimed that that paradigm is ‘‘evidence based practice’’:
Evidence-based librarianship represents something different: a deliberate approach to change.(13)
This review is believed to be the first in the general library literature to
survey developments and outputs from that specific branch of evidence
based practice labelled, not uncontroversially, as ‘‘evidence based librarian-
ship’’ (14). It focuses primarily on literature produced since a health-
specific review appeared in Medical Reference Services Quarterly in 2002
(15), augmented by a wider survey of materials targeted at audiences
outside the health domain.
2. EVIDENCE BASED PRACTICE
Evidence based practice advocates the collection, interpretation, and
integration of valid, important and applicable evidence (16). Such evidence
may be reported by a user/patient/client/parent, observed by a librarian/
clinician/social worker/teacher, or derived from rigorously conducted
research. Irrespective of its origin, the best available evidence, moderated
by sensitivity to a user/client/patient’s values and preferences, is harnessed
to improve the quality of day-to-day decision-making (17, 18).
This model of knowledge management promotes the use of research in
making decisions that benefit individuals or whole populations. In doing
so, evidence based practice seeks to address information overload (19),
information delay (20) and information entropy (21). These particular
problems are particularly manifest in, and indeed critical to, the field of
medicine from within which evidence based practice first appeared as
‘‘evidence based medicine’’.
4 The New Review of Information and Library Research 2003
Dow
nloa
ded
by [
The
UC
Irv
ine
Lib
rari
es]
at 2
3:55
17
Dec
embe
r 20
14
Evidence based medicine emerged from McMaster University, Canada, inthe early 1990s (22). Having previously enjoyed a modest incarnation as
clinical epidemiology (23) (literally the application of results gained from
the study of populations to the care of an individual patient) evidence
based medicine demonstrated both a greater relevance and increasing
sophistication in applying research at the bedside. It is no accident that its
growth coincided with revolutionary developments in information and
communications technologies (24). The paradigm soon encompassed
specific branches of medicine such as psychiatry (25) and dentistry (26)and related domains such as nursing (27), pathology (28) and pharma-
cotherapy (29). By the mid-1990s a broader term, evidence based
healthcare (30), was a portmanteau for wide ranging activities promoted
within and outside medicine.
The late 1990s saw evidence based healthcare spread to contiguous fields
such as education (31), social work (32, 33), human resource management
(34) and criminology (35). An even broader term evidence based practice
captures the commonality of approaches across a broad spectrum of
professional endeavour (36).
Ford and colleagues (37) have proposed a number of characteristics thatidentify a discipline as suited to an evidence based practice model. These
include a substantive knowledge base and a requirement for informed
decision-making. Library and information science seems to possess all
these purported characteristics.
3. THE POLICY CONTEXT
While evidence based practice has a fundamental appeal for any practi-
tioner who wishes to develop skills for lifelong learning in pursuit of
professional excellence, it is the political imperative that has added
significant weight to adoption of the paradigm. This growing interest in
using research evidence to inform policy and practice is reflected in
documents from such sources as the Cabinet Office (38, 39), with anemphasis on the identification, synthesis and application of rigorous
evidence to problem-solving (40). The main tool for such evidence
synthesis is the systematic review (41). Unlike traditional reviews,
systematic reviews of the literature include a detailed appraisal of the
research studies identified (42). In healthcare, an emphasis on evaluating
evidence of effectiveness for inclusion in systematic reviews has led to the
emergence of distinct critical appraisal criteria, embodied in checklists.
Booth and Haines (43) were among the first to highlight the value of thesystematic review for the general library practitioner.
The New Review of Information and Library Research 2003 5
Dow
nloa
ded
by [
The
UC
Irv
ine
Lib
rari
es]
at 2
3:55
17
Dec
embe
r 20
14
4. EVIDENCE BASED LIBRARIANSHIP
With their increased participation within multidisciplinary teams involved
in critical appraisal and methodological assessment of clinical articles (44),
librarians and information specialists have become increasingly adept at
recognising and critiquing experimental studies of drugs, operations and
similar interventions. From here it is a natural progression to applying suchtechniques to their own professional literature in seeking to uncover
enduring, generic truths that form the foundations of evidence-based
librarianship (45). Respondents to the CILIP-funded study The LIS
research landscape: a review and prognosis (46) have observed, with regard
to evidence based librarianship, that
as this practice was being adopted by an increasing number of professions, LIS should take asimilar approach to improve its image and professional standing. (47)
The immaturity of the paradigm is reflected in the absence of a consensual
definition of evidence based librarianship (EBL). At least four published
attempts at defining EBL currently exist. Three of these originate from the
diverse health librarianship communities of the US, the UK and Canada
while the fourth derives from the school librarianship sector. Each
definition has characteristics by which to recommend itself.
Eldredge emphasises the pragmatic context for EBL, emphasising the
multiplicity of important research designs:
Evidence-Based Librarianship (EBL) seeks to improve library practice by utilising the bestavailable evidence in conjunction with a pragmatic perspective developed from workingexperiences in librarianship. The best available evidence might be produced from quantitativeor qualitative research designs, although EBL encourages more rigorous forms over lessrigorous forms of evidence when making decisions. (48)
Booth adapts a pre-existing definition of Evidence Based Healthcare,
coined by Anne McKibbon from McMaster University (17):
Evidence-based librarianship (EBL) is an approach to information science that promotes thecollection, interpretation and integration of valid, important and applicable user-reported,librarian observed, and research-derived evidence. The best available evidence, moderated byuser needs and preferences, is applied to improve the quality of professional judgements. (49)
A recent online poll on the competing health sector definitions of EBL,
part of an international EBL continuing education course, assigned this
definition an overwhelming majority of votes, primarily because of its focus
on the user (seen in ‘‘user-reported’’ evidence and the requirement for
evidence to be ‘‘moderated by user needs and preferences’’).
Crumley and Koufogiannakis populate their definition by itemising thestages of the evidence based practice process:
6 The New Review of Information and Library Research 2003
Dow
nloa
ded
by [
The
UC
Irv
ine
Lib
rari
es]
at 2
3:55
17
Dec
embe
r 20
14
Evidence-based librarianship (EBL) is a means to improve the profession of librarianship byasking questions, finding, critically appraising and incorporating research evidence fromlibrary science (and other disciplines) into daily practice. It also involves encouraginglibrarians to conduct research. (50, 51)
Within the context of research utilisation the emphasis on the librarian as
practitioner-researcher is perhaps the most noteworthy contribution of this
definition.
Todd (52), a US academic working within school librarianship, identifies
two dimensions to evidence based librarianship:
First . . . it focuses on the conscientious, explicit and carefully chosen use of current bestresearch evidence in making decisions about the performance of the day-by-day role. Second,evidence-based practice is where day-by-day professional work is directed towards demon-strating the tangible impact and outcomes of sound decision making and implementation oforganizational goals and objectives.
Todd’s second dimension (53), wherein the local librarian gathers ‘‘mean-ingful and systematic evidence of the impact of the librarian’s instructional
initiatives on student learning outcomes’’ echoes, albeit in connection with
evaluation rather than research, Crumley and Koufogiannakis’ emphasis
on the practitioner researcher.
Recently Booth and Brice (54) have attempted, hitherto unsuccessfully, to
hasten the adoption of the term ‘‘evidence based information practice’’ in
preference to ‘‘evidence based librarianship’’ (55). This term, has its early
origins in a statement by the then NHS Libraries Adviser, Margaret
Haines, in the newsletter Evidence Based Purchasing :
What I find particularly appealing about the [Anglia and Oxford Librarian of the 21st
Century] programme is that it will not only help librarians to support evidence-based practiceof their users but it will also develop their own evidence-based information practice [italicsadded] which should result in more cost-effective and higher quality information support tothe NHS. (56)
Evidence based information practice exhibits several advantages such as itsfacility to include the wider context of informatics, information literacy and
information systems (57, 58). It capitalises, by association, on the evidence
based practice movement offering the flexibility to include the role of
information workers in supporting the practice of others (59) together with
their role in incorporating the evidence base of their own practice (55).
Opponents of the term ‘‘evidence based information practice’’ counter that
these two distinct roles should remain separate. Others manifest a dogged,
albeit understandable, attachment to the ‘‘librarianship’’ label. Add to this
debates about whether information practice is ‘‘based’’, ‘‘supported’’ or‘‘informed’’ by evidence, offensives already waged unsuccessfully within the
The New Review of Information and Library Research 2003 7
Dow
nloa
ded
by [
The
UC
Irv
ine
Lib
rari
es]
at 2
3:55
17
Dec
embe
r 20
14
wider arena (60, 61), and it becomes clear that there is far to be travelledbefore consensus is reached.
5. THE PROCESS OF EVIDENCE BASED PRACTICE
Evidence based practice emphasises four requisite information management
processes: problem specification (focusing the question); searching the
literature, filtering search results and critical appraisal (assessing retrieved
items for validity, reliability and applicability). These are then followed by
two implementation -related tasks �/ applying the results and evaluating
performance. The classical model of evidence based practice conflates the
searching and filtering processes but as the former focuses on subjectrelevance and the latter accentuates considerations of quality it is more
accurate, within an information management context, to consider these
separately. Significantly the original adherents of evidence based medicine
have been forced to acknowledge the inherent impracticality of performing
the entire process when encountering an information need. Recent years
have seen an emphasis on practitioners (users of evidence) benefiting from
the preappraised (preassessed) products of others (62).
Proponents of evidence based librarianship have contributed to all stages
of the process, taking the techniques of the wider paradigm and replicatingor modifying them before applying them to their own practice.
5.1. Focusing the Question
The first stage of evidence based practice is to convert information needs
from practice into ‘‘focused, structured questions’’ (63, 64). The goal of this
primary stage, variously called focusing or formulating your question (65), is
to convert a precise, yet possibly vaguely expressed, information need into
an ‘‘answerable question’’. Advocates of evidence based healthcare have
given this stage considerable attention (66, 67) and this is increasingly being
mirrored within information practice. Eldredge (68) comments:
Questions drive the entire EBL process. EBL assigns highest priority to posed questions withgreatest relevance to library practice. The wording and content of the questions will determinewhat kinds of research designs are needed to secure answers.
A foreground question, featuring a decision between two competing
choices of action (in contrast to an unspecified information need),
commonly contains up to four elements (65):
. A Population �/ those who are recipients or potential beneficiaries
of a service or intervention;
. An Intervention �/ the service or planned action that is beingdelivered to the population;
8 The New Review of Information and Library Research 2003
Dow
nloa
ded
by [
The
UC
Irv
ine
Lib
rari
es]
at 2
3:55
17
Dec
embe
r 20
14
. The Outcomes �/ the ways in which the service or action can bemeasured to establish whether or not it has had a desired effect, and,
optionally
. A Comparison �/ an alternative service or action that may or may
not achieve similar outcomes.
Such focused questions have been shown to lead to elicit more information
from requesters thereby yielding more precision in specifying search
requests (69, 70). This PICO model from evidence based practice has been
adapted as ECLIPSE (Expectation, Client Group, Location, Impact,
Professionals, SErvice) for use by librarians in specifying managers’
information needs (71). Of more general applicability, however, is a
corresponding SPICE framework (55) for specifying the questions of
evidence based information practice:
. SETTING
. PERSPECTIVE
. INTERVENTION
. COMPARISON
. EVALUATION
In this case Population from the PICO model has been subdivided into the
Setting or context of the service and the Perspective (User, manager, carer,
information professional etceteras) which combine to moderate the impact
of any intervention. This recognises that, unlike a drug intervention with
pharmacodynamic properties that generally apply across multiple settings
and populations, library and information science is a far more complex
human-mediated discipline. So, from the perspective of an undergraduate
student (PERSPECTIVE) in a University Library (SETTING) is provision
of a short term loan collection (INTERVENTION) more effective than a
general collection (COMPARISON) in terms of the percentage availability
of recommended texts (EVALUATION).
Another contribution of evidence based practice has been in characterising
question types thereby making it easier to map such questions to
appropriate study designs to answer them (72, 73). Similarly Crumley
and Koufogiannakis have attempted to characterise those domains that
typify library activity (50, 51). Initially they proposed six such domains
(51):
. Reference/Enquiries
. Education
The New Review of Information and Library Research 2003 9
Dow
nloa
ded
by [
The
UC
Irv
ine
Lib
rari
es]
at 2
3:55
17
Dec
embe
r 20
14
. Collections
. Management
. Information Access and Retrieval
. Marketing/Promotions
Following a systematic content analysis of the library and information
studies literature published in 2001, Koufogiannakis, Crumley and Slater
(74) suggest revising their categories with the addition of Library History
and Professional Issues. Further research is needed to look at how optimal
search strategies might be developed to retrieve articles for each domain.
5.2. Finding the Evidence
As indicated by the breadth of evidence based librarianship domains above,information practice draws on an evidence base that covers an almost
unparalleled range of disciplines. This observation is supported by a
Library Association Health Libraries Group funded study which looked at
the feasibility of systematic reviews of the library literature (75�/77).
Literature searches on a sample topic of end user searching found relevant
evidence not just in obvious sources (Library Literature, Information
Science Abstracts and Library and Information Science Abstracts) but also
in the three main biomedical databases MEDLINE, EMBASE andCINAHL, the computing databases INSPEC and COMPENDEX and
the multidisciplinary indices, Science and Social Science Citation Indexes.
Given evidence based practice’s focus on research methodology it is not
surprising to see the practitioners of evidence based librarianship calling
for the use of structured abstracts (78) as already widely provided in
medical journals (79). Such abstracts are structured around such pre-
specified sections as, subjects, intervention, outcomes, methods, results and
conclusions. They have been shown to improve retrieval (80) and to enable
rapid assimilation. Research commissioned by the British Library (81)
suggests that structured abstracts are feasible for the research literature ofsocial sciences, including librarianship. Whilst not necessarily more
accurate than traditional abstracts, they are significantly longer, more
readable and more informative (82, 83). Fears that they take up too much
space appear misplaced (84) and there is growing support for their more
widespread introduction (85, 86).
5.3. Filtering Search Results
Growing interest in evidence based practice, encouraging practitioners to
base decisions on sound research evidence, has stimulated the developmentof so-called ‘‘methodological filters’’ (87). Such methodological search
10 The New Review of Information and Library Research 2003
Dow
nloa
ded
by [
The
UC
Irv
ine
Lib
rari
es]
at 2
3:55
17
Dec
embe
r 20
14
filters initially arose out of researchers’ concerns in locating ‘‘RandomizedControlled Trials’’ (RCTs) to avoid publication bias, associated with flawed
results and invalid conclusions. The term ‘‘methodological search filter’’
was coined by Wilczynski et al. (88) to signify ‘‘a search term or terms
(such as ‘random allocation’ for sound studies of medical intervention) that
select studies that are at the most advanced stages of testing for clinical
application’’. White and colleagues define them further as: ‘‘collections of
search terms intended to capture frequently sought research methods’’ (89).
Methodological search filters have risen to prominence over the last 20years (90). They are distinguished from hedges, lists of terms designed to
retrieve information on selected subjects, in that they focus on the design of
a study, not its subject content. They are therefore combined with subject
terms to increase precision, not as subject permutations or variants to
increase recall. McKibbon and colleagues (91) have identified terms to be
used for retrieving different study designs from the major health biblio-
graphic databases. Beverley (92) has briefly suggested some corresponding
terms for use with the library and information science literature.
5.4. Appraising the Literature
In order to ask the all-important questions is this research valid, reliable
and applicable to my practice librarians need to be able to appraise each
study. Mirroring early developments in EBM, whereby a series of Users’
Guides to the Medical Literature was produced (93) to address particularquestion types, Booth and Brice (94) initiated a Critical Skills Training in
Appraisal for Librarians (CriSTAL) programme to develop similar guides
to address common types of questions asked by librarians (95). The first
two guides, prioritised according to the prevalence and importance of their
topics, cover Use Studies and Information Needs Analyses. Although such
guides are well-established for quantitative research designs such as clinical
trials and cohort studies, it is comparatively recently that qualitative
research has been similarly served. For example, within informationsystems Atkins and Sampson (96) have suggested guidelines to appraise
case study research, a commonly occurring research design within
information practice. Clyde (97) has conducted a small-scale empirical
study to look at how evaluation of research evidence might be operatio-
nalised among school librarians.
5.4.1. Hierarchy of Study Design
Central to selection and assessment of research studies is the concept of the
hierarchy of evidence. A not uncontroversial concept, this idea is criticised
as being ‘‘discipline centric’’ with ‘‘research methodologies employed bysocial scientists . . . ranked at the bottom of the hierarchy’’ (98). In actuality
The New Review of Information and Library Research 2003 11
Dow
nloa
ded
by [
The
UC
Irv
ine
Lib
rari
es]
at 2
3:55
17
Dec
embe
r 20
14
the supporters of evidence based practice are not advancing a quantitative
hegemony having previously acknowledged that it is the question that
ultimately should determine the chosen design (99).
One way of reconciling these paradigm wars is to consider the hierarchy
illustrated in Figure 1 as more properly representing a hierarchy of
effectiveness. Other question types, for example, user views require a
markedly different hierarchy. A variant of this approach is advocated by
Eldredge who divides questions into those dealing with Prediction,
Intervention and Exploration (100).
In contrast, Edwards et al. (101), circumventing the acknowledged
limitations of such hierarchies, suggest the superiority of a ‘‘signal to
noise’’ approach. Here the strength of a research signal or message is
ranged against the possible noise resulting from inadequacies of research
design.
At the top of the hierarchy of effectiveness is the systematic review.
Systematic reviews address sharply defined questions, use explicit and
rigorous methods to identify, critically appraise and synthesize studies
meeting explicit inclusion and exclusion criteria. Increasingly such reviews
offer a synthesis of messages from research to date, spelling out the
implications of these for practitioners and highlighting the future research
agenda. Examples from recent information practice include those by
Winning and Beverley (102), Beverley et al. (103), Brettle (104), and
Tenopir (105). Weller (106), for example, extends the evidence based model
to scientific publishing, providing a systematic review of empirical studies
on the editorial peer review process from 1945 to 1997. Other reviews, while
not necessarily accommodating a full systematic reviews approach, employ
increasingly systematic literature searching and/or appraisal techniques.
Such reviews, with information as both subject and object, offer many
FIG. 1: Hierarchy of evidence (101).
12 The New Review of Information and Library Research 2003
Dow
nloa
ded
by [
The
UC
Irv
ine
Lib
rari
es]
at 2
3:55
17
Dec
embe
r 20
14
opportunities for librarians. Beverley and colleagues (107) map the stagesof the systematic review process to the varying competencies that
information professionals can contribute to a review team.
Meanwhile Eldredge has initiated a process of identifying, documenting
and illustrating occurrences of more rigorous designs in librarianship
starting with randomised controlled trials (108) and cohort studies (109).
5.5. Applying the Results in Practice
While finding and appraising the evidence base for information practice
carry their own challenges it is implementation that poses a greater
challenge to the evidence based practice movement. In his previous review
(15) Booth contrasts individual approaches to using research findings,
based on enthusiasm and personal motivation, with organisational
mechanisms, harnessing systems and resources. Individual enthusiasm
may drive the establishment of journal clubs (110) such as those described
by Koufogiannakis and Crumley (111) and Doney and Stanton (112).However, such approaches lie outwith the organisational agenda. In
contrast, guidelines specifying best practice, based on research evidence,
usually receive organisational endorsement without any guarantee that
they will be read and used by practitioners. More recently, Younger (113)
has opened up the prospect of a ‘‘third way’’ by extending thinking on the
organisational contribution beyond obvious ‘‘bolt-on’’ evidence based
approaches to full integration with existing activities. For example she
observes that
performance evaluation criteria do not include the expectation that university librariansshould be using their research, or that of others, in carrying out their responsibilities . . .academic libraries should be looking at rewarding individuals not just for producing researchbut also for incorporating research into operational decisions when warranted. (113)
Another suggestion, made by the same speaker at the 2nd Evidence Based
Librarianship Conference is the inclusion of an explicit requirement to
practise EBL within job advertisements and their corresponding job
descriptions (113). Oberg (114) identifies how teacher librarians canmake a difference in terms of measurable gains in student achievement,
both by using research findings of others and by generating research
findings (first, analysing the results of national, provincial or local testing
programmes and then using locally available statistical data that is available
or easily obtained). Perhaps the greatest contribution to evidence based
librarianship can come, not from the comparatively thinly-populated
library and information science evidence base itself but rather from the
increasing body of reviews that address organisational change anddissemination of good practice (115).
The New Review of Information and Library Research 2003 13
Dow
nloa
ded
by [
The
UC
Irv
ine
Lib
rari
es]
at 2
3:55
17
Dec
embe
r 20
14
5.6. Evaluating Your Performance
Although benchmarking and performance indicators have developed
outside evidence based librarianship it is clear that there are many
synergies to be enjoyed (116). Objective means of measurement are
required to evaluate whether the introduction of an evidence based
intervention has made the anticipated difference (10). Indicators selectedmust themselves be based on evidence. For example, accreditation check-
lists may have the ‘‘ring of truth’’ in terms of a practitioner’s assessment of
their validity but are the criteria that are being assessed derived from the
research evidence? As Scott (117) remarks:
By using Performance Indicators it is possible to judge what the LRC [School LibraryResource Centre] is doing well; to identify where improvements are needed to raiseeffectiveness; to develop the service and to justify funding bids for these developments andimprovements. Importantly, using PIs is objective not subjective �/ not ‘‘I think’’ but ‘‘Theevidence shows’’.
Plutchak (118) warns, however, that:
The evidence-based movement pushes us in the direction of measurable goals and objectives.We need to be careful, however, not to confuse measurement with efficacy.
Notwithstanding such cautions it is appropriate to see that interest in
evidence based librarianship comes at a time when the attention of theprofession at large is moving from inputs and outputs towards outcomes.
In this respect recent work attempting to quantify the benefits of library
services using contingent valuation heralds a new focus on cost effective-
ness (119).
6. CURRENT DEVELOPMENTS
For a movement in its infancy the achievements of Evidence BasedInformation Practice to date are impressive. Having featured at health
information conferences, and even general library association conferences,
the movement has already produced two conferences with international
speakers and delegates in Sheffield, UK (120, 121) and Edmonton, Canada
(98). Discussion and debate has been serviced via a Jisc-mail electronic list-
server ([email protected]). Add to this evidence
based librarianship workshops, a special supplement on evidence based
health information practice and a multi-author publication entitledEvidence based practice for information professionals and the impact of
the paradigm is apparent (122, 123).
6.1. Issues to be Resolved
One issue generating considerable heat at the 2nd Evidence BasedLibrarianship conference was whether those present should focus on
14 The New Review of Information and Library Research 2003
Dow
nloa
ded
by [
The
UC
Irv
ine
Lib
rari
es]
at 2
3:55
17
Dec
embe
r 20
14
advancing evidence based librarianship as a health information movementor aspire to influence the wider profession (124). The issue here was not
one of inclination but rather one of opportunity costs. This conference
included delegates from outside health librarianship although the speakers
were predominantly drawn from the health sector. Some present felt that
the challenge confronting the handful of opinion leaders in impacting upon
the health information profession was great enough without diffusing
energies to the wider information community. A practical way forward
involves the opportunistic use of relatively ‘‘high impact, low energy’’interventions such as publications and conference platforms to spread the
principles abroad more widely in the hope that local product champions
will, in turn, cascade them within their own sector. In this regard the
presence within other sectors of librarianship, such as special librarianship
(125), public librarianship (126, 127) and school librarianship (128), of
significant movements to stimulate the use of research in practice suggests
that it is the label, not the philosophy that needs to be the focus of our
efforts.
7. THE WAY FORWARD
A priority for the evidence based information practice movement has to be
an authoritative consensus statement capturing progress to date andoutlining a shared vision and values. While the ‘‘fission’’ brought about
by juxtaposing the individual energies of evidence based librarianship
communities across several continents has led to several exciting collabora-
tions this can only come a poor second best to the ‘‘fusion’’ resulting from
agreed objectives and priorities. Parallels with the Cochrane and Campbell
Collaborations, international networks of researchers systematically iden-
tifying, analysing and synthesising the evidence, have been drawn on
several occasions but remain tantalisingly elusive (44).
Much remains to be learnt about the characteristics of the informationpractice evidence base. Mainstream studies characterising the types of
outputs produced by the information science research community have yet
to be analysed for their implications with regard to such mechanisms as the
hierarchy of evidence and critical appraisal checklists. Experience with
those few checklists to emerge to date suggests that they have a valuable
role in supporting the conduct of systematic reviews in information topics.
In a recent article in Vine, Booth (55) posits the idea that moves to make
practitioners more responsive to the messages from research are actually
just one contribution to an overarching theme of reflective practice. Thisperspective on evidence based information practice, where research is
The New Review of Information and Library Research 2003 15
Dow
nloa
ded
by [
The
UC
Irv
ine
Lib
rari
es]
at 2
3:55
17
Dec
embe
r 20
14
merely one stimulant for reflection (as compensation for previous neglect),finds welcome echoes in the following comment from Todd:
A profession without reflective practitioners willing to learn about the advances in research inthe field is a blinkered profession, one that is disconnected from best practice and bestthinking, and one which, by default, often resorts to advocacy and position as a bid forsurvival. (52)
Todd’s focus on the concept of the ‘‘reflective practitioner’’ (129, 130), as
originally proposed by Schon (131), might allay the concerns of those who
favour dropping the term evidence based for a term with less of a medicalconnotation. This step is seen by West (98) as ‘‘prerequisite to eliciting
greater acceptance by the profession as a whole of the idea of basing
professional decisions and actions on best available evidence’’.
Schmidt (132) offers an optimistic vision of the route by which the
research-practice might be bridged in the future:
Library practitioners can improve their research and contribute to good practice and goodresearch by developing skills, publishing and communicating results, developing soundproposals and seeking funds imaginatively. Academic researchers can benefit from the resultsof good practice through collaborative involvement in real projects.
Will the next few years witness enough evidence of such an outcome
becoming a reality (133)? The jury is still out!
REFERENCES
1. HUDGINS, C. A. and ALLEN-MEARES, P. Translational research: a new solution toan old problem? Journal of Social Work Education , 36(1), 2000, 2�/4.
2. GORARD, S. A changing climate for educational research? The role of research capabilitybuilding . ESRC Teaching and Learning Research Programme Research CapacityBuilding Network Occasional Paper Series Paper 45. Cardiff: Cardiff University Schoolof Social Sciences, 2002.
3. CLAYTON, P. Bridging the gap: research and practice in librarianship. In: P. Claytonand R. McCaskie, eds, Priorities for the future: proceedings of the first National Referenceand Information Service Section conference and the university, college and research librariessection workshop on research . Deakin, ACT: ALIA and D.W. Thorpe, 1992, 73�/76.
4. BOOTH, A. Research column: turning research priorities into answerable questions.Health Information and Libraries Journal , 18(2), 2001, 130�/132.
5. WILSON, T. D. Philosophical foundations and research relevance: issues for informationresearch. Journal of Information Science, 29(6), 2003, 445�/452.
6. TURNER, K. Do information professionals use research published in LIS journals?Paper presented at the 68th IFLA Council and General Conference August 18�/24 2002.http://www.ifla.org/IV/ifla68/prog02.htm
7. FEEHAN, P. E., GRAGG, W. L., HAVENER, W. M. and KESTNER, D. D. Libraryand information science research: an analysis of the 1984 journal literature. Library andInformation Science Research , 9, 1987, 173�/185.
16 The New Review of Information and Library Research 2003
Dow
nloa
ded
by [
The
UC
Irv
ine
Lib
rari
es]
at 2
3:55
17
Dec
embe
r 20
14
8. HARTER, S. P. and HOOTEN, P. Information science and scientists: Journal of theAmerican Society for Information Science 1972�/1990. Journal of the American Society
for Information Science, 43, 1992, 583�/593.
9. MCLURE, C. and BISHOP, A. The status of research in Library and InformationScience. College & Research Libraries, 40, 1989, 127�/143.
10. CULLEN, R. Does performance measurement improve organizational effectiveness? Apostmodern analysis. Performance Measurement and Metrics, 1(1), 1999, 9�/30.
11. WATSON-BOONE, R. Academic librarians as practitioner-researchers. Journal of
Academic Librarianship , 26(2), 2000, 85�/93.
12. MCNICOL, S. LIS researchers and practitioners: creating a research culture. Library andInformation Research News, 26(83), 2002, 10�/16.
13. HOIVIK, T. Why is quality control so hard? Reference studies and reference quality inpublic libraries: the case of Norway. Paper presented at World Library and InformationCongress: 69th IFLA General Conference and Council 1�/9 August 2003, Berlin. http://www.ifla.org/IV/ifla69/papers/131e-Hoivik.pdf
14. ELDREDGE, J. D. Evidence-based librarianship: an overview. Bulletin of the Medical
Library Association , 88(4), 2000, 289�/302.
15. BOOTH, A. From EBM to EBL: two steps forward or one step back? Medical Reference
Services Quarterly, 21(3), 2002, 51�/64.
16. MCKIBBON, K. A. Evidence-based practice. Bulletin of the Medical Library Associa-
tion , 86(3), 1998, 396�/401.
17. MCKIBBON, K. A., WILCZYNSKI, N., HAYWARD, R. S., WALKER-DILKS, C. J.and HAYNES, R. B. The medical literature as a resource for evidence based care.McMaster University: Health Information Research Unit, 1996. http://hiru.mcmaster.ca/hiru/medline/asis-pap.htm
18. BOOTH, A. Evidence based healthcare. In: J. Feather and P. Sturges, International
Encyclopaedia of Information and Library Science, 2nd edn. London: Routledge, 2003,189�/190.
19. DAVIDOFF, F., HAYNES, R. B., SACKETT, D. L. and SMITH, R. Evidence-basedmedicine: a new journal to help doctors identify the information they need. British
Medical Journal , 310(6987), 1995, 1085�/1086.
20. ANTMAN, E. M., LAU, J., KUPELNICK, B., MOSTELLER, F. and CHALMERS, T.C. A comparison of results of meta-analyses of randomized control trials andrecommendations of clinical experts. Treatments for myocardial infarction. JAMA ,268(2), 1992, 240�/248.
21. STRAUS, S. E. and SACKETT, D. Using research findings in clinical practice. British
Medical Journal , 317(7154), 1998, 339�/342.
22. EVIDENCE BASED MEDICINE WORKING GROUP. Evidence-Based medicine: anew approach to teaching the practice of medicine. JAMA , 268(17), 1992, 2420�/2425.
23. SACKETT, D., HAYNES, R. B., TUGWELL, P. and GUYATT, G. Clinical
epidemiology: a basic science for clinical medicine, 2nd edn. Boston: Little, Brown andCompany, 1991.
24. COIERA, E. Evidence-based medicine, the internet, and the rise of medical informatics.Hewlett-Packard Laboratories Technical Report HPL-96-26, 1996.
25. GOLDNER, E. M. and BILSKER, D. Evidence-based psychiatry. Canadian Journal of
Psychiatry, 40(2), 1995, 97�/101.
26. RICHARDS, D. and LAWRENCE, A. Evidence based dentistry. British Dental Journal ,179(7), 1995, 270�/273.
The New Review of Information and Library Research 2003 17
Dow
nloa
ded
by [
The
UC
Irv
ine
Lib
rari
es]
at 2
3:55
17
Dec
embe
r 20
14
27. WHITE, S. J. Evidence-based practice and nursing: the new panacea? British Journal of
Nursing , 6(3), 1997, 175�/178.
28. FLEMING, K. A. Evidence-based pathology. Journal of Pathology, 179(2), 1996, 127�/
128.
29. ETMINAN, M., WRIGHT, J. M. and CARLETON, B. C. Evidence-based pharma-cotherapy: review of basic concepts and applications in clinical practice. Annals of
Pharmacotherapy, 32(11), 1998, 1193�/1200.
30. GRAY, J. A. Evidence-based healthcare: how to make health policy and management
decisions. London: Churchill Livingstone, 1997.
31. HAMMERSLEY, M. Evidence-based practice in education and the contribution ofeducational research. In: L. Trinder and S. Reynolds, eds, Evidence-based practice: a
critical appraisal . Oxford: Blackwell Science, 2000, 163�/183.
32. TRINDER, L. Evidence-based practice in social work and probation. In: L. Trinder andS. Reynolds, eds, Evidence-based practice: a critical appraisal . Oxford: Blackwell Science,2000, 138�/162.
33. ROSEN, A. Evidence-based social work practice: challenges and promise. Social Work
Research , 27(4), 2003, 197�/208.
34. BRINER, R. Evidence-based human resource management. In: L. Trinder and S.Reynolds, eds, Evidence-based practice: a critical appraisal . Oxford: Blackwell Science,2000, 184�/211.
35. PETROSINO, A., BORUCH, R., FARRINGTON, D., SHERMAN, L. and WEIS-BURD, D. Toward evidence-based criminology and criminal justice: systematic reviews.The Campbell Collaboration, and its Crime and Justice Group, Justice Research &Statistics Association Forum , 19, 2001.
36. TRINDER, L. and REYNOLDS, S., eds, Evidence-based practice: a critical appraisal .Oxford: Blackwell Science, 2000.
37. FORD, N., MILLER, D., BOOTH, A., O’ROURKE, A., RALPH, J. and TURNOCK,E. Information retrieval for evidence-based decision making. Journal of Documentation ,55(4), 1999, 385�/401.
38. CABINET OFFICE. Modernising government . Cm 4310. London: Stationery Office,1999, 66 pp. http://www.cabinet-office.gov.uk/moderngov/whtpaper/index.htm
39. STREATFIELD, D. Sifting the grist in the research mill? Information Research Watch
International , March, 2000, 3�/4.
40. BOAZ, A. and ASHBY, D. Fit for purpose? Assessing research quality for evidencebased policy and practice. ESRC UK Centre for Evidence Based Policy and Practice.London, January 2003. 18 pp. (Working Paper 11).
41. BOAZ, A., ASHBY, D. and YOUNG, K. Systematic reviews: what have they got to offerevidence based policy and practice? ESRC UK Centre for Evidence Based Policy andPractice. London, March 2002. 26 pp. (Working Paper 2).
42. PETTICREW, M. Systematic reviews from astronomy to zoology: myths and mis-conceptions. British Medical Journal , 322(7278), 2001, 98�/101. http://www.bmj.com/cgi/content/full/322/7278/98
43. BOOTH, A. and HAINES, M. Room for a review? Library Association Record , 100,August 1998, 411�/412.
44. BOOTH, A. Research column: [systematic reviews of health information services andsystems]. Health Information and Libraries Journal , 18(1), 2000, 60�/63.
45. BOOTH, A. and ELDREDGE, J. D. Evidence-based librarianship: a Socratic dialogue.Bibliotheca Medica Canadiana , 23(4), 2002, 136�/140.
18 The New Review of Information and Library Research 2003
Dow
nloa
ded
by [
The
UC
Irv
ine
Lib
rari
es]
at 2
3:55
17
Dec
embe
r 20
14
46. MCNICOL, S. and NANKIVELL, C. The LIS research landscape, a review andprognosis. Birmingham: Centre for Information Research (CIRT), University of CentralEngland, 2003. http://www.cie.uce.ac.uk/cirt/projects/past/LISlandscape_final report.pdf
47. MCNICOL, S. LIS researchers and practitioners: creating a research culture. Library andInformation Research News, 26(83), 2002, 10�/16.
48. ELDREDGE, J. D. Evidence-based librarianship: what might we expect in the yearsahead? Health Information Libraries Journal , 19(2), 2002, 71�/77.
49. BOOTH, A. Exceeding expectations: achieving professional excellence by gettingresearch into practice. LIANZA 2000. Christchurch, New Zealand, October 15�/18,2000. http://www.conference.co.nz/lianza2000/papers/AndrewBooth.pdf
50. CRUMLEY, E. and KOUFOGIANNAKIS, D. Developing evidence based librarianshipin Canada: six aspects for consideration. Hypothesis, 15, 2001, 9�/10.
51. CRUMLEY, E. and KOUFOUGIANNAKIS, D. developing evidence-based librarian-ship: practical steps for implementation. Health Information and Libraries Journal , 19(2),2002, 61�/70.
52. TODD, R. J. School librarian as teachers: learning outcomes and evidence-basedpractice. 68th IFLA Council and General Conference August 18�/24, 2002. http://www.ifla.org/IV/ifla68/papers/084-119e.pdf
53. TODD, R. J. Learning in the information age school: opportunities, outcomes andoptions. International Association of School Librarianship (IASL) 2003 AnnualConference. Durban, South Africa, 7�/11 July 2003. http://www.iasl-slo.org/confer-ence2003-virtualpap.html
54. BOOTH, A. and BRICE, A., eds, Evidence based practice: a handbook for informationprofessionals. London: Facet Publishing, 2004.
55. BOOTH, A. Where systems meet services: towards evidence-based information practice.Vine, 33(2), 2003, 65�/71.
56. HAINES, M. Librarians and evidence based purchasing. Evidence Based Purchasing , 8,1995, 1.
57. ATKINS, C. F. and LOUW, G. Reclaiming knowledge: a case for evidence-basedinformation systems. Proceedings of the 8th European conference on information systems,Vienna, 3�/5 July, 2000, 39�/45.
58. KAPLAN, R. B. and WHELAN, J. S. Buoyed by a rising tide: information literacy sailsinto the curriculum on the currents of evidence based medicine and professionalcompetency objectives. Journal of Library Administration , 36(1/2), 2002, 219�/235.
59. BEAVEN, O. and MCHUGH, J. An introduction to evidence-based health care and theopportunities it presents for information professionals �/ clinical evidence as an example.Vine, 33(4), 2003, 179�/183.
60. DESHPANDE, N., PUBLICOVER, M., GEE, H. and KHAN, K. S. Incorporating theviews of obstetric clinicians in implementing evidence-supported labour and deliverysuite ward rounds: a case study. Health Information and Libraries Journal , 20(2), 2003,86�/94.
61. ENTWISTLE, V. A., SHELDON, T. A., SOWDEN, A. and WATT, I. S. Evidence-informed patient choice. Practical issues of involving patients in decisions about healthcare technologies. International Journal of Technology Assessment in Health Care , 14(2),1998, 212�/225.
62. GUYATT, G. H., MEADE, M. O., JAESCHKE, R. Z., COOK, D. J. and HAYNES, R.B. Practitioners of evidence based care. British Medical Journal , 320(7240), 2000, 954�/
955.
63. ROSENBERG, W. and DONALD, A. Evidence based medicine: an approach to clinicalproblem-solving. British Medical Journal , 310(6987), 1995, 1122�/1126.
The New Review of Information and Library Research 2003 19
Dow
nloa
ded
by [
The
UC
Irv
ine
Lib
rari
es]
at 2
3:55
17
Dec
embe
r 20
14
64. SACKETT, D. L. and ROSENBERG, W. M. Need for evidence-based medicine. Journalof the Royal Society of Medicine, 88(11), 1995, 620�/624.
65. RICHARDSON, W., WILSON, M., NISHIKAWA, J. and HAYWARD, R. The well-built clinical question: a key to evidence-based decisions. ACP Journal Club , 123, 1995,A12�/13.
66. FLEMMING, K. Asking answerable questions. Evidence-Based Nursing , 1(2), 1998, 36�/
37.
67. BOOTH, A. Formulating the question. In: A. Booth and G. Walton, eds, Managingknowledge in health services. London: Library Association, 2000, 197�/206.
68. ELDREDGE, J. D. Evidence-based librarianship: formulating EBL questions. Bib-liotheca Medica Canadiana , 22, 2000, 74�/77.
69. BOOTH, A., O’ROURKE, A. J. and FORD, N. J. Structuring the pre-search referenceinterview: a useful technique for handling clinical questions. Bulletin of the MedicalLibrary Association , 88(3), 2000, 239�/246.
70. VILLANUEVA, E. V., BURROWS, E. A., FENNESSY, P. A., RAJENDRAN, M. andANDERSON, J. N. Improving question formulation for use in evidence appraisal in atertiary care setting: a randomised controlled trial [ISRCTN66375463] BMC MedicalInformatics and Decision Making 2001 , 1, 2001, 4. http://www.biomedcentral.com/1472-6947/1/4
71. WILDRIDGE, V. and BELL, L. How CLIP became ECLIPSE: a mnemonic to assist insearching for health policy/management information. Health Information and LibrariesJournal , 19(2), 2002, 113�/115.
72. GORMAN, P. N. and HELFAND, M. Information seeking in primary care: howphysicians choose which clinical questions to pursue and which to leave unanswered.Medical Decision Making , 15, 1995, 113�/119.
73. BARRIE, A. R. and WARD, A. M. Questioning behaviour in general practice: apragmatic study. British Medical Journal , 315(7143), 1997, 1512�/1515.
74. KOUFOGIANNAKIS, D., CRUMLEY, E. and SLATER, L. A content analysis oflibrarianship research. Journal of Information Science (In press).
75. BOOTH, A. Testing the LORE of Research. Library Association Record , 100(12), 1998,654.
76. BOOTH, A. Library-LORE (Literature Oriented Reviews of Effectiveness). Hypothesis,14, 2000, 11�/12.
77. BOOTH, A. Librarian heal thyself: evidence based librarianship, useful, practicable,desirable? Proceedings of the 8th international congress on medical librarianship. London,UK. July 2�/5, 2000. http://www.icml.org/tuesday/themes/booth.htm
78. BAYLEY, L. and Eldredge, J. The structured abstract: an essential tool for researchers.Hypothesis , 17(1), 2003, 1, 11�/13.
79. AD HOC WORKING GROUP FOR CRITICAL APPRAISAL OF THE MEDICALLITERATURE. A proposal for more informative abstracts of clinical articles. Annals ofInternal Medicine, 106(4), 1987, 598�/604.
80. BOOTH, A. and O’ROURKE, A. J. The value of structured abstracts in informationretrieval from MEDLINE. Health Libraries Review, 14(3), 1997, 157�/166.
81. HARTLEY, J. Is it appropriate to use structured abstracts in social science journals?Learned Publishing , 10(4), 1997, 313�/317. http://cogprints.ecs.soton.ac.uk/archive/00000589/00/199801003.html
82. HARTLEY, J. Are structured abstracts more or less accurate than traditional ones? Astudy in the psychological literature. Journal of Information Science, 26(4), 2000, 273�/
277.
20 The New Review of Information and Library Research 2003
Dow
nloa
ded
by [
The
UC
Irv
ine
Lib
rari
es]
at 2
3:55
17
Dec
embe
r 20
14
83. HARTLEY, J. Improving the clarity of journal abstracts in psychology: the case forstructure. Science Communication , 24(3), 2003, 366�/379.
84. HARTLEY, J. Do structured abstracts take more space? And does it matter? Journal ofInformation Science, 28(5), 2002, 417�/422.
85. TAYLOR, B. J., DEMPSTER, M. and DONNELLY, M. Hidden gems: systematicallysearching electronic databases for publications for social work and social care. BritishJournal of Social Work , 33(4), 2003, 423�/439.
86. GRAYSON, L. and GOMERSALL, A. A difficult business: finding the evidence forsocial science reviews. ESRC UK Centre for Evidence Based Policy and Practice,Department of Politics, Queen Mary, University of London, Mile End Road, London E14NS, 2003. 23 pp. (Working Paper 19). http://www.evidencenetwork.org/Documents/wp19.pdf
87. HAYNES, R. B., WILCZYNSKI, N., MCKIBBON, K. A., WALKER, C. J. andSINCLAIR, J. C. Developing optimal search strategies for detecting clinically soundstudies in MEDLINE. Journal of the American Medical Informatics Association, 1994 ,1(6), 1994, 447�/458.
88. WILCZYNSKI, N. L., WALKER, C. J., MCKIBBON, K. A. and HAYNES, R. B.Quantitative comparison of pre-explosions and subheadings with methodologic searchterms in MEDLINE. Proceedings �/ the annual symposium on computer applications inmedical care , 1994, 905�/909.
89. WHITE, V. J., GLANVILLE, J. M., LEFEBVRE, C. and SHELDON, T. A. A statisticalapproach to designing search filters to find systematic reviews: objectivity enhancesaccuracy. Journal of Information Science, 27(6), 2001, 357�/370.
90. MARSHALL, J. Sizzling search strategies: how to put some methods terms in yourMedline searches. Bibliotheca Medica Canadiana , 5(3), 1983, 88�/90.
91. MCKIBBON, A., EADY, A. and MARKS, S. PDQ evidence-based principles andpractice. Hamilton, Ontario, Canada: BC Decker Inc, 1999.
92. BEVERLEY, C. Searching the literature. In: A. Booth and A. Brice, eds, Evidence basedpractice:a handbookfor informationprofessionals.London:FacetPublishing,2004,89�/103.
93. GUYATT, G. and DRUMMOND, R. eds, Users’ guides to the medical literature: amanual for evidence-based clinical practice. Chicago: AMA Press, 2002.
94. BOOTH, A. and BRICE, A. Research. Health Information and Libraries Journal , 18(3),2001, 175�/177.
95. BOOTH, A. and BRICE, A. Clear-cut?: facilitating health librarians to use informationresearch in practice. Health Information and Libraries Journal , 20(suppl 1), 2003, 45�/52.
96. ATKINS, C. and SAMPSON, J. Critical appraisal guidelines for single case studyresearch. ECIS 2002. June 6�/8, Gdansk, Poland.
97. CLYDE, L. A. Evidence based practice in school librarianship: evaluating the researchevidence. Access , 17(4), 2003, 26�/29.
98. WEST, K. The librarianship conference report: convincing evidence. InformationOutlook , December, 2003, 12�/14.
99. SACKETT, D. L. and WENNBERG, J. E. Choosing the best research design for eachquestion: it’s time to stop squabbling over the ‘‘best’’ methods. British Medical Journal ,315(7123), 1997, 1636.
100. ELDREDGE, J. D. Evidence-based librarianship: levels of evidence. Hypothesis, 16(3),2002, 10�/13. http://gain.mercer.edu/mla/research/hypothesis.html
101. EDWARDS, A. G. K., RUSSELL, I. T. and STOTT, N. C. H. Signal versus noise in theevidence base for medicine: an alternative to hierarchies of evidence. Family Practice,15(4), 1998, 319�/322.
The New Review of Information and Library Research 2003 21
Dow
nloa
ded
by [
The
UC
Irv
ine
Lib
rari
es]
at 2
3:55
17
Dec
embe
r 20
14
102. WINNING, M. A. and BEVERLEY, C. A. Clinical librarianship: a systematic review ofthe literature. Health Information and Libraries Journal , 20(suppl 1), 2003, 10�/21.
103. BEVERLEY, C. A., BATH, P. A. and BOOTH, A. Health information needs of visuallyimpaired people: a systematic review of the literature. Health and Social Care in theCommunity, 12(1), 2004, 1�/24.
104. BRETTLE, A. Information skills training: a systematic review of the literature. HealthInformation and Libraries Journal , 20(suppl 1), 2003, 3�/9.
105. TENOPIR, C. (with the assistance of Hitchcock, B. and Pillow, A.) Use and users ofelectronic library resources: an overview and analysis of recent research studies.Washington: Council on Library and Information Resources 8/03, 72 pp., 2003. http://www.clir.org/pubs/abstract/pub120abst.html
106. WELLER, A. C. Editorial peer review: its strengths and weaknesses. Medford, NJ:Information Today, 2001. 342 pp. (ASIS&T Monograph Series).
107. BEVERLEY, C. A., BOOTH, A. and BATH, P. A. The role of the information specialistin the systematic review process: a health information case study. Health Information andLibraries Journal , 20(2), 2003, 65�/74.
108. ELDREDGE, J. D. The randomised controlled trial design: unrecognized opportunitiesfor health sciences librarianship. Health Information and Libraries Journal , 20(Suppl 1),2003, 34�/44.
109. ELDREDGE, J. Cohort studies in health sciences librarianship. Journal of the MedicalLibrary Association , 90(4), 2002, 380�/392.
110. GRANT, M. J. Journal clubs for continued professional development. Health Informa-tion and Libraries Journal , 20(Suppl 1), 2003, 72�/73.
111. KOUFOGIANNAKIS, D. and CRUMLEY, E. Facilitating evidence-based librarian-ship: a Canadian experience. Health Information and Libraries Journal , 20(Suppl 1),2003, 73�/75.
112. DONEY, L. and STANTON, W. Facilitating evidence-based librarianship: a UKexperience. Health Information and Libraries Journal , 20(Suppl 1), 2003, 76�/78.
113. YOUNGER, J. Integrating evidence and practice. Presentation at 2nd International EBLConference, Edmonton, Alberta, June 4th�/5th, 2003.
114. OBERG, D. Looking for evidence: do school libraries improve student achievement?School Libraries in Canada , 22(2), 2002, 10�/13, 44.
115. BOOTH, A. On a cautious adoption of innovative projects. Health Information andLibraries Journal , 19(4), 2002, 239�/242.
116. DAVIES, J. E. What gets measured, gets managed: statistics and performance indicatorsfor evidence based management. Journal of Librarianship and Information Science, 34(3),2002, 129�/133.
117. SCOTT, E. S. How good is your school library resource centre? An introduction toperformance measurement. 68th IFLA Council and General Conference August 18�/24,2002, p. 2.
118. PLUTCHAK, T. S. The art and science of making choices. Journal of the Medical LibraryAssociation , 91(1), 2003, 1�/3.
119. BRITISH LIBRARY. Measuring our value: results of an independent economic impactstudy commissioned by the British Library to measure the Library’s direct and indirect valueto the UK economy. London: British Library, 2003.
120. BOOTH, A. Mirage or reality? Health Information and Libraries Journal , 19(1), 2002,56�/58.
121. BOOTH, A. Evidence-based librarianship: one small step. Health Information andLibraries Journal , 19(2), 2002, 116�/119.
22 The New Review of Information and Library Research 2003
Dow
nloa
ded
by [
The
UC
Irv
ine
Lib
rari
es]
at 2
3:55
17
Dec
embe
r 20
14
122. BOOTH, A. Spotlight on evidence-based librarianship. Bibliotheca Medica Canadiana ,23(3), 2002, 84�/85.
123. KOUFOGIANNAKIS, D. and CRUMLEY, E. Evidence based librarianship. Feliciter,48(3), 2002, 112�/114.
124. STREATFIELD, D. Towards evidence-based library and information work? InformationResearch Watch International , October, 2002, 3�/4.
125. MARSHALL, J. Influencing our professional practice by putting our knowledge towork. Information Outlook , 7(1), 2003, 40�/44.
126. STREATFIELD, D. From action to research: and back again? Information ResearchWatch International , August, 2002, 3�/4.
127. DAHLGREEN, M. Better libraries through research: using research to inform librarypractice. OLA Quarterly, 8(3), 2002, 2�/9.
128. WILLIAMS, I. Evidence based practice: the sustainable future for teacher librarians.School Library Bulletin , 8(2), 2002. http://www.education.tas.gov.au/0278/issue/022/sustainablefuture.htm
129. TODD, R. J. Evidence-based practice I: the sustainable future for teacher-librarians.Scan , 21(1), 2002, 30�/37.
130. TODD, R. J. Evidence-based practice II: getting into the action. Scan , 21(2), 2002, 34�/
41.
131. SCHON, D. The reflective practitioner: how professionals think in action . Avebury:Ashgate Publishing Ltd, 1991.
132. SCHMIDT, J. Library and information research in practice. Text of a paper presented at:survival, improvement, innovation: how research makes good practice; how practicemakes good research. Canberra, October 2000. http://www.library.uq.edu.au/papers/survival_improvement_innovation.pdf
133. ELDREDGE, J. D. Evidence-based librarianship: what might we expect in the yearsahead? Health Information and Libraries Journal , 19(2), 2002, 71�/77.
The New Review of Information and Library Research 2003 23
Dow
nloa
ded
by [
The
UC
Irv
ine
Lib
rari
es]
at 2
3:55
17
Dec
embe
r 20
14