Upload
andrew-booth
View
222
Download
3
Embed Size (px)
Citation preview
copy Blackwell Science Ltd 2002
Health Information and Libraries Journal
19
pp181ndash184
181
Blackwell Science Ltd
Research
Libraries without walls still need windows
Andrew Booth Graham Walton Veronica FraserChristine Urquhart amp John van Loo for the HealthLibraries Confederation (HeLiCon) Research TaskFinish Group
The original concept behind this special themeissue is lsquolibraries without wallsrsquo a notuncontroversial label in a profession alwaysstriving to classify Perhaps lsquolibrarians sansfrontiegraveresrsquo is more fitting given the essentialperson-based nature of many such servicesWhatever the lsquomot justersquo and here we support ourguest editorrsquos judgement the concept itselfendures Libraries without walls include on theone hand the growth and dramatic developmentof virtual libraries locally nationally and indeedinternationally On the other hand they includea plethora of outreach activities with suchinnovative roles as the clinical librarian theprimary care knowledge officer and the peripateticlibrarian In fact Yeoman and colleagues promotethe term lsquovirtual outreach servicesrsquo to include lsquoanyservices that enable healthcare professionals toaccess pertinent information without physicallycoming into the library premisesrsquo
1
In doing sothey emphasize the similarities rather than thedifferences between such flexible user-centredresponses
Despite contrasting positions on a technology-people continuum both types of activities share atleast one common needmdasha requirement for appro-priate evaluation Interestingly within the formerTrent NHS Region a portfolio of five such initi-atives is currently being evaluated together as asingle package (contact lfalzonsheffieldacuk)These include one Web-based clinical guidelinesproject (SEEKmdashhttpwwws-e-e-korguk) oneclinical librarian project (Leicester) and threeprimary and community health outreach projects
in Leicester Nottingham and Rotherham Clearlythe challenge is to achieve consistency in evalu-ating such projects whilst remaining sensitiveto the individual variations and circumstancesthat contextualize each project Such an observa-tion is as true when evaluating a number of similarprojects (eg five clinical librarian projects) asit is when exploring a more heterogeneousgrouping as above
Exploring complexities
If all the above conveys a misleading picture ofevaluation as a lsquoone-offrsquo activity a snapshot itmust be remembered that this is a function notof evaluation itself but rather of the short-termlimited life-span projects that so often characterizeour health information sector Within the contextof a continuing service evaluation itself shouldbe continuous a perpetually running videotapethat allows manager and researcher alike tolsquofreeze the framersquo Both can thus gain unique andmeaningful insights regarding either the service itselfor the information behaviours of its users andpotential users The challenge for all libraryand information services is to develop processesand systems that produce data with minimaloverheads in terms of time effort and resource toenable continuing evaluation of services Suchan approach goes a long way towards ensuringthat service developments and enhancementsare informed by evidencemdashthe true rationale forevidence-based librarianship
An added complexity if more were neededis the fact that evaluation of innovative projectsinvariably takes place against a backdrop of exist-ing service provision So clinical librarian projectsusually rely on a support infrastructure housedwithin an existing library service Similarlyprimary care knowledge services have sprung upto serve a clientele that has previously been served
HIR_385fm Page 181 Tuesday September 3 2002 926 AM
Research
copy Blackwell Science Ltd 2002
Health Information and Libraries Journal
19
pp181ndash184
182
albeit inappropriately by services based in acutehospitals Finally in considering the lsquovirtuallibraryrsquo we must recognize that for the short- tomedium-term at least libraries will continue to bedelivered in a hybrid format Thus the SEEKProject provides not simply web-based access toevidence-based resources but supplements this byenhanced provision of evidence-based journalsand books
For the above reasons a well-consideredevaluation will examine the current mix of bothprinted and electronic resources For examplethe Northern NHS Workforce DevelopmentConfederation is currently funding research toevaluate how pre-registration health studentsaccess printed and electronic resources when onclinical placement in the NHS This project(Health and Education Northumbria StudentsAccess to Learning resourcesmdashHENSAL)conducted by the Information ManagementResearch Institute at Northumbria University isdue to report its findings at about the time that thiscolumn appears Clearly in the fast-developingfield of information provision evaluation shouldinclude both the effectiveness of the interven-tion itself and its impact on the surroundingenvironment This is particularly the case givencurrent initiatives centred on e-learning and inthe UK the NHS university (NHSU)
On evaluation and research
At this point some readers will be experiencingsome confusion if not outright antipathy Afterall is this not a research column What doesevaluation have to do with research In the firstedition
2
of his excellent book for practitionersand researchers alike
Real World Research
(which has recently appeared in a secondedition
3
) Colin Robson advances the followinghypothesis
evaluations are essentially indistinguishable fromother research in terms of design data collectiontechniques and methods of analysis
Although such a viewpoint is by no meansuniversally accepted it has led to use of the termlsquoevaluation researchrsquo Such a concept is extremely
important in the context of a research columnaimed at practitioners rather than academicsWhereas many library managers might take issuewith a potential role as researchers few wouldquibble with a contribution to evidence-basedlibrarianship in the form of robust evaluation oftheir own services
Patton asserts that
the practice of evaluation involves the systematiccollection of information about the activitiescharacteristics and outcomes of programspersonnel and products for use by specific peopleto reduce uncertainties improve effectivenessand make decisions with regard to what thoseprogrammes personnel or products are doing andaffecting
4
Placing evaluation within such an informationmanagement context is extremely reassuring fora would-be librarian-evaluator However suchan all-encompassing definition admits manyother techniques knowledge harvesting forexample which are neither research nor evalu-ation although they might be considered usefulto both
In contrast to the jargon-laden context ofresearch the terminology of evaluation is com-paratively accessiblemdashat least as an entreacutee if notas a less digestible main course Fundamental toany understanding of evaluation however is anawareness of the distinction between two formsof evaluationmdash
formative
and
summative
AsMarshall explains
formative
evaluation is lsquoa typeof evaluation that continues throughout the lifeof the programme with the goal of providingmonitoring and feedbackrsquo
Summative
evaluationlsquotypically results in a written report assessing theextent to which the programmersquos objectives havebeen attained in the specified time periodrsquo
5
Ofcourse such approaches are not mutually exclus-ive and optimally an evaluation will includeboth approaches A report from the AustralianClinical Information Access Program (CIAP) statesthat formative evaluation is typically carried out
internally
with ongoing feedback and productimprovement
Summative
evaluation in contrastwill often require commissioning an
external
teamto determine whether the project has achieved its
HIR_385fm Page 182 Tuesday September 3 2002 926 AM
Research
copy Blackwell Science Ltd 2002
Health Information and Libraries Journal
19
pp181ndash184
183
goals
6
Banwell in a chapter entitled
EvaluatingInformation Services
gives an example of theHyLife project which employed both formativeand summative approaches to good effect
7
Evaluation shares with research a concern withdata collection and tends to employ similar tech-niques and instruments We should not howeverfall into the trap of equating data in either casesimply with facts and figures The CIAP reportrehearses the importance of selecting the appro-priate study method for the question in hand illus-trating a range of questions with ethnographicmethods through randomised controlled trial tohealth economic costing study
6
As an examplethe recent clinical librarian conference hosted inLeicester featured a series of anecdotal lsquomorningcase reportsrsquo that probably did more to convey thecultural environment in which the role operatesthan any amount of quantitative data could everaspire to Evaluation should therefore include theentire organizational context as studied by lsquohealthservices researchrsquo not merely the effectivenessof individual interventions as captured by lsquohealthtechnology assessmentrsquo An example of the grow-ing influence of lsquomixed methodsrsquo techniquesfrom health services research is seen in the
Valueand Impact of Virtual Outreach Services
(VIVOS)project which used interviews primarily for qualit-ative data questionnaires primarily for quantitat-ive data and a range of other methods includingcritical incident technique vignettes and evencost-benefit analysis
1
Not untypically of suchan approach the subsequent analysis requiredfamiliarity with a corresponding range of soft-ware from
for qualitative data to SPSSand Excel for quantitative data
What is required
What then are the current lsquodriversrsquo to commandthe attention of ongoing evaluations The per-ennial lsquohas it made
any
difference at allrsquo continuesto occupy base position Increasingly howeverwe see concern with
bull
What has the projectrsquos
impact
been on existinglibrary services
bull
Can the project demonstrate
added value
overand above existing services
bull
Is what we are doing
value for money
bull
How has the project affected
information provisionto the general public
either directly or indirectlyand the Holy Grail
bull
What has the projectrsquos impact been on
directpatient care
Such a complex evaluation agenda places
extraordinary demands on project staff and evalu-ation teams alike The ever-increasing catalogue ofskills required include
bull
project management
bull
questionnaire design and development
bull
ongoing data collection and monitoring
bull
stakeholder analysis
bull
marketing and Web design
bull
people skillsA major lesson from the aforementioned clinical
librarian conference was that the last of these thepeople skills are undoubtedly more crucial to thesuccess of such projects than the broadest arrayof technical skills Evaluation may produceoutcomes that stakeholders do not wish to acceptor would like to suppress Project staff may needto be able to manage resultant conflict In additionto a focused approach to project objectives theyneed to be able to step back and take a strategiclook at how the evaluation outcomes may beused to change a situation or to go in a directionnot previously identified
Finally those conducting the evaluation mustbe aware that different stakeholders respond todifferent means of presentation Evaluationoutcomes must therefore be presented in a formatand style that ensures maximum stakeholderimpactmdashwhether that be stories pound signslarge numbers or some other more imaginativeway of securing commitment To this end theCIAP report provides a useful tabulation of itsstakeholders together with the questions thateach might want answered
6
There have been attempts to meet even in partsuch an ambitious staff development agenda Inparallel to the Trent evaluation project projectmanagers and officers from the five sites togetherwith evaluation staff took part in a series ofmonthly action learning set meetings aroundspecific needs of the group A problem-basedlearning approach was used to crystallise learn-ing opportunities and encourage cross-projectthinking and working While great care must
HIR_385fm Page 183 Tuesday September 3 2002 926 AM
Research
copy Blackwell Science Ltd 2002
Health Information and Libraries Journal
19
pp181ndash184
184
be taken not to exaggerate the effectiveness of suchan approach it can nevertheless be seen as anatural way forward from the task-focused projectreview meetings that more typically characterizethe life cycle of such limited duration initiatives
Conclusion
Formative and summative outcomes andprocessesmdashthe terminology of evaluation seemsdestined to occupy the agendas of healthcarelibrarians for some considerable time The realchallenge it seems is not to create an elite super-breed of
practitioner-researchers
but rather toassist
every
information professional to become a
librarian-evaluator
This will require a sizeableinvestment of time effort and (dare we say) moneyin continuing professional development Whetherthe NHS university is the answer or merely theobject of further questions is an issue that onlyfurther evaluation can address
References
1 Yeoman A Urquhart C
et al
The value and impact of virtual outreach services (VIVOS)
Library and Information Commission Research Report 111
London Resource the Council for Museums Archives and Libraries 2001 Available online from httpusersaberacukcju [accessed 23 May 2002]
2 Robson C
Real World Research A Resource for Social Scientists and Practitioner-Researchers
Oxford Blackwell 1993 174
3 Robson C
Real World Research a Resource for Social Scientists and Practitioner-Researchers
2nd edn Oxford Blackwell 2002
4 Patton M Q
Practical Evaluation
London Sage 1982 155 Marshall J G Using evaluation research methods to
improve quality
Health Libraries Review
1995
12
159ndash726 Collaborative Projects Planning Committee Development
of an evaluation methodology for the NSW Health Clinical Information Access Program (CIAP) Sydney NSW Health Department 2001 Available online from httpwwwhealthnswgovau [accessed 7 May 2002]
7 Banwell L Evaluating information services In Booth A Walton G (eds)
Managing Knowledge in Health Services
London Library Association 2000 173ndash81
HIR_385fm Page 184 Tuesday September 3 2002 926 AM
Research
copy Blackwell Science Ltd 2002
Health Information and Libraries Journal
19
pp181ndash184
182
albeit inappropriately by services based in acutehospitals Finally in considering the lsquovirtuallibraryrsquo we must recognize that for the short- tomedium-term at least libraries will continue to bedelivered in a hybrid format Thus the SEEKProject provides not simply web-based access toevidence-based resources but supplements this byenhanced provision of evidence-based journalsand books
For the above reasons a well-consideredevaluation will examine the current mix of bothprinted and electronic resources For examplethe Northern NHS Workforce DevelopmentConfederation is currently funding research toevaluate how pre-registration health studentsaccess printed and electronic resources when onclinical placement in the NHS This project(Health and Education Northumbria StudentsAccess to Learning resourcesmdashHENSAL)conducted by the Information ManagementResearch Institute at Northumbria University isdue to report its findings at about the time that thiscolumn appears Clearly in the fast-developingfield of information provision evaluation shouldinclude both the effectiveness of the interven-tion itself and its impact on the surroundingenvironment This is particularly the case givencurrent initiatives centred on e-learning and inthe UK the NHS university (NHSU)
On evaluation and research
At this point some readers will be experiencingsome confusion if not outright antipathy Afterall is this not a research column What doesevaluation have to do with research In the firstedition
2
of his excellent book for practitionersand researchers alike
Real World Research
(which has recently appeared in a secondedition
3
) Colin Robson advances the followinghypothesis
evaluations are essentially indistinguishable fromother research in terms of design data collectiontechniques and methods of analysis
Although such a viewpoint is by no meansuniversally accepted it has led to use of the termlsquoevaluation researchrsquo Such a concept is extremely
important in the context of a research columnaimed at practitioners rather than academicsWhereas many library managers might take issuewith a potential role as researchers few wouldquibble with a contribution to evidence-basedlibrarianship in the form of robust evaluation oftheir own services
Patton asserts that
the practice of evaluation involves the systematiccollection of information about the activitiescharacteristics and outcomes of programspersonnel and products for use by specific peopleto reduce uncertainties improve effectivenessand make decisions with regard to what thoseprogrammes personnel or products are doing andaffecting
4
Placing evaluation within such an informationmanagement context is extremely reassuring fora would-be librarian-evaluator However suchan all-encompassing definition admits manyother techniques knowledge harvesting forexample which are neither research nor evalu-ation although they might be considered usefulto both
In contrast to the jargon-laden context ofresearch the terminology of evaluation is com-paratively accessiblemdashat least as an entreacutee if notas a less digestible main course Fundamental toany understanding of evaluation however is anawareness of the distinction between two formsof evaluationmdash
formative
and
summative
AsMarshall explains
formative
evaluation is lsquoa typeof evaluation that continues throughout the lifeof the programme with the goal of providingmonitoring and feedbackrsquo
Summative
evaluationlsquotypically results in a written report assessing theextent to which the programmersquos objectives havebeen attained in the specified time periodrsquo
5
Ofcourse such approaches are not mutually exclus-ive and optimally an evaluation will includeboth approaches A report from the AustralianClinical Information Access Program (CIAP) statesthat formative evaluation is typically carried out
internally
with ongoing feedback and productimprovement
Summative
evaluation in contrastwill often require commissioning an
external
teamto determine whether the project has achieved its
HIR_385fm Page 182 Tuesday September 3 2002 926 AM
Research
copy Blackwell Science Ltd 2002
Health Information and Libraries Journal
19
pp181ndash184
183
goals
6
Banwell in a chapter entitled
EvaluatingInformation Services
gives an example of theHyLife project which employed both formativeand summative approaches to good effect
7
Evaluation shares with research a concern withdata collection and tends to employ similar tech-niques and instruments We should not howeverfall into the trap of equating data in either casesimply with facts and figures The CIAP reportrehearses the importance of selecting the appro-priate study method for the question in hand illus-trating a range of questions with ethnographicmethods through randomised controlled trial tohealth economic costing study
6
As an examplethe recent clinical librarian conference hosted inLeicester featured a series of anecdotal lsquomorningcase reportsrsquo that probably did more to convey thecultural environment in which the role operatesthan any amount of quantitative data could everaspire to Evaluation should therefore include theentire organizational context as studied by lsquohealthservices researchrsquo not merely the effectivenessof individual interventions as captured by lsquohealthtechnology assessmentrsquo An example of the grow-ing influence of lsquomixed methodsrsquo techniquesfrom health services research is seen in the
Valueand Impact of Virtual Outreach Services
(VIVOS)project which used interviews primarily for qualit-ative data questionnaires primarily for quantitat-ive data and a range of other methods includingcritical incident technique vignettes and evencost-benefit analysis
1
Not untypically of suchan approach the subsequent analysis requiredfamiliarity with a corresponding range of soft-ware from
for qualitative data to SPSSand Excel for quantitative data
What is required
What then are the current lsquodriversrsquo to commandthe attention of ongoing evaluations The per-ennial lsquohas it made
any
difference at allrsquo continuesto occupy base position Increasingly howeverwe see concern with
bull
What has the projectrsquos
impact
been on existinglibrary services
bull
Can the project demonstrate
added value
overand above existing services
bull
Is what we are doing
value for money
bull
How has the project affected
information provisionto the general public
either directly or indirectlyand the Holy Grail
bull
What has the projectrsquos impact been on
directpatient care
Such a complex evaluation agenda places
extraordinary demands on project staff and evalu-ation teams alike The ever-increasing catalogue ofskills required include
bull
project management
bull
questionnaire design and development
bull
ongoing data collection and monitoring
bull
stakeholder analysis
bull
marketing and Web design
bull
people skillsA major lesson from the aforementioned clinical
librarian conference was that the last of these thepeople skills are undoubtedly more crucial to thesuccess of such projects than the broadest arrayof technical skills Evaluation may produceoutcomes that stakeholders do not wish to acceptor would like to suppress Project staff may needto be able to manage resultant conflict In additionto a focused approach to project objectives theyneed to be able to step back and take a strategiclook at how the evaluation outcomes may beused to change a situation or to go in a directionnot previously identified
Finally those conducting the evaluation mustbe aware that different stakeholders respond todifferent means of presentation Evaluationoutcomes must therefore be presented in a formatand style that ensures maximum stakeholderimpactmdashwhether that be stories pound signslarge numbers or some other more imaginativeway of securing commitment To this end theCIAP report provides a useful tabulation of itsstakeholders together with the questions thateach might want answered
6
There have been attempts to meet even in partsuch an ambitious staff development agenda Inparallel to the Trent evaluation project projectmanagers and officers from the five sites togetherwith evaluation staff took part in a series ofmonthly action learning set meetings aroundspecific needs of the group A problem-basedlearning approach was used to crystallise learn-ing opportunities and encourage cross-projectthinking and working While great care must
HIR_385fm Page 183 Tuesday September 3 2002 926 AM
Research
copy Blackwell Science Ltd 2002
Health Information and Libraries Journal
19
pp181ndash184
184
be taken not to exaggerate the effectiveness of suchan approach it can nevertheless be seen as anatural way forward from the task-focused projectreview meetings that more typically characterizethe life cycle of such limited duration initiatives
Conclusion
Formative and summative outcomes andprocessesmdashthe terminology of evaluation seemsdestined to occupy the agendas of healthcarelibrarians for some considerable time The realchallenge it seems is not to create an elite super-breed of
practitioner-researchers
but rather toassist
every
information professional to become a
librarian-evaluator
This will require a sizeableinvestment of time effort and (dare we say) moneyin continuing professional development Whetherthe NHS university is the answer or merely theobject of further questions is an issue that onlyfurther evaluation can address
References
1 Yeoman A Urquhart C
et al
The value and impact of virtual outreach services (VIVOS)
Library and Information Commission Research Report 111
London Resource the Council for Museums Archives and Libraries 2001 Available online from httpusersaberacukcju [accessed 23 May 2002]
2 Robson C
Real World Research A Resource for Social Scientists and Practitioner-Researchers
Oxford Blackwell 1993 174
3 Robson C
Real World Research a Resource for Social Scientists and Practitioner-Researchers
2nd edn Oxford Blackwell 2002
4 Patton M Q
Practical Evaluation
London Sage 1982 155 Marshall J G Using evaluation research methods to
improve quality
Health Libraries Review
1995
12
159ndash726 Collaborative Projects Planning Committee Development
of an evaluation methodology for the NSW Health Clinical Information Access Program (CIAP) Sydney NSW Health Department 2001 Available online from httpwwwhealthnswgovau [accessed 7 May 2002]
7 Banwell L Evaluating information services In Booth A Walton G (eds)
Managing Knowledge in Health Services
London Library Association 2000 173ndash81
HIR_385fm Page 184 Tuesday September 3 2002 926 AM
Research
copy Blackwell Science Ltd 2002
Health Information and Libraries Journal
19
pp181ndash184
183
goals
6
Banwell in a chapter entitled
EvaluatingInformation Services
gives an example of theHyLife project which employed both formativeand summative approaches to good effect
7
Evaluation shares with research a concern withdata collection and tends to employ similar tech-niques and instruments We should not howeverfall into the trap of equating data in either casesimply with facts and figures The CIAP reportrehearses the importance of selecting the appro-priate study method for the question in hand illus-trating a range of questions with ethnographicmethods through randomised controlled trial tohealth economic costing study
6
As an examplethe recent clinical librarian conference hosted inLeicester featured a series of anecdotal lsquomorningcase reportsrsquo that probably did more to convey thecultural environment in which the role operatesthan any amount of quantitative data could everaspire to Evaluation should therefore include theentire organizational context as studied by lsquohealthservices researchrsquo not merely the effectivenessof individual interventions as captured by lsquohealthtechnology assessmentrsquo An example of the grow-ing influence of lsquomixed methodsrsquo techniquesfrom health services research is seen in the
Valueand Impact of Virtual Outreach Services
(VIVOS)project which used interviews primarily for qualit-ative data questionnaires primarily for quantitat-ive data and a range of other methods includingcritical incident technique vignettes and evencost-benefit analysis
1
Not untypically of suchan approach the subsequent analysis requiredfamiliarity with a corresponding range of soft-ware from
for qualitative data to SPSSand Excel for quantitative data
What is required
What then are the current lsquodriversrsquo to commandthe attention of ongoing evaluations The per-ennial lsquohas it made
any
difference at allrsquo continuesto occupy base position Increasingly howeverwe see concern with
bull
What has the projectrsquos
impact
been on existinglibrary services
bull
Can the project demonstrate
added value
overand above existing services
bull
Is what we are doing
value for money
bull
How has the project affected
information provisionto the general public
either directly or indirectlyand the Holy Grail
bull
What has the projectrsquos impact been on
directpatient care
Such a complex evaluation agenda places
extraordinary demands on project staff and evalu-ation teams alike The ever-increasing catalogue ofskills required include
bull
project management
bull
questionnaire design and development
bull
ongoing data collection and monitoring
bull
stakeholder analysis
bull
marketing and Web design
bull
people skillsA major lesson from the aforementioned clinical
librarian conference was that the last of these thepeople skills are undoubtedly more crucial to thesuccess of such projects than the broadest arrayof technical skills Evaluation may produceoutcomes that stakeholders do not wish to acceptor would like to suppress Project staff may needto be able to manage resultant conflict In additionto a focused approach to project objectives theyneed to be able to step back and take a strategiclook at how the evaluation outcomes may beused to change a situation or to go in a directionnot previously identified
Finally those conducting the evaluation mustbe aware that different stakeholders respond todifferent means of presentation Evaluationoutcomes must therefore be presented in a formatand style that ensures maximum stakeholderimpactmdashwhether that be stories pound signslarge numbers or some other more imaginativeway of securing commitment To this end theCIAP report provides a useful tabulation of itsstakeholders together with the questions thateach might want answered
6
There have been attempts to meet even in partsuch an ambitious staff development agenda Inparallel to the Trent evaluation project projectmanagers and officers from the five sites togetherwith evaluation staff took part in a series ofmonthly action learning set meetings aroundspecific needs of the group A problem-basedlearning approach was used to crystallise learn-ing opportunities and encourage cross-projectthinking and working While great care must
HIR_385fm Page 183 Tuesday September 3 2002 926 AM
Research
copy Blackwell Science Ltd 2002
Health Information and Libraries Journal
19
pp181ndash184
184
be taken not to exaggerate the effectiveness of suchan approach it can nevertheless be seen as anatural way forward from the task-focused projectreview meetings that more typically characterizethe life cycle of such limited duration initiatives
Conclusion
Formative and summative outcomes andprocessesmdashthe terminology of evaluation seemsdestined to occupy the agendas of healthcarelibrarians for some considerable time The realchallenge it seems is not to create an elite super-breed of
practitioner-researchers
but rather toassist
every
information professional to become a
librarian-evaluator
This will require a sizeableinvestment of time effort and (dare we say) moneyin continuing professional development Whetherthe NHS university is the answer or merely theobject of further questions is an issue that onlyfurther evaluation can address
References
1 Yeoman A Urquhart C
et al
The value and impact of virtual outreach services (VIVOS)
Library and Information Commission Research Report 111
London Resource the Council for Museums Archives and Libraries 2001 Available online from httpusersaberacukcju [accessed 23 May 2002]
2 Robson C
Real World Research A Resource for Social Scientists and Practitioner-Researchers
Oxford Blackwell 1993 174
3 Robson C
Real World Research a Resource for Social Scientists and Practitioner-Researchers
2nd edn Oxford Blackwell 2002
4 Patton M Q
Practical Evaluation
London Sage 1982 155 Marshall J G Using evaluation research methods to
improve quality
Health Libraries Review
1995
12
159ndash726 Collaborative Projects Planning Committee Development
of an evaluation methodology for the NSW Health Clinical Information Access Program (CIAP) Sydney NSW Health Department 2001 Available online from httpwwwhealthnswgovau [accessed 7 May 2002]
7 Banwell L Evaluating information services In Booth A Walton G (eds)
Managing Knowledge in Health Services
London Library Association 2000 173ndash81
HIR_385fm Page 184 Tuesday September 3 2002 926 AM
Research
copy Blackwell Science Ltd 2002
Health Information and Libraries Journal
19
pp181ndash184
184
be taken not to exaggerate the effectiveness of suchan approach it can nevertheless be seen as anatural way forward from the task-focused projectreview meetings that more typically characterizethe life cycle of such limited duration initiatives
Conclusion
Formative and summative outcomes andprocessesmdashthe terminology of evaluation seemsdestined to occupy the agendas of healthcarelibrarians for some considerable time The realchallenge it seems is not to create an elite super-breed of
practitioner-researchers
but rather toassist
every
information professional to become a
librarian-evaluator
This will require a sizeableinvestment of time effort and (dare we say) moneyin continuing professional development Whetherthe NHS university is the answer or merely theobject of further questions is an issue that onlyfurther evaluation can address
References
1 Yeoman A Urquhart C
et al
The value and impact of virtual outreach services (VIVOS)
Library and Information Commission Research Report 111
London Resource the Council for Museums Archives and Libraries 2001 Available online from httpusersaberacukcju [accessed 23 May 2002]
2 Robson C
Real World Research A Resource for Social Scientists and Practitioner-Researchers
Oxford Blackwell 1993 174
3 Robson C
Real World Research a Resource for Social Scientists and Practitioner-Researchers
2nd edn Oxford Blackwell 2002
4 Patton M Q
Practical Evaluation
London Sage 1982 155 Marshall J G Using evaluation research methods to
improve quality
Health Libraries Review
1995
12
159ndash726 Collaborative Projects Planning Committee Development
of an evaluation methodology for the NSW Health Clinical Information Access Program (CIAP) Sydney NSW Health Department 2001 Available online from httpwwwhealthnswgovau [accessed 7 May 2002]
7 Banwell L Evaluating information services In Booth A Walton G (eds)
Managing Knowledge in Health Services
London Library Association 2000 173ndash81
HIR_385fm Page 184 Tuesday September 3 2002 926 AM