29
Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer in Psychology, Dundee [email protected] [email protected] [email protected]

Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

Embed Size (px)

Citation preview

Page 1: Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

Evaluation of Informatics Tools in Primary Care

Frank Sullivan, Prof. of R&D TCGP, DundeeLiz Mitchell, Research Fellow, Glasgow

Claudia Pagliari, Lecturer in Psychology, Dundee

[email protected] [email protected] [email protected]

Page 2: Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

Informatics

The study of the acquisition,

processing and use of information.

Friedman CP, Wyatt JC. Evaluation methods in medical informatics. New York: Springer 1997.

Informatik Informatique

Page 3: Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

Primary Care in the Information Age

Moving from• Popper world 2

– Notions, intuitions, judgements, mystique

• Popper world 3– Objective reality open to criticism and logical

correction

LL Weed. Clinical judgement revisitedMeth.Inf. Med 1999;38:279-86

Page 4: Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

GP Computer Screens and Prompts

Page 5: Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

Information Age Consultations

ReferenceInformationincluding guidelines

Education

ClinicianPatient

Recall

Consultation

Retrieval

ElectronicMedicalRecord

Sullivan FM, et al

Page 6: Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

Problem 1: Limited evaluation of informatics tools

• Failure to evaluate new resources is a major problem

• Often top-down, technologist/manager - driven development, with little involvement of end-users in the process.

• Expensively developed tools often discarded due to unanticipated technical difficulties or ‘people and organizational issues’.

Page 7: Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

Problem 2: Inappropriate evaluation

RCTs aimed at measuring ‘hard’ clinical and economic outcomes may not always be appropriate for informatics systems because

a) they are not drugs but multifaceted procedural interventions and

b) the type of questions asked of an informatics evaluation are broader, dealing as much with end-users’ acceptance and use of the system as with external outcomes.

Page 8: Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

Thinking about the WHOLE

• The RCT may provide useful information but it can only give part of the story

• Comprehensive evaluation of health informatics tools requires a broader range of research methods, involving both quantitative and qualitative approaches.

• The ideal method, or combination of methods, will be determined by the research questions and the context and timeframe in which it is taking place.

Page 9: Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

Which research questions/whose perspective?

Developer

Does it work?

Will they use it?

User

Is it fast? Is it fun?

Patient

Purchaser

What is the cost:benefit?

Is it safe? Will it work?

Page 10: Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

2 Classes of research method for informatics evaluations

Objectivist: concerned with objective assessment of clearly defined variables, usually measured quantitatively (e.g. via experimental or correlational studies).

Subjectivist: based on the judgements of expert evaluators, system users, potential users or other stakeholders. Often rely on qualitative, anthropological research methods.

Friedman and Wyatt, 1997

Page 11: Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

Comparison-Based:

Employs experiments and quasi experiments. Comparisons based on small numbers of outcome variables

e.g. Hypothesis: “Compliance with guideline recommendations to check diabetics’ feet annually will increase following introduction of computer-based reminders system”

Objectivist approaches 1

Page 12: Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

Objectives-Based:

Aim is to determine whether the resource meets its designer’s objectives.

E.g.: Are fully integrated patient records accessible to the GP within 2 minutes?

Objectivist approaches 2

Page 13: Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

Decision Facilitation:

Focus on answering questions important to developers and administrators. Usually used in formative studies when developing new resources.

e.g. Systematic study of various formats for a presenting guideline information on-screen, conducted as part of the process of resource development.

Objectivist approaches 3

Page 14: Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

Goal-Free:

Evaluators are blinded to the intended effects of the resource and must chart all its effects. Aims to reduce reporting bias and to uncover both unintended and intended affects.

E.g. Conducting patient chart reviews before and after introduction of an information resource without telling the reviewer anything about the nature of the information resource.

Objectivist approaches 4

Page 15: Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

Responsive-illuminative:

Focuses on the reports of users, e.g. feedback following a demonstration or period of hands-on familiarisation with the tool. Useful for technical troubleshooting and for examining contextual factors which may affect implementation.

E.g. Observations of prototypical users in a laboratory setting, followed by one-to-one interviews about the advantages and disadvantages of the resource and discussion of what has been observed.

Subjectivist approaches 1

Page 16: Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

Art Criticism:

Analysis and review of a resource by a generic expert.

E.g. Software review in a technical magazine. Inviting a noted consultant on user interface design to spend a day on site to offer suggestions regarding the prototype of a new system.

Subjectivist approaches 2

Page 17: Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

Professional review:

Management consultancy type approach using extended site visits by experienced peers to the environment in which the resource is installed. May employ a combination of methods including speaking to users, observing the system in operation etc.

E.g. A site visit by a government review team to several research groups competing

to have their patient management screens for asthma adopted nationally.

Subjectivist approaches 3

Page 18: Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

Quasi-legal:

Mock trial or other formal adversarial procedure to judge a resource. Rarely used.

E.g. Staging a mock debate at a research group retreat.

Subjectivist approaches 4

Page 19: Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

Comprehensive evaluation may require a combination of research methods involving both objectivist and subjectivist approaches.

The choice will relate to the specific research questions and the stage of the evaluation.

Tailoring methods to the problem

Page 20: Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

General steps in informatics evaluations

•Define and prioritise study questions

•Define the "system" to be studied

•Select or develop reliable, valid measurement methods

•Design the demonstration study

•Choose the appropriate methodology

•Ensure that study findings can be generalized

•Carry out the evaluation study

(NB. Demonstration and evaluation phases may overlap)

Page 21: Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

Step 1: Define and prioritise your study questions

• Decide exactly what you want to find out and specify your objectives.

• Ideally questions should be be agreed between the research team, system developers, clinical & non-clinical users & patients.

• Find out what has been done before

Page 22: Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

Step 2: Define the "system" to be studied

• Is the system simple or multifaceted? Is it one component or the system as a whole that is of interest? If the former, can you isolate and evaluate that part alone? (e.g diabetes web-suite)

• Develop a model for the evaluation to test. Results can be compared with the model to define the place of the new technology and further refine the model.

Page 23: Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

Step 3: Select or develop reliable, valid measurement methods

• The aim of so-called ‘measurement studies’ is to ensure that the tools you use to assess outcomes are of as high quality as the methodology allows.

• Try and use established measurement tools (e.g. questionnaires) if available. If not, there are clear procedures for developing them (see Friedman & Wyatt p71).

• It may necessary to consult widely, interview potential system users individually or in groups in order to determine which are the key variables to be studied.

Page 24: Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

Step 4: Design the demonstration study

• Leaving the evaluation until after a system is in place restricts the degree to which the results of the evaluation can be used to modify the system, resulting in less-than-ideal implementation (meaning not only access but also acceptance and use).

• Gold standard approach to evaluation involves a ‘prototyping’ phase or ‘demonstration study’, in as realistic a context as possible, followed by one or more user-informed iterations of the system (i.e. ‘the evaluation-development cycle’).

• May assess several objective and subjective variables including usability; attitudes; ideas for change; barriers to implementation.

Page 25: Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

Step 5: Choose the appropriate methodology

• Approaches to evaluation that examine informatics resources from multiple perspectives, using several methodologies, are likely to produce more valuable results

• Tailor methods to research questions & stakeholder perspectives

• Ensure methodological rigor. See checklists by Johnston et al. & Sullivan & Mitchell for assessment criteria for experimental and non-experimental studies.

Page 26: Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

Step 6: Ensuring that study findings can be generalized

• Difficult to achieve in informatics research. Study effects can be context-dependent (‘People don’t use computers: organisations do’)

• Qualitative research will focus on small (selected) samples, although may indicate wider issues which could affect generalisability of results

• Experimental research may be more generalizable but important to build-in safeguards e.g. increase sample sizes when randomising by practice to correct for intra-cluster correlation.

Page 27: Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

Step 7: Carry out the evaluation study

Preparation

• Decide whether continuation is justified. • Firm up the methodology • Convince the ethics committee. • Identify & liase with key stakeholders• Consider commercial implications and intellectual

property rights.

Page 28: Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

evaluation study continued.

• Recruitment

• Remember - enthusiasts may not be representative• Design strategies for recruiting patients (& gaining

consent)

• Detailed study planning

• Create written manual of study procedures. (Focuses on the fine detail of who does what at the different stages of the project. May change over time.)

Page 29: Evaluation of Informatics Tools in Primary Care Frank Sullivan, Prof. of R&D TCGP, Dundee Liz Mitchell, Research Fellow, Glasgow Claudia Pagliari, Lecturer

evaluation study continued.• Pilot as much of your study procedure as possible• Think carefully about where you intend to do the pilot

work. Sites need to be representative of those you intend to use in the main study. Use a small number of test-bed sites to learn of the problems with the resource.

• Other issues to consider during the study• Respond immediately to any technical problems or

concerns expressed by participants. • Study sites and participants should be kept informed of

progress. • Reward participating practices if possible