92
July 2017 Revision 12 | Page APPENDICES / RESOURCES APPENDIX 1 – Deadlines APPENDIX 2 – Slides from Academic Day presentation outlining the expectations of the audit. APPENDIX 3 – List of QI Leads at Sites APPENDIX 4 – Topics for Consideration APPENDIX 5 – Security and Storage of Personal Health Information APPENDIX 6 – Eight Steps to Chart Audit for Quality Improvement APPENDIX 7 – PDSA Cycle Systematic Review APPENDIX 8 – Research Guide Review – Example from RCPSC APPENDIX 9 – Building Queries in Accuro APPENDIX 10 – Example of flow diagram for Results section APPENDIX 11 – Article – Preparing and delivering a 10-minute presentation at a scientific meeting

APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

  • Upload
    others

  • View
    4

  • Download
    0

Embed Size (px)

Citation preview

Page 1: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

July 2017 Revision

12 | P a g e

APPENDICES / RESOURCES APPENDIX 1 – Deadlines

APPENDIX 2 – Slides from Academic Day presentation outlining the expectations of the audit.

APPENDIX 3 – List of QI Leads at Sites

APPENDIX 4 – Topics for Consideration

APPENDIX 5 – Security and Storage of Personal Health Information

APPENDIX 6 – Eight Steps to Chart Audit for Quality Improvement

APPENDIX 7 – PDSA Cycle Systematic Review

APPENDIX 8 – Research Guide Review – Example from RCPSC

APPENDIX 9 – Building Queries in Accuro

APPENDIX 10 – Example of flow diagram for Results section

APPENDIX 11 – Article – Preparing and delivering a 10-minute presentation at a scientific meeting

Page 2: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

July 2017 Revision

13 | P a g e

APPENDIX 1 – DEADLINES

Deadlines for the 2017/18 Academic Year: PGY1 Topic

PGY1 Topic Deadline – check with Stream/Clinic

PGY1 Written Report – check with Stream/Clinic

PGY1 Oral Presentation – check with Stream/Clinic

PGY2 Topic Deadline - due March 16, 2018

PGY2 Written Report – due May 14, 2018 PGY2 Oral

Presentation – on June 4, 2018

Page 3: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

July 2017 Revision

14 | P a g e

APPENDIX 2 - ACADEMIC DAY INFORMATION Slides from Academic Day presentation outlining the expectations of the audit:

Page 4: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

1

Family Medicine Scholarly Projects

Alexander Singer MB BAO BCh CCFPQI/Informatics Stream LeadAssociate Professor

Part 1Defining the Requirements of the Scholarly Project

R1 Scholarly Project• Written paper and presentation

• Completed as small group (2-3)

• Done during Family Medicine Block Time (FMBT)

• Quality Improvement (QI) using chart audit

• Project must be unique to stream• The same topic cannot be repeated within your stream in the same

year. Get approval from your Education Director/Stream Lead

• Relevant to Family Medicine and your stream

Page 5: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

2

R1 Scholarly Project• Identify a potential “care gap”• Explain and justify “benchmarks” for “best care” derived

from:• Literature review & collaboration with your health care team

• Define your population-of-interest• Outline your EMR search/chart audit strategy• Use findings to develop a QI plan

• What SYSTEM changes must occur to move towards benchmark?• Who needs to be involved in the change?• Barriers to implementation?

R1 Scholarly Project• Expectations – Written Paper

• Title page• Introduction• Methods• Results• Discussion• References and Appendix

• Expectations – Oral Presentation• Group presentation at home clinic• 20-30 minutes with 5 minutes for questions

Page 6: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

3

R2 Scholarly Project

• Written paper and presentation

• Completed individually

• QI or Research based on chart audit

• Project must be unique to stream

• Relevant to Family Medicine and local site

R2 Scholarly Project - QI• Identify “care gap”• Explain and justify benchmarks

– Literature review, collaboration with health care team• Define population• Outline search strategy• Conduct chart audit• Use findings to develop QI plan

– How do findings compare to benchmarks?– What system changes need to be put in place to

encourage practice improvement

Page 7: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

4

R2 Scholarly Project - Research• Identify suspected “care gap”• Conduct chart audit for preliminary data

– Proof for further study• Identify specific research question• Conduct literature review for background• Describe population• Define data sources and research methods• Proposed Statistical Analysis & rationale• Knowledge Translation plan

R2 Scholarly Project• Expectations – Written Paper

– QI• Title page, Introduction, Methods, Results,

Discussion, References and Appendix– Research

• ICMJE format–Abstract, Introduction, Methods, Results,

Discussion, References, Tables and Illustrations

Page 8: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

5

R2 Scholarly Project• Expectations – Oral Presentation

– Individual presentation at end of R2 year• FM Scholar Day – May, 2017

– 10-minutes with 5 minutes for questions

• Must present on FM Scholar Day and if not it is YOUR responsibility to make alternative arrangements!

Completion of R1 and R2 projects are a program requirement for graduation!

Facilitate the education of patients, families, trainees, other health professional colleagues, and the public, as appropriate

Contribute to the creation, dissemination, application, and translation of new knowledge and practices

Critically evaluate medical information, its sources, and its relevance to their practice, and apply this information to practice decisions

Page 9: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

6

What Do We Mean by Family Medicine Scholarship?

• The Family Medicine Laboratory = The Clinic

• Engagement with “evidence” at various levels

• Implementing Evidence in Patient Centred and

appropriate ways

QualityImprovement

Page 10: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

7

What is Quality Improvement?

Plan, Do, Study, Act (PDSA)

Page 11: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

8

QI initiatives• Small-scale cycles of

interventions linked to assessment.

• Goal of improving the process, outcome, and efficiency of complex systems of healthcare.

• Chart audits are often used as part of QI efforts.

• Not done to root out bad quality, but rather to measure quality.

Purposes of Quality ImprovementTo measure quality of care in order to improve it.

• Document something• Determine if the outcome is what is wanted• Find a defect in the process• Fix it• Re-measure to determine if the fix worked

Practices and health systems that agree upon initiatives and processes of care can use audits to determine how well they are following them?

Page 12: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

9

Agreeing on processes?

Clinicians are idiosyncratic individuals• “How consistently are you caring for your

PRACTICE POPULATION’S DIABETICS?” as opposed to “How are you caring for MRS SMITH?”

• What are the determinants of• First choice of anti-hypertensive?• A labour floor’s cesarean section rate?• A clinic’s rate of checking ACRs in diabetics?

What is Chart Audit?

A chart audit is an examination of medical records (electronic and/or hard copy), to determine what has been done, and see if it can be done better.

Page 13: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

10

What should the audit be on?• Can be virtually any aspect of healthcare.

• The data being reviewed should be ACCURATE and must be AVAILABLE in the medical record.

• The data is confidential – issue of ethics approvals.

• Identify a clinically significant care gap• Choose, explain, and justify your external benchmark• Identify, explain, and justify your internal benchmark• Explain and justify the specific measurement outcomes• Describe the population you plan to sample• Collect data• Report findings• Make specific recommendations for system change

Planning a chart audit

Page 14: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

11

• Intervention/tests that will significantly benefit patients if implemented.

• Requires “good quality” evidence of benefit.• Statistically significant.• Sufficient magnitude to be clinically significant.• Patient oriented outcome; meaning something patients

will notice.

Clinically Significant Care Gap

Page 15: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

12

• Evidence‐informed frequency of provision of some important 

aspect of care that has already been achieved in a real world 

setting. 

per individual (e.g. occurs at each visit, occurs a minimum 

of twice a year)

across a whole practice population (e.g. at least 95% of all 

people over the age of 65)

across a specific population (e.g. at least 90% of  patients 

on opioids for more than three months have a signed 

opioid contract)

External Benchmark

Page 16: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

13

• Target level of implementation which you have determined is 

most appropriate for your local practice setting. 

• Done in consultation with local clinicians 

• Must have already made some attempt to implement the 

intervention. 

• May equal the external benchmark

• If unrealistic, too difficult to measure, or otherwise 

inappropriate, may be different 

• Might be a proxy measure for the external benchmark.

Internal Benchmark

• Actual variable assessed in the chart audit which is 

compared to the external and internal benchmarks.

• An example of specific outcome measures are 

clinical variables such as blood pressure, lab values 

or characteristics of care provision (i.e. 

hospitalization or mortality.

Specific Outcome Measure

Page 17: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

14

Any Questions?

Page 18: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

1

Family Medicine Scholarly Projects

Part 2How to choose a question and write your queries

Alexander Singer MB BAO BCh CCFPQI/Informatics Stream LeadAssociate Professor

R1 Scholarly Project• Written paper and presentation

• Completed as small group (2-3)

• Done during Family Medicine Block Time (FMBT)

• Quality Improvement (QI) using chart audit

• Project must be unique to stream• The same topic cannot be repeated within your stream in the same

year. Get approval from your Education Director/Stream Lead

• Relevant to Family Medicine and your stream

Page 19: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

2

R1 Scholarly Project• Identify a potential “care gap”• Explain and justify “benchmarks” for “best care” derived

from:• Literature review & collaboration with your health care team

• Define your population-of-interest• Outline your EMR search/chart audit strategy• Use findings to develop a QI plan

• What SYSTEM changes must occur to move towards benchmark?• Who needs to be involved in the change?• Barriers to implementation?

R1 Scholarly Project• Expectations – Written Paper

• Title page• Introduction• Methods• Results• Discussion• References and Appendix

• Expectations – Oral Presentation• Group presentation at home clinic• 20-30 minutes with 5 minutes for questions

Page 20: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

3

R2 Scholarly Project

• Written paper and presentation

• Completed individually

• QI or Research based on chart audit

• Project must be unique to stream

• Relevant to Family Medicine and local site

R2 Scholarly Project - Research• Identify suspected “care gap”• Conduct chart audit for preliminary data

– Proof for further study• Identify specific research question• Conduct literature review for background• Describe population• Define data sources and research methods• Proposed Statistical Analysis & rationale• Knowledge Translation plan

Page 21: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

4

R2 Scholarly Project• Expectations – Oral Presentation

– Individual presentation at end of R2 year• FM Scholar Day – May, 2017

– 10-minutes with 5 minutes for questions

• Must present on FM Scholar Day and if not it is YOUR responsibility to make alternative arrangements!

Completion of R1 and R2 projects are a program requirement for graduation!

Plan, Do, Study, Act (PDSA)

Page 22: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

5

What is Chart Audit?

A chart audit is an examination of medical records (electronic and/or hard copy), to determine what has been done, and see if it can be done better.

• Identify a clinically significant care gap• Choose, explain, and justify your external benchmark• Identify, explain, and justify your internal benchmark• Explain and justify the specific measurement outcomes

• Describe the population you plan to sample• Collect data• Report findings• Make specific recommendations for system change

Planning a chart audit Discussed in previous session

Page 23: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

6

Asking the question: The PICO approach

PopulationIntervention or IssueComparatorOutcome

Asking a “Good” Question

• Hot topics?• Clinical practice guidelines• Chronic Disease Management (PCQI)• Data quality• Choosing Wisely

• Data Quality (data completeness and capture) for chosen question needs to be considered

Page 24: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

7

Identify patient population

• You need to define the population you wish to assess. Consider: age, gender, disease status, treatment

status, etc.

• In many cases, the topic itself will help to define this. If you want to look at cervical cancer screening

rates, the population has to be limited to women. You may wish to exclude those who have had hysterectomies…

Page 25: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

8

Identify Measures (and intervention you are comparing)

• Define EXACTLY what you will measure• What is a YES (criteria met)? / What is a NO

(not met)?

• Do a literature review to help in defining/justify measures – ideally variables that have been used successfully in the past or have clear clinical relevance.

Data Discipline

Page 26: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

9

Determine a sample size

• For optimal results calculate necessary power and statistical significance.

• A common rule of thumb is to try for 10-20% of the eligible charts.

• Multiple queries can also lead to interesting breakdowns of the information.

• Expect to assess ~30-50 charts or have 10+ queries to provide data which is granular enough.

Collect Data

• Review each chart to determine if the individual meets the selection criteria (e.g., correct age, gender, etc.)

• Complete one audit tool (paper or line in the spreadsheet) for each individual you include in the sample.

Page 27: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

10

How to conduct a chart audit in Accuro?

How to make queries work• Use coded fields as much as possible

• Free text is the enemy• Examples: Labs, Demographics, Problem List (for the

most part)• Ask answerable questions• Build cohorts with manageable numbers• Let the computer do the heavy lifting

Page 28: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

11

Overview of the Query Builder

What is the question you want to answer?

Page 29: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

12

Storing Query Data in Accordance with Policies and Security & Storage of PHI

Printing Results: Perform only if you will need ongoing access to the output of your query at a given point in time.• Make sure you know which printer you are

connected to. • Retrieve the print job immediately. • Store all written PHI in a secure file and keep files

in a secure place at all times.

Storing Query Data in Accordance with Policies and Security & Storage of PHI

Exporting Results: Perform only if you need to do work on the data.

Export‐> Computer ‐> C$(\\Client) (V:) ‐> Documents & Settings ‐> Your username (i.e. mkrahn3) ‐> Select a Folder‐> Save. 

– Do not save queries on the desktop.

– De‐identify if possible and delete information not required: Keep only the minimum amount of information you need to know in order to analyze the data. 

– Save the document as an excel file & encrypt the workbook.

– Delete the original export (CSV file).

Page 30: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

13

What Kind of Data is Best?

• The accuracy of query results depends on data correctness and completeness in the EMR.

• Many fields in the EMR that are ‘queryable’ are not always up-to-date (e.g. patient problem list).

CAUTION: DATA QUALITY FOR SOME FIELDS IN THE EMR IS POOR

Page 31: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

14

• COPD: 48.55%

• CHF: 49.49%

• CAD: 67.10%

Oct

ober

30,

210

3

WRHA CompletenessProblem List compared with Billing

Collecting Data - Create Audit Tools

A spreadsheet format is ideal for record keeping.

• For those more comfortable with paper-based systems, a sheet of paper pre-printed with key points/questions to check in each chart serves well as an audit tool.

• Another sheet that compiles and summarizes your findings is also helpful.

• Appropriately report relevant data in form of a rate, percentage, mean, or other statistical measurement.

Page 32: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

15

Reporting Findings

• Summarizing the data is a little more complex than just counting up all the data sheets. You must consider how the data will be used, and make sure the information is presented in a way it will be useable.

• The trickiest part is often determining the proper denominator to use for percentages. Inconsistencies here can lead to confusing or un-interpretable data.

Analyze and apply findings/results

• Compare and contrast to internal and external benchmarks • Explain rationale for proxy measures you may have

used• Justify why data is above or below pre-determined

benchmarks • Remember a rate of 100% or 0% is usually

unrealistic• There are usually explanations as to why clinical

data deviates from expectations/recommendations

Page 33: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

16

Making Specific Recommendations

• Describe interventions to correct the ‘defect’

• What can be done with the current organizational system to make things “better” beyond mere ‘raising of awareness’?

• Be sure to involve stakeholders in the design of a proposed solution or course of action• What does ‘success’ look like for the next audit?• Is it REASONABLE and ACHIEVABLE?

• WHO needs to be tasked with follow-through?

If data quality is good the EMR will support efforts to leverage the EMR for your quality improvement “intervention”

Tasks to request individual phone calls to patients Letter generation to mail reminders to patients Reminders (in patient chart) to mention

requirements to patients when next in the office

Examples of How to Use your Queries

Page 34: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

17

Taking action on query results

Taking this action will create tasks for the selected user with the note as entered for each patient in the query results

Taking action on query results (2) • You may also choose to create reminders in patient

charts. It will appear the next time you are in the chart.

Page 35: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

18

And what about research?

What is CPCSSN?• CPCSSN - Canadian Primary Care Sentinel Surveillance

Network

• The first pan-Canadian primary care based multi-disease surveillance system

• Collaboration of Clinicians and Researchers

• A network of networks securely collecting health information from Electronic

Medical Records (EMRs) in the offices of primary care providers since 2008.

Page 36: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

19

How can you get involved?

What is CPCSSN?

11 Primary-Care Based Research Networks across Canada

7 Provinces, 1 Territory

11 distinct EMR systems

MaPCReN is Manitoba’s network in the CPCSSN

How can you get involved?Sites Sentinels Patients

CPCSSN 217 1,189 1,500,000+

MaPCReN 48 251 287,961 As of Apr. 3, 2017

• Each local CPCSSN network recruits local primary care providers who consent to extraction of data from their EMRs.

• MaPCReN (Manitoba Primary Care Research Network)o currently extracting EMR data from practices utilizing

Accuro and JonokeMed

What is CPCSSN?

Page 37: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

20

CPCSSN - FocusInitial focus:

• 5 chronic health conditionso Hypertensiono Osteoarthritiso Diabeteso COPDo Depression

• 3 neurologic conditionso Alzheimer’s and related dementiaso Epilepsyo Parkinson’s Disease

CPCSSN EMR Data• Provider characteristics• Practice characteristics• Patient demographics• Health conditions• Patient encounters• Risk factors• Medications• Diagnostic test results

Structured Database

Page 38: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

21

Page 39: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

July 2017 Revision

15 | P a g e

APPENDIX 3 – LIST OF QI LEADS AT SITES Each clinic will identify individuals who are skilled at creating inquiries with the electronic medical record in order to identify the charts you want to find. Each site/stream/unit will have a representative on the Quality Improvement and Informatics working group who can assist in mentorship regarding the project.

The following is the list of the QI/Informatics leads for the various sites and streams:

FMC: Alexander Singer

KMC: Jamie Falk and Allison Paige

NCMC: Roger Suss

BILINGUAL: Kheira Jolin-Dahel

PARKLAND: Scott Kish

BRANDON: Joanne Maier

BOUNDARY TRAILS: Bob Menzies

PORTAGE LA PRAIRIE: Mike Omichinski

STEINBACH: Karen Toews

Page 40: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

July 2017 Revision

16 | P a g e

APPENDIX 4 – TOPICS FOR CONSIDERATION

Page 41: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

PREVIOUS RESIDENT TOPICS2017LAST NAME FIRST NAME TOPICBarnes Allyson Quality Control for Missed Appointments

Barnes Daniel Vaccination Rates Among Patients at FMC with COPD

Bhullar Ashley High Risk Falls – A look at falls risk assessment in the osteoporotic elderly

Brar Kiran Identifying Childhood Obesity at FMC: Are we weighing in enough?

Butterworth Stephanie Otitis Media – Are We Treating Patients Appropriately?

Carriere Chantal MDI + Spacer vs Nebulizer Administered Salbutamol in the Treatment of Asthma Exacerbations

Cashman Stephen Statins: Are we checking lipid levels unnecessarily for patients on statin therapy?

Chadha Natasha Post-Partum Depression: a look at screening patterns at Dauphin Medical Clinic

Chan David Cardiovascular Risk: Are we following up?

Chan Jessica Improving anticoagulant prescription and adherence: A Chart Review at Access River East

Colosimo Chantal Breastfeeding Counselling by Residents at FMC: How Are We Doing?

Conrad KyleImplementation of Tdap Immunization in Pregnancy through amendment of the Manitoba prenatal form in the CW Wiebe Medical Centre

Dargie Andrew COPD and the Utilization of Pulmonary Rehabilitation

Downey Angelle The Challenges of Deprescribing PPIs

Felsh Sheila Is my fetus trying to kill me? Serial B-hCG Measurements in Early Pregnancy Vaginal Bleeding

Friesen Brittney Monitoring Antiepileptic Drugs and Mood Stabilizers at Kildonan Medical Centre

Funk Aaron Thyroid Function Testing in Hypothyroidism

Funk ChristyImplementation of Tdap Immunization in Pregnancy through amendment of the Manitoba prenatal form at Agassiz Medical Centre

Govender Prashen How often are Opiate Prescriptions entered into the EMR at ARE?

Gravelle Steven Neonatal Hyperbilirubinemia Screening Practices at Bethesda Hospital

Gray Regan Management of the Type II Diabetic Inpatient

Gray Steven Update on ACP Status Documentation in EMR at Family Medical Centre

Gullane Kira The use and abuse of TSH: a retrospective look at lab testing patterns at FMC

Janke Alyssa A1c Targets in the Elderly

Jaramillo Carlos Chronic Kidney Disease Screening Practices in At-Risk Patients at Kildonan Medial Centre

Jattan AaronAre Rural Residents Missing Out? – A comparison of the teaching opportunities for urban and rural FM residents at the U of M

Klus StephaniePotentially Inappropriate Prescribing of Dual Antiplatelet Therapy at Steinbach Family Medical: A Chart Review

Lesperance Sarah Tracking Health Determinants in Accuro: Can We Do Better?

Loewen MariePercentage of Rural Patients Age 18 and Above with Major Depressive Disorder Receiving Combined Therapy

Majd Shiva Diagnostic Imaging Ordering Patterns for Back Pain in Primary Care

Marantz Jesse Primary Care Quality Indicators at FMC

Marion Valérie Are we meeting the goal blood pressure of 130/80 for Diabetic Patients aged 65-85 at NCMC?

McKay SavannaPharmacological vascular protection for primary prevention in diabetics ≥ 40 yrs of age at Dauphin Medical Clinic

McPhee Stacy Duration of dual anti-platelet therapy in post-PCI patients at KMC

Page 42: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

Mendoza KenAnalysis of Anticoagulation Management and Adherence to CSS guidelines of KMC Patients with Atrial Fibrillation

Menzies Kathryn H. Pylori Testing

Meredith Trevor Assessment of Rate Control in Patients with Atrial Fibrillation

Morrow Christopher At NCMC are we retesting lipids in patients already on a statin?

Nataros AlexanderAcute Back Pain in Walk-In Clinic: Quality Improvement Project on Treatment Approaches after the Age of Opioids

Nelko Serena Polypharmacy in the Elderly

Nichol Darrin Screening for Chronic Kidney Disease in Diabetic Patients at the Western Medical Clinic

Paige Dennis Colorectal Cancer Screening and the Endoscopy Central Referral Form

Paradoski Samantha Atrial fibrillation and anticoagulation practices at NCMC

Penner Brittany Benzodiazepine Prescribing and the Elderly at Steinbach Family Medical Centre

Quinn Kelsi HIV Testing in STI-positive Patients in Churchill, MB

Rae JamesA comparison of prescripotion of benzodiazepines for treatment of anxiety in Indigenous and settler populations in the Parkland region

Sarpong Simon Consistency of A1C Monitoring in Churchill, MB

Senez Michelle Statin Use in Chronic Kidney Disease

Stasiuk Allison Chronic Kidney Disease Identification and Appropriate Prescribing

Sun Weiyun The Duration of DAPT Treatment in Post-PCI Patient: How are we doing at FMC?

Thompson Dylan Vitamin B12 Deficiency in Portage la Prairie

Treloar Kelby What If? Advanced Care Directives in Primary Care

Vanderhooft Rebecca Colorectal Cancer Screening

Wallace Marc Antithrombotic Therapy in Access River East Patients with Atrial Fibrillation

Wanigasekara Nilupama Screening for Diabetic Retinopathy

Wareham Kristen Childhood Obesity: Are we doing enough?

Xi Zheng Rate of Oral Contraception Use Among Teenagers in Churchill, MB

Xi Maya Left in the Dark? An audit of test result communication to patients

York Ryan Blood pressure measurement prior to antihypertensive prescriptions

2016Agostinho Andrea Capturing and Counseling of Weight Gain in Patients at the Family Medical Centre

Allen Jessica SFMC Immunization Status: Are we decreasing the Manitoba average?

Alto Meaghan Management of Hypertension in the very elderly

Beaton Emily Returning birth to northern communities: numbers from Norway House, MB

Boon Laurie Incidence of Breast Cancer Screening at a Primary Care Clinic

Boparai Taran Otitis Media in Children: Are we treating appropriately?

Braun Chantel Influenza Vaccination in Pregnancy

Brown Jonathan AAA Screening

Bruin Sonja Diabetes, Cancer and Steroids

Burnside Tyler Type 2 Diabetes Screening at the Dauphin Medical Clinic

Casaclang Natalie STI Screening at Access River East

Clendenan Jessica Reducing Rates of Lipid Level Testing in Individuals on High Dose Statins

Coleman Nathan Trends and Implications of Continuity of Care

Delaquis Alyssa Comparison of Satisfactory Pap Smear Samples Using Spatula and Cytobrush vs. Cytobroom at the Portage Clinic

Fajobi Abiola Choosing Wisely at NCMC: a Chart Audit of Thyroid Stimulating Hormone (THS) Testing

Page 43: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

Gosselin Timothy Screening Guidelines for Osteoporosis and OST Measurement in a Rural Academic Practice

Grexton Travis Dietary Advice for Patients with Diabetes

Huisma Felicity Statin prescribing practices for primary prevention by primary care practitioners in Manitoba

Kallos Stéphane Influenza Vaccination Rates in Pregnancy at Centre Medical Seine

Keeper Edward PSA Screening

Kroeker Bryan Who Are We Missing? – A Review of Portage Clinic Missed Appointments

Lenoski Stephane Vitamin D Testing: Changing Clinical Practice

Magnusson Joshua PPI Use at Family Medical Centre

Martin Kathryn Osteoporosis Screening in Men

McElhoes Jason Investigating Tdap Immunization Rates in Adult Patients at Kildonan Medical Centre

McGuire Catherine Review of Diabetic Retinal Care in Churchill

McNamee David Polypharmacy in the Community Geriatric Population

Melnyk Steven Quality of Follow-Up Among Chronic Kidney Disease Patients at FMC

Meradje Katyoun Glycemic Control in DM with HbA1c

Narang Birinder ACP: Primary Care Discussions…?

Nichol Mike Antipsychotics for treatment of Neuropsychiatric Symptoms in the Personal Care Home

Nickel Jarrod Duration of Bisphosphonate Use in Patients with Osteoporosis

Obara Robert Human Papillomavirus Vaccination at Family Medical Centre

Olfert Janna Dual Antiplatelet Therapy: Do we stop when we should?

Omichinski Lisa Type 2 Diabetes and Pneumococcal Vaccine

Perche Jason Billing Practices at KMC. Are we getting it right?

Peters Leah Monitoring Cholesterol after starting a Statin – Are we following the guidelines?

Pohl Blane Penicillin Allergy Documentation: The Importance of a Thorough History

Poole Cody Allopurinol and Uric Acid Targets in Gout

Renkas Rebecca Identifying Care Gaps in Colorectal Cancer Screening

Rist Jamie Use of Objective Measures of Pulmonary Function in Diagnosis and Continued Management of Asthma

Rondeau Jocelyne Potentially Inappropriate Medication Use in the Elderly

Ruremesha Delphine Osteoporosis Screening at the Northern Connections Medical Clinic

Saper Jonathan Clinical Assessment of Concussion

Schroeder Francis CT Scan Guidelines for Lung Cancer and their Impact on the KMC Patient Population

Sprange Ashleigh Opioid Prescribing Practices for Chronic Non-Cancer Pain at Kildonan Medical Centre

Victor Samuel Alcohol Use Disorder in the North: Chart Audit of the Churchill Health Centre Practice

Warrack Christopher Deprescribing PPI's in Adults with GERD

Wong Jason De-prescribing PPI’s

Worden Tyler Pertussis Vaccination during Pregnancy at the Family Medical Centre

Zhang Jason PSA Screening

2015

Andani Rafiq Review of Nephropathy Screening in Type 2 Diabetics at Churchill, Manitoba

AnsariStephanie Churchill Quality Improvement Project: Cervical Cancer Screening in Northern Communities

Barnard Alicia Rotavirus Vaccination at Steinbach Family Medical Centre

Page 44: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

CarpenterJean-Loup A review of E. coli resistance in uncomplicated urinary tract infections at Centre Médical

Seine

CharetteMiranda Addressing Smoking Status and Smoking Cessation in Patients with COPD in the Primary Care

ClinicChew Darren Monitoring Uric Acid Levels in Patients with Gout on AllopurinolChudley David AAA Screening at FMCCoudière Émilie Urgent and semi-urgent colonoscopies : are we making it under 28 days?

de MoissacPierre Transfer Time for Emergent Obstetrical Transfers from Ste-Anne Hospital to Bethesda Regional

Health Centre

DueckJane Rates of Influenza Vaccination Among Patients with Known Coronary Artery Disease

Ewonchuk Marie Statin use in patients over the age of 75Fine Alison COPD: Primary Care Spirometry and Smoking CessationFung Adrian Are Hypertension Goals being met in Churchill?Gacutan Sherwin Do Polar Bears Affect Diabetes Screening?Humniski Kirstyn ACP Planning in a Primary Care ClinicJack Megan Analysis of Initial Chronic Kidney Disease Monitoring at FMCJhooty Jason Hypothyroidism Monitoring at the Churchill Health CentreJolin-Dahel Kheira Physician Integrated Network project. Can it pay for itself?Koetting Leah Incidence of Cervical Cancer Screening in a Rural Primary Care ClinicKornelsen Victoria Advanced Care Planning in the ElderlyLanglais Blaire An Analysis of Abdominal Aortic Aneurysm Screening in Males Aged 65 to 75Lehn Paeta Childhood Obesity. More than just being cute - A Norway House Chart AuditMacdonald Lindsey Osteoporosis Screening and Management at the Family Medical CentreMcNaughton Leslie Weight Management in the Care of Patients with Hip and Knee Osteoarthritis - Mejia Ana Pap Smear procedural opportunities by gender, 2016 grads, at NCMCMurphy Brady Effective Antibiotic Prescribing For UTI at SFMCNeufeld John Diabetics with A1Cs Greater than 10 at FMC Paige Allison Keeping our Elderly Healthy: Polypharmacy in the oldest of the elderly at FMC

PaniakAnita Consistency of Hba1c Monitoring: Implications on EMR as a Health Delivery aid in

YellowknifePlett Jeremy Rates of Pneumococcal Polysaccharide Vaccine in Elderly Patients at FMC

PolanMichael Sedative Medications And Their Use In The Elderly Population At The Family Medical Centre

Riel Stefan Hgb A1C Monitoring in Patients with Diabetes

SharmaAnish An Audit into Ultrasound Screening for Abdominal Aortic Aneurysms Based on Age at KMC

Smith Stephen Waist Circumference: Is Family Medical Centre’s Population being measured?

StittLaurel Pneumococcal Immunization in Patients Aged 65 Years and Older at a Primary Care Clinic

SwainKristina Guideline Adherence for the diagnosis and treatment of acute uncomplicated UTIs

Utko Pawel AAA ScreeningVirk Pablo The Appropriate Use of Prostate Specific Antigen at Thompson Medical ClinicWaye Leon Preventative Medicine: How good are we at keeping our Patients Healthy?Wettig Kara Antibiotic Regime and Group A Streptococcal Pharyngitis

Page 45: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

Last Reviewed: 22/06/2017 | Dr. Kimberly Wintemute MD CCFP FCFP | Primary Care Co-Lead, Choosing Wisely Canada

Choosing Wisely Canada Recommendations that Apply to Primary Care

Stratified by Quality Improvement (QI) Pillar Searchable by keyword June 2017

Dr. Kimberly Wintemute at Choosing Wisely Canada and Dr. Alex Singer at the University of Manitoba have compiled a list of recommendations that may pertain to primary care; it draws from the recommendations put forward by all medical societies. It can be helpful in identifying quality improvement opportunities in any practice.

Suggestions for family physicians, primary care teams, and / or trainees using this list:

1. Review recommendations to identify those that have relevance for your community, team or practice (Yes or No column).

2. Review the “Yes” recommendations for ease of implementation and measurement. Consider how easy it will be to galvanize interest and energy for the topic; and where you will get the data (eg. EMR? Community or hospital lab? Hospital or other database? Manual tabulations done in your office?). Indicate in this column if the project is “do-able” – Yes or No.

3. Recommendations that have two “Yes” responses are potential QI projects.

Note:

Patient Safety and Effectiveness are the relevant QI pillars for most recommendations. However, the point can be made that the pillars of Timeliness (Access) and Efficiency are important in all potential situations that entail clinicians spending time in ways that do not add value to patient care. Unnecessary clinical activity has two negative consequences: it increases wait times for patients who genuinely require access to the system; and it occupies clinicians, taking them away from activities that have true impact.

Page 46: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

Last Reviewed: 22/06/2017 | Dr. Kimberly Wintemute MD CCFP FCFP | Primary Care Co-Lead, Choosing Wisely Canada

Choosing Wisely Canada Recommendation Society List(s) QI Pillar

Relevant for us?

Yes/No

Easy to Implement &

Measure? Don’t perform stress cardiac imaging or advanced non-invasive imaging in the initial evaluation of patients without cardiac symptoms unless high-risk markers are present.

Cardiology Effectiveness Access

Don’t perform echocardiography as routine follow-up for mild, asymptomatic native valve disease in adult patients with no change in signs or symptoms.

Cardiology Effectiveness Access

Don’t order annual electrocardiograms (ECGs) for low-risk patients without symptoms. Cardiology Effectiveness

Efficiency

Don’t prescribe antibiotics in adults with bronchitis/asthma and children with bronchiolitis.

Emergency Medicine

Effectiveness Patient Safety

Don’t prescribe antibiotics after incision and drainage of uncomplicated skin abscesses unless extensive cellulitis exists.

Emergency Medicine

Effectiveness Patient Safety

Don’t use antibiotics in adults and children with uncomplicated sore throats

Emergency Medicine

Effectiveness Patient Safety

Don’t use antibiotics in adults and children with uncomplicated acute otitis media

Emergency Medicine

Effectiveness Patient Safety

Don’t recommend routine or multiple daily self-glucose monitoring in adults with stable type 2 diabetes on agents that do not cause hypoglycemia.

Endocrinology and Metabolism

Family Medicine

Effectiveness

Patient-Centredness

Don’t routinely order a thyroid ultrasound in patients with abnormal thyroid function tests unless there is a palpable abnormality of the thyroid gland.

Endocrinology and Metabolism

Effectiveness Access

Don’t use Free T4 or T3 to screen for hypothyroidism or to monitor and adjust levothyroxine (T4) dose in patients with known primary hypothyroidism.

Endocrinology and Metabolism

Effectiveness

Don’t routinely test for Anti-Thyroid Peroxidase Antibodies (anti – TPO).

Endocrinology and Metabolism Effectiveness

Page 47: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

Last Reviewed: 22/06/2017 | Dr. Kimberly Wintemute MD CCFP FCFP | Primary Care Co-Lead, Choosing Wisely Canada

Choosing Wisely Canada Recommendation Society List(s) QI Pillar

Relevant for us?

Yes/No

Easy to Implement &

Measure?

Don’t do imaging for lower-back pain unless red flags are present

Family Medicine Emergency Medicine

Occupational Medicine Radiology

Spine Society Physical

Medicine & Rehabilitation

Access Effectiveness Patient Safety

Patient-Centredness

Don’t use antibiotics for upper respiratory infections that are likely viral in origin, such as influenza-like illness, or self-limiting, such as sinus infections of less than seven days of duration

Family Medicine Patient Safety Effectiveness

Don’t order screening chest X-rays and ECGs for asymptomatic or low risk outpatients Family Medicine Effectiveness

Patient Safety

Don’t screen women with Pap smears if under 21 years of age or over 69 years of age Family Medicine Effectiveness

Don’t do annual screening blood tests unless directly indicated by the risk profile of the patient Family Medicine Effectiveness

Don’t routinely measure Vitamin D in low risk adults

Family Medicine Pathology Effectiveness

Don’t routinely do screening mammography for average risk women aged 40 – 49. Individual assessment of each woman’s preferences and risk should guide the discussion and decision regarding mammography screening in this age group

Family Medicine

Patient-Centredness Effectiveness Patient Safety

Don’t do annual physical exams on asymptomatic adults with no significant risk factors Family Medicine Effectiveness

Access

Don’t order DEXA (Dual-Energy X-ray Absorptiometry) screening for osteoporosis on low risk patients

Family Medicine Effectiveness Access

Don’t order thyroid function tests in asymptomatic Family Medicine Effectiveness

Page 48: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

Last Reviewed: 22/06/2017 | Dr. Kimberly Wintemute MD CCFP FCFP | Primary Care Co-Lead, Choosing Wisely Canada

Choosing Wisely Canada Recommendation Society List(s) QI Pillar

Relevant for us?

Yes/No

Easy to Implement &

Measure? patients Patient-

Centredness Don’t maintain long term Proton Pump Inhibitor (PPI) therapy for gastrointestinal symptoms without an attempt to stop/reduce PPI at least once per year in most patients.

Gastroenterology Effectiveness Patient Safety

Avoid using an upper GI series to investigate dyspepsia. Gastroenterology Effectiveness

Patient Safety

Avoid colorectal cancer screening tests on asymptomatic patients with a life expectancy of less than 10 years and no family or personal history of colorectal neoplasia.

General Surgery

Effectiveness Patient-

Centredness Access

Don’t use antimicrobials to treat bacteriuria in older adults unless specific urinary tract symptoms are present.

Geriatrics Hospital Medicine Pathology Urology

Effectiveness Patient Safety

Don’t use benzodiazepines or other sedative-hypnotics in older adults as first choice for insomnia, agitation or delirium.

Geriatrics Hospital Medicine

Psychiatry

Effectiveness Patient Safety

Patient-Centredness

Don't use antipsychotics as first choice to treat behavioural and psychological symptoms of dementia.

Geriatrics Psychiatry

Long Term Care

Effectiveness Patient Safety

Patient-Centredness

Avoid using medications known to cause hypoglycemia to achieve hemoglobin A1c <7.5% in many adults age 65 and older; moderate control is generally better.

Geriatrics Patient Safety

Don’t order neuroimaging or sinus imaging in patients who have a normal clinical examination, who meet diagnostic criteria for migraine, and have no “red flags” for a secondary headache disorder.

Headache Radiology

Effectiveness Access

Patient Safety

Don’t prescribe acute medications or recommend Headache Effectiveness

Page 49: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

Last Reviewed: 22/06/2017 | Dr. Kimberly Wintemute MD CCFP FCFP | Primary Care Co-Lead, Choosing Wisely Canada

Choosing Wisely Canada Recommendation Society List(s) QI Pillar

Relevant for us?

Yes/No

Easy to Implement &

Measure? an over-the-counter analgesic for patients with frequent migraine attacks without monitoring frequency of acute medication use with a headache diary.

Don’t prescribe opioid analgesics or combination analgesics containing opioids or barbiturates as first line therapy for the treatment of migraine.

Headache

Effectiveness Patient-

Centredness Patient Safety

Don’t prescribe alternate second-line antimicrobials to patients reporting non-severe reactions to penicillin when beta-lactams are the recommended first-line therapy.

Infectious Disease Patient Safety

Don’t use steroids (e.g., prednisone) for maintenance therapy in inflammatory bowel disease (IBD)

Inflammatory Bowel Disease

Effectiveness

Patient Safety

Don’t use abdominal computed tomography (CT) scan to assess inflammatory bowel disease (IBD) in the acute setting unless there is suspicion of a complication (obstruction, perforation, abscess) or a non-IBD etiology for abdominal symptoms

Inflammatory Bowel Disease Patient Safety

Don’t do a urine dip or urine culture unless there are clear signs and symptoms of a urinary tract infection (UTI)

Long Term Care Effectiveness Patient Safety

Don’t continue or add long-term medications unless there is an appropriate indication and a reasonable expectation of benefit in the individual patient

Long Term Care Effectiveness Patient Safety

Don’t order screening or routine chronic disease testing just because a blood draw is being done Long Term Care Effectiveness

Don’t use non-invasive prenatal detection of fetal aneuploidies by cell-free DNA as a diagnostic test.

Medical Genetics Effectiveness

Don’t make medical decisions based on results of Medical Genetics Effectiveness

Page 50: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

Last Reviewed: 22/06/2017 | Dr. Kimberly Wintemute MD CCFP FCFP | Primary Care Co-Lead, Choosing Wisely Canada

Choosing Wisely Canada Recommendation Society List(s) QI Pillar

Relevant for us?

Yes/No

Easy to Implement &

Measure? direct to consumer genetic testing (DTC-GT) without a clear understanding of the limitations and validity of the test Don’t collect urine specimens for culture from adults who lack symptoms localizing to the urinary tract or fever unless they are pregnant or undergoing genitourinary instrumentation where mucosal bleeding is expected

Medical Microbiology

Effectiveness Patient Safety

Patient-Centredness

Don’t routinely collect or process specimens for Clostridium difficile testing when stool is non-liquid (i.e., does not take the shape of the specimen container) or when the patient has had a prior nucleic acid amplification test result within the past 7 days

Medical Microbiology Effectiveness

Don’t obtain swabs from superficial ulcers for culture as they are prone to both false positive and false negative results with respect to the cause of the infection

Medical Microbiology Effectiveness

Don’t prescribe nonsteroidal anti-inflammatory drugs (NSAIDS) in individuals with hypertension or heart failure or CKD of all causes, including diabetes.

Nephrology Patient Safety

Don’t prescribe angiotensin converting enzyme (ACE) inhibitors in combination with angiotensin II receptor blockers (ARBs) for the treatment of hypertension, diabetic nephropathy and heart failure.

Nephrology Patient Safety

Don’t use nuclear medicine thyroid scans to evaluate thyroid nodules in patients with normal thyroid gland function.

Nuclear Medicine

Effectiveness Patient Safety

Don’t repeat DEXA scans more often than every two years in the absence of high risk or new risk factors.

Nuclear Medicine

Effectiveness Access

Page 51: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

Last Reviewed: 22/06/2017 | Dr. Kimberly Wintemute MD CCFP FCFP | Primary Care Co-Lead, Choosing Wisely Canada

Choosing Wisely Canada Recommendation Society List(s) QI Pillar

Relevant for us?

Yes/No

Easy to Implement &

Measure? Don’t repeat dual energy X-ray absorptiometry (DEXA) scans more often than every 2 years.

Rheumatology

Avoid the use of routine episiotomy in spontaneous vaginal births. Obstetrics &

Gynecology Effectiveness Patient Safety

Don’t do electronic fetal monitoring for low risk women in labour; use intermittent auscultation. Obstetrics &

Gynecology Effectiveness Patient Safety

Don’t perform routine urinalysis (protein, glucose) at every antenatal visit (in low risk normotensive women).

Obstetrics & Gynecology

Effectiveness Efficiency

Don’t use meperidine for labour analgesia due to its long-acting active metabolites and negative effects on neonatal behaviours.

Obstetrics & Gynecology Patient safety

Don’t routinely screen women with Pap smears if under 21 years of age or over 69 years of age. Obstetrics &

Gynecology Effectiveness

Don’t screen for ovarian cancer in asymptomatic women at average risk. Obstetrics &

Gynecology Effectiveness

Don’t endorse clinically unnecessary absence from work.

Occupational Medicine

Patient-Centredness Effectiveness

Don’t repeat chest X-rays when screening exposed workers for asbestosis unless clinical indications are present.

Occupational Medicine

Effectiveness Patient Safety

Don’t perform routine cancer screening, or surveillance for a new primary cancer, in the majority of patients with metastatic disease.

Oncology Patient-Centredness

Don’t delay or avoid palliative care for a patient with metastatic cancer because they are pursuing disease-directed treatment.

Oncology Palliative Care

Patient-Centredness

Don’t deliver care (e.g., follow-up) in a high-cost Oncology Patient-

Page 52: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

Last Reviewed: 22/06/2017 | Dr. Kimberly Wintemute MD CCFP FCFP | Primary Care Co-Lead, Choosing Wisely Canada

Choosing Wisely Canada Recommendation Society List(s) QI Pillar

Relevant for us?

Yes/No

Easy to Implement &

Measure? setting (e.g., inpatient, cancer center) that could be delivered just as effectively in a lower-cost setting (e.g., primary care).

Centredness Effectiveness

Access Don’t use glucosamine and chondroitin to treat patients with symptomatic osteoarthritis of the knee.

Orthopedics Effectiveness

Don’t use oral antibiotics as a first line treatment for patients with painless ear drainage associated with a tympanic membrane perforation or tympanostomy tube unless there is evidence of developing cellulitis in the external ear canal skin and pinna

Otolaryngology Effectiveness Patient Safety

Don’t order a routine ultrasound for umbilical and/or inguinal hernia. Pediatric Surgery Effectiveness

Don’t order a routine ultrasound for children with undescended testes.

Pediatric Surgery Urology Effectiveness

Don’t delay referral for undescended testes beyond 6 months of age. Pediatric Surgery Effectiveness

Don’t image a midline dimple related to the coccyx in an asymptomatic infant or child.

Pediatric Neurosurgery Effectiveness

Don’t routinely use acid blockers or motility agents for the treatment of gastroesophageal reflux in infants

Pediatrics Effectiveness Patient Safety

Don’t administer psychostimulant medications to preschool children with Attention Deficit Disorder (ADD), but offer parent-administered behavioural therapy.

Pediatrics Effectiveness Patient Safety

Don’t routinely do a throat swab when children present with a sore throat if they have a cough, rhinitis, or hoarseness as they almost certainly have viral pharyngitis.

Pediatrics Effectiveness Efficiency

Don’t recommend the use of cough and cold remedies in children under six years of age Pediatrics Effectiveness

Patient Safety

Don’t delay advance care planning conversations. Palliative Care Patient-

Page 53: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

Last Reviewed: 22/06/2017 | Dr. Kimberly Wintemute MD CCFP FCFP | Primary Care Co-Lead, Choosing Wisely Canada

Choosing Wisely Canada Recommendation Society List(s) QI Pillar

Relevant for us?

Yes/No

Easy to Implement &

Measure? Centredness

Don’t use stool softeners alone to prevent opioid induced constipation. Palliative Care

Effectiveness Patient-

Centredness

Don’t treat asymptomatic urinary tract infections in catheterized patients

Physical Medicine &

Rehabilitation

Effectiveness Patient-

Centredness

Don’t regularly prescribe bed rest and inactivity following injury and/or illness unless there is scientific evidence that harm will result from activity

Physical Medicine &

Rehabilitation

Effectiveness Patient-

Centredness

Don’t order prescription drugs for pain without considering functional improvement

Physical Medicine &

Rehabilitation

Effectiveness Patient Safety

Patient-Centredness

Don’t recommend carpal tunnel release without electrodiagnostic studies to confirm the diagnosis and severity of nerve entrapment

Physical Medicine &

Rehabilitation

Effectiveness Patient Safety

Patient-Centredness

Do not use SSRIs as the first-line intervention for mild to moderately depressed teens. Psychiatry

Effectiveness Patient-

Centredness Patient Safety

Do not use psychostimulants as a first-line intervention in preschool children with ADHD. Psychiatry

Effectiveness Patient-

Centredness

Do not routinely use antipsychotics to treat primary insomnia in any age group. Psychiatry Effectiveness

Patient Safety

Don’t initiate long-term maintenance inhalers in stable patients with suspected COPD if they have not had confirmation of post-bronchodilator airflow obstruction with spirometry

Respiratory Medicine

Effectiveness Patient Safety

Don’t perform CT screening for lung cancer among Respiratory Effectiveness

Page 54: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

Last Reviewed: 22/06/2017 | Dr. Kimberly Wintemute MD CCFP FCFP | Primary Care Co-Lead, Choosing Wisely Canada

Choosing Wisely Canada Recommendation Society List(s) QI Pillar

Relevant for us?

Yes/No

Easy to Implement &

Measure? patients at low risk for lung cancer. Medicine Efficiency

Patient Safety Don’t treat adult cough with antibiotics even if it lasts more than 1 week, unless bacterial pneumonia is suspected (mean viral cough duration is 18 days).

Respiratory Medicine

Effectiveness Patient Safety

Don’t initiate medications for asthma (e.g., inhalers, leukotriene receptor antagonists, or other) in patients ≥ 6 years old who have not had confirmation of reversible airflow limitation with spirometry, and in its absence, a positive methacholine or exercise challenge test, or sufficient peak expiratory flow variability.

Respiratory Medicine

Effectiveness Patient Safety

Don’t use antibiotics for acute asthma exacerbations without clear signs of bacterial infection.

Respiratory Medicine

Safety Effectiveness

Don’t order ANA as a screening test in patients without specific signs or symptoms of systemic lupus erythematosus (SLE) or another connective tissue disease (CTD).

Rheumatology Effectiveness

Don’t order an HLA-B27 unless spondyloarthritis is suspected based on specific signs or symptoms. Rheumatology Effectiveness

Don’t prescribe bisphosphonates for patients at low risk of fracture. Rheumatology Effectiveness

Patient Safety

Don’t order an MRI for suspected degenerative meniscal tears or osteoarthritis

Sport & Exercise Medicine

Effectiveness Efficiency Patient-

Centredness

Don’t prescribe opiates as first line treatment for tendinopathies

Sport & Exercise Medicine Patient Safety

Don’t order orthotics for asymptomatic children with pes planus (flat feet)

Sport & Exercise Medicine

Effectiveness Patient-

Page 55: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

Last Reviewed: 22/06/2017 | Dr. Kimberly Wintemute MD CCFP FCFP | Primary Care Co-Lead, Choosing Wisely Canada

Choosing Wisely Canada Recommendation Society List(s) QI Pillar

Relevant for us?

Yes/No

Easy to Implement &

Measure? Centredness

Don’t order an MRI as an initial investigation for suspected rotator cuff tendinopathy

Sport & Exercise Medicine

Effectiveness Efficiency Patient-

Centredness

Don’t perform unnecessarily frequent ultrasound examinations in asymptomatic patients with small abdominal aortic aneurysms. Aneurysms smaller than 4.5cm in diameter should undergo ultrasound surveillance every 12 months.

Vascular Surgery Effectiveness

Page 56: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

Primary Care Quality Indicators Preventative Care (screening) Diabetes Hypertension Coronary Artery Disease Congestive Heart Failure Asthma The following link provides more information: http://www.gov.mb.ca/health/primarycare/providers/pin/qm.html

Page 57: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

July 2017 Revision

17 | P a g e

APPENDIX 5 – SECURITY AND STORAGE OF PERSONAL HEALTH INFORMATION

Page 58: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

Approved by Teaching Clinic Council, October 2014 2

Appendix B: Key Policy Points re: Security and Storage of Personal Health Information: Note:

All staff, faculty and learners working in the Teaching Clinics are accountable to abide by Regional policies.

The following summary is not a substitute for reviewing the full set of policies on access, use and management of personal health information. Please see: http://home.wrha.mb.ca/corp/policy/policy.php. For questions concerning the application of these policies, consult your clinic’s manager or Unit Director.

1. Personal health information (PHI) is to be collected, used, disclosed or accessed only by individuals who are

authorized for that purpose.

2. All written PHI shall be placed in a secure file, and the file must be kept in a secure place at all times.

3. PHI shall not be removed from the trustee’s premises, unless for a purpose specifically authorized by the trustee. In such cases, the authorized person shall carry the file/electronic media with them or ensure secure storage at all times.

4. Individuals who sign on to a computer must not leave the computer on in accessible areas when they leave their workstations. User password protocols must be in place and utilized. For electronic medical records, users must appropriately lock their workstation when leaving it, without exception. In the Accuro EMR, the “fuzzy lock” is the recommended approach and users should customize their own fuzzy lock. For instructions on how to do this, please see: http://home.wrha.mb.ca/prog/csis/files/training/fuzzylock16.pdf

5. Personal health files/electronic media shall be returned to their designated secured storage location and not allowed to accumulate or be left unattended on desktops or any other location in a non-secured place. Any PHI contained within the computer hardware or on electronic storage media shall be secured or removed prior to removal of the equipment/media from any office.

6. PHI stored in electronic form shall be properly secured from unauthorized access. PHI stored on electronic media, e.g., flash drives, shall be kept in a secured place at all times and shall be used only by authorized personnel having access to a protected system.

7. Personal health information shall not be transmitted via electronic mail without appropriate safeguards such as encryption or transmittal within a secure firewall where practicable.

8. Simply erasing restricted data from electronic storage media does not completely remove the data. In order to destroy the data, overwriting, degaussing or physical destruction of the storage media is required.

*

Overwriting: achieved by writing random data to the media a minimum of three (3) times.

Degaussing: use of a powerful magnet to destroy the magnetic disk by completely randomizing its magnetic field.

Physical Destruction: crushing, shredding, incinerating, perforating or otherwise rendering the medium physically unable to be used.

Page 59: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

July 2017 Revision

18 | P a g e

APPENDIX 6 – EIGHT STEPS TO CHART AUDIT FOR QUALITY

Page 60: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

To

m f

oT

y

8 Steps to a Chart Audit for Quality

For many family physicians, the idea of a chart audit conjures up images of federal investigators or insurance company representatives descending on their offices to look for evidence of wrongdo-

ing. For the most part, however, a chart audit is not so scary. A chart audit is simply a tool physicians can use to check their own performance, determine how they’re doing and identify areas where they might improve. The purpose of this article is to describe some scenarios

in which a chart audit might be helpful and to offer step-by-step instructions for doing one.

Why a chart audit?

Chart audits can serve many pur-poses, from compliance to research

to administrative to clinical. You can conduct a chart audit on virtually any aspect of care that is ordinarily docu-mented in the medical record. Prac-

tices frustrated with clinical processes that don’t work well can use chart audits

to document that something is wrong, find the defect in the process and fix it.

Perhaps the most beneficial use for a chart audit is to measure quality of care

so that you can improve it. Chart audits are often used as part of a quality improve-

ment initiative. For example, a practice might review charts to see how often a particular vaccine

is offered, given or declined. If the audit determines that the vaccine is not being offered or given as recom-

mended, then there is room for improvement. The same practice could review the panels of individual physicians within the group to see if they differ in performance on this measure and to give focus to their improvement efforts (for additional chart audit ideas, see page A4). ➤

Barbara H. Gregory, mPH, mA, Cheryl Van Horn, RN, and Victoria S. Kaprielian, mD

AA

FP M

emb

er Bo

nus Section

A simple chart review can help your group answer the question on everyone’s mind:

“How are we doing?”

Downloaded from the Family Practice Management Web site at www.aafp.org/fpm. Copyright© 2008 American Academy of Family Physicians. For the private, noncommercial

use of one individual user of the Web site. All other rights reserved. Contact [email protected] for copyright questions and/or permission requests.

Page 61: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

AA

FP M

emb

er B

onu

s Se

ctio

n

A4 | FAMILY PRACTICE MANAGEMENT | www.aafp.org/fpm | July/August 2008

A chart audit is one of numerous data sources available for quality improvement efforts. Others include patient surveys, dis-charge summary reviews, billing/claims data and employee feedback.

How to do it

Below we describe eight steps to a formal chart audit. Although the process is not nec-essarily linear, we will discuss each step in the order it might typically occur, using the example of a breast cancer screening audit to illustrate each step. Because the audit will involve reviewing confidential data, it is important to check your institutional guide-lines regarding patient confidentiality before you get too far into the planning process.

Step 1: Select a topic. The focus of your audit must be clear, neither too narrow nor too broad, and measurable using data avail-able in the medical record. If possible, choose an area that interests you. You will find that you are more able to recognize nuances in your study when you have personal interest in the topic. Of course, your topic should also be of interest to the practice, perhaps a prob-

lem or aspect of care that the providers have identified as needing improvement. The Joint Commission recommends studying issues that are high frequency, high risk or both.

You should also consider early in the process how important external comparison is to your purpose. If it is quite important, then choose a topic that has an existing, well-defined measure and available benchmark data – even one you might not choose other-wise – because this will be more practical than developing your own standard for comparison.

Chart auditing is an iterative process – don’t be discouraged if you change directions several times before settling on a topic.

Example: Your practice wants to measure how well it’s doing on meeting recommenda-tions for preventive care. Since the insurance carriers in the area are focusing heavily on women’s health, the group decides to focus its chart review on screening for breast cancer (mammography).

Step 2: Identify measures. Once you’re set on a topic, you need to define exactly what you will measure. Criteria must be outlined precisely, with specific guidelines as to what should be counted as a “yes” (criteria met) and

About the AuthorsBarbara Gregory is coordinator for the Duke Center for Community Research Virtual Library in Durham, N.C. Cheryl Van Horn is quality analyst for the Department of Community and family medicine at Duke Uni-versity. Dr. Kaprielian is professor and vice-chair for education in the Department of Community and family medicine at Duke University, where she also serves as director for quality improvement and continuing medical education. Author disclosure: nothing to disclose.

POTENTIAL TOPICS FOR QUALITY AUDITS

Preventive care

Percentage of women ages 21-64 who have had a Pap smear within the past three years

Percentage of adults ages 51-80 who have had colon cancer screening

Percentage of children age 2 who have completed all recommended immunizations

Percentage of elderly adults with documented fall risk assessment within the past year

Chronic disease management

Percentage of patients with hypertension whose last blood pressure reading was < 140/90

Percentage of patients with diabetes with an A1C level recorded in the last year

Percentage of patients with diabetes whose A1C is < 7.0

Percentage of patients with diabetes with a documented eye exam within the last year

Percentage of patients with persistent asthma who are on an anti-inflammatory agent

Note: Any of these metrics would have to be defined with greater specificity before use.

Chart audits can be done for com-pliance, research, administrative or clinical purposes.

Chart audits are

often used as part of quality-improve-

ment initiatives.

The first step in a chart audit for

quality is to iden-tify a clear, measur-

able topic in an area that interests

your practice.

Page 62: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

AA

FP M

emb

er Bo

nus Section

CHART AudITS

July/August 2008 | www.aafp.org/fpm | FAMILY PRACTICE MANAGEMENT | A5

DETERmININg SAmPLE SIzE

Calculating a statistically valid sample size for a chart review follows steps adapted from statistical techniques used for descriptive studies. The process uses a nomogram, or table, to identify the desired number:

1. Estimate the expected proportion within the population that will have the measure of interest.

If you have a benchmark from literature or prior studies, use it. otherwise, consult with colleagues or experts in the field to determine an estimate. The tables generally require this proportion to be 50 percent or less. If more than 50 percent of the population is expected to have the characteristic, then base your sample size calculation on the proportion without the characteristic.

2. Specify the width of the confidence interval you wish to use.

All empirical estimates based on a sample have a certain degree of uncertainty associated with them. It is necessary, therefore, to specify the desired width of the confidence interval (W). This gives a range of values that you can be confident contains the true value. In most cases, an appro-priate width is 0.20 (that is, plus or minus 10 percent).

3. Set the confidence level.This is a measure of the precision or level of uncertainty. Typically 95 percent is used, meaning that we are 95 percent certain that the interval includes the true value. This is arbitrary, however, and other lev-els of confidence can be used. The table shown below is for a 95-percent confidence level. The narrower the width of the confidence interval and the higher the confidence level, the larger the sample size.

4. use the nomogram (below) to estimate sample size.

Sample size for a descriptive study of a dichotomous variable 95-percent confidence interval

Width of the confidence interval (W) 0.10 0.15 0.20 0.25 0.30

Expected proportion (P)

0.10 138 61

0.15 196 87 49 31

0.20 246 109 61 39 27

0.25 288 128 72 46 32

0.30 323 143 81 52 36

0.40 369 164 92 59 41

0.50 384 171 96 61 43

Adapted with permission from Hulley SB, et al. Designing Clinical Research, 3rd ed. Philadelphia: Wolters Kluwer Health; 2006:91.

The second step is to determine exactly what you will measure and to define inclu-sion and exclusion criteria.

A pilot audit may help to uncover issues that need to be clarified before beginning the full audit.

The sample audit described in the article focuses on whether a mammogram was completed or recommended within the last 24 months.

AN ExAmPLE

According to HEDIS 2007 Audit Means, Percentiles and Ratios, the NCQA’s annual report of health plan performance data, 68.9 percent of women age 40 to 69 had a mammogram during 2006. This makes the expected proportion of those without screening 31.1 percent. We choose a width of the confidence interval of 0.20 (plus or minus 10 percent) and a confidence level of 95 percent. This means that we want to be 95 percent confident that the result falls between 58.9 percent and 78.9 percent. Using the nomogram to determine the sample size, we read down the left column of figures for the expected proportion without the characteristic (0.30 is the closest value to 31.1 percent) and then across to the chosen width of the confidence interval (0.20). When we follow the column down, we find the required sample size (81). If the number required is too large to be completed, we can recalculate with a lower confidence level or wider interval; this will produce a smaller sample size.

Page 63: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

AA

FP M

emb

er B

onu

s Se

ctio

n

A6 | FAMILY PRACTICE MANAGEMENT | www.aafp.org/fpm | July/August 2008

what should be counted as a “no” (not met). For example, if you decided to review the

rate at which foot exams were performed on patients with diabetes in the last year, you would need to decide what qualifies as an adequate foot exam. Is it monofilament testing for sensation? Visual inspection? Palpation of pulses? Many would say all three are necessary for a complete foot exam. If only two of the three are documented, how will you count that?

It may be worthwhile to do a literature review to help you define your measures or consult measures used by insurers or accredit-ing bodies; adopting measures that have been used successfully in the past will make your work easier. A literature review may also help you identify benchmarks for comparison.

Once you’ve chosen measures that seem workable, it can be helpful to conduct a pilot audit. Just going through a few charts will help to identify issues that need to be clarified before starting a full audit.

Example: For your audit on breast cancer screening, the group considers several mea-sures, including the following:

• Time since last mammogram. This pro-vides the most specific information but would require more analysis.

• Mammogram completed within last year. This measure attempts to assess compliance with clinical guidelines. The U.S. Preventive Services Task Force recommends screen-ing mammography every one to two years for women age 40 and older. However, the Healthcare Effectiveness Data and Informa-tion Set (HEDIS) measures, which most health plans use for National Committee for Quality Assurance (NCQA) accreditation purposes, require at least one mammogram completed within the past 24 months.

• Mammogram ordered within last year. Do you want to measure only whether the study was done, or whether it was recom-mended or ordered by the provider? Should providers be held accountable when patients decline to have the test?

After considerable discussion, the group decides to measure whether a mammogram was completed or recommended within the last 24 months.

CHART AUDIT FOR BREAST CANCER SCREENINg

Patient identification Inclusion criteria Exclusion criteria

mammogram in past 24 months No mammogram in past 24 months

Patient name mRN

Age 42-69 as of 12/31/07

3 visits in past 3 years

1 visit in past 13 months

Bilateral mastectomy

Left practice,

terminally ill, expired Locally Elsewhere

No discussion documented

Discussed, patient

declined

mammogram ordered, not completed

Jane D A2345 53 yes yes no no yes no

Sue S B2345 62 yes yes no no yes no

Ann J C2345 59 yes yes no no no no yes no no

Betty m D2345 65 yes yes yes no

Julie J E2345 57 yes yes no yes

Bonnie B f2345 52 no

Alice G G2345 55 yes yes no no yes no

Kate H H2345 61 yes no

Dana T I2345 63 yes yes right side only no no yes

Doris B J2345 40

Helen P K2345 64 yes yes no no yes no

Evelyn C L2345 51 yes yes no no yes no

Paula T m2345 49 yes yes no no yes no

mary S N2345 69 yes yes no no yes no

Beverly C P2345 56 yes yes no no yes no

The patient popu-

lation that is the focus of the audit is

generally defined by the measure.

Calculating a statis-tically valid sample size is aided by the use of a nomogram.

Note: Shading indicates that the patient has not met the exclusion or inclusion criteria.

Page 64: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

AA

FP M

emb

er Bo

nus Section

CHART AudITS

July/August 2008 | www.aafp.org/fpm | FAMILY PRACTICE MANAGEMENT | A7

Step 3: Identify the patient population. To determine which records to review, you need to define the population you want to assess. Characteristics to consider may include age, gender, disease status and treatment sta-tus. In many cases, the focus of the audit and even the measure itself will help to define the population. You’ll also need to develop spe-cific inclusion or exclusion criteria.

Example: In keeping with the HEDIS breast cancer screening measure that your group decided to follow, your patient popu-lation will be women age 40 to 69. Because you’ll be looking for evidence of a mammo-gram in the past 24 months, the lower age limit for the sample will be 42. Only those patients with at least three visits in the last two years and one in the last 13 months will be included. You decide to exclude women who have had bilateral mastectomies or are terminally ill.

Step 4: Determine sample size. A manual audit of all charts meeting your inclusion criteria will not be feasible in most situa-tions. That’s where sampling comes in. For

an informal, or “quick and dirty,” audit designed to give you a sense of whether a more sophis-ticated audit is warranted, you

may find it useful to sample a minimum of 20 charts. For better results, a common rule of thumb is to try for 10 percent of the eligible charts. Or you may choose to use a conve-nience sample: the patients from a single day or all the charts on a single shelf in the records room.

If you want to track a measure over time, or if you want your results to be statistically valid, your sample size is critical. If the sample is too small, the random variability will be too large, and the results will be limited in their applicability.

Example: Using the process outlined on page A5, your group determines that its sam-ple should total 81 charts.

Step 5: Create audit tools. To complete your chart audit, you will need instruments on which to record your findings. How they are structured and the details they include will affect the analysis you can do and the even-tual usability of your findings. Data should be collected in a format that keeps all individual records separate but allows for easy compiling.

Many chart audits involve the calculation of a rate, percentage, mean or other statisti-cal measurement. An electronic spreadsheet format can be customized to do these calcula-tions for you. For those more comfortable with paper-based systems, a preprinted form that lists the specific items to check in each chart serves well as an audit tool. One form is completed for each chart, and the forms can

CHART AUDIT FOR BREAST CANCER SCREENINg

Patient identification Inclusion criteria Exclusion criteria

mammogram in past 24 months No mammogram in past 24 months

Patient name mRN

Age 42-69 as of 12/31/07

3 visits in past 3 years

1 visit in past 13 months

Bilateral mastectomy

Left practice,

terminally ill, expired Locally Elsewhere

No discussion documented

Discussed, patient

declined

mammogram ordered, not completed

Jane D A2345 53 yes yes no no yes no

Sue S B2345 62 yes yes no no yes no

Ann J C2345 59 yes yes no no no no yes no no

Betty m D2345 65 yes yes yes no

Julie J E2345 57 yes yes no yes

Bonnie B f2345 52 no

Alice G G2345 55 yes yes no no yes no

Kate H H2345 61 yes no

Dana T I2345 63 yes yes right side only no no yes

Doris B J2345 40

Helen P K2345 64 yes yes no no yes no

Evelyn C L2345 51 yes yes no no yes no

Paula T m2345 49 yes yes no no yes no

mary S N2345 69 yes yes no no yes no

Beverly C P2345 56 yes yes no no yes no

BREAST CANCER SCREENINg RESULTS # %

Total charts reviewed 100

Patients included in audit 81

Patients who received mammogram 46 57

Received mammogram locally 25 31

Received mammogram elsewhere 10 12

Patients with no documentation of completed mammogram 35 43

Documented declined mammography 6 17

Documented mammogram ordered, not completed 4 13

No documentation of discussion of mammography 25 71

A confidence level of 95 percent, with a confidence interval of plus or minus 10 percent, is often used.

The tools used for recording the audit data must be clear, simple and well understood by the auditors.

Page 65: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

AA

FP M

emb

er B

onu

s Se

ctio

n

A8 | FAMILY PRACTICE MANAGEMENT | www.aafp.org/fpm | July/August 2008

then be sorted and counted as desired. A sepa-rate form can be used to tabulate results.

Creating clear, simple audit tools will make it possible for nonclinical staff to perform many audits effectively. Once you’ve devel-oped the forms, if someone other than you will be doing the actual chart reviews, go over a few examples together to be sure the reviewer understands the criteria exactly as you intend.

Example: Your group decides to use paper forms for the chart audit (see the completed forms on page A6).

Step 6: Collect data. Select the date or dates on which you will collect data. Be sure to coordinate the specifics (date, time and number of charts to be pulled) with the medical records staff. Review each chart to determine if the patient meets the selec-tion criteria. The reviewer should complete one audit tool (paper form or row in the electronic spreadsheet) for each patient that meets the criteria. To protect patient con-fidentiality, patient names should not be included on the review forms.

Example: You instruct your office staff to pull the charts of roughly 100 adult female patients. Once you’ve identified 81 that meet the selection criteria, your nursing supervisor fills out the audit tool for each one, reserving questionable cases for physician review.

Step 7: Summarize results. Summariz-ing the data is a little more complex than just counting up all the data sheets. You must con-sider how the data will be used and make sure the information is presented in a way that will make it meaningful. Inconsistencies here can produce data that can’t be interpreted.

Example: Your breast cancer screening audit results show that 57 percent of your sample received mammograms (see the results table on page A7).

Step 8: Analyze and apply results. Once you have compiled your data and calculated the results, you can compare them to local or national benchmarks. There may be mul-tiple benchmarks, depending on your topic and the performance measure you calculated. You should take into account the differences

between your population and those you’re comparing it with, as appropriate. If the mea-sure is truly important to the group, you may wish to set a performance goal based on what the group feels is appropriate and reasonable and make it the focus of a quality improve-ment initiative.

Example: At 57 percent, your group’s breast screening rate is less than the national benchmark of 68.9 percent. This benchmark is the mean for commercial HMO patients, according to the HEDIS 2007 Audit Means, Percentiles and Ratios, the NCQA’s annual report of health plan performance data (view it at http://www.ncqa.org/tabid/334/default.aspx). Of the 35 patient charts that had no documentation of a mammogram, only 10 records showed that the physician had dis-cussed the need for a mammogram with the patient. The challenge is now to drill down to figure out whether the issue was discussed but not documented in those other charts or whether it was simply overlooked. Telephone contact with the 25 identified patients might help you begin to clarify this so that an appro-priate intervention can be designed.

make it count

Chart audits can be useful tools in improve-ment and safety efforts. It is essential to define precisely what you want to measure and the criteria by which you will measure it. (If you’re floundering, you probably haven’t defined this well enough.) Sample sizes can be chosen informally or determined in a statisti-cally valid fashion. Summarize your data in a way that makes sense for the problem you’re addressing. Make sure to act on problems you find, and remeasure later to see that your changes made a difference. You and your patients will be glad you did.

Editor’s note: An expanded and interactive version of this content is available at http://patientsafetyed.duhs.duke.edu.

Send comments to [email protected].

make sure to act on problems you find, and remeasure later to see that your changes made a difference.

Patient names

should be omit-ted from review forms to protect

confidentiality.

The audit results

must be care-fully summarized and compared to

benchmarks.

once the results

are fully under-stood, an improve-ment initiative can

be designed and implemented.

Page 66: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

July 2017 Revision

19 | P a g e

APPENDIX 7 – PDSA CYCLE SYSTEMATIC REVIEW

Page 67: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

Systematic review of the applicationof the plan–do–study–act method toimprove quality in healthcare

Michael J Taylor,1,2 Chris McNicholas,2 Chris Nicolay,1 Ara Darzi,1

Derek Bell,2 Julie E Reed2

▸ Additional material ispublished online only. To viewplease visit the journal online(http://dx.doi.org/10.1136/bmjqs-2013-001862).

1Department of Surgery andCancer, Imperial CollegeLondon, London, UK2National Institute for HealthResearch (NIHR) Collaborationfor Leadership in Applied HealthResearch and Care (CLAHRC) forNorth-West London, London, UK

Correspondence toMichael J Taylor, AcademicSurgical Unit, 10th Floor, QEQMbuilding, St Mary’s Hospital,Paddington, London W2 1NY,UK; [email protected]

Received 29 January 2013Revised 25 June 2013Accepted 4 July 2013

To cite: Taylor MJ,McNicholas C, Nicolay C,et al. BMJ Qual Saf PublishedOnline First: [please includeDay Month Year]doi:10.1136/bmjqs-2013-001862

ABSTRACTBackground Plan–do–study–act (PDSA) cyclesprovide a structure for iterative testing ofchanges to improve quality of systems. Themethod is widely accepted in healthcareimprovement; however there is little overarchingevaluation of how the method is applied. Thispaper proposes a theoretical framework forassessing the quality of application of PDSAcycles and explores the consistency with whichthe method has been applied in peer-reviewedliterature against this framework.Methods NHS Evidence and Cochranedatabases were searched by three independentreviewers. Empirical studies were included thatreported application of the PDSA method inhealthcare. Application of PDSA cycles wasassessed against key features of the method,including documentation characteristics, use ofiterative cycles, prediction-based testing ofchange, initial small-scale testing and use of dataover time.Results 73 of 409 individual articles identifiedmet the inclusion criteria. Of the 73 articles, 47documented PDSA cycles in sufficient detail forfull analysis against the whole framework. Manyof these studies reported application of the PDSAmethod that failed to accord with primaryfeatures of the method. Less than 20% (14/73)fully documented the application of a sequenceof iterative cycles. Furthermore, a lack ofadherence to the notion of small-scale change isapparent and only 15% (7/47) reported the useof quantitative data at monthly or more frequentdata intervals to inform progression of cycles.Discussion To progress the development of thescience of improvement, a greater understandingof the use of improvement methods, includingPDSA, is essential to draw reliable conclusionsabout their effectiveness. This would besupported by the development of systematic andrigorous standards for the application andreporting of PDSAs.

INTRODUCTIONDelivering improvements in the qualityand safety of healthcare remains an inter-national challenge. In recent years, qualityimprovement (QI) methods such as plan–so–study–act (PDSA) cycles have beenused in an attempt to drive such improve-ments. The method is widely used inhealthcare improvement; however there islittle overarching evaluation of how themethod is applied. This paper proposes atheoretical framework for assessing thequality of application of PDSA cycles andexplores the quality and consistency ofPDSA cycle application against this frame-work as documented in peer-reviewedliterature.

Use of PDSA cycles in healthcareDespite increased investment in researchinto the improvement of healthcare,evidence of effective QI interventionsremains mixed, with many systematicreviews concluding that such interven-tions are only effective in specific set-tings.1–4 To make sense of these findings,it is necessary to understand that deliver-ing improvements in healthcare requiresthe alteration of processes within complexsocial systems that change over time inpredictable and unpredictable ways.5

Research findings highlight the influentialeffect that local context can have on thesuccess of an intervention6 7 and, as such,‘single-bullet’ interventions are not antici-pated to deliver consistent improvements.Instead, effective interventions need to becomplex and multi-faceted8–11 and devel-oped iteratively to adapt to the localcontext and respond to unforeseen obsta-cles and unintended effects.12 13 Findingeffective QI methods to support iterativedevelopment to test and evaluate

SYSTEMATIC REVIEW

Taylor MJ, et al. BMJ Qual Saf 2013;0:1–9. doi:10.1136/bmjqs-2013-001862 1

BMJ Quality & Safety Online First, published on 11 September 2013 as 10.1136/bmjqs-2013-001862

Copyright Article author (or their employer) 2013. Produced by BMJ Publishing Group Ltd under licence.

group.bmj.com on February 8, 2016 - Published by http://qualitysafety.bmj.com/Downloaded from

Page 68: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

interventions to care is essential for delivery of high-quality and high-value care in a financially constrainedenvironment.PDSA cycles provide one such method for structur-

ing iterative development of change, either as a stan-dalone method or as part of wider QI approaches,such as the Model for Improvement (MFI), TotalQuality Management, Continuous QI, Lean, SixSigma or ‘Quality Improvement Collaboratives’.3 4 14

Despite increased use of QI methods, the evidencebase for their effectiveness is poor and under-theorised.15–17 PDSA cycles are often a central com-ponent of QI initiatives, however few formal objectiveevaluations of their effectiveness or application havebeen carried out.18 Some PDSA approaches have beendemonstrated to result in significant improvementsin care and patient outcomes,19 while others havedemonstrated no improvement at all.20–22

Although at the surface level these results appeardisheartening for those involved in QI, there is a needto explore the extent to which the PDSA method hasbeen successfully deployed to draw conclusions fromthese studies. Rather than see the PDSA method as a‘black box’ of QI,23 it is important to understand thatthe use of PDSA cycles is, itself, a complex interven-tion made up of a series of interdependent steps andkey principles that inform its application5 24 25 andthat this application is also affected by local context.26

To interpret the results regarding the outcome(s) fromthe application of PDSA cycles (eg, whether processesor outcomes of care improved) and gauge the effect-iveness of the method, it is necessary to understandhow the method has been applied.No formal criteria for evaluating the application or

reporting of PDSA cycles currently exist. It is only inrecent years, through SQUIRE guidelines, that frame-works for publication have been developed that expli-citly consider description of PDSA application.27 28

We consider that such criteria are necessary tosupport and assess the effective application of PDSAcycles and to increase their legitimacy as a scientificmethod for improvement. We revisited the origins andtheory of the method to develop a theoretical frame-work to evaluate the application of the method.

The origins and theory of PDSA cyclesThe PDSA method originates from industry andWalter Shewhart and Edward Deming’s articulation ofiterative processes which eventually became known asthe four stages of PDSA.25 PDCA (plan–do–check–act) terminology was developed following Deming’searly teaching in Japan.29 The terms PDSA and PDCAare often used interchangeably in reference to themethod. This distinction is rarely referred to in the lit-erature and for the purpose of this article we considerPDSA and PDCA but refer to the methodologies gen-erally as ‘PDSA’ cycles unless otherwise stated.

Users of the PDSA method follow a prescribed four-stage cyclic learning approach to adapt changes aimed atimprovement. In the ‘plan’ stage a change aimed atimprovement is identified, the ‘do’ stage sees thischange tested, the ‘study’ stage examines the success ofthe change and the ‘act’ stage identifies adaptations andnext steps to inform a new cycle. The MFI30 andFOCUS31 (see figure 1) frameworks have been devel-oped to precede the use of PDSA and PDCA cycles30 31

respectively (table 1).In comparison to more traditional healthcare

research methods (such as randomised controlledtrials in which the intervention is determined inadvance and variation is attempted to be eliminatedor controlled for), the PDSA cycle presents a prag-matic scientific method for testing changes in complexsystems.32 The four stages mirror the scientific experi-mental method33 of formulating a hypothesis, collect-ing data to test this hypothesis, analysing andinterpreting the results and making inferences toiterate the hypothesis.The pragmatic principles of PDSA cycles promote

the use of a small-scale, iterative approach to testinterventions, as this enables rapid assessment andprovides flexibility to adapt the change according tofeedback to ensure fit-for-purpose solutions are devel-oped.10 12 13 Starting with small-scale tests providesusers with freedom to act and learn; minimising riskto patients, the organisation and resources requiredand providing the opportunity to build evidence forchange and engage stakeholders as confidence in theintervention increases.In line with the scientific experimental method, the

PDSA cycle promotes prediction of the outcome of atest of change and subsequent measurement over time(quantitative or qualitative) to assess the impact of anintervention on the process or outcomes of interest.Thus, learning is primarily achieved through interven-tional experiments designed to test a change. In recog-nition of working in complex settings with inherentvariability, measurement of data over time helpsunderstand natural variation in a system, increaseawareness of other factors influencing processes oroutcomes, and understand the impact of anintervention.As with all scientific methods, documentation of

each stage of the PDSA cycle is important to supportscientific quality, local learning and reflection and toensure knowledge is captured to support organisa-tional memory and transferability of learning to othersettings.This review examines the application of PDSA

cycles as determined by these principle features of thePDSA method described above. We recognise that anumber of health and research related contextualfactors may affect application of the method but thesefactors are beyond the scope of this review. Thereview intends to improve the understanding of

Systematic review

2 Taylor MJ, et al. BMJ Qual Saf 2013;0:1–9. doi:10.1136/bmjqs-2013-001862

group.bmj.com on February 8, 2016 - Published by http://qualitysafety.bmj.com/Downloaded from

Page 69: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

whether the PDSA method is being used and reportedin line with the literature informed criteria and there-fore inform the interpretation of studies that haveused PDSA cycles to facilitate iterative development ofan intervention.

METHODSA systematic narrative review was conducted in adher-ence to the Preferred Reporting Items for SystematicReviews and Meta-Analyses (PRISMA) statement.34

SearchThe search was designed to identify peer-reviewedpublications describing empirical studies that appliedthe PDSA method. Taking into account the develop-ment of the method and terminology, the searchterms used were ‘PDSA’, ‘PDCA’, ‘Deming Cycle’,‘Deming Circle’, ‘Deming Wheel’ and ‘ShewhartCycle’. No year of publication restrictions wereimposed.

Information sourcesThe following databases were searched for articles:Allied and Complementary Medicine Database(AMED; 1985 to present), British Nursing Index (BNI;1985 to present), Cumulative Index to Nursing andAllied Health Literature (CINAHL; 1981 to present),Embase (1980 to present), Health Business Elite(EMBESCO Publishing, Ipswich, Massachusetts, USA),the Health Management Information Consortium(HMIC), MEDLINE from PubMed (1950 to present)and PsychINFO (1806 to present) using the NHSEvidence online library (REF), and the Cochrane

Database of Systematic Reviews. The last search datewas 25 September 2012.

Data collection process and study selectionData were collected and tabulated independently byMJT, CM and CN in a manner guided by theCochrane Handbook. Eligibility was decided inde-pendently, in a standardised manner and disagree-ments were resolved by consensus. If an abstract wasnot available from the database, the full-text referencewas accessed.Inclusion criteria for articles were as follows: pub-

lished in peer-reviewed journal; describes PDSAmethod being applied to improve quality in a health-care setting; published in English. Editorial letters,conference abstracts, opinion and audit articles wereexcluded from the study selection.

Data itemsA theoretical framework was constructed by compart-mentalising the key features of the PDSA method intoobservable variables for evaluation (table 2). Thisframework was developed in accordance with recom-mendations for PDSA use cited in the literature,describing the origins and theory of the method. Facevalidity of the framework was achieved through dis-cussion among authors, with QI facilitators and atlocal research meetings.Data were collected regarding application of the

PDSA method in line with the theoretical framework.Other data collected included first author, year ofpublication, country, area of healthcare, use of PDSAor PDCA terminology, and use of MFI or FOCUS as

Figure 1 The Model for Improvement; FOCUS.

Systematic review

Taylor MJ, et al. BMJ Qual Saf 2013;0:1–9. doi:10.1136/bmjqs-2013-001862 3

group.bmj.com on February 8, 2016 - Published by http://qualitysafety.bmj.com/Downloaded from

Page 70: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

supporting frameworks. Ratios were used to analysethe results regarding the majority of variables, andmean scores regarding data associated with length ofstudy, length of PDSA cycle and sample size were alsoused for analysis. Data were analysed independentlyby MJT and CM. Discrepancies (which occurred inless than 3% of data items) were resolved byconsensus.

Risk of bias in individual studiesThe present review aimed to assess the reported appli-cation of the PDSA method and the results of individ-ual studies were not analysed in this review.

Risk of bias across studiesDespite our review being focused on reported applica-tion, rather than success of interventions, it may stillbe possible that publication bias affected the results ofthis study. Research that used PDSA methodology, butdid not yield successful results, may be less likely toget published than reports of successful PDSAinterventions.

RESULTSStudy selectionA search of the databases yielded 942 articles. Afterremoval of duplicates, 409 remained; 216 and 120

Table 2 Theoretical framework based on key features of the plan–do–study–act (PDSA) cycle method

Feature of PDSA Description of feature How this was measured

Iterative cycles To achieve an iterative approach, multiple PDSA cycles mustoccur. Lessons learned from one cycle link and inform cycles thatfollow. Depending on the knowledge gained from a PDSA cycle,the following cycle may seek to modify, expand, adopt orabandon a change that was tested

▸ Were multiple cycles used?▸ Were multiple cycles linked to one another (ie, does

the ‘act’ stage of one cycle inform the ‘plan’ stage ofthe cycle that follows)?

▸ When isolated cycles were used were future actionspostulated in the ‘act’ stage?

Prediction-based testof change

A prediction of the outcome of a change is developed in the‘plan’ stage of a cycle. This change is then tested and examinedby comparison of results with the prediction

▸ Was a change tested?▸ Was an explicit prediction articulated?

Small-scale testing As certainty of success of a test of change is not guaranteed,PDSAs start small in scale and build in scale as confidence grows.This allows the change to be adapted according to feedback,minimises risk and facilitates rapid change and learning

▸ Sample size per cycle?▸ Temporal duration of cycles?▸ Number of changes tested per cycle?▸ Did sequential cycles increase scale of testing?

Use of data over time Data over time increases understanding regarding the variationinherent in a complex healthcare system. Use of data over time isnecessary to understand the impact of a change on the process oroutcome of interest

▸ Was data collected over time?▸ Were statistics used to test the effect of changes

and/or understand variation?

Documentation Documentation is crucial to support local learning andtransferability of learning to other settings

▸ How thoroughly was the application of the PDSAmethod detailed in the reports?

▸ Was each stage of the PDSA cycles documented?

Table 1 Description of the plan–do–study–act (PDSA) cycle method according to developers and commentators

Deming (1986)25Original description ofthe method relating to manufacturing

Langley (1996)30How the PDSA methodmay be adapted for use in healthcarecontexts

Speroff and O’Connor (2004)33How thePDSA method is analogous to scientificmethodology

Plan Plan a change or test aimed at improvement ▸ Identify objective▸ Identify questions and predictions▸ Plan to carry out the cycle (who, when,

where, when)

Formation of a hypothesis for improvement

Do Carry out the change or test (preferably ona small scale)

▸ Execute the plan▸ Document problems and unexpected

observations▸ Begin data analysis

Conduct study protocol with collection of data

Study Examine the results. What did we learn?What went wrong?

▸ Complete the data analysis▸ Compare data to predictions▸ Summarise what was learnt

Analysis and interpretation of the results

Act Adopt the change, abandon it or runthrough cycle again

▸ What changes are to be made?▸ What will the next cycle entail?

Iteration for what to do next

Systematic review

4 Taylor MJ, et al. BMJ Qual Saf 2013;0:1–9. doi:10.1136/bmjqs-2013-001862

group.bmj.com on February 8, 2016 - Published by http://qualitysafety.bmj.com/Downloaded from

Page 71: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

were further discarded following review of abstractsand full texts, respectively. Excluded articles did notapply the PDSA method as part of an empirical studyor coincidently used the acronyms PDSA or PDCA fordifferent terms, or were abstracts for conferences orposter presentations. A total of 73 articles met theinclusion criteria and were included in the review (seefigure 2).

General study characteristicsCountry of studyThe retrieved articles describe studies conducted inthe USA (n=46), the UK (n=13), Canada (n=3)Australia (n=3), the Netherlands (n=2) and one eachfrom six other countries (see online supplementaryappendix A for complete synthesis of results).

Healthcare discipline to which method was appliedThis varied across acute and community care and clin-ical and organisational settings. The most commonsettings were those of pain management and surgery(six articles each).

Method terminologyOf the 73 articles identified, 42 articles used ‘PDSA’as terminology and 31 referred to the method as‘PDCA’. Eight of these reported using the MFI.Thirty-one articles used ‘PDCA’ terminology, with 20using the preceding FOCUS framework. One articledescribed use of FOCUS and MFI. Over time therewas an increase in the prevalence of PDSA use with

PDCA use diminishing (see online supplementaryfigure S1). The earliest reported use of PDCA andPDSA in healthcare was 1993 and 2000, respectively.

DocumentationThe following four categories were used to describethe extent to which cycles were documented in arti-cles (n=73): no detail of cycles (n=16); themes ofcycles (but no additional details) (n=8); details ofindividual cycles, but not of stages within cycles(n=8); details of cycles including separated informa-tion on stages of cycles (n=41).Analysis of articles against the developed framework

was dependent on the extent to which the applicationof PDSA cycles was reported. Articles that providedno details of cycles or only themes of cycles wereinsufficient for full review and excluded for analysisagainst all features. Articles that provided furtherdetails of cycles completed (n=49) were included foranalysis against the remaining four features of theframework. A full breakdown of findings can beviewed in online supplementary appendix B.

Application of methodIterative cycles (n=49)Fourteen articles described a sequence of iterativecycles (two or more cycles with lessons learned fromone cycle linking and informing a subsequent cycle),33 described isolated cycles that are not linked, and 2articles described cycles that used PDSA stages in theincorrect order (in one article, one plan, one do, twochecks and three acts were described, PDACACA35; afurther study did not report use of a ‘check’ stage;PDA36) and are excluded from further review. Of the33 articles that described non-iterative cycles, 29reported a single cycle being used, and 4 describedmultiple, isolated (non-sequential) cycles. Althoughfuture actions are often suggested in articles thatreported a single cycle, only three explicitly men-tioned the possibility of further cycles taking place. Atotal of 13.6% (3/22) of PDCA studies described theapplication of iterative cycles compared with 44%(11/25) of PDSA studies describing the application ofiterative cycles (see figure 3).

Prediction-based testing of change (n=47)The aims of the cycles adhered to one of two themes:tests of a change; and collection or review of datawithout a change made. Of the 33 articles with singlecycles, 30 aimed to test a change while 3 used thePDSA method to collect or review data. Of the 14articles demonstrating sequential cycle use, 8 solelyused their cycles to test change whilse5 began with acycle collecting or reviewing data followed by cyclestesting change. One article described a mixture ofcycles testing changes and cycles that involved collec-tion/review of data. Four of the 47 studies containedan explicit prediction regarding the outcome of aFigure 2 PRISMA diagram.

Systematic review

Taylor MJ, et al. BMJ Qual Saf 2013;0:1–9. doi:10.1136/bmjqs-2013-001862 5

group.bmj.com on February 8, 2016 - Published by http://qualitysafety.bmj.com/Downloaded from

Page 72: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

change; all 4 aimed to test a change (see online sup-plementary table S1).

Small-scale testing (n=47)Scale was assessed in three ways: sample size, durationand complexity. Sample size refers to quantity ofobservations used to measure the change; durationrefers to the length of PDSA cycle application; andcomplexity refers to the quantity of changes adminis-tered per cycle.

Sample size

Patient data, staff data and case data were used assamples within PDSA cycles. Twenty-seven articlesreported a sample size from at least one of theircycles. Twenty-one of these were isolated cycle studieswith sample size ranging from 7 to 2079(mean=323.33, SD=533.60). The remaining sixstudies reporting individual cycle sample sizes usediterative cycles; the sample size of the first cycles ofthese ranged from 1 to 34 (mean=16.75, SD=11.47).Two of these studies described the use of incrementalsample sizes across cycles, three used non-incrementalsample sizes across cycles, and one changed the typeof sample. Of the eight iterative cycle articles that didnot report individual cycle sample sizes, two did notdifferentiate sample sizes between cycles and insteadgave an overall sample for the chain of cycles and sixdid not report sample size.

Duration

Reported study duration of isolated cycles rangedfrom 2 weeks to 5 years (mean=11.91 months,SD=12.81). Only five articles describing iterativecycles explicitly reported individual cycle duration.Individual cycle duration could be estimated from thetotal duration of the PDSA cycle chain and thenumber of cycles conducted, resulting in approximatecycle lengths ranging from three cycles in 1 day toone cycle in 16 months (mean=5.41 months,SD=4.80, see online supplementary figure S2). Thetotal PDSA cycle duration for series of iterative cycles

(first to last cycle of one chain) ranged from 1 day to4 years (mean=20.38, SD=20.39 months).

Complexity

Twenty-two articles reported more than one changebeing tested within a single cycle. Of the articlesdescribing iterative cycles, 42% administered morethan one change per cycle compared with 48% of thearticles describing non-iterative PDSA cycles.

Data over time (n=47)All studies used a form of qualitative and quantitativedata to assess cycles. Studies were categorised accord-ing to four types of reporting quantitative data:regular (n=15), three or more data points with con-sistent time intervals; non-regular (n=16), before andafter or per PDSA cycle; single data point (n=8), asingle data point after PDSA cycle(s); and no quantita-tive data reported (n=8). Of the 15 articles that usedregular data, only 7 used monthly or more frequentdata intervals (see online supplementary figure S3 forfull frequency of regular quantitative data reporting).No studies reported using statistical process control toanalyse data collected from PDSA cycles. Elevenincluded analysis of data using inferential statisticaltests (five of these studies collected isolated data, sixinvolved continuous data collection).Of the eight articles that did not report any quanti-

tative data, two reported that quantitative analyseshad taken place but did not present the findings andsix described the use of qualitative feedback only (onenon-regular, five single data point). Qualitative datawere gathered through a range of mechanisms frominformal staff or patient feedback to structured focusgroups.

DISCUSSIONPDSA cycles offer a supporting mechanism for itera-tive development and scientific testing of improve-ments in complex healthcare systems. A review of thehistoric development and rationale behind PDSAcycles has informed the development of a theoreticalframework to guide the evaluation of PDSA cyclesagainst use of iterative cycles, initial small-scaletesting, prediction-based testing of change, use of dataover time and documentation.Using these criteria to assess peer-reviewed publica-

tions of PDSA cycles demonstrates an inconsistentapproach to the application and reporting of PDSAcycles and a lack of adherence to key principals of themethod. Only 2/7337 38 articles demonstrated compli-ance with criteria in all five principles. Assessment ofcompliance was problematic due to the marked vari-ation in reporting of this method, which reflects alack of standardised reporting requirements for thePDSA method.From the articles that reported details of PDSA

cycles it was possible to ascertain that variation is

Figure 3 Iterative nature of cycles for all articles and split byplan–do–check–act and plan–do–study–act terminology.

Systematic review

6 Taylor MJ, et al. BMJ Qual Saf 2013;0:1–9. doi:10.1136/bmjqs-2013-001862

group.bmj.com on February 8, 2016 - Published by http://qualitysafety.bmj.com/Downloaded from

Page 73: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

inherent not just in reporting standards, but in theconduct of the method, implying that the key princi-ples of the PDSA method are frequently not followed.Less than 20% (14/73) of reviewed articles reportedthe conduct of iterative cycles of change, and of these,only 15% (2/14) used initial small-scale tests withincreasing scale as confidence in the interventiondeveloped. These results suggest that the full benefitsof the PDSA method would probably not have beenrealised in these studies. Without an iterativeapproach, learning from one cycle is not used toinform the next cycle, and therefore it is unlikely thatinterventions will be adapted and optimised for use ina particular setting. Furthermore, large-scale cyclesrisk significant resource investment in an interventionthat has not been tested and optimised within thatenvironment and risk producing ‘false’ negatives.Only 14% (7/47) of articles reported use of regular

data over time at monthly or more frequent intervals,indicating a lack of understanding around the use ofthe PDSA method to track change within a ‘live’system, and limiting the ability to interpret the resultsfrom the study. Cycles that included an explicit predic-tion of outcomes were reported in only 9% (4/47) ofarticles, suggesting that PDSA cycles were not used aslearning cycles to test and revise theory-basedpredictions.Overall these results demonstrate poor compliance

with key principles of the PDSA method, suggestingthat it is not being used optimally. The increasingtrend in using PDSA (as opposed to ‘PDCA’) cycles inrecent years, however, does seem to have been accom-panied by an increase in compliance with some keyprinciples, such as use of iterative cycles. Deming wascautious over the use of the ‘PDCA’ terminology andwarned it referred to an explicitly different process,referring to a quality control circle for dealing withfaults within a system, rather than the PDSA process,which was intended for iterative learning andimprovement of a product or a process.39 This subtledifference in terminologies may help to explain thebetter compliance with key methodological principlesin studies that refer to the method as ‘PDSA’.One of the articles identified in the search included

comments by the authors that the PDSA methodshould be ‘more realistically represented’,40 as inef-fective cycles can be ‘abandoned’ early on, making itneedless to go through all four stages in each iteration.These comments may provide insight into an import-ant potential misunderstanding of the PDSA method-ology. Ineffective changes will result in learning,which is a fundamental principle behind a PDSAcycle. However minor this abandoned trial may havebeen, it can still be usefully described as a PDSA cycle.A minor intervention may be planned (P) and put intopractice (D). A barrier may be encountered (S), result-ing in a decision being made to retract the interven-tion, and to do something differently (A).

The theoretical framework presented in this paperhighlights the complexity of PDSA cycles and theunderpinning knowledge required for correct applica-tion. The considerable variation in applicationobserved in the reported literature suggests thatcaution should be taken in interpreting results fromevaluations in which PDSAs are used in a controlledsetting and as a ‘black box’ of QI. This review did notcompare the effectiveness of use to reported outcomesand therefore this study does not conclude whetherbetter application of the PDSA method results inbetter outcomes, but instead draws on theoreticalprinciples of PDSAs to rationalise why this would beexpected. Prospective mechanistic studies exploringthe effective application of the method as well asstudy outcomes would be of greater use in drawingconclusions regarding the effectiveness of the method.The framework presented in this paper could act as agood starting point for such studies.The fact that only peer-reviewed publications were

assessed in this study means that results may beaffected by publication bias. This is anticipated bothin terms of what is accepted for publication but alsothe level and type of detail that is requested andallowed in typical publications (eg, before and afterstudies are more common than presenting data overtime and this may make these types of studies easierto publish). Though QI work may be easier to publishnow through recent changes in publication guide-lines,27 possible publication outlets continue to berelatively limited.To support systematic reporting and encourage

appropriate usage, we suggest that reporting guidelinesbe produced for users of the PDSA method to increasetransparency as to the issues that were encountered andhow they were resolved. While PDSA is analogous to ascientific method, it appears to be rarely used orreported with scientific rigour, which in turn, inhibitsperceptions of PDSA as a scientific method. Suchguidelines are essential to increase the scientific legit-imacy of the PDSA method as well as to improve scien-tific rigour or application and reporting. Although theSQUIRE guidelines make reference to the potential useof PDSA cycles, further support to users and teachers,and publication of this improvement method seemsnecessary. Consistent reporting of PDSA structurewould allow meta-evaluation and systematic reviews tofurther build the knowledge of how to use suchmethods effectively and the principles to apply toincrease chances of success.It is clear from these findings that there is much

room for improvement in the application and use ofthe PDSA method. Previous studies have discussed theinfluence of different context factors on the use of QImethods, such as motivation, data support infrastruc-ture and leadership20 22 41–43 Understanding howhigh-quality usage can be promoted and supportedneeds to become the focus of further research if such

Systematic review

Taylor MJ, et al. BMJ Qual Saf 2013;0:1–9. doi:10.1136/bmjqs-2013-001862 7

group.bmj.com on February 8, 2016 - Published by http://qualitysafety.bmj.com/Downloaded from

Page 74: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

QI methods are going to be used effectively in main-stream healthcare.

CONCLUSIONSThere is varied application and reporting of PDSAsand lack of compliance with the principles that under-pin its design as a pragmatic scientific method. Thevaried practice compromises its effectiveness as amethod for improvement and cautions against studiesthat view QI or PDSA as a ‘black box’ intervention.There is an urgent need for greater scientific rigour

in the application and reporting of these methods toadvance the understanding of the science of improve-ment and efficacy of the PDSA method. The PDSAmethod should be applied with greater consistencyand with greater accordance to guidelines provided byfounders and commentators25 30 44 45

Acknowledgements The authors would like to thank DrThomas Woodcock for his valuable input into the theoreticalframework and data analysis.

Contributors All listed authors qualify for authorship based onmaking one or more of the substantial contributions to theintellectual content: conceptual design (MJT, CM, CN, DB,AD and JR), acquisition of data (MJT, CM and CN) and/oranalysis and interpretation of data (MJT, CM, CN and JR).Furthermore all authors participated in drafting the manuscript(MJT, CM, CN, DB, AD and JR) and critical revision of themanuscript for important intellectual content (MJT, CM, CN,DB, AD and JR).

Disclaimer This article presents independent researchcommissioned by the National Institute for Health Research(NIHR) under the Collaborations for Leadership in AppliedHealth Research and Care (CLAHRC) programme for NorthWest London. The views expressed in this publication are thoseof the author(s) and not necessarily those of the NHS, theNIHR or the Department of Health.

Competing interests The authors declare no conflict of interest.

Provenance and peer review Not commissioned; externallypeer reviewed.

Open Access This is an Open Access article distributed inaccordance with the Creative Commons Attribution NonCommercial (CC BY-NC 3.0) license, which permits others todistribute, remix, adapt, build upon this work non-commercially, and license their derivative works on differentterms, provided the original work is properly cited and the useis non-commercial. See: http://creativecommons.org/licenses/by-nc/3.0/

REFERENCES1 Øvretveit J. Does improving quality save money. A review of

evidence of which improvements to quality reduce costs to healthservice providers. London: The Health Foundation, 2009.

2 Walshe K, Freeman T. Effectiveness of quality improvement:learning from evaluations. Qual Saf Health Care2002;11:85–7.

3 Schouten LMT, Hulscher MEJL, van Everdingen JJE, et al.Evidence for the impact of quality improvement collaboratives:systematic review. BMJ 2008;336:1491–4.

4 Nicolay CR, Purkayastha S, Greenhalgh A, et al. Systematicreview of the application of quality improvementmethodologies from the manufacturing industry to surgicalhealthcare. Br J Surg 2012;99:324–35.

5 Berwick DM. Developing and testing changes in delivery ofcare. Ann Intern Med 1998;128:651–6.

6 McCormack B, Kitson A, Harvey G, et al. Getting evidenceinto practice: the meaning of context. J Adv Nurs2002;38:94–104.

7 Kaplan HC, Brady PW, Dritz MC, et al. The influence ofcontext on quality improvement success in health care:a systematic review of the literature. Milbank Q 2010;88:500–59.

8 Oxman AD, Thomson MA, Davis DA, et al. No magic bullets:a systematic review of 102 trials of interventions to improveprofessional practice. CMAJ 1995;153:1423.

9 Department of Health. Report of the High Level Group (HLG)on clinical effectiveness. London: Department of Health, 2007.

10 Greenhalgh T, Robert G, Macfarlane F, et al. Diffusion ofinnovations in service organizations: systematic review andrecommendations. Milbank Q 2004;82:581–629.

11 Plsek PE, Wilson T. Complexity science: complexity, leadership,and management in healthcare organisations. BMJ 2001;323:746.

12 Damschroder LJ, Aron DC, Keith RE, et al. Fosteringimplementation of health services research findings intopractice: a consolidated framework for advancingimplementation science. Implement Sci 2009;4:50.

13 Powell AE, Rushmer RK, Davies HTO. A systematic narrativereview of quality improvement models in health care: NHSQuality Improvement Scotland. 2009. Report No. 1844045242.

14 Boaden R, Harvey J, Moxham C, et al. Quality improvement:theory and practice in healthcare. NHS Institute for Innovationand Improvement, 2008.

15 Walshe K. Understanding what works—and why—in qualityimprovement: the need for theory-driven evaluation. Int JQual Health Care 2007;19:57–9.

16 Shojania KG, Grimshaw JM. Evidence-based qualityimprovement: the state of the science. Health Aff2005;24:138–50.

17 Auerbach AD, Landefeld CS, Shojania KG. The tensionbetween needing to improve care and knowing how to do it.N Engl J Med 2007;357:608–13.

18 Ting HH, Shojania KG, Montori VM, et al. Quality improvementscience and action. Circulation 2009;119:1962–74.

19 Pronovost P, Needham D, Berenholtz S, et al. An interventionto decrease catheter-related bloodstream infections in the ICU.N Engl J Med 2006;355:2725–32.

20 Benning A, Ghaleb M, Suokas A, et al. Large scaleorganisational intervention to improve patient safety in fourUK hospitals: mixed method evaluation. BMJ 2011;342:d195.

21 Landon BE, Wilson IB, McInnes K, et al. Effects of a qualityimprovement collaborative on the outcome of care of patientswith HIV infection: the EQHIV study. Ann Intern Med2004;140:887–96.

22 Vos L, Duckers ML, Wagner C, et al. Applying the qualityimprovement collaborative method to process redesign: amultiple case study. Implement Sci 2010;5:19.

23 Grol R, Baker R, Moss F. Quality improvement research:understanding the science of change in health care. Qual SafHealth Care 2002;11:110–11.

24 Walshe K. Pseudoinnovation: the development and spread ofhealthcare quality improvement methodologies. Int J QualHealth Care 2009;21:153–9.

25 Deming WE. Out of the crisis, 1986. Cambridge, MA:Massachusetts Institute of Technology Center for AdvancedEngineering Study xiii, 1991;507.

26 Øvretveit J. Understanding the conditions for improvement:research to discover which context influences affectimprovement success. BMJ Qual Saf 2011;20(Suppl 1):i18–23.

Systematic review

8 Taylor MJ, et al. BMJ Qual Saf 2013;0:1–9. doi:10.1136/bmjqs-2013-001862

group.bmj.com on February 8, 2016 - Published by http://qualitysafety.bmj.com/Downloaded from

Page 75: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

27 Davidoff F, Batalden P, Stevens D, et al. Publication guidelinesfor quality improvement in health care: evolution of theSQUIRE project. Qual Saf Health Care 2008;17(Suppl 1):i3–9.

28 Ogrinc G, Mooney S, Estrada C, et al. The SQUIRE (Standardsfor Quality Improvement Reporting Excellence) guidelines forquality improvement reporting: explanation and elaboration.Qual Saf Health Care 2008;17(Suppl 1):i13–32.

29 Imai M. The key to Japan’s competitive success. New York:McGraw-Hill, 1986.

30 Langley GJ. The improvement guide: a practical approach toenhancing organizational performance. 1st edn. San Francisco:Jossey-Bass Publishers, 1996.

31 Batalden P. Building knowledge for improvement-an introductoryguide to the use of FOCUS-PDCA. Nashville, TN: QualityResource Group, Hospital Corporation of America, 1992.

32 Moen R, Norman C. Evolution of the PDCA cycle. 2006.33 Speroff T, O’Connor GT. Study designs for PDSA quality

improvement research. Qual Manag Health Care2004;13:17–32.

34 Liberati A, Altman DG, Tetzlaff J, et al. The PRISMAstatement for reporting systematic reviews and meta-analyses ofstudies that evaluate health care interventions: explanation andelaboration. PLoS Med 2009;6:e1000100.

35 Bader MK, Palmer S, Stalcup C, et al. Using a FOCUS-PDCAquality improvement model for applying the severe traumaticbrain injury guidelines to practice: process and outcomes.Reflect Nurs Leadersh 2002;28:34–5.

36 Reid D, Glascott G, Woods D. Improving referral informationin community mental health. Nurs Times 2005;101:34–5.

37 Lynch-Jordan AM, Kashikar-Zuck S, Crosby LE, et al.Applying quality improvement methods to implement ameasurement system for chronic pain-related disability. J PediatrPsychol 2010;35:32–41.

38 Varkey P, Sathananthan A, Scheifer A, et al. Using quality-improvement techniques to enhance patient education andcounselling of diagnosis and management. Qual Prim Care2009;17:205–13.

39 Moen R, Norman C. Circling back: clearing up the mythsabout the Deming cycle and seeing how it keeps evolving.Qual Progress 2010;42:23–8.

40 Tomolo AM, Lawrence RH, Aron DC. A case study oftranslating ACGME practice-based learning and improvementrequirements into reality: systems quality improvement projectsas the key component to a comprehensive curriculum. PostgradMed J 2009;85:530–7.

41 Berwick DM. Developing and testing changes in delivery ofcare. Ann Intern Med 1998;128:651.

42 Benn J, Burnett S, Parand A, et al. Perceptions of the impact of alarge-scale collaborative improvement programme: experience inthe UK Safer Patients Initiative. J Eval Clin Pract 2009;15:524–40.

43 Parand A, Burnett S, Benn J, et al. Medical engagement inorganisation-wide safety and quality-improvementprogrammes: experience in the UK Safer Patients Initiative.Qual Saf Health Care 2010;19:e44.

44 Berwick D. Broadening the view of evidence-based medicine.Qual Saf Health Care 2005;14:315–16.

45 Speroff T, James BC, Nelson EC, et al. Guidelines for appraisaland publication of PDSA quality improvement. Qual ManagHealth Care 2004;13:33–9.

Systematic review

Taylor MJ, et al. BMJ Qual Saf 2013;0:1–9. doi:10.1136/bmjqs-2013-001862 9

group.bmj.com on February 8, 2016 - Published by http://qualitysafety.bmj.com/Downloaded from

Page 76: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

quality in healthcareact method to improve−study−do−plan

Systematic review of the application of the

and Julie E ReedMichael J Taylor, Chris McNicholas, Chris Nicolay, Ara Darzi, Derek Bell

published online September 11, 2013BMJ Qual Saf 

1862http://qualitysafety.bmj.com/content/early/2013/09/11/bmjqs-2013-00Updated information and services can be found at:

These include:

MaterialSupplementary

1862.DC1.htmlhttp://qualitysafety.bmj.com/content/suppl/2013/09/11/bmjqs-2013-00Supplementary material can be found at:

References

#BIBL1862http://qualitysafety.bmj.com/content/early/2013/09/11/bmjqs-2013-00This article cites 36 articles, 17 of which you can access for free at:

Open Access

http://creativecommons.org/licenses/by-nc/3.0/non-commercial. See: provided the original work is properly cited and the use isnon-commercially, and license their derivative works on different terms, permits others to distribute, remix, adapt, build upon this workCommons Attribution Non Commercial (CC BY-NC 3.0) license, which This is an Open Access article distributed in accordance with the Creative

serviceEmail alerting

box at the top right corner of the online article. Receive free email alerts when new articles cite this article. Sign up in the

CollectionsTopic Articles on similar topics can be found in the following collections

(235)Open access (96)Editor's choice

Notes

http://group.bmj.com/group/rights-licensing/permissionsTo request permissions go to:

http://journals.bmj.com/cgi/reprintformTo order reprints go to:

http://group.bmj.com/subscribe/To subscribe to BMJ go to:

group.bmj.com on February 8, 2016 - Published by http://qualitysafety.bmj.com/Downloaded from

Page 77: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

July 2017 Revision

20 | P a g e

APPENDIX 8 – RESEARCH GUIDE PREVIEW – Example from RCPSC

Page 78: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

July 2017 Revision

21 | P a g e

APPENDIX 9 – BUILDING QUERIES IN ACCURO

Page 79: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

Approved by Teaching Clinic Council, October 2014 3

Appendix C: Query Building and Testing: A How-To Guide

1. Procedure:

Step 1: Open the query builder: From the Menu Bar, go to Reports -> Alerts (Query Builder). The Alert Definition window will open.

Step 2: Name your query:

o Click the Add icon in the top-left corner under Definitions -> type a name for the query you are creating -> click OK.

The query name should include 3 elements:

Clinic initials (FMC, KMC, NCMC) Name of query Something to identify it as your own, e.g., your initials, last name, etc. E.g., FMC-Resident Experience Audit MK

o How to copy existing queries: Use the icon that represents two sheets of white paper -> re-name the query appropriately -> Click OK.

Step 3: Adding rules: Select the type of data you wish to query:

o Select a Category from the drop-down list under New Rule -> o Choose the Type of Data to constrain your rule by from the list of options below the category

you selected -> o Click Add Rule -> o The rule will be added followed by the word Exists (e.g. Category EMR -> Encounter Note ->

Encounter Note Exists).

Step 4: Add constraints:

o Click on New to add a constraint -> o Click Update Rule -> o the constraint should be visible in the Current Rules box (e.g. Encounter Note Title Contains

‘Complete Physical’).

o If you would like to add more constraints click New again, specify the constraint, and update the rule.

o See section 2, “key constraints to consider”.

Page 80: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

Approved by Teaching Clinic Council, October 2014 4

Step 5: Run the report:

o When you are ready to run the report click Run Report ->

o Under Fields to Display double click any of the green check marks next to fields that you do

NOT want to be displayed in the results; they will change to red exes -> o under Match Types to View leave the ‘Unassigned’ setting selected -> o Click Run -> o Wait for the results of your report to display in the output window.

Step 6: Manage your results/report:

From the results window there are several options available to you.

Select an action from the drop-down list:

The action you select will be applied to every patient that appears on the list. Examples include, create task, create bill, assign patient flag etc.

Completed: Sets the status for the task to be completed.

Rerun: takes you back to the previous screen where you can change Fields to Display and Match Types to View and then rerun the report.

Print All: Allows you to print all results displayed.

Export: Allows you to export the query results to Microsoft excel.

2. Key Constraints to Consider:

a) Patient ‘Active’ Status:

In the EMR an ‘Active’ Status refers to a code in the field ‘patient status’, which is located in the patient section of Accuro under demographics. If you include this line in your query, the query will exclude any patient’s whose charts are inactive, such as patients who are deceased or who have left the practice. See PCOG <name, number> for definitions of patient status.

You may or may not want to include this as a constraint depending on your question.

To get ‘truly active’ you would need to add a line for apt. in the last 18 months

b) Patient Records Only: If this box is selected, the query output constitutes a list of unique patients that meet the conditions of the query. This list does NOT take into consideration the number of instances that each patient meets the conditions of the query.

E.g. you want to figure out how many appointments Dr. Singer has provided in the last 6 months.

Patient Records Only Selected: there is a risk of undercounting because some patients may have seen Dr. Singer multiple times in the past six months, but if this box is selected the query will only pull their name once.

Page 81: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

Approved by Teaching Clinic Council, October 2014 5

Patient Records Only NOT Selected: The results represent the total number of appointments Dr. Singer provided in the last six months. There may be some patient names that appear more than once on the list because of multiple return visits in that time period.

NOTE: When querying billing data, be sure to select the patient records only box. In a clinics with residents, every visit with a resident will have two bills associated with it1. One bill in the resident name, which later gets ‘no charged’, and another bill in the preceptor’s name, which gets sent to MB Health. If you do not check off patient records only when querying billing data, you will get duplicate results for all visits with residents.

E.g. If you are looking for the number of patients at FMC that have hypertension and you use billing codes in your query to generate the list Make sure you select the ‘Patient Records Only’ box on every single line of your query, otherwise the query will retrieve every single bill that has ever been submitted with the code for hypertension, which could have a doubling or tripling effect on your results.

c) Appointment Provider, Office Provider, and Bill Provider:

Appointment Provider: The provider who actually saw the patient.

Bill Provider: Selecting this will pull any bills generated for visits with residents that Dr. X precepted.

Office Provider: The provider to whom the patient is assigned.

1 Based on teaching clinic workflows. This depends on clinic workflow for scheduling patients with residents.

Page 82: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

Approved by Teaching Clinic Council, October 2014 6

Key points:

A query constrained based on office provider will retrieve all patients who have Dr. Singer as their assigned provider in the EMR, and who have a claim that was billed as 8400, 8401, or 8402.

A query constrained based on appointment provider will retrieve the number of patients that Dr. X saw and whose visits were billed as 8400, 8401 or 8402.

Run comparison: Part I with Part II: 86% of maternity related visits provided by Dr. Singer at FMC were to patients assigned to him in the EMR.

d) Doesn’t Match:

If this box is selected, under ‘Current Rules’ the rule will appear highlighted in red. This means that the query will only pull results that do not match the highlighted rule.

Check this box for each rule in your query that you do not wish results to be displayed for.

e) ‘Contains’ vs. ‘Equals’:

For most Types of Data listed under Category, there are additional options that can be used for further specification.

E.g. Encounter Note Title Contains ‘phone’ -> The query would pull only encounter notes that have titles which include the word ‘phone’, such as ‘telephone note’, ‘telephone call’, ‘phone call’ etc.

E.g. Encounter Note Title Equals ‘encounter note’ -> the query would pull only encounter notes whose title is exactly encounter note.

Page 83: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

Approved by Teaching Clinic Council, October 2014 7

The ‘contains’ constraint can be helpful to hone in on results within a specific subtype of data with many possibilities, such as encounter note titles. It is recommended to use this constraint if the field being queried is not fixed in the EMR, and therefore the spelling or order of words could differ depending on the user.

The ‘equals’ constraint is more limiting because the query only retrieves results that exactly match what is written in the ‘equals’ box, taking into consideration capitalization, spelling, spacing and punctuation.

E.g. Encounter Note Title Equals ‘encounter note’ -> the query would only pull encounter notes whose title is exactly encounter note.

In the example shown above, the query would retrieve a much smaller number of encounter notes than expected, because many residents and preceptors modify the title of their encounter notes to reflect things such as the type of visit (e.g. complete physical male) or the reason for the visits (e.g. pneumonia). To retrieve all encounter notes in the EMR, it would be better note to constrain the query based on encounter note title.

3. Using Advanced Logic (AND vs. OR conditions):

It is important to note that without modifying any of the settings in the query builder, there is already a built in ‘AND’ condition between each line that is added to the Current Rules box.

E.g. if a query consists of four lines, the results will reflect only patients who meet the conditions specified on line 1 and 2 and 3 and 4.

To build ‘OR’ conditions into the same line of a query: click on the line you wish to add more options to -> click ‘New’ ->select the type of data (e.g. ‘Bill Procedure Code’ = ‘8400’) -> Repeat these steps to continue adding new options (e.g. ‘8401’ OR ‘8402’)

To build a query that has and “AND” and an “OR” condition specified between separate lines:

USE ADVANCED LOGIC.

E.g. if you have four lines in your query and you want the results to display only patients that meet the conditions specified on lines 1 and 2 and 3 AND either 4 OR 5 you would need to use advanced logic.

Page 84: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

Approved by Teaching Clinic Council, October 2014 8

1 AND 2 AND 3 AND (4 OR 5)

1. Office Contains ‘Office Name’ 2. Office Provider ‘Last name, First name’ 3. Patient Status= ‘Active’ 4. Prescription Start Date Between 2011-Dec-18 and 2013-Jun-18 and Prescription Classification= “VARENICLINE” (Patient Records Only) 5. Lifestyle History Contains ‘Tobacco Use’ (Patient Records Only)

Under Options… -> Check the box that says Use Advanced Logic ->

Click OK -> Click the pencil icon beside the rule at the top of the query window ->

Write out the logic that you would like the query builder to follow, using brackets “( )” to specify

‘OR’ conditions.

4. Testing queries and building confidence:

Data quality varies in the EMR and is generally poor at this time. As with any data base, there are

limitations. Therefore, no matter how careful you are, your query may not retrieve all patients who

fit your inclusion criteria and/or your results may include patients that should have been excluded.

The following tips may help build confidence into your query results:

Use your knowledge of the clinic demographic - do the results seem reasonable?

Verify that no results are double counted in the query output.

Take a sample of records and manually check them against what is recorded in the EMR for each patient.

Where possible, build another query to help validate results.

o Select ‘Run Comparison’

o From the drop down menu, select the two queries you would like to compare against each other (checking for similarities in results).

o Click ‘Ok’

o A message will display the results of the comparison.

Consult a colleague who may have more experience in query building – go over your query and the results together.

5. Additional Examples: Hypertension:

Page 85: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

Approved by Teaching Clinic Council, October 2014 9

a) Part I: Question: How many of Dr. Singer’s patients have been billed for hypertension in the year 2014?

*this doesn’t necessarily mean that Dr. Singer’s name was on all of those bills.*

Office Contains ‘Family Medical’ (Patient Records Only)

Office Provider = ‘Singer, Alexander’ (Patient Records Only)

Status = ‘Active’ (Patient Records Only)

Bill Date Between 2014-Jan-01 and Today’s Date AND Bill Diagnosis Code Starts with ‘401’ (Patient Records Only)

# Matches:

b) Part II: Question: How many of Dr. Singer’s patients have been billed for hypertension in the year 2014 by Dr. Singer or one of the residents that he precepted?

Office Contains ‘Family Medical’ (Patient Records Only)

Office Provider = ‘Singer, Alexander’ (Patient Records Only)

Status = ‘Active’ (Patient Records Only)

Bill Date Between 2014-Jan-01 and Today’s Date AND Bill Diagnosis Code Starts with ‘401’ AND Bill Provider = ‘Singer, Alexander’. (Patient Records Only)

# Matches:

c) Part III: Question: How many of Dr. Singer’s patients who have been billed for hypertension in the year 2014, also have it recorded in their problem list?

Office Contains ‘Family Medical’ (Patient Records Only)

Office Provider = ‘Singer, Alexander’ (Patient Records Only)

Status = ‘Active’ (Patient Records Only)

Bill Date Between 2014-Jan-01 and Today’s Date AND Bill Diagnosis Code Starts with ‘401’ (Patient Records Only)

Diagnosis Diagnosis Code Starts With ‘401’ (Patient Records Only)

6. Screen Shots:

Example Query - Hypertension

1. From the Menu Bar, click on Reports -> Alerts (Query Builder). The Alert Definitions window opens.

2. Click the Add icon in the top-left corner under Definitions.

Page 86: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

Approved by Teaching Clinic Council, October 2014 10

3. Type a name for the definition.

4. Click OK.

5. From the Category drop-down list, select Demographics.

6. From the Category list, select Office.

7. Click the Add Rule button to add the rule to the Current Rules for the definition -> (Office ‘Exists’)

8. Click the New button to specify the name of the office.

9. Select Contains from the drop-down menu -> Type the name of the office (Family Medical Centre) and check off the Patient Records Only box.

10. Click Update Rule.

11. Repeat steps 5-10 to add Office Provider(s) AND Patient Status = ‘Active’

12. From the Category drop-down list, select Billing.

13. From the Category list, select Bill.

14. Click the Add Rule button to add the rules to the Current Rules for the definition -> (Bill ‘Exists).

15. Click the New button to specify the Bill Date (e.g. Between Jan 1-14 Jun 30-14) and check off the Patient Records Only box.

16. Click Update Rule

17. Click the New button to specify the Bill Diagnosis Code.

18. Pick Starts With from the drop-down list and type ‘401’.

19. Click Update Rule

20. Click on the Run Report button. The Check for Alert Matches dialogue window will open.

21. Ensure that Unassigned is checked off under Match Types to View.

Page 87: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

Approved by Teaching Clinic Council, October 2014 11

22. Click the Run button. The Alert Matches dialogue window opens with the results of your query.

How to Export Queries

1. Click Export from the results window.

2. Click Computer.

3. Double Click on C$(\\Client) (V:).

4. Open Documents & Settings folder.

5. Find and open the folder with your Username.

6. Save the query as a CSV file type to a folder within your username folder (i.e. my documents). DO NOT SAVE THE QUERY ON THE DESKTOP.

How to Manage Queries

1. Open the query export.

NOTE: Your

folder options

might be

different, e.g.,

“Users”, then

<your name>.

Page 88: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

Approved by Teaching Clinic Council, October 2014 12

2. To comply with PHIA policies keep only the minimum amount of information you need in order to analyze the data Delete all columns of patient data that you do not require.

3. De-identify the information if possible.

4. Save the file as an Excel Workbook.

5. Delete the original CSV Export.

6. Encrypt the Excel workbook -> Under the File Tab -> Info -> Permissions -> Protect Workbook ->

Encrypt with password -> Type password -> Save.

Page 89: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

July 2017 Revision

22 | P a g e

APPENDIX 10 – FLOW DIAGRAM EXAMPLE

EXAMPLE: Your results could include the following in a flow diagram: 900 Patient charts identified by inclusion criteria. 100 Patient charts excluded based on exclusion criteria. 800 Patient charts remaining – identifying the eligible population. 100 Sample extracted by quasi random process to form the study population. 30 Patient charts identified by field specific EMR markers. 20 Patient charts without field specific EMR markers but with free text documentation of adequate foot exams. 50 Patient charts without any identified documentation of adequate foot exams.

Page 90: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

July 2017 Revision

23 | P a g e

APPENDIX 11 – ARTICLE – PREPARING AND DELIVERING A 10-MINUTE PRESENTATION

Page 91: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

Paediatric Respiratory Reviews 12 (2011) 148–149

Research: from concept to presentation

Preparing and delivering a 10-minute presentation at a scientific meeting

Trisha Greenhalgh a,*, Johan C. de Jongste b, Paul L.P. Brand c

a Professor of Primary Health Care and Director, Healthcare Innovation and Policy Unit, Centre for Health Sciences Barts and The London School of Medicine and

Dentistry, Abernethy Building 2, Newark Street, London E1 2ATb Professor of Pediatric Respiratory Medicine, Department of Pediatrics, Erasmus University Medical Center - Sophia Children’s Hospital, PO box 2060, 3000 CB Rotterdam,

The Netherlandc Consultant paediatrician (respiratory/allergy), Princess Amalia Children’s Clinic, Isala klinieken PO Box 10400, 8000 GK Zwolle, The Netherlands

Contents lists available at ScienceDirect

Paediatric Respiratory Reviews

Know your audience. Craft your message. Use technologycreatively. Engage your listeners.

KNOW YOUR AUDIENCE

It is tempting to start with the question ‘‘What am I going tosay?’’. Resist this temptation. Start with the question ‘‘Who are myaudience and what would they want to know about this topic?’’.Ideally, think of a single, known individual who typifies the sort ofperson who will be in your audience, and write your presentationfor that person.

If your session is a relatively low-key, generic ‘research-in-progress’ session, your audience may consist mainly of PhD studentsor junior researchers. They may be unaware of the policy backgroundand basic science behind your particular study. On the other hand, ifyou are presenting your study in a themed session at a nationalmeeting, it is likely that most people in your audience will be awareof the state of research in the field and may already have heard thepolicy background from other speakers. If your session is apresentation to clinicians, they may be relatively uninterested inthe research literature but keen to hear how your findings couldaffect the management of patients in their clinic. At a local ornational seminar, you can probably assume that the audience willknow how the NHS works and what common acronyms mean; at aninternational seminar you cannot assume this.

CRAFT YOUR MESSAGE

Ten minutes means a maximum of ten slides, including yourintroduction and conclusion. You will find it extremely difficult tokeep to ten slides, but keep to them you must, or you will go overtime and engender resentment in your audience. If people in youraudience are interested in your work, they will email you and youcan send more details. Don’t worry at this stage about graphics orlayout, just get some words down and return to re-format themlater. If you prefer, do your outline with pencil and paper or as aWord document, and transfer it to Powerpoint later.

* Corresponding author. Tel.: +20 7882 7325or 7326; fax: +20 7882 2552.

E-mail address: [email protected].

URL: http://www.ihse.qmul.ac.uk/staff/trishagreenhalgh.html

1526-0542/$ – see front matter � 2011 Elsevier Ltd. All rights reserved.

doi:10.1016/j.prrv.2011.01.010

Slide One. Include a title for your presentation and your name,position, organisation and collaborators. Even when you arepresenting alone, briefly acknowledge people who have con-tributed to the work. If pushed for time, you do not need to readtheir names out. If you are part of a large collaboration, saysomething like ‘‘full list of collaborating centres available onrequest’’. Try to convey in your title that your study is important,original and could change practice. This tactic can help drawpeople to attend your talk – after all, would you prefer to attend asession called ‘‘A study of A versus B in children with pneumonia’’or ‘‘Keeping kids out of hospital: Early results from a randomisedtrial of a new antibiotic, drug B, in community acquiredpneumonia’’?

Slide Two. Describe in very broad terms the background to yourwork and why the study matters. If you are speaking to clinicians,talk about a common patient problem and how your work couldimprove outcomes in this area. If speaking to senior researchers,refer very briefly to leading studies in the field (they will befamiliar with these so you only need give the study acronym, leadauthor name and year e.g. BITS trial, Patel 2009). Do not list everyrelevant study.

Slide Three. Give one aim and three or four key objectives of thestudy you are presenting. The aim should be very general (e.g. ‘‘Tobuild the evidence base for management of community-acquiredpneumonia in school-age children’’). The objectives should listspecific tasks you have undertaken or plan to do (e.g. ‘‘Incollaboration with general practitioners, recruit a sample ofchildren with community-acquired pneumonia’’, ‘‘Randomisethem to receiving Antibiotic A or Antibiotic B’’, ‘‘Measure durationof illness and complication rate’’).

Slides Four and Five. Give more details on the methods. If yourstudy is a clinical trial, for example, use the PICOT (population-intervention-comparison-outcome-time) acronym and list theinclusion and exclusion criteria. A flow chart diagram is a goodway to convey a lot of data in a small space.

Slides Six and Seven. Give your main results. Start with a table ofparticipants (don’t call them ‘‘subjects’’ as it’s disrespectful),showing age, gender, ethnicity and relevant baseline variables(e.g. body mass index, peak flow rate). Show the results of yourprimary outcome measure (e.g. mean duration of illness in theexample above). Highlight any unexpected or alarming findings e.g.new adverse reactions.

Page 92: APPENDICES / RESOURCES · APPENDIX 7 – PDSA Cycle Systematic Review . APPENDIX 8 – Research Guide Review – Example from RCPSC . APPENDIX 9 – Building Queries in Accuro . APPENDIX

T. Greenhalgh / Paediatric Respiratory Reviews 12 (2011) 148–149 149

Slide Eight. Give one or two key conclusions (e.g. ‘‘This study hasshown that Antibiotic A and Antibiotic B appear to be therapeu-tically equivalent in the management of community-acquiredpneumonia in children.’’). Be restrained here: you will come acrossas a poor scientist if you conclude too much from your findings.

Slide Nine. There are two options here. If you are sure where youare heading next, call this slide ‘Future Directions’ and suggest onepossible follow-on study. Do not use the cliched expression ‘‘moreresearch is needed’’. If you are an MSc student, flag an idea for aPhD – and if you are a PhD student, flag an idea for a postdoctoralresearch bid. If you are presenting to a work-in-progress meetingand are not sure where to take your work next, it is quite in order tohead the slide ‘‘Questions/Issues for discussion’’ and be honestabout your uncertainties. For example, list ‘‘Concerns about lossesto follow-up’’, ‘‘Possible further analysis by subgroup’’, ‘‘Unsurewhether correct statistical tests were used’’, and invite discussion.

Slide Ten. Put the message ‘‘Thank you for your attention’’ andlist up to five key references. This slide will remain on the screenwhile you answer questions, so add information such as your emailaddress or phone number if you want to build contacts.

USE TECHNOLOGY CREATIVELY

Having outlined your slides, you now need to make them. If youcannot use Powerpoint, you need to learn. Ask your boss or a seniorcolleague whether there is a ‘house style’ for your department ororganisation. If there is, you will be given some ready-formattedslides with a standardised background and header and/or footer. Ifnot, you might like to select a ‘Slide design’ from the Powerpointmenu.

There are five don’ts and five do’s in using Powerpoint in ten-minute presentations.

Five Powerpoint don’ts:

1. Don’t overuse the functionality of the software. Too manytechnological tricks will distract attention from the science ofyour presentation and make you look like a show-off.

2. Don’t set your bullet points to uncover line by line. The humanbrain is designed to take in an overview while at the same timefocusing on the detail of what is being attended to. Denying youraudience this opportunity will make them feel frustrated andpatronised.

3. Don’t assume that the computer from which you will run yourpresentation will have the same version of Powerpoint as theone you are currently using. Many lecture theatres are usingobsolete versions of Microsoft Office. Hence, heed any pop-upwarnings that certain functions may not work as expected. Tryout your slides on a lowest-common-denominator programme.

4. Don’t reduce the font size if you can’t fit the words on the slide.Instead, edit your text down. You should not put down everyword you want to say. If you wish, make a separate set of notesto read out.

5. Don’t use clip-art. It’s tatty.

Five Powerpoint do’s:

1. Keep font size strictly standardised. Your presentation willirritate people if each slide has a different font size for the text.Use 36-40 pt for headings and 24-28 pt for text.

2. Use simple functionality to add a professional touch. Forexample, select ‘slide transition’, ‘fade through black’ and ‘applyto all’.

3. Use as many high-quality images as you can. People remember10 percent of what they hear but 50 percent of what they see.

Browse websites of royalty-free images (put this term intoGoogle), and see if you can find photographs, maps or otherimages to accompany your message. Most such images will costyou no more than a pound or two to download. Add aphotograph of yourself and your team to the first or last slide.Take your own pictures e.g. of consenting patients holdingbottles of medication. Get patient consent to reproduce their X-rays or scans, and use black rectangles from the drawing toolbarto remove identifying details.

4. If you have published a paper on this topic, open a pdf file of thepaper and use the snapshot tool (‘Tools’, ‘Select’, ‘Snapshot’) toclip a rectangle from the front page (e.g. the title and a few linesof the abstract). Reduce this in size so you can include it on yourlast slide.

5. Make simple graphs by entering data directly into the Power-point programme. The keystrokes ‘Insert’, ‘Chart’ and ‘Charttype’ will get you to a menu where you can select a piechart, histogram, scatter plot and so on. Putting in some datafrom scratch will produce a much more professional imagethan cutting and pasting the output of a programme such asSPSS.

ENGAGE YOUR LISTENERS

Most people attending most conferences are already bored.Passive listening is tiring. A speaker who engages directly with theaudience will be particularly welcome. A good opening move, afterthanking your chairperson for introducing you, is to ask theaudience for a show of hands on a simple question – for example,‘‘Can I just ask how many of you prescribe Drug B in your ownpractice?’’. You can then personalise your talk as you go along,adding phrases like ‘‘As some of you will know, this drug coloursthe urine. . .’’.

If you are confident (e.g. if you’ve given this presentationbefore and know the lines off by heart), you could engage theaudience in interpreting the graphs or tables. Use a laser pointerand ask ‘‘Can anyone suggest what’s happening here?’’. If you planto try this approach, test it out on some colleagues beforehand incase their suggestions are not what you were expecting. Over-zealous efforts at audience participation can backfire. Peoplegenerally want to contribute briefly, but not to play contrivedgames.

When you invite questions, thank anyone who offers one. If youhave no idea of the answer, be honest and admit it, and thenquickly ask whether anyone in the audience can help you out.There may be a professor in the back row who is itching todemonstrate his or her superior knowledge, and you will get thereflected glory of having invited the answer! Some people askquestions because they are genuinely interested in your work, andyou will find their questions easy and interesting to answer. Othersput their hands up mainly to flag their own work – and tend to askvery long and slightly irrelevant questions. A good response to aquestion which appears irrelevant is ‘‘I’m afraid that was beyondthe scope of my study’’. If you think someone is using question timeto score points off you, give the chair a sidelong glance – or say‘‘Thanks very much for that comment, does anyone else have anyquestions’’.

At the end of your talk, remember to thank the chair. If you don’thave business cards, make some beforehand by printing yourname, affiliation and email along with a short abstract of your talk.You will probably find that a few people are interested in followingup. The conference bar that same evening might be a good place tobegin networking!