34
Hospital Report Jim Dameron Administrator Leslie Ray, PhD Field Coordinator Sydney Edlund Data Analyst August 2010

Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

  • Upload
    others

  • View
    6

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

Hospital Report

Jim Dameron Administrator Leslie Ray, PhD Field Coordinator Sydney Edlund Data Analyst

August 2010

Page 2: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

INTRODUCTION ..................................................................................................................................................... 1

NORTH STAR: A PROGRESS REPORT ....................................................................................................................... 2

Table 1: North Star Summary 2009-2010 ........................................................................................................................... 3 NORTH STAR MEASURE #1: ELIMINATE HARM .................................................................................................................... 3

Figure 1: Number of Retained Objects ............................................................................................................................... 3 NORTH STAR MEASURE #2: ADOPT EVIDENCE-BASED GUIDELINES .......................................................................................... 4

Table 2: Percent of Patients Whose Care Met All 3 SCIP Antibiotic Measures ................................................................... 4 Table 3: Oregon’s SCIP scores compared to National Average .......................................................................................... 4

NORTH STAR MEASURE #3: ASSESS RISK; LEARN FROM EXPERIENCE ........................................................................................ 5 Figure 2: Adverse Event Reports Submitted by Year .......................................................................................................... 5

NORTH STAR MEASURE #4: IMPROVE CULTURES OF SAFETY WITHIN HOSPITALS ........................................................................ 5 NORTH STAR MEASURE #5: ROLE OF PATIENTS AND CONSUMERS ........................................................................................... 6 NORTH STAR MEASURE #6: CREATE INTERCONNECTED SYSTEMS OF CARE ................................................................................ 6

Table 4: Electronic Medical Record (EMR) Adoption Score ................................................................................................ 6

ADVERSE EVENT REPORTING ................................................................................................................................. 8

HOSPITAL PARTICIPATION ................................................................................................................................................ 8 FREQUENCY OF ADVERSE EVENT REPORTING ....................................................................................................................... 8

Figure 3: Number of Submitted Adverse Event Reports, by Year ....................................................................................... 8 Figure 4: Consistency of Hospital Reporting Over Time ..................................................................................................... 9

Types of Adverse Events ........................................................................................................................................ 9 Table 5: Adverse Events by Type, 2009 ............................................................................................................................ 10

CMS-DEFINED HEALTHCARE ACQUIRED CONDITIONS (HAC) ............................................................................................... 11 Table 6: Comparison of CMS and Oregon Patient Safety Events ...................................................................................... 11

WHAT CAUSES ADVERSE EVENTS: CONTRIBUTING FACTORS ................................................................................................. 12 Table 7: Contributing Factor Categories by Most Common Event Types ......................................................................... 12

HARM LEVELS OF REPORTED EVENTS ............................................................................................................................... 12 Figure 5: Harm Levels ....................................................................................................................................................... 13 Figure 6: Harm Levels of Reported Events ....................................................................................................................... 13

ADVERSE EVENTS ASSOCIATED WITH DEATH ...................................................................................................................... 13 Table 8: Reports That Resulted in Death, By Year ............................................................................................................ 13 Figure 7: Number and Types of Death Events Reported by Hospitals in 2009 ................................................................. 14

CHARACTERISTICS OF ADVERSE EVENT REPORTS ................................................................................................................. 14 a) The Hospital Review Team .............................................................................................................................. 14

Table 9: Review Team Members ...................................................................................................................................... 15 Table 10: Review Teams Including a Physician or Pharmacist, by Hospital Size ............................................................... 15 Table 11: Review Teams Including a Physician or Pharmacist, by Event Type ................................................................. 15

b) Leadership Engagement ................................................................................................................................. 16 Figure 8: Leadership Involvement in Adverse Events ....................................................................................................... 16

c) Report Quality ................................................................................................................................................. 16 Figure 9: Report Acceptability and Quality ....................................................................................................................... 17

d) Action Plans .................................................................................................................................................... 18 Strategy – “Do better” ...................................................................................................................................................... 18 Strategy – “Rewrite the policy” ........................................................................................................................................ 18 Strategy – “Change the practice environment” ............................................................................................................... 18

e) Written Notification ........................................................................................................................................ 18 Figure 10: Frequency with Which Written Notification Was Provided When Required, By Facility Size ......................... 19

Page 3: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

USING REPORTING TO DRIVE CHANGE ............................................................................................................................... 19 a) Preventing Retained Objects ........................................................................................................................... 19 b) Improving the Use of Medical Devices ............................................................................................................ 20 c) Eliminating Falls .............................................................................................................................................. 20

OTHER PATIENT SAFETY INITIATIVES ................................................................................................................... 21

SAFE SURGERY CHECKLIST .............................................................................................................................................. 21 NATIONAL SURGICAL QUALITY IMPROVEMENT PROGRAM (NSQIP) ....................................................................................... 22 HOSPITAL COLLABORATIVE TO REDUCE HEALTH CARE ACQUIRED INFECTIONS .......................................................................... 22

NEXT STEPS .......................................................................................................................................................... 24

1. CHAMPION NORTH STAR ........................................................................................................................................... 24 2. IMPROVE THE USEFULNESS OF ADVERSE EVENT REPORTING .............................................................................................. 24 3. INCREASE THE USE OF EVIDENCE-BASED BEST PRACTICES ................................................................................................. 25 4. WORK WITH THE NINE HOSPITALS IN THE COMMISSION’S HAI COLLABORATIVE ................................................................... 25 5. TRANSFER QUALITY IMPROVEMENT IDEAS FROM HOSPITALS TO OTHER HEALTHCARE SETTINGS ............................................... 25

APPENDIX A : NORTH STAR MEASURES ............................................................................................................... 26

APPENDIX B: MEASUREMENTS ............................................................................................................................ 27

Page 4: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

August 2010 - 1 -

Oregon Patient Safety Commission Hospital Report

Introduction

The Patient Safety Commission has a single mission: to improve safety by reducing the risk of patient harm in Oregon’s healthcare system. We offer practical ways to learn from mistakes, to eliminate unnecessary variation, and to introduce and champion best practices. We work with hospitals, nursing homes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies.

This report offers a partial snapshot of hospital patient safety efforts in Oregon. While hospitals typically have their own strong quality agendas, we will focus on five initiatives organized (or co-organized) by the Patient Safety Commission. We’ll assess whether we are working on the right problems and whether we have seen any results. The five initiatives are:

1. Promoting an audacious “North Star” goal as a way to challenge Oregon hospitals to become the safest in the nation.

2. Collecting, maintaining, and mining data from the Commission’s voluntary and confidential adverse event reporting program in order to improve patient safety practices.

3. Championing specific patient safety initiatives, such as implementation of the Safe Surgery Checklist.

4. Providing technical assistance to the eight Oregon hospitals that participate in the National Surgical Quality Improvement Program (NSQIP).

5. Running a nine-hospital collaborative to reduce targeted infections by up to 30%.

Page 5: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

August 2010 - 2 -

North Star: A Progress Report

In 2008 the Patient Safety Commission’s Board of Directors issued its North Star goal: Oregon will have the safest healthcare delivery system in the nation. The goal is audacious; it is meant to inspire, to declare progress is possible, and to provide a clear sense of direction. North Star, simple as it is, boldly offers a new strategy:

• First, North Star creates a sense of urgency. Too many people are harmed when seeking care. By most quality and safety metrics, Oregon offers ‘average’ care at best. We can do better.

• Second, the Commission seeks to develop a shared ownership of the problem and the solutions. Behind North Star is a belief in complex adaptive systems. It attempts to define, “simple rules, good enough vision, and a wide space for innovation.” This is shared work. To succeed, Oregon’s healthcare leaders must embrace the challenge.

• Third, North Star puts a focus on patient-centered care. We all need to live and breathe, ‘What’s best for the patient, independent of healthcare setting.’

• Fourth, we are helping to align efforts. North Star sets priorities. • Fifth, North Star provides a framework for benchmarking progress.

With regard to benchmarking, the Commission created a measurement tool that ‘unpacks’ safety into six dimensions, each of which can be expressed as a question:

1. Can we demonstrate that we are eliminating preventable harm events?

2. Are we using evidence-based best practices?

3. Are we assessing risk and learning from experience?

4. Do we have a culture that supports learning and improvement?

5. Are patients and consumers playing an active role?

6. Are we working to create connected systems of care?

By addressing each question we make progress toward becoming the safest state in the country. The Commission’s approach offers a unified philosophy that balances outcome and process measures, highlights the importance of organizational culture and learning, places the patient/consumer in the middle of the equation, and offers a vision of connected systems replacing stand-alone silos.

Following is a progress report for five of the six patient safety dimensions (we currently lack data for patient/consumer involvement). In general, our North Star findings suggest:

• Oregon’s patient safety scores have improved. However, we have not made much progress relative to many other states (almost all states are improving; Oregon is not getting better faster).

• Overall, the Patient Safety Commission’s measurement scheme makes good sense. Many hospitals have adopted the framework for their own use.

Page 6: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

Oregon Patient Safety Commission - 3 -

• There is a delicate balance between inspiring people to be the best versus frustrating them with unobtainable goals.

• North Star, in part because of the measurement scheme, is playing a useful role in promoting change.

Table 1: North Star Summary 2009-2010

North Star Dimension Measure Results 1. Eliminate preventable harm Number of retained objects No Improvement

2. Use evidence-based practice Scores for SCIP 1-3 Improvement

3. Assess risk Volume of reporting to Commission Uncertain

4. Have culture of safety % Hospitals using culture survey Improvement

6. Create connected systems Electronic Medical Record Adoption Score Improvement

Note: Please see Appendix A for a more complete technical discussion of these North Star measures, including an assessment of the limitations of current measures and plans for future measures.

North Star Measure #1: Eliminate Harm

The goal of patient safety is to eliminate harm. Therefore, our measurement approach begins with a concrete outcome measure. We’ve chosen to track the number of ‘retained objects’ accidentally left after surgery. These events are relatively easy to count and represent a large class of events reported to the Patient Safety Commission.

Figure 1: Number of Retained Objects

Definition/Data Source: “Foreign body accidental ly left during a procedure not elsewhere classified” (ICD-9-CM 998.4). Office of Health Policy and Research, Hospital discharge data.

20

30

40

50

60

2002 2003 2004 2005 2006 2007 2008 2009

Num

ber o

f Ret

entio

ns

Page 7: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

August 2010 - 4 -

Results: While the number of events has declined since 2007, the longer term pattern is less clear. The goal is zero retained objects. Hospitals need to address this problem more forcefully by implementing best practices (see page 19 for additional information).

North Star Measure #2: Adopt Evidence-Based Guidelines Evidence-based best practices represent “medical decisions, guidelines, and policies consistent with good evidence of effectiveness and benefit.” Evidence-based means that clinical decisions are grounded in scientific research, not organizational convenience or professional custom. For our North Star measure we’ve used three best practices aimed at eliminating surgical site infections within hospitals. These are called SCIP measures (for Surgical Care Improvement Project):

• SCIP #1: Percent of patients who receive preventive antibiotic within one hour prior to surgical incision;

• SCIP #2: Percent of surgery patients who receive the appropriate preventive antibiotic(s) for their surgery;

• SCIP #3: Percent of patients who have preventive antibiotic discontinued within 24 hours after surgery.

To be most effective it is important that each patient receive all three best practices.

Results: As Table 2 demonstrates, Oregon hospitals have made real improvements, moving from 75% compliance with all three SCIP measures at the beginning of 2008, to 87% by the end of 2009.

Table 2: Percent of Patients Whose Care Met All 3 SCIP Antibiotic Measures

2008 Q1 2009 Q1 2009 Q2 2009 Q3 2009 Q4

All Oregon PPS and CAH Hospitals 75% 75% 85% 87% 87%

Oregon Benchmark 91% 91% 97% 98% 98%

Data Source: Acumentra Health

However, national data also suggest that Oregon isn’t among the country’s best performers. According to Hospital Compare, Oregon ranks in the middle of the pack.

Table 3: Oregon’s SCIP scores compared to National Average

SCIP Measure Oregon’s

Score National Average

Number of States with better performance than OR

SCIP #1 - Percent of patients who receive preventive antibiotic within one hour prior to surgical incision 92% 92% 29

SCIP #2 - Percent of surgery patients who receive the appropriate preventive antibiotic(s) for their surgery 96% 95% 18

SCIP #3 - Percent of patients who have preventive antibiotic discontinued within 24 hours after surgery 90% 90% 25

Page 8: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

Oregon Patient Safety Commission - 5 -

North Star Measure #3: Assess Risk; Learn from Experience

A critical tenet of the patient safety movement is that we can only improve if we identify errors, systematically work to find root causes, and then openly discuss solutions. This requires a reporting program. Willingness to identify/share findings from adverse events is a powerful force for change.

Our North Star measure monitors hospital participation in the Commission’s reporting program. Success means that more hospitals report more events. It is worth noting that consumers often find this “more-is-better” logic to be counterintuitive. However, the Commission views an increase in reporting as an indicator of a hospital’s commitment to patient safety and to its ability to integrate our program into its everyday routine.

Results: In the four years the Commission’s hospital reporting program has been operational, we have seen a slow but steady increase in the number of reports submitted. 2009 was no exception, with an 8% increase in the number of reports compared to 2008. That said, we need to inject a cautionary note: so far in 2010 we have seen a decline in reporting (see page 8 for more discussion).

Figure 2: Adverse Event Reports Submitted by Year

North Star Measure #4: Improve Cultures of Safety within Hospitals

A fundamental issue for patient safety is whether organizations can quickly learn from their own mistakes. This requires better teamwork, a willingness to engage in open discussion, and a sense of trust on the part of health care workers that they will not be punished for speaking out or for challenging colleagues. One way hospitals can measure their own progress is by using a “culture of safety” survey. As an initial measure of statewide progress, we determined the number of hospitals that routinely monitor their safety culture by using a facility-wide survey.

Results: According to the Oregon Association of Hospitals and Health Systems, 57 of 61 hospitals (including Vibra, Shriners and Cedar Hills) have completed or are planning a culture survey in 2010. This 93% is up from an estimated 80% in 2008. While this represents real improvement, our target remains 100%.

53

85

118127

0

30

60

90

120

150

2006 2007 2008 2009

Num

ber o

f Rep

rots

Page 9: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

August 2010 - 6 -

North Star Measure #5: Role of Patients and Consumers

We currently lack data for this measure. Please see Appendix A.

North Star Measure #6: Create Interconnected Systems of Care

This dimension highlights the need for coordinated care. Patients treated in isolation, without complete medical information, are more likely to suffer harm. In addition, without integrated information it is difficult for organizations to learn from their mistakes.

Having an Electronic Medical Record (EMR) is a critical component of a connected healthcare delivery system. With a functioning EMR, hospitals can quickly share clinical information with all necessary providers and with all appropriate entities within a network. A fully integrated EMR also allows hospitals to actively share health and wellness information with consumers. And it encourages sophisticated mining of data to improve care protocols and decision support tools.

Our North Star measure is the Electronic Medical Record Adoption Score (HIMSS Analytics). This 0 to 7 scale assesses hospital progress toward implementing electronic medical records as follows:

Table 4: Electronic Medical Record (EMR) Adoption Score

Q2 2008 Q4 2009 Stage Description National Oregon National Oregon Stage 0 Some clinical automation may exist, but Stage 1 not

yet met 17.7% 10.0% 11.5% 6.5%

Stage 1 Laboratory, pharmacy and radiology installed 12.6% 15.0% 7.2% 6.5% Stage 2 Major ancillary clinical systems feed data to clinical

data repository 33.9% 41.0% 16.9% 8.1%

Stage 3 Clinical documentation installed. First level of clinician decision support is implemented to conduct error checking with order entry. Some level of medical image access

32.0% 31.0% 50.9% 61.3%

Stage 4 Computerized practitioner order entry (CPOE) for use by any clinician is added to nursing and others. Second-level of clinical decision support implemented

1.8% 3.0% 7.4% 9.7%

Stage 5 Closed loop medication administration environment is fully implemented in at least one patient care area

1.3% 0.0% 3.8% 6.5%

Stage 6 Full physician documentation/charting implemented for at least one patient care area. Full radiology system implemented

0.9% 0.0% 1.6% 1.6%

Stage 7 Hospital has a paperless EMR environment. Clinical information is readily shared via electronic transactions with all appropriate entities and providers

0.0% 0.0% 0.7% 0.0%

Total Hospitals 5,073 59 5,235 62

Oregon’s Median Score (based on 0-7) 2.25 3.13

Oregon's National Rank 17th 23rd

Data source: HIMSS Analytics

Page 10: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

Oregon Patient Safety Commission - 7 -

Results: In 2008, Oregon’s hospital median score was 2.25 (based on a 0 to 7 scale with 7 representing complete EMR adoption). This ranked Oregon 17th in the nation. In 2009 the median score was 3.13, a real improvement. At the end of last year, five Oregon hospitals had reached Stage 5 (up from 0 in the prior year). In the view of experts this is a critical step toward improving overall safety (having a “closed loop” medication administration process). But since other states have moved more quickly, Oregon’s overall ranking fell to 23rd.

Page 11: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

August 2010 - 8 -

Adverse Event Reporting

The backbone of the Commission’s patient safety efforts has been its adverse event reporting program. Oregon’s approach, based on voluntary participation and confidential reporting of adverse events, is unique in the United States. Such events provide an opportunity to discover previously hidden flaws in the delivery of patient care. The Commission’s goal is to create a “safe table” to more quickly identify these flaws and find improvement opportunities that will decrease risk of future harm.

An adverse event includes “any unanticipated, usually preventable consequence of patient care that results in death or serious physical injury.” This definition focuses attention on care management problems that may have contributed to the outcome. It specifically excludes poor outcomes due to the patient’s underlying condition or from known medical complications.

Hospitals agree to report all Serious Adverse Events. Such events include:

• Sentinel events, as defined by The Joint Commission • 25 specific events included on the National Quality Forum’s List of Serious Adverse Events

(the Commission has explicitly excluded criminal events that appear on NQF’s list) • Any other event that meets the Commission’s general definition (as cited above)

Hospital Participation

Even though the Patient Safety Commission’s reporting program is voluntary, participating hospitals have signed agreements stating they will report all qualifying events using common definitions and a standardized report form. Currently 56 of Oregon’s 58 hospitals participate. These 56 include all large hospitals, all medium-sized hospitals and all but two of Oregon’s small hospitals. Together these 56 provide more than 99% of all hospital care in Oregon.

Frequency of Adverse Event Reporting

From June 2006, when the Commission received its first report, through the end of 2009, we received 383 adverse event reports. The following graphic shows the year-by-year trend.

Figure 3: Number of Submitted Adverse Event Reports, by Year

0

30

60

90

120

150

2006 2007 2008 2009 2010

Num

ber o

f Rep

orts

Total Submissions Projected reporting

Projected from annualized number of

Page 12: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

Oregon Patient Safety Commission - 9 -

In the first four years, the number of adverse event reports increased steadily. This increase represents an important success, indicating that hospitals have gained trust in the program. (See North Star discussion, above).

We have some cause for concern, however. In the first half of 2010, we have received only 41 reports, compared to 55 reports for the same period in 2009. These numbers are hard to interpret. For example, some hospitals tell us that they have made progress in eliminating certain types of adverse events. If this progress is significant and widespread, it would result in fewer reports (and would be very good indeed). Further, trends can be masked by reporting activity that goes in spurts. Still, given the preponderance of evidence, the Commission believes that adverse events are under-reported by Oregon hospitals.

Another important measure of participation is the number of hospitals that routinely report their adverse events to the Commission. Through 2009, 47 of 56 hospitals have reported at least one event to the Commission. Twenty-three hospitals have reported consistently every year since submitting their first report in 2006, 2007 or 2008. Six additional hospitals submitted their first report in 2009. Nine hospitals had never reported an adverse event by year end 2009 (although four of these have since submitted their first report in 2010). Overall, we are mildly encouraged by these numbers, even as we acknowledge that much work remains.

Figure 4: Consistency of Hospital Reporting Over Time First year submitting reports Subsequent year submitting reports

2006 2007 2008 2009 Number of Hospitals

16 3 1 2 2 2 2 3 1 5 1 1 2 6 9

Types of Adverse Events

As Table 6 shows, we’ve classified hospital reports for 2009 into 18 general categories, with three of these categories accounting for almost half (46%) of the events. Falls were the most frequently reported event, (19% of reported events in 2009). Hospitals reported more falls than in previous years, perhaps because of increased national attention on decreasing the number and severity of falls in hospitals. Unintentionally retained objects represented the second most frequently reported event type (15%).

Page 13: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

August 2010 - 10 -

Medication errors were the third most frequently reported event type, representing 12% of adverse events.

Table 5: Adverse Events by Type, 2009 Small Hospitals Medium Hospitals Large Hospitals All Hospitals

Type of Event No. % of

Events No. % of

Events No. % of

Events No. % of

Events

Fall 3 10.7% 6 19.4% 17 22.1% 26 19.1%

Retained object 5 17.9% 5 16.1% 11 14.3% 21 15.4%

Medication error 4 14.3% 4 12.9% 8 10.4% 16 11.8%

Ulcer 1 3.2% 9 11.7% 10 7.4%

Wrong site surgery 2 7.1% 1 3.2% 6 7.8% 9 6.6%

Healthcare associated infection* 3 10.7% 2 6.5% 3 3.9% 8 5.9%

Injury 4 14.3% 2 6.5% 2 2.6% 8 5.9%

Care Delay 3 9.7% 5 6.5% 8 5.9%

Other** 2 7.1% 2 6.5% 3 3.9% 7 5.1%

Unexpected Death 2 7.1% 3 3.9% 5 3.7%

Perinatal 5 6.5% 5 3.7%

Equipment 2 7.1% 1 3.2% 3 2.2%

Suicide 1 3.2% 2 2.6% 3 2.2%

Burn 1 3.6% 0.0% 1 1.3% 2 1.5%

Hypoglycemia 2 6.5% 2 1.5%

Air Embolism 1 3.2% 1 0.7%

Hemolytic reaction 1 1.3% 1 0.7%

Wrong procedure 1 1.3% 1 0.7%

Total Events 28 100.0% 31 100.0% 77 100.0% 136 100.0%

Total Reports 25 28 74 127

* Hospitals reported many fewer infections to the Patient Safety Commission than to the state’s independent infection reporting program. Please see page 10 for an explanation.

**Other events include: wrong graft tissue in 30 year old male; cancelled surgery in 73 year old female (missing equipment); expired substance used in surgery in 60 year old male; O2 discontinued without order in 71 year old male; wrong discharge instructions in 70 year old male; sexual assault in 19 year old female; and DVT in 86 year old female.

In 2009, hospitals reported eight healthcare-associated infections (HAI) to the Commission (6% of all reports). In contrast, hospitals reported 199 infections to the state’s separate (and mandatory) infection reporting program during the same time period (75 central-line associated blood stream infections; 62 knee prosthesis surgical site infections; and 62 coronary artery bypass graft infections). It is important to note that the two reporting programs have very different definitions of ‘reportable events.’ The state’s

Page 14: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

Oregon Patient Safety Commission - 11 -

program is designed to capture rate-based information using definitions developed by the Centers for Disease Control and Prevention (CDC). The Commission’s program represents an effort to understand why serious adverse events occur. As such it asks that hospitals report ‘serious harm’ infections only. In addition, given the purpose of the Commission’s reporting program, hospitals are required to complete a root cause analysis, which can take 60 hours or more of staff time. In other words, the two programs serve different purposes, and are structurally distinct. That said, given the volume of reports submitted to the state for just three specific types of infections, we believe that serious infections are under-reported to the Commission. We intend to explore this issue further.

CMS-Defined Healthcare Acquired Conditions (HAC) Healthcare acquired conditions represent poor clinical outcomes that the Centers for Medicare & Medicaid Services (CMS) believes could be reasonably prevented through the application of evidence-based guidelines. To date, CMS has defined ten such conditions, including certain types of hospital acquired infections. As of October 2008, CMS stopped paying for the additional care required to treat an HAC, unless the hospital can demonstrate that the condition was present on admission to the hospital. This is why HACs are commonly called ‘no-pay’ events.

The Commission’s reporting template does not explicitly highlight the ten CMS conditions. Instead, the Commission’s definition of a reportable event is built around the National Quality Forum’s list of serious adverse events. The following table offers a crosswalk between the Commission’s definition and CMS’s definitions.

Table 6: Comparison of CMS and Oregon Patient Safety Events

CMS Healthcare Acquired Conditions OPSC Reported Event 2009 ‘06-‘09 Stage III, IV pressure ulcers Pressure ulcer - Stage 3 or 4 9 1

6 Fall or trauma resulting in serious injury Fall 2

6 59

Foreign object retained after surgery Retention (unintended) of a foreign object in a patient after surgery or other procedure

21

62

Vascular catheter-associated infection 0 2 Catheter-associated urinary tract infection As part of infections generally 1 0 Certain surgical site infections 1 2 Air embolism Intravascular air embolism 1 1 Blood incompatibility Hemolytic reaction 1 5 Certain manifestations of poor blood sugar control

Hypoglycemia 2 4

Certain deep vein thromboses or pulmonary embolisms

As ‘Other’ event 1 2

TOTAL 63 153

While matching is imperfect, at least 63 of 127 events reported to the Commission in 2009 correspond to CMS healthcare acquired conditions (50%). Falls, stage III and IV pressure ulcers, and retained foreign objects are easily mapped between the two lists. The three infection-related HACs correspond in general terms to the Commission's single HAI event type (Table 6, red highlighted rows). The remaining HACs are

Page 15: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

August 2010 - 12 -

not explicitly included in the Commission's list, but by searching text fields, we were able to identify reported events that included deep vein thrombosis and pulmonary embolism (which were not present on admission).

What Causes Adverse Events: Contributing Factors When hospitals report adverse events to the Patient Safety Commission, they include information about their internal investigation of the event, including details about why the event occurred. Because health care is complex, hospitals can choose from among 57 contributing factors grouped into eight general categories. For the purposes of this report, we offer a partial summary (Table 8).

Table 7: Contributing Factor Categories by Most Common Event Types

Retained

object Falls Medication

error Ulcer

Wrong site

surgery

Healthcare acquired infection

Total Reports

Communication 76.2% 57.7% 75.0% 70.0% 88.9% 75.0% 72.4%

Policies/Procedures 76.2% 38.5% 75.0% 30.0% 66.7% 75.0% 59.1%

Patient Management 14.3% 65.4% 25.0% 90.0% 55.6% 37.5% 50.4%

Organizational Factors 38.1% 46.2% 43.8% 20.0% 66.7% 50.0% 45.7%

Equipment, software, or material defects 38.1% 50.0% 50.0% 20.0% 33.3% 50.0% 44.9%

Training and supervision 28.6% 34.6% 50.0% 30.0% 33.3% 62.5% 40.9%

Patient Factors 28.6% 46.2% 31.3% 30.0% 55.6% 50.0% 37.0%

Work area/ environment 42.9% 23.1% 37.5% 20.0% 66.7% 50.0% 31.5%

Note: Each report could mark multiple contributing factors. These percentages should be read as follows: "76.2% of Retained Object reports indicated that communication was a problem."

Overall, Communication and Policies/Procedures are the two most commonly cited contributing factor categories. Communication played a role in 72% of 2009 reports (a rate similar to that reported by The Joint Commission). Policies/Procedures contributed to 59% of reported adverse events. Within these categories, Communication – Hand-offs/Shift Reports and Policies/Procedures – Not Followed are the most frequently mentioned factors, appearing in 34% and 31% of adverse event reports respectively.

Harm Levels of Reported Events

The Patient Safety Commission uses a scale of 1-9 to describe levels of harm. The scale accounts for degree of harm (not detectable, minimal, moderate, and serious) and duration of harm (temporary or permanent). Serious harm (Levels 7, 8 and 9) means that the event severely affects a patient’s physical status or functional ability. Indicators of seriousness include transfer to a higher level of care, severe allergic reaction, return to surgery, significant medical intervention, and increased length of stay.

Page 16: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

Oregon Patient Safety Commission - 13 -

Figure 5: Harm Levels

In 2009, serious harm events accounted for two-thirds of all reported events (66%). When we started the reporting program, we put an explicit emphasis on serious harm events, but extended to hospitals ‘an invitation to report’ lower harm events. We expected, over time to see a slow increase in the proportion of such events. But as the following graphic suggests, we have seen the proportions of serious harm, moderate harm and less serious harm events remain fairly stable.

Figure 6: Harm Levels of Reported Events

Adverse Events Associated with Death

Of the 127 adverse event reports submitted to the Commission in 2009, 32 resulted in the death of a hospital patient (25%). These deaths took place in 14 separate facilities.

Table 8: Reports That Resulted in Death, By Year

Year Number % of

Reports

2006 17 32.1%

2007 26 30.6%

2008 27 22.9%

2009 32 25.2%

Overall 102 26.6%

66%

12%

15%

7%

Serious Harm (7, 8 & 9) Moderate Harm (5 & 6)Minimal Harm (3 & 4) No Harm (1 & 2)

0%

20%

40%

60%

80%

100%

2006 2007 2008 2009

% o

f Rep

orts

Serious Harm (7-9) Moderate Harm (5-6) Less Serious Harm (1-4)

Page 17: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

August 2010 - 14 -

Figure 7: Number and Types of Death Events Reported by Hospitals in 2009

Note: while 32 of 127 submitted reports resulted in the patient’s death, those 32 reports involved 34 of 136 reported events, as a single report may include multiple adverse events.

Taken together, perinatal events, falls, medication errors, and unexpected deaths account for nearly 60% of the events.

In general, the factors contributing to death events were similar to those for events overall, though in slightly different proportions. Two contributing factors stand out in their association with death events. Compared to non-death adverse events, problems with Communication Among Hospital Personnel occurs more frequently in events resulting in death (47% compared to 21%). Similarly, problems with Patient Management – Response to Changing Conditions are more frequently involved with events resulting in death than in non-death events (38% compared to 11%).

Characteristics of Adverse Event Reports To improve safety in Oregon, serious adverse events must be completely investigated by hospitals and fully reported to the Patient Safety Commission. In the following section we discuss the reporting itself, specifically the process itself and how well it is done.

a) The Hospital Review Team

In the aftermath of an adverse event, hospitals typically put together review teams. Such teams work best when they are multi-disciplinary, fine-tuned to the needs of a specific event, have physicians actively involved in making sense of what went wrong, and have an administrative champion to assure appropriate resources and effective action plans.

Hospitals have varied approaches to team composition for reviewing adverse events. Some change membership with every review; others have a core team for each review and add members as needed. Some teams include staff who were involved in the event, while others do not ask those involved to participate directly. There is no single, perfect formula for constituting a review team.

0 1 2 3 4 5 6

PerinatalFall

Medication ErrorUnexpected Death

InfectionCare Delay

SuicideInjury

Wrong Site SurgeryAir Embolism

Page 18: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

Oregon Patient Safety Commission - 15 -

Table 9: Review Team Members

Count %

Nursing Staff 91 92.9%

Physician 76 77.6%

Nursing Management 89 90.8%

Quality Management 73 74.5%

Risk Management 90 91.8%

Pharmacist 15 15.3%

Senior Administration 61 62.2%

Engineering Staff 1 1.0%

Other 38 38.8%

TOTAL 98 100.0%

In 2009, representatives from nursing staff, nursing management, and risk management were on 81% of the teams; 98% of the teams had at least one of the three. Only two event reviews occurred in which neither nursing staff nor nursing management participated. Physicians participated in 78% of reported reviews; pharmacist participated in only 15%. Thirty-nine percent of the review teams included representatives from other departments, such as physical therapy or laboratory services.

Because physicians and pharmacists can play such important roles on review teams, we took a closer look at their participation. Table 10 summarizes event reviews in which pharmacists and physicians participated, by size of hospital.

Table 10: Review Teams Including a Physician or Pharmacist, by Hospital Size

Physician Pharmacist

Small Hospitals 73.3% 6.7%

Medium Hospitals 76.0% 16.0%

Large Hospitals 79.3% 17.2%

ALL HOSPITALS 77.6% 15.3%

Table 11 shows similar data specifically for falls and medication errors. Given the number of medications that may cause drowsiness or otherwise affect motor coordination (resulting in a fall), the lack of a pharmacist to offer a medication review can limit full understanding and development of effective intervention strategies.

Table 11: Review Teams Including a Physician or Pharmacist, by Event Type

Physician Pharmacist

Count % Count % TOTAL

Medication Error 6 60.0% 7 70.0% 10

Fall 9 39.1% 3 13.0% 23

Page 19: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

August 2010 - 16 -

b) Leadership Engagement

Strong leadership is a critical component of a robust safety/quality program. Without management engagement, quality efforts will be starved of resources and shuffled among competing priorities. The Commission tracks management engagement by asking hospitals to identify who participated in the RCA, whether senior management was notified of the event, if the Hospital’s Board of Directors was notified, and whether senior management received a post-review briefing.

Figure 8: Leadership Involvement in Adverse Events

In 2009, there were 98 qualifying adverse event reports for which hospitals provided information on leadership involvement. In all cases, leadership received notification of the adverse event. Some form of management follow-up occurred 94% of the time, either through senior management presence on the review team or a post-review briefing. Of the six events that had no reported follow-up with leadership (representing 6% of the total), one occurred in a small hospital, two in medium hospitals, and three in large hospitals. Senior leadership of medium-sized hospitals were less likely to participate in reviewing adverse events than leadership of large and small hospitals. Overall, board notification occurred in 58% of cases. Such notification was markedly lower in medium and small hospitals, occurring only 40% of the time, while it occurred 71% of the time in large hospitals.

c) Report Quality

The Commission reviews each report soon after submission to assure consistency in harm levels and event categorization. Beginning with 2009 as a baseline year, Commission staff also review each adverse event using Joint Commission criteria: completeness, thoroughness, credibility, and system-oriented action plans.

0%

20%

40%

60%

80%

100%

Leadership Notified Board Notified Post Review Briefing Review Participation by Sr Mgt

% o

f Rep

orts

Small Hospital Medium Hospital Large Hospital All

Page 20: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

Oregon Patient Safety Commission - 17 -

Criteria Used in Report Review

Complete All information asked for is provided. The event description includes detailed and relevant information.

Thorough The report considers system-level contributing factors and indentifies the root cause in the findings.

Credible The report is internally consistent; leadership is involved or notified.

Action Plans The report includes system-level plans that are likely to prevent future occurrence, and those plans address identified causes, as listed in the findings.

If an adverse event report meets the minimum standard in each of the criteria we judge that report acceptable. In addition, staff use the Public Health Officer’s quality scoring method to assign a quality value. In general, the concept of acceptability measures whether or not a report meets the most basic requirements; report quality is a measure of the depth and strength of the root cause analysis (RCA).

Eighty-nine percent of 2009 adverse event reports met the acceptability standard. Eighty-seven percent earned the designation of "good quality." Looking at hospitals by size, small hospitals had the largest gap between acceptable and good quality (92% acceptable; 76% good quality), but, overall, the variations by size of hospital were not great.

Figure 9: Report Acceptability and Quality

Of the 11% that were judged unacceptable (14 reports), common problems included: a) inability of the review teams to identify the root causes of the event; b) low yield action plans (though this problem plagued many of the high scoring reports as well -- see below).

0%

25%

50%

75%

100%

Small Hospitalsn=25

Medium Hospitalsn=29

Large Hospitals n = 73

Overalln=127

% o

f Rep

orts

Acceptable Good Quality

Page 21: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

August 2010 - 18 -

d) Action Plans

Action plans put the insights derived from root cause analysis into play. They offer remedy to the identified problem. For this reason, a good analysis is a necessary prerequisite of good action plans. But because even a complete and thorough investigation does not always lead to a useful action plan, we review hospital action plans and judge their effectiveness using criteria derived from the Veterans Administration’s Patient Safety Center. Only 30% of ‘good quality’ reports received a perfect score for their action plans. Our review indicates that action plans fall into three general categories and ranged in strength from weak to very strong. In general, the Commission believes that hospitals need to improve in this area.

Strategy – “Do better”

Many action plans address individual hospital staff using words such as remind, counsel, educate (or re-educate), reinforce, encourage. These plans are limited in their effectiveness because of their over-reliance on changing individual behavior and because they assume (usually incorrectly) that staff did not know or have forgotten the correct action. To implement such plans, hospitals typically rely on individual discussions, staff meetings, newsletters, education sessions, or distributing literature. The impact of such plans is minimal. In contrast, some hospitals hold practice drills. While this is also a ‘do better’ strategy, it is usually more effective because it involves more than one type of learning (cognitive/experiential) and reinforces the need for teamwork.

Strategy – “Rewrite the policy”

The second most common type of action plans involve making or revising policies and procedures. Whether they are effective or not depends on how they are developed. It is a weak strategy if it entails a policy that outlines expectations for individual behavior. It is a stronger strategy when policies are standardized or procedures simplified.

Strategy – “Change the practice environment”

The most effective action plans are those that change the practice environment. Among the best are those that engineer safety into the design of a process. As one approach, forcing functions preclude any action except the correct action. For example, one hospital altered their electronic health record to limit the dosage of high alert or risky medications and established strict protocols about who could override the limit. Other ‘hard wired’ action plans included architectural re-design, and purchase of new equipment, such as infusion pumps or beds. In general such action plans are not common.

e) Written Notification

As part of their participation in the adverse event reporting program, hospitals agree to notify the patient (or their representative) in writing after any serious adverse event (serious harm, Level 7, 8, or 9). This notification is meant to supplement the oral disclosure of such events. The only requirements for the written notification are that it is timely and consistent with the hospital’s policy on disclosure.

Over the course of the reporting program, hospitals have varied in the extent to which they put written

Page 22: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

Oregon Patient Safety Commission - 19 -

notification into practice. For 2009, hospitals provided written notification in just over half of the events (43 out of 80) for which it was required1.

While compliance is far short of optimal, a few hospitals have successfully integrated written notification into their disclosure process. Since reporting began in 2006, eight hospitals have notified patients of adverse events in writing for every serious harm event. Nine other hospitals have done so over half of the time.

Figure 10 offers a summary of how often hospitals met the written notification standard in 2009.

Figure 10: Frequency with Which Written Notification Was Provided When Required, By Facility Size

*Includes four hospitals that submitted only one serious harm report in 2009

Using reporting to drive change

Following are three examples of how the Patient Safety Commission has used reporting to encourage quality improvement within Oregon’s hospitals.

a) Preventing Retained Objects

As discussed on page 3 of this report, foreign bodies inadvertently retained after surgery are an on-going problem in Oregon hospitals. The Commission’s reporting program has brought this issue to the fore. To address the issue, the Commission convened an expert panel; championed the use of a surgical checklist (see below); and, on two occasions, issued guidelines and recommendations. The latest

0

3

6

9

12

Always* Half of the time or more

Less than half of the time

Never

Num

ber o

f Hos

pita

ls

Small Hospitals Medium Hospitals Large Hospitals

1 Even though hospitals struggle with written notification, hospitals also have verbal disclosure policies. Evidence suggests that hospitals do a better job in providing verbal disclosure.

26 hospitals, 80 reports

Page 23: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

August 2010 - 20 -

recommendations include the need to:

• Revise current work processes to assure consistency with recommendations endorsed by the Association of periOperative Registered Nurses (AORN) for prevention of retained objects;

• Include pre-closure cavity sweeps as routine activity; • Eliminate/minimize Intraoperative staff changes; • Institute a safe surgery checklist to support team responsibility for counts.

b) Improving the Use of Medical Devices

Recently we issued an alert to all hospitals based on a near-miss adverse event submitted to the Commission. This case involved the use of a defibrillator as a cardiac monitor. In places like the emergency room, cardiac monitors are not always available. In some instances, staff use defibrillators as monitors instead. In the reported case, the monitor’s alarm (which isn’t important when used as a defibrillator) had been turned off. Without the alarm, nursing staff were not alerted to a patient in trouble.

Based on this report and on additional investigation, the Patient Safety Commission issued a series of recommendations to Oregon hospitals:

Recommendations

• Assure that appropriate monitors are available and accessible for each clinical area, removing any that are not used as intended.

• Add alarm checks to equipment checks as appropriate. • Determine degree to which alarm fatigue, false alarms, and nuisance alarms present risk in

your institution and implement risk reduction strategies. • Inform staff of risk and encourage recommendations for risk reduction. • Assure staff are competent in the use of alarms.

c) Eliminating Falls

Falls are another common problem in hospitals. In one case reported to the Commission, a patient with recent below-the-knee amputations of both legs was admitted for treatment of an infection. Several days into his hospitalization, he was maneuvering to transfer from the bed to his personal wheelchair when it tipped over; he hit his head and sustained a small subdural hematoma.

This is a rare event, caused by the unusual incidence of two separate factors occurring together: a recent amputation that changed the patient’s center of gravity, and a personal wheelchair without an anti-tipping device. To the hospital’s credit, it looked at the event and asked how nurses might integrate unusual risk factors into their thinking about patients’ needs and the care they provide ─ in short, how to prompt critical thinking. The Commission reported the results (and recommendations) of this analysis to all hospitals in Oregon.

Page 24: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

Oregon Patient Safety Commission - 21 -

Other Patient Safety Initiatives

Safe Surgery Checklist

The Safe Surgery Checklist, first introduced in the United States in 2008, is designed to standardize basic operating room protocols among members of a surgical team. The goal is to reduce surgical complications. This approach—which borrows heavily from the aviation industry—has been shown to be especially effective in reducing infections, assuring safe anesthesia, and improving surgical team communication.

Because of its proven effectiveness, and because more than 20% of reported adverse events involve preventable surgical complications, the Patient Safety Commission—in partnership with seven other organizations2—has championed the use of the checklist in Oregon. As a result of various promotional and training efforts, 81% of the Oregon hospitals that perform surgical procedures report that they now use a checklist. As of April 2010, only eleven hospitals had yet to adopt the checklist.

This is an excellent outcome. As a next step, the Commission and its partners wanted to learn more about the specific checklists being implemented. Toward that end, we recently analyzed 37 checklists (out of the 43 submitted) and found that many hospitals had modified their checklists in ways that call their effectiveness into question. In all, the checklists that we reviewed had 65 different elements, only six of which were included on all of the checklists.

Elements Present in All Reviewed Checklists

Phase 1

● Patient ID appears four times (patient, procedure, site, consent)

● Site marked

Phase 2

● “This is [patient’s name].”

● “[Procedure/site/laterality] as stated on the consent.”

● “Is essential imaging available and accessible?”

Phase 3

● Counts are correct

The most frequently omitted concepts were those that related to team communication. These omissions raise concerns: effective communication is a hallmark of a strong safety culture. For example, data from the adverse events reported to the Commission in 2009 corroborate a need for improved surgical communication. Three surgical event types (retained objects, wrong procedures, and wrong-site surgeries) comprise 23% of all reported events, with retained objects the second most commonly reported event. Further, while 72% of the 2009 reports noted communication as a contributing factor, 81% of reports for surgical events did so. 2Acumentra Health, CareOregon, Oregon Association of Hospitals and Health Systems, Oregon Medical Association, Oregon Nurses Association, Oregon Office of Rural Health, Oregon Rural Healthcare Quality Network

Page 25: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

August 2010 - 22 -

National Surgical Quality Improvement Program (NSQIP)

NSQIP is a research and quality improvement tool developed by the American College of Surgeons. The hospitals that participate in NSQIP agree to collect clinically-detailed, patient-specific data about surgical complications, using a common data format. These data are outcomes-based and risk adjusted. As a result, NSQIP hospitals can offer new ideas about the next generation of surgical improvement initiatives. Eight Oregon hospitals participate in NSQIP:

• Kaiser Sunnyside Medical Center • Legacy Emanuel Medical Center • Legacy Good Samaritan Medical Center • Oregon Health & Science University

Hospital

• Providence Portland Medical Center • Providence St. Vincent Medical Center • Sacred Heart Medical Center at

RiverBend • Salem Hospital

In partnership with the Oregon Patient Safety Commission, these eight hospitals have formed a Consortium dedicated to improving surgical care across the state by publishing clinical and quality improvement findings. According to one participant, “NSQIP helps cut through the complexity of modern medicine. This complexity creates a fog that isolates individual care givers from determining the most effective treatment for their patients.”

As one example of the Consortium’s work, last year the group published a Call to Action on how to reduce surgical complications for patients with diabetes or at risk for diabetes. In Oregon, initial findings have demonstrated a 54% reduction in complications for patients with known diabetes. These complications include, but are not limited to infections, cardiac arrest, pneumonia, and pulmonary embolisms. NSQIP data also suggest that surgical complications can be lowered for patients who do not have diabetes but are at risk for elevated blood sugars.

Recommendations included:

1. Screening surgical patients for poorly controlled diabetes and undiagnosed diabetes using established guidelines.

2. Measuring perioperative blood glucose levels (before, during, just after surgery) for patients with diabetes or known risk factors.

3. Treating hyperglycemia (high blood sugar) by using safe and effective control strategies.

Hospital Collaborative to Reduce Health Care Acquired Infections

In 2009 the Patient Safety Commission received a federal grant to organize a hospital collaborative designed to significantly reduce the occurrence of hospital-acquired infections through December, 2011. This work is well underway. Nine Oregon hospitals have joined the collaborative (three are critical access hospitals, three are mid-sized and three are large).

With the help of an expert panel convened by the Commission, the nine hospitals have targeted three types of infection: surgical site infections; central line infections; and Clostridium difficile (C. diff). In

Page 26: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

Oregon Patient Safety Commission - 23 -

addition, these nine are working to improve their programs for hand hygiene, environmental cleaning, and antibiotic stewardship.

The first learning session took place on June 10 and 11, 2010. The second session is scheduled for September 15th. The group has phone conferences every week and routinely shares information about challenges and successes. As a recent example of the effectiveness of the collaborative approach, participating hospitals are currently sharing materials related to hand hygiene. These materials include how to implement ‘secret shopper’ programs (to gain accurate information about compliance with hand hygiene standards), tools to engage staff in awareness efforts, and data display templates to help communicate progress. All hospitals have now identified baseline hand hygiene rates and are actively working to improve them.

Page 27: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

August 2010 - 24 -

Next Steps

The Patient Safety Commission can point to specific interventions and unifying philosophies that are beginning to make a difference in reducing harm to patients in Oregon hospitals. That said, we have much work to do. The following is an outline of some of the Commission’s next steps.

1. Champion North Star

The Commission’s efforts to create an audacious ‘stretch’ goal have not yet obtained a critical mass of participation from hospitals, consumers, purchasers, insurers and government. North Star remains a useful measurement scheme but it has not animated the delivery system to more rapidly improve. To improve the likelihood of success, the Commission will:

• Publish periodic progress reports based on North Star metrics.

• Revise/improve the specific North Star measures (such as using infection rates as an outcome measure, in addition to retained objects) – See Appendix A.

• Encourage purchasers to include specific North Star language in their vendor contracts

• Encourage consumers/patients to ask questions of their hospital and their physician about how each is helping to create a safer delivery system.

• Align our own quality improvement efforts around North Star objectives.

2. Improve the Usefulness of Adverse Event Reporting

• The Commission will set clear expectations about the overall volume of reporting it expects to see (based on national norms and on current Oregon patterns).

• The Commission will meet with each of the five hospitals that have not yet submitted adverse event reports to clarify the reasons why. The Commission will do everything it can to remove barriers to reporting, provide additional clarification, and offer mentoring and peer-to-peer consultation. If all else fails, the Commission will consider terminating participants for non-compliance.

• The Commission will work with hospitals to align the definition of a reportable adverse event more closely with emerging patient safety indicators. We will:

o Revise reportable events to include CMS healthcare acquired conditions. o Consider including low harm events of interest. o Evaluate when ‘bundled’ root cause analysis is an appropriate reporting strategy (for

example when a hospital experiences a cluster of similar events). • The Commission will develop a consensus-based guidance on the depth and breadth of

investigation needed for different types of adverse events, including:

Page 28: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

Oregon Patient Safety Commission - 25 -

o Definitions of when a ‘full bore’ root cause analysis is required. o Descriptions of more focused or limited root cause analyses.

• The Commission will provide hospital-specific analyses to participants at more frequent intervals.

• The Commission will work with small/medium sized hospitals to improve the involvement of their hospital boards in adverse event review.

• The Commission will improve compliance with written notification requirements by:

o Working with the OAHHS and national researchers (including Thomas Gallagher, MD, University of Washington) to better understand the dynamics of disclosure and notification.

o Providing hospitals with case studies of hospitals that have succeeded. o Providing additional information about model notification letters. One size doesn’t fit all

and we will provide context on how such letters might be used. o Developing a ‘commonsense’ exception policy. o Working with healthcare purchasers to add contract language that reinforces the need

for vendors to comply with written notification requirements. o Developing a termination policy that could lead to hospitals being dropped from the

reporting program for continued non-compliance. Such a policy would include a fair chance to explain actions and to seek accommodation.

3. Increase the Use of Evidence-Based Best Practices

• The Commission will continue to champion the use of the safe surgery checklist. Our goal is for 100% of hospitals adopt, at a minimum, the core elements of the Oregon checklist.

• The Commission will explore the concept of ‘safe harbor’ legislation as a means of increasing the use of best practices. This work is being funded by a one year federal planning grant to explore innovative ways to reform medical liability and improve safety.

• The Commission will continue working with the Oregon NSQIP Consortium to develop (and share) emerging best practices.

4. Work with the Nine Hospitals in the Commission’s HAI Collaborative

We will continue to work with the nine hospitals in the Commission’s collaborative to reduce healthcare-acquired infections. The Commission will use the collaborative model (and the skills we’ve learned) to create other learning opportunities.

5. Transfer Quality Improvement Ideas from Hospitals to Other Healthcare Settings

• Continue to train nursing homes in the use of root cause analysis.

• Encourage the spread of NSQIP surgical insights to ambulatory surgery centers.

Page 29: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

August 2010 - 26 -

APPENDIX A : North Star Measures

The Patient Safety Commission’s North Star measures are based on easily defined rules:

• Simplicity: For each of the six North Star dimensions we’ve selected only one or two measures. As such, some of the measures are proxies or only partial indicators. For example, to measure harm events, we selected retained objects. While we strive to eliminate all harm events, we begin by measuring one.

• Align measures: We selected existing measures that are commonly in use. We believe that North Star can help align organizations around a short list of appropriate measures.

• National comparisons: Our goal is to be the safest state in the nation. To measure our progress we needed to include a few measures that allow for national benchmarks.

• Ease of collection: We wanted to make sure that we didn’t add to the already-high administrative burden of hospitals. In some cases, when data didn’t exist, Patient Safety Commission staff collected it.

• Ever-improving: Even as we assembled an initial measurement scheme, we began sifting through possible improvements and additions. As such we have a group of measures in development.

• Consensus-based: We developed our measures in partnership with hospitals, public health, and others.

• Narrow focus on safety: Quality is more than safety. Overall quality measures should include (at a minimum) indicators of safety, clinical effectiveness, timeliness of care, access to care, efficiency of care delivery, and equity in providing care. Our measurements focus exclusively on patient safety.

Page 30: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

Oregon Patient Safety Commission - 27 -

APPENDIX B: Measurements

Dimension 1: Outcomes. Can we demonstrate that we are eliminating preventable harm events?

Context: The goal of patient safety is to eliminate harm. Therefore, our measurement approach must begin with a concrete outcome measure.

Measure: “Foreign body accidentally left during a procedure not elsewhere classified” (ICD-9-CM 998.4). Source of data: Office of Health Policy and Research, Hospital discharge data.

Why this measure: Foreign bodies, accidentally left during a procedure, represent a very large class of events reported to the Patient Safety Commission.

Data limitations: Retained foreign bodies represent a large cluster of reports submitted to the Commission, in part because they are easier to identify than other types of events. Still, we’ve received only about half as many reports as identified by the state’s Office of Health Policy and Research (OHPR) from their administrative data set. We do not completely understand why. For this reason we’ve reported OHPR numbers instead of our own.

Future outcome measures: CMS healthcare acquired conditions; hospital-acquired infection rates as reported to the state’s mandatory infection reporting program.

Dimension #2: Safe Practices. Are we using evidence-based best practices?

Context: Evidence-based best practices represent “medical decisions, guidelines, and policies consistent with good evidence of effectiveness and benefit.” Evidence-based means that clinical decisions are grounded in scientific research, not convenience or custom.

Measure: Adoption of Surgical Care Infection Prevention measures (called SCIP measures):

• Percent of patients who receive preventive antibiotic received within one hour prior to surgery (SCIP #1);

• Percent of surgery patients who received the appropriate preventive antibiotic(s) for their surgery (SCIP #2);

• Percent of patients who have preventive antibiotic discontinued within 24 hours after surgery SCIP #3);

Note: To be most effective it is important that all three actions are carried out for each patient

Why these measures: Eliminating healthcare acquired infections (HAIs) is a top priority of the country’s health care system; HAIs represent a large category of preventable harm events. The three measures selected by the Commission define HAI-elimination strategies endorsed by the American College of Surgeons, the Institute for Healthcare Improvement, Centers for Medicare and Medicaid Services (CMS) and others. These measures are publicly reported by CMS (Hospital Compare). The measures also align with Oregon’s infection reporting program.

Definitions/sources: Acumentra Health; Hospital Compare (CMS website).

Page 31: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

August 2010 - 28 -

Data limitations: These measures are defined by CMS. There is some controversy about the causal link between compliance with SCIP measures and reduction in infections. In addition, hospitals across the country are continually improving their SCIP scores. At some point we will see compression at the upper end of the scale—if all states are in the 95% to 100% range, the differences among them will be virtually meaningless. Finally, experts agree that the single most important SCIP measure is the percentage of patients who receive all three preventive steps. However, national and state-to-state comparisons are not readily available for this bundled measure.

Future Safe Practices Measures: Likely candidates include compliance with a checklist for reducing central line infections and compliance with hand-hygiene protocols.

Dimension #3: Risk Assessment. Are we assessing risk and learning from experience?

Context: A critical tenet of the patient safety movement is that we can only improve if we identify errors, understand them, and systematically work to find root causes. This requires a reporting program.

Measure: a) Number of hospitals that have reported at least one event to the Commission in the last two years; b) Number of events reported to the Commission.

Why this measure: Willingness to identify/share findings from adverse events is a powerful force for change. In the near term an increase in reports suggests improvement.

Definition/source: Patient Safety Commission data.

Data limitations: The Commission believes it sees only a percentage of the actual number of adverse events that occur in Oregon hospitals. However, it is difficult to determine the ‘true’ number of such events. In addition, we don’t really know how many reports to expect from big versus small hospitals, or from different units of the same hospital. Perhaps more importantly, the reports we do receive don’t completely mirror the true ‘epidemiology of errors.’ For example, we receive fewer reports about medication errors than expected. Finally, we need to ensure that the targets for Dimension #1 (fewer retained objects reported) and for Dimension #3 (more overall reports) do not contradict each other. One seems to create the expectation for fewer reports, the other for more.

Future Risk Assessment Measures: We believe reporting to the Commission is a useful step in improving safety in Oregon, and therefore a useful measure of organizational willingness to learn from mistakes. But there are other, more direct tools available that we need to explore. For example IHI’s "Trigger Tool" might be an innovation worth championing. Or we might find ways to better encourage the quality of root cause analyses done by hospitals as they investigate their own adverse events.

Page 32: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

Oregon Patient Safety Commission - 29 -

Dimension #4: Culture of Safety. Do we have a culture that supports learning and improvement?

Context: A fundamental issue for patient safety is whether organizations can quickly learn from their own mistakes. This requires, among other factors, better teamwork, a willingness to engage in open discussion, and a sense of trust on the part of health care workers that they will not be punished for speaking out or for challenging colleagues.

Measure: The percentage of hospitals that have completed a “culture of safety” survey within the last three years.

Why this measure: Over the past few years hospitals have begun to systematically survey their employees and clinical staff to learn more about the organization’s climate. Using a survey is one very positive way for organizations to address the fundamental willingness of staff to align with and champion patient safety priorities.

Definition/Source: In 2008, the Patient Safety Commission independently fielded a web-based survey of Oregon hospitals asking quality managers three questions:

• Has your hospital completed a ‘culture of safety’ survey within the last three years?

• [if yes] When?

• [If yes] Which culture survey tool did you use?

In 2010 the OAHHS provided similar data to the Commission.

Data limitations: The Commission’s 2008 benchmark was based on a response rate of 53% (30 of 57 hospitals responded to our survey). This relatively low response rate raises questions about the generalizability of the findings. In 2010, the Commission did not attempt to independently validate the findings of the OAHHS. Aside from questions about the accuracy of the data, the critical factor in using a culture survey is whether a hospital acts on the results. This measure does not address that issue.

Future Culture of Safety measure: As noted, the presence/absence of a culture survey represents an important, but limited benchmark of a hospital’s organizational ability to drive quality improvements. In the future, we hope to benchmark data from specific questions within those surveys. In addition we will benchmark progress in complying with the written notification standard. And, we think there is merit in benchmarking the efforts of hospital Boards of Directors to improve patient safety.

Dimension #5: Patient Empowerment. Are patients and consumers playing an active role?

Context: In order for safety efforts to be effective patients must become integral members of their own health care team.

Measure: Number of hospitals that encourage patients to report concerns (derived from the Joint Commission’s 2009 Patient Safety Goal #13.)

Why this measure: Provides a way to capture information about a hospital’s willingness to offer patients a venue for suggestions and criticism.

Page 33: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

August 2010 - 30 -

Definition/source: In 2008 the Commission fielded a web-based survey of Oregon hospitals asking quality managers two questions about patient empowerment.

• Do you encourage patients to report patient safety concerns?

• [If yes] Please briefly describe how you do this.

Data limitations: The response rate for the 2008 survey was 53%. So far in 2010 the Commission has not re-run that survey. More fundamentally, this indicator does not directly measure active patient engagement.

Future Measures: Patient Empowerment: The Commission will explore the use of HCAHPS consumer survey data. Also, we will attempt to measure the effectiveness of empowerment programs such as Speak Up, especially as such programs might relate to the patient’s role in making sure that hospitals follow basic hand washing protocols.

Dimension #6: Community-wide Connectivity. Are we working to create connected systems of care?

Context: This dimension highlights the need for coordinated care.

Measure: Electronic Medical Record (EMR) Adoption Score. From HIMSS Analytics (a wholly-owned, not-for-profit subsidiary of the Healthcare Information and Management Systems Society). Based on 0 to 7 scale, as follows:

Stage 7 – Hospital has a paperless EMR environment. Clinical information is readily shared via electronic transactions with all appropriate entities and providers.

Stage 6 – Full physician documentation/charting implemented for at least one patient care area. Full radiology system implemented.

Stage 5 – Closed loop medication administration environment is full implemented in at least one patient care area.

Stage 4 – Computerized practitioner order entry (CPOE) for use by any clinician is added to nursing and others. Second-level of clinical decision support implemented.

Stage 3 – Clinical documentation installed. First level of clinician decision support is implemented to conduct error checking with order entry. Some level of medical image access.

Stage 2 – Major ancillary clinical systems feed data to clinical data repository.

Stage 1 – Laboratory, pharmacy and radiology installed.

Stage 0 – Some clinical automation may exist, but Stage 1 not yet met.

Why this measure: Having an EMR is a critical component of having a connected healthcare delivery system. With a functioning EMR hospitals can quickly share clinical information with all necessary providers and with all appropriate entities within a network. A fully integrated EMR also allows hospitals

Page 34: Hospital Reporthomes, ambulatory surgery centers, renal dialysis centers, and community based pharmacies. This report offers a partial snapshot of hospital patient safety efforts in

Oregon Patient Safety Commission - 31 -

to more actively share health and wellness information with consumers. And it would encourage sophisticated mining of data to improve care protocols and decision support tools.

Definition/Source: Data are self-reported to HIMSS using an on-line survey. 2008 benchmark based on responses from 5,073 hospitals across the United States (59 in Oregon). 2009 data based on 5,235 hospitals (62 in Oregon).

Data limitations: Self reported data. State scores are bunched together (not much spread). Smaller hospitals will be slower to adopt EMR. Data don’t adjust for large proportion of critical access hospitals in Oregon.

Future Measures: Connectivity Measure: The HIMSS Analytics measures of EMR adoption are still very new. We need to continue to seek ways to verify the utility of this approach.