32
BIG Dot, Little Dot: Defining Quality Michael Heenan, BA, MBA Haajra Khan, BBA, MBA Dorothy Binkley, BSc, MBA Candidate Quality & Mission Committee of the Board June 2009

BIG Dot, Little Dot

  • Upload
    ngonga

  • View
    318

  • Download
    5

Embed Size (px)

Citation preview

Page 1: BIG Dot, Little Dot

BIG Dot, Little Dot:

Defining Quality

Michael Heenan, BA, MBAHaajra Khan, BBA, MBA

Dorothy Binkley, BSc, MBA CandidateQuality & Mission Committee of the Board

June 2009

Page 2: BIG Dot, Little Dot

Presentation Overview

• Background: How We Got Here• Methodology• Defining Quality• Big Dots: The Literature• Big Dots: Specific Indicators or Categories?• Connecting the Dots: SJHH Examples• Proposed SJHH Process

Page 3: BIG Dot, Little Dot

How We Got Here Board-Exec-MAC Retreat

Page 4: BIG Dot, Little Dot

Adoption of IHI-Orlikoff Approach

• Boards adopt Definition of Quality• Boards adopt Big Dots• Boards frame data in Patient Credo

– Don’t Hurt Me– Heal Me– Be Nice to Me

• SJHH Work Plan Adopted by Governance

Page 5: BIG Dot, Little Dot

Methodology• Literature review

– Institute for Healthcare Improvement white papers– Quality and Safety in Health Care journal– Ontario Hospital Association– Canadian Institute for Health Information

• Analysis of existing quality definitions:– Ontario Teaching Hospitals

• Toronto General• Ottawa Hospital

– US Teaching Hospitals and other Healthcare Institutions• Beth Israel Deaconess (USA)• Kaiser Permanente (USA)• Institute of Medicine (USA)

Page 6: BIG Dot, Little Dot

Defining Quality:Policy/Ministry Level Definitions vs. Provider

Level Definitions

Page 7: BIG Dot, Little Dot

Why we need to ‘Define’ Quality

The problem with the quality business has always been the lurking impression that we're talking about varying degrees of "goodness." In the secular world, people refer to "high-quality" restaurants and "low-quality" products and everyone pretends to know what that means.

But those of us who have to make quality happen must have a definition that's manageable and measurable. "Goodness" is neither.

-Philip Crosby

Presenter�
Presentation Notes�
Why do we need to have a definition for quality?” The problem is that Quality is considered to be varying degrees of goodness. In the words of Quality Guru Philip Crosby, “those of us who have to make quality happen must have a definition that's manageable and measurable. "Goodness" is neither”. If we are on the path of providing high quality healthcare, we MUST define what quality means because we can not achieve our goals if we do not know what we are aiming for. �
Page 8: BIG Dot, Little Dot

What is a definition?

A statement expressing the essential nature of something.

•Variation across Industries– Products– Services

•Variation in Policy/Ministry and Provider Definitions

Presenter�
Presentation Notes�
Before moving towards creating a definition for quality, let us take a look at what a definition is. A definition is ‘A statement expressing the essential nature of something’. Therefore to express the essential nature of ‘quality’, we must identify what are the dimensions that create quality. In Philip Crosby’s words, quality cannot be defined by varying degrees of ‘goodness’. Yet we can not have a generic definition across industries. Good quality in a pen cannot mean the same thing as good quality in a car which cannot mean the same thing as good quality in healthcare. Even within an industry e.g. Healthcare, definitions vary. We are bound to see differences as we move from a Systems perspective to a Provider perspective as the interaction of the patient with the two is different in nature. . �
Page 9: BIG Dot, Little Dot

What others say—Policy/Ministry Level Definitions

Systems Definitions Safe Timely Effective Efficient Equitable Patient-Centred Capacity Outcomes Value

Institute of Medicine (USA)

Agency for Healthcare Research & Quality (USA)U.S. Department of Health & Human Services (USA) Care Quality Commission (2009) (UK)NHS Quality Improvement (Scotland)

Presenter�
Presentation Notes�
Therefore to understand the essential nature of quality as it pertains to the healthcare industry, we looked at various systems definitions as well as provider definitions. Within these definitions we identified the dimensions that were included. One of the most commonly citied definition of healthcare quality is the one created by the Institute of Medicine. The IoM has defined what the 6 dimensions of Healthcare quality should be. STEEEP. Safe: Patients should not be harmed by the care that is intended to help them Timely: Unnecessary waits and harmful delays should be reduced Effective: Care should be based on sound scientific knowledge Efficient: Care shouldn't be wasteful. Equitable: It shouldn't vary in quality because of patient characteristics Patient-Centered: Care should be responsive to individual preferences, needs, and values Most organizations use this as the default definition and modify it for their own purposes. For example, as a Public System, NHS Scotland added the dimension of Capacity. Our research showed that not many institutions have defined quality. Even the IHI does not have a definition for quality. �
Page 10: BIG Dot, Little Dot

Examples of Policy/Ministry Level Definitions

• Institute of Medicine:– The degree to which health services for

individuals and populations increase the likelihood of desired health outcomes and are consistent with current professional knowledge

Page 11: BIG Dot, Little Dot

What others say—Provider Definitions

Provider Definitions Safe Timely Effective Efficient Equitable Patient-Centred Environment Outcomes

University of California San Francisco Medical CenterOttawa Hospital

Massachusetts General HospitalGrand River Hospital

Mayo Clinic At Mayo Clinic, quality is not just a simple measure. Quality is a comprehensive look at all aspects of a patient's experience…Quality at Mayo Clinic involves the totality of a patient's experience — from the first phone call to the last appointment.

Presenter�
Presentation Notes�
Looking at provider definitions we see that they are less comprehensive. Even though there are variations in what is included in the definition of Quality, the three dimensions that we see across the board are Safe, Equitable and Patient-Centered. �
Page 12: BIG Dot, Little Dot

Examples of Provider Definitions

• Ottawa Hospital:– Quality is providing the patient with

appropriate, consistent health care in a clean and safe environment in which the patient is treated with respect

• University of California San Francisco – At UCSF, we define quality as:

• Superior care and outcomes • Outstanding patient safety • Excellent service and patient satisfaction

Page 13: BIG Dot, Little Dot

Moving towards an SJHH Definition

• At SJHH quality means….???– Safety– Patient Centeredness– Equity– The environment– Respect/Dignity– Best possible outcome

Presenter�
Presentation Notes�
Moving forward, we need to define for ourselves what the dimension of Quality Healthcare at St. Josephs Healthcare Hamilton will be. Will it be safety, patient centeredness, equity, one of these or all of these? What is it that we will be promising our patients when we say we will provide them with high quality healthcare? �
Page 14: BIG Dot, Little Dot

Connecting the Dots

How do we operationalize the big dots?

An operational definition is a procedure agreed upon for the translation of a concept into measurement of some kind.

– W. Edwards Deming

Presenter�
Presentation Notes�
Once we have a definition of Quality, this definition itself will help drive our Quality Agenda. In short, we will move from an abstract idea of “Good Quality’ to a tangible definition of “Quality at St. Joseph's” to concrete measures and indicators in the in the form of our Big Dots. �
Page 15: BIG Dot, Little Dot

Big Dots:The Literature

Page 16: BIG Dot, Little Dot

What are Big Dots?

• Whole-system measures used to evaluate overall organizational performance and the effectiveness of strategies

• Core processes or functions that patients expect from the organization

• Do not replace the Mission Excellence Scorecard

Efficiency

Equity

Timeliness

Effectiveness

Patient-centeredness

Safety

Patient expectations

Source: Institute of Medicine

Presenter�
Presentation Notes�
The Institute of Medicine’s 6 dimensions of patient expectations are represented in the diagram. According to the IOM, they represent the core processes or functions that patients expect from the organization. The IOM recommends emphasizing these dimensions in the selection of system-level indicators.�
Page 17: BIG Dot, Little Dot

Big Dot Approaches:Specific Indicators or Categories

Page 18: BIG Dot, Little Dot

If it targets a specific program, it’s not a Big Dot

• Big Dots reflect the overall quality of the healthcare system

• System-level is not disease or program specific

• Trustees must tackle system-level aims rather than individual projects

• Improving quality across the system cannot be achieved by creating one single “island of excellence”

One of the best ways for the board to drive quality and safety is to watch its dots, by continuously asking the question “Are we getting better – are

we on pace to achieve our aims?”- James L. Reinertsen, IHI Senior Fellow

Presenter�
Presentation Notes�
Big Dots reflect the overall quality of the system, rather than specific programs. They enable the Board to improve quality across the organization without creating “islands of excellence”, where resources are concentrated into specific projects, to the detriment of others. The following quote by James Reinertsen speaks to the use of Big Dots by the Board.�
Page 19: BIG Dot, Little Dot

Big Dot Criteria

1. Institution-wide (not program specific)

2. Outcome driven (not a process indicator)

3. Connect to other “little” dots or processes (multifaceted)

4. Reflect the organization’s strategic priorities

5. Reflect the organization’s quality definition

Dot Assumption: Data can or will be collected regularly

Presenter�
Presentation Notes�
The following Big Dot criteria will facilitate the selection of appropriate indicators. 2. Means that the indicator must speak to results or effects, rather than compliance with a procedure or practice standard. 3. Little Dots “operationalize” Big Dots, because a Big Dot is too large of an indicator to run an improvement effort. 4. Multifaceted = many-sided, comprehensive. Following on the previous criteria, a Big Dot is moved or changed by running a variety of projects at the Little Dot level. Big Dot assumptions highlights the fact that data can or will be regularly collected for the indicator, so that performance can be measured over time.�
Page 20: BIG Dot, Little Dot

Big Dot Approaches

Heal Me

Be Nice To Me

Don’t Hurt Me

Themed Categories(Patient Credo)

Clinical Categories(McLeod Health, S. Carolina)

Complications

Readmission

Mortality

Strategic Categories

Patient Safety

Financial Stewardship

Patient Flow

Mission Excellence

Presenter�
Presentation Notes�
Currently, there are no particular indicators recommended by the literature, but there is a movement toward using categories. Big Dot categories are used to frame the Big Dot indicators. �
Page 21: BIG Dot, Little Dot

Connecting the Dots:How a Big Dot Leads to Little Dots

Page 22: BIG Dot, Little Dot

How to connect the dots?

• Each Big Dot can be broken down into Little Dots

• Little Dots are defined and measurable process indicators, which help to operationalize Big Dots

• Little Dots are are attached to the organization’s improvement projects

Little Dots(processes/projects)

Big Dots(outcomes)

Presenter�
Presentation Notes�
Since Big Dots are “whole system measures” that reflect the overall quality of the health care system and all other smaller measures flow to and from them, if the system is performing well at the highest level of aggregation, then it is likely performing well at the lower levels. �
Page 23: BIG Dot, Little Dot

The Big Dot Cascade

BigDots

BOARD

EXEC TEAM

MAC/PAC

25 Clinical Programs (bedside)

Standard Scorecard Kit & Non-Clinical Scorecards

Little Dot

Report

ing M

easu

res

Presenter�
Presentation Notes�
The diagram represents the cascade of responsibility within the organization from Big Dots down to Little Dots. At the top of the diagram is the Board, who reviews the system-level measures. Moving down the diagram, management is responsible for overseeing the Little Dots and their associated projects or processes, which will be executed at the front line. �
Page 24: BIG Dot, Little Dot

Example

Board Executive/MAC/PACBig Dot Little Dots

Hospital Standardized 

Mortality Ratio

Infections

Medication Error

Falls

Emergency Department 

Wait Times

Time to lab results

Time to DI results

ALC patients

Margin

Volumes

Bed turns

Sick time

Presenter�
Presentation Notes�
You will recall my description of Big Dot categories, such as the Patient Credo. These are not included in this example, because it is intended to show the types of Little Dot indicators found under a Big Dot Indicator and what group is responsible for their oversight. At the clinical program level, the Little Dots are represented by projects, which help translate the indicator into something that speaks to clinicians. �
Page 25: BIG Dot, Little Dot

Big Dot, Little Dot:How the SJHH Board has used dots to date

Page 26: BIG Dot, Little Dot

Big Dot Dynamics

Previous Q Current Q Target Previous Q Current Q TargetPatient Satisfaction Overall Care Received Patient Access & Quality

Patient Satisfaction - Acute Care 91.1% 94.3% 93.4% Volumes: Inpatient Cases 3,612 3,705 3,626Patient Satisfaction - Surgical Care 91.9% 95.7% 93.4% Volumes: Day Surgery Cases 9,238 10,246 8,894Patient Satisfaction - Emergency Care 79.2% 73.8% 82.4% Volumes: Emergency Department (ED) Visits 26,120 27,623 25,667

Acute Average Length of Stay (LOS) 4.80 4.70 5.00Wait Times (in days) Total Average Length of Stay (ALOS) 6.2 6.1 5

Cancer Surgery (see graphs for detailed information) 75 74 84 Readmission Rate 3.5% 3.3% 3.3%Cataract Surgery 104 93 182 Average Resource Intensity Weight 1.66 1.67 1.91Hip Replacement 168 87 182 Number of ALC Equivalent Beds 98 107 88Knee Replacement 248 137 182MRI 149 74 28 Patient SafetyCT Scans 28 29 28 Hospital Standard Mortality Rate (HSMR) 98 66 76

Ventilator Associated Pneumonia Rates 7.77 7.34 15Emergency Department CCRT - Rate of Inpatient Codes per 1,000 admissions 2.48 4.83 5.00

Left without being seen 5.2% 4.9% 2.0% Infection Rate - MRSA 0.40 0.50 0.70ER LOS Less than 8 Hours - CTAS Levels I, II 55.9% 57.6% 60.9% Infection Rate - VRE 0.90 2.10 0.15ER LOS Less than 6 Hours - CTAS Levels III 59.9% 63.0% 63.0% Infection Rate - c.Difficile 0.50 0.10 0.77ER LOS Less than 4 Hours - CTAS Levels IV, V 58.9% 66.7% 64.1% Central Line Infection Rates 3.2 1.4 6.9Wait Time to Inpatient Bed (Admitted Patients) 10.1 10.4 6.0 Surgical Site Infection Rates 1.45% 0.00% 1.45%% of patients with ED LOS beyond 24 hrs 5.3% 4.7% 2.0% % of chronic patients with new stage 2 or greater skin ulcers 8.5% 0.0% 8.6%

Number of Reported Patient Incidents 572 523 500Research

Total Research Funding 19,819,214.00$ -$ TBD Mental Health Percentage of External Peer- Reviewed Funding 54.00% 0.00% TBD Acute Inpatient Volumes 438 436 430Research Staff Repatriated to Campus 4 0 26 Acute Average Length of Stay (LOS) 15.97 15.51 16.00

Acute Readmission Rate 4.38% 6.77% 7.50%Specialized MH Inpatient Volumes 168 189 198Specialized MH Average LOS 80.53 79.85 72.00Specialized MH Average LOS (excl. Forensics) 80.46 78.43 72.00Number of Physical, Chemical Restraints and Seclusions 25/208/88 682/75/368 TBD

Previous Q Current Q Target Previous Q Current Q Target

Total Margin per GAAP -0.15% -3.34% -2.36% Avg Sick Days per FT Employee 4.10 3.98 2.59Revenue 118,612,020$ 104,246,488$ 103,381,095$ Turnover Rate 1.70% 2.28% 2.00%Expenses 118,784,428$ 107,733,346$ 105,815,728$ HAA Target: % of Full Time Nurses 73.5% 72.3% 70.0%Current Ratio 0.25 0.18 0.23 Overall Average Vacancy Rate - Nursing 7.81% 9.04% 0.00%Total Margin (per Hospital Operations) -1.33% -2.22% 0.00% Number of Employee Incident Reports 247 260 0

Providing improved access to safe and high quality care

Providing excellence in care through sound fiscal management

Living our CARE commitment through:

Promoting a healthy workplace environment, employee engagement, and continuous learning

through innovation and evidence based practiceCompassion, Attitude, Responsiveness, and Excellence

Financial Health Work Life and Learning

St. Joseph's Healthcare HamiltonMission Excellence Scorecard

PERIOD: FY 2008-09 Q1 (April-June 2008)Service and Mission Excellence Excellence in Patient Care

SJHH is dedicated to providing compassionate, sensitive care to our patients and their familiesand to achieving clinical, research, and academic excellence in health care through integrated health services and on-going commitment to education and research.

Page 27: BIG Dot, Little Dot

HSMR – Board Quality Example

HSMR Scoreof 88 to Board

•Question: Why is target 76 if benchmark is 100?

•Answer: Top Quartile

•Question: How do we get there?

Exec & MAC Discussions

InternalAnalysis

• QPPIP examines data by death type, unit location, and researches other peers across globe

• QPPIP Findings: Sepsis & safety campaigns including infections & hand-washing key to lowering HSMR

• Number 1 cause of death at SJHH: Sepsis

MAC & Program Discussions

Action: Sepsis Management Project in ER & ICU

Jan 2008 Mar 2008 Spring 2008 Fall 2008 Jan 2009

Page 28: BIG Dot, Little Dot

Proposed Process:Selection, Definition & Targets

for our Big Dots

Page 29: BIG Dot, Little Dot

Literature 

Review

Initial Consultation Board 

Approval

Approval

Board‐

Exec‐

MAC‐PAC 

Working 

Group

Board

MAC

Executive

QPPIP 

Research

May June - August September Oct 15

Big Dot Definition, Selection and Target Timeline 2009 (DRAFT)

Organizational 

Approval

Executive

PAC

Page 30: BIG Dot, Little Dot

Indicators Previous Q Current Q Target Target Type

HEAL ME (Access & Quality)

BE NICE TO ME (Patient Satisfaction)

** Indicators Publicly Reported by Ministry of Health & Long Term Care.Indicators are graphed. Raw case counts & definitions attached with each graph.

Target Type A - Internal B - Benchmark C - Ministry D - Peer

PERIOD: FY 2008-09 Q3 (Oct-Nov-Dec 2008)

Mission Excellence ScorecardBig Dot Report

Board Mission Excellence Scorecard

DON'T HURT ME (Infection Rates)

DON'T HURT ME (Mortality & Incidents)

Page 31: BIG Dot, Little Dot

ReferencesBeasley, C. “Leading Quality: Thoughts for Trustees and Other Leaders.” Institute for Healthcare Improvement. April 9, 2003. <

http://www.vahhs.org/EventDocs/Leading%20Quality.ppt>Campbell, S.M., Braspenning, J., Hutchinson, A., and M. Marshall. 2002. Research methods used in developing and applying quality

indicators in primary care. Quality and Safety in Health Care 11: 358-364.Baker, M.A., Corbett, A., & Reinertsen, J.L. “Quality and patient safety: Understanding the role of the board.” Governance Centre of

Excellence and the Ontario Hospital Association. 2008. <http://www.oha.com/client/oha/oha_lp4w_lnd_webstation.nsf/page/Publications+for+Sale>

Move Your Dot™: Measuring, Evaluating, and Reducing Hospital Mortality Rates (Part 1). IHI Innovation Series white paper. Boston: Institute for Healthcare Improvement; 2003.

Peters, W.A. “Ten Elements for Creating Solid Health Care Indicators.” Quality Digest. August 30, 2008. <http://www.qualitydigest.com/magazine/2008/sep/article/ten-elements-creating-solid-health-care-indicators.html>

Pugh M., Reinertsen J.L. 2007. Reducing harm to patients. Healthcare Executive 22(6): 62, 64-65.Pulcins, I. “The State of Big Dot Performance Measures in Canada”. Canadian Institute for Health Information. October 14, 2008.

<http://www.qhn.ca/events/OH_Patientsafety_CIHI.ppt>QReview. “RHA boards must lead the charge on improving quality, safety: Reinertsen”. Health Quality Council. April 2007.

<https://www.hqc.sk.ca/portal.jsp;jsessionid=5lh532fh11?q45RfFNvzgvL+j+W57aw9TBIzBf0QfLQkUwK4QBZaJsNclR3 Gq3AV1VvI5thiwzu>

VHA Georgia. 2006. How do Hospitals Become “Best in Class?” Materially originally presented by senior fellows from the Institute for Healthcare Improvement at a hospital CEO Summit on Quality, February 9-10, in Georgia.

Mayo Clinic Website (2009). Retrieved April 8, 2009. http://www.mayoclinic.org/quality/ NHS Quality Improvement Scotland 2008/09 – 2010/11 Delivery Plan. Retrieved April 9th, 2009.

http://www.nhshealthquality.org/nhsqis/files/NHSQIS_DeliveryPlan_200809-201011.pdf Care Quality Commission (2009) England. Retrieved April 9th, 2009. http://www.cqc.org.uk/_db/_documents/About_CQC.pdf

Page 32: BIG Dot, Little Dot

Michael Heenan, BA, MBADirector, Quality Planning & Performance Improvement

Haajra Khan, BBA, MBAPerformance Improvement Consultant

Dorothy Binkley, BSc, MBA CandidateQuality Associate

St. Joseph’s Healthcare HamiltonTel: (905) 522-1155Fax: (905) 308-7221Email: [email protected]

Contact Information