1
CQC’s use of risk analysis: Intelligent Monitoring
Alex Baylis, Head of Acute Sector PolicyLisa Annaly, Head of Provider Analytics
17th EPSO conference, Porto, Portugal, 8th May 2014
Overview
• An update on CQC’s purpose and strategic changes
• Overview of our new operational model
• Development of intelligent monitoring – progress and learning
2
3
Our purpose
We make sure health and social care services provide people with safe, effective, compassionate, high-quality care and we encourage care services to improve
Our role
We monitor, inspect and regulate services to make sure they meet fundamental standards of quality and safety and we publish what we find, including performance ratings to help people choose care
Our Purpose and Role
4
FROM•Focus on Yes/No ‘compliance’
•A low and unclear bar
TO•Professional, intelligence-based judgements.
•Ratings - Clear reports that talk about safety, effectiveness, caring, well-led and responsive care
• 28 regulations, 16 outcomes • Five key questions.
• CQC as part of the system with responsibility for improvement
• On the side of people who use services.
• Providers and commissioners clearly responsible for improvement.
• Generalist inspectors • Specialist with teams of experts.
• Longer, thorough and people-focussed inspections.
• Corporate body and registered manager held to account for the quality of care
• Individuals at Board level also held to account for the quality of care.
Strategy 2013-2016: key changes overall
5
CQC’s New Operating Model
Framework for our use of information
• Indicators we have prioritised for routine monitoring in Intelligent monitoring
• Prompt action which can include a request for further information or an inspection of a site
• Wider set of indicators that are examined along with tier 1 to provide “key lines of enquiry” for inspection
• Do not cause regulatory action if a single indicator or a combination of several indicators breach thresholds
• “Horizon scanning” to identify future indicators
• Devised/updated through engagement with Providers, Royal Colleges, Specialist Societies and academic institutions and international best practice
Tier 1 indicators
Tier 2 indicators
Tier 3 indicators
Qualitative intelligence
Safety
Caring
Effectiveness
Well led
Responsiveness
Indicators that are available to the CQC about a provider across all 5 domains
Indicators being developed that are not yet nationally comparable indicators in association with the
professional bodies e.g., Royal Colleges
7
• Uses a prioritised set of indicators relating to the five questions to raise questions
• Focuses on identifying risks, does not analyse the full distribution of performance
• Uses nationally available quantitative and qualitative information sources
• Overall weighted summary for each provider (‘banding’) to help prioritise inspection activity
• Results of the analysis and banding available to the public• Plan to refresh data every quarter• Plan to roll out for all of the sectors that we regulate
Intelligent monitoring tool (tier 1)
Outcome measures and safety events
Information from patients and the public
Information from and about staff
• Mortality indicators• Readmission rates• Results from National
Clinical Audits• Patient Reported
Outcomes Measures• A&E, cancer and
elective waiting • Infection Rates –
MRSA, C.difficile, puerperal infections
• Never events and patient safety incidents
• VTE risk assessment
• Responses from the National Inpatient Survey
• Friends and Family Inpatient Scores and Responses
• People’s experiences shared with CQC
• Feedback left on NHS Choices and Patient Opinion
• Complaints
• Concerns raised by staff to CQC
• Responses from the National Staff Survey
• Staffing indicators – turnover, stability, sickness rates
• Flu vaccination uptake• Junior doctor satisfaction
Other indicators included: management of discharge from hospital, consistency of reporting and data quality measures, Monitor and NTDA risk/escalation scores.
8
Intelligent Monitoring: NHS Acute and specialist Trusts
9
Intelligent Monitoring: NHS Acute and Specialist Trusts
Progress:•Tested prototype – June 2013•Version 1 published – October 2013•Version 2 published – March 2014
11
What we have learned so far?
• Improved communications to providers• Extended time for trusts and inspectors to review their
reports to two weeks• Early results but some agreement between priority
banding and findings from inspection
But some clear ‘false positives’ identified• Application of the model to other sectors – in
development leading up to October 2014• Not proposing to create a ‘priority for inspection’
banding for those sectors where national information is lacking
12
Thanks for listening
Questions?