40
Designing a Cross Model Quality Assurance System: Mission Impossible? 1

Designing a Cross Model Quality Assurance System: Mission

  • Upload
    others

  • View
    6

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Designing a Cross Model Quality Assurance System: Mission

Designing a Cross Model Quality Assurance System:

Mission Impossible?1

Page 2: Designing a Cross Model Quality Assurance System: Mission

Introductions

Julia HeanyPrincipal Investigator

Cynthia ZagarMHVQAS LeadHFA Reviewer

Jennifer TorresStudy Coordinator

Tiffany KostelecHome Visiting Unit Manager

2

Page 3: Designing a Cross Model Quality Assurance System: Mission

Agenda for Today

Provide a broad understanding of the evidence base behind this work

Discuss the laws relative to this project

Share the story of the journey

Discuss the study and preliminary findings

Share initial thinking about what next steps are to make use of the tool moving forward

3

Page 4: Designing a Cross Model Quality Assurance System: Mission

The Research Base: Our Guiding Light

4

Page 5: Designing a Cross Model Quality Assurance System: Mission

Finding clarity: What’s QA & what’s QI & why do we need both?

Quality Assurance:Reviewing performance against a set of standards

Quality Improvement:Adjusting processes to better meet the needs of customers, both internal and external

Determines whether a program meets expectations for quality

Determines whether adjustments to a program result in improvements

Uses staff expertise to ensure the program provides quality services

Uses staff experience to improve the processes by which the program is implemented

Relies on external review and internal monitoring

Relies on internal teams

Responds to expectations established by external bodies

Responds to needs of customers and stakeholders

Expectation of MIECHV grant recipients Expectation of MIECHV grant recipients

Expectation of Michigan’s Home Visiting Law (PA 291)

5

Page 6: Designing a Cross Model Quality Assurance System: Mission

Current QA & QI Activities

Gaps in quality assurance: Does not communicate consistent expectations for home visiting

program implementation from the State’s perspective Does not monitor the quality of home visiting implementation across

Michigan’s HV system Does not provide a foundation for quality improvement Does not meet legislative or funding requirements

Quality Assurance Quality ImprovementModel reviews MHVI Learning Collaborative

Contractual monitoring Participation in HV Collaborative Innovation and Improvement Network (CoIIN)

Benchmark reporting (under MIECHV) MHVI LIA, Local, and State CQI projects

Training & TA

6

Page 7: Designing a Cross Model Quality Assurance System: Mission

The Law and You

7

Page 8: Designing a Cross Model Quality Assurance System: Mission

Building a System that Supports Quality

Quality Assurance Expectations for MI are established through: Michigan’s Home Visiting Law - PA 291 of 2012

Defines the Home Visiting System as the “infrastructure and programs that support and provide home visitation.”

Built on the idea that “evidence-based programs have comprehensive home visitation standards that ensure high-quality service delivery and continuous quality improvement...”

Requires that state funded HV programs “operate with fidelity to the program or model.”

MIECHV Funding Requirements “States must provide a plan for ongoing monitoring of implementation

quality” Includes “State’s overall approach to home visiting quality assurance” State’s approach to “maintaining quality assurance”

8

Page 9: Designing a Cross Model Quality Assurance System: Mission

The Journey of the Development of the Quality System

9

Page 10: Designing a Cross Model Quality Assurance System: Mission

The Shock:10

Page 11: Designing a Cross Model Quality Assurance System: Mission

Building a System that Supports Quality

To build an integrated approach to assuring and improving the quality of Evidence-Based and Promising Home Visiting Programs in Michigan: Formed a workgroup, including model representatives and

state partners Explored the evidence base and existing tools Received consultation from national & model experts Identified domains of quality in home visiting implementation Developed cross-model standards and measures under each

domain Developed a tool and process for assessing quality Received funding to field test the tool

11

Page 12: Designing a Cross Model Quality Assurance System: Mission

Quality Domains

Selected domains based on indicators of quality and fidelity that were supported by the research and common across models

Domains include: Recruitment and Enrollment Home Visitor Caseloads Assessment of Family Needs and Referral to Services Dosage and Duration Home Visit Content Staff Qualifications and Supervision Professional Development Organizational Structure and Support

12

Page 13: Designing a Cross Model Quality Assurance System: Mission

Quality Standards and Measures13

Page 14: Designing a Cross Model Quality Assurance System: Mission

Quality Standards and Measures14

Page 15: Designing a Cross Model Quality Assurance System: Mission

MHVQAS: Supplementary Materials

Model Specific Guidance Provides guidance to reviewers for measures where models have specified requirements.

Worksheets Reviewers – assists with reviewing site documentation prior to site visit Sites – assists with preparing data for site review

Document Checklist List of self-assessment materials to be sent prior to site visit

Document Gathering Questions Documentation of issues with gathering materials

Site Visit Document Review List of materials to have ready on day of site visit

Interview Questions Questions you will be asked during the interview portion of the site visit

15

Page 16: Designing a Cross Model Quality Assurance System: Mission

Quality Report

On-site review summary – rating for each standard

16

Page 17: Designing a Cross Model Quality Assurance System: Mission

Quality Report

• Ratings for each measure, including opportunities for improvement and/or special recognition

17

Page 18: Designing a Cross Model Quality Assurance System: Mission

The Study Framework and Preliminary Findings

18

Page 19: Designing a Cross Model Quality Assurance System: Mission

Study Questions

o Do the tool & procedure produce reliable and valid results? o Are the results similar when the tool is completed by different reviewers?o Do the tool & procedure really measure implementation quality?

o Can the tool & procedure be applied across models implemented in Michigan?

o What are the costs to the state and to local home visiting programs associated with preparing for and completing the review procedure?

o How can the results of the tool be used to improve quality in home visiting at the local and state levels?

19

Page 20: Designing a Cross Model Quality Assurance System: Mission

Participants

Home visiting implementation experts (n=6) Provide feedback on whether the tool captures key concepts in quality

through an online survey

Reviewers (n=5) Conduct site reviews, track time and resources, and complete satisfaction

surveys

Home visiting sites (n=8) Participate in site reviews, coordinate home visit observations, provide

recent model review, track time and resources, and complete satisfaction surveys

20

Page 21: Designing a Cross Model Quality Assurance System: Mission

Study Tools

Michigan’s Home Visiting Quality Assurance System Tool Implementation Expert Survey Home Visit Rating Scale Satisfaction Surveys Reviewer satisfaction Site satisfaction 6-month follow-up

Cost Tracking Tool Reviewer Site

21

Page 22: Designing a Cross Model Quality Assurance System: Mission

Procedures

Training

• Reviewers• LIAs

Scheduling

• On site review• HOVRS

observations

Review Preparation

• LIA collects documentation

• Submit 3 weeks prior to review

On-site Review

• Documentation• Interview• Tour• HOVRS

completed within a month of the review

Quality Report

• Developed through reviewer consensus process

• Provided to LIA

• Phone conference to discuss

Cost Tracking

• Completed throughout process

• Submitted 1 week after review online

Surveys

• LIAs and reviewers -day after quality report is sent

• LIAs - 6 months after review

• Experts

22

Page 23: Designing a Cross Model Quality Assurance System: Mission

Findings

Focus on research question #2 Can the tool and procedure be applied across models

implemented in Michigan? To what degree do model review guidance overlap with MHVQAS

standards? To what degree were model review findings consistent with MHVQAS

findings? Were there areas of the MHVQAS tool where models performed

differently? Did models differ in their perceptions of the MHVQAS tool’s reliability and

validity, or in their satisfaction with the tool? Did models differ in the cost of preparing for and completing the MHVQAS

process? What successes and challenges were associated with testing a cross-

model set of standards and measures?

23

Page 24: Designing a Cross Model Quality Assurance System: Mission

To what degree do model review guidance overlap with MHVQAS standards?

Domains Standards MHVQAS MIHP HFA EHS PAT

Recruitment and Enrollment

Recruitment and Enrollment X X X X

CaseloadsCaseloads

X X X

Assessment of Need and Tailoring of Services

Assessment of Need and Referrals

X X X X X

Developmental ScreeningX X X X X

Care CoordinationX

Design of Services to Address Needs

X X X X

Goal Setting and Visit Planning X X X X

24

Page 25: Designing a Cross Model Quality Assurance System: Mission

Domains Standards MHVQAS MIHP HFA EHS PAT

Dosage and Duration

Dosage X X X X X

Retention X X X

Family Exit and Transition Plans X X X X

Home Visit Content

Use of a Curriculum X X X X X

Nutrition Services X

Documentation of Home Visits X X X

Family Rights

Family Feedback X X X X X

Accommodations X X

Voluntary Services X

Consent X X

25

To what degree do model review guidance overlap with MHVQAS standards?

Page 26: Designing a Cross Model Quality Assurance System: Mission

Domains Standards MHVQAS MIHP HFA EHS PAT

Staff Qualifications and Supervision

Staff QualificationsX X X X X

SupervisionX X X

Professional DevelopmentTraining

X X X

Organizational Structure and Support

Environmental Health & Safety X

InfrastructureX X X

Quality AssuranceX X X X

Integration into Service System X X X

Advisory GroupX X X X

26

To what degree do model review guidance overlap with MHVQAS standards?

Page 27: Designing a Cross Model Quality Assurance System: Mission

To what degree do model standards overlap with MHVQAS standards?

There is not a perfect overlap between MHVQAS standards and any other model’s standard

The MHVQAS tool does not include any standards that are not included in at least one model

Take Home Message:The MHVQAS standards are comprehensive by design, overlapping

with and supplementing each of the other models reviewed.

27

Page 28: Designing a Cross Model Quality Assurance System: Mission

To what degree were model review findings consistent with MHVQAS findings?

Model review findings were compared with MHVQAS findings to see if the two reviews found consistent areas of strength and opportunities for improvement Reviewed findings at the standard level rather than at the

measure level

BIG limitations: The reviews were conducted at different times, so there could

have been improvement or drift between each review EHS home based and center based are reviewed against the

same standards so a review comparison was not possible for EHS sites

28

Page 29: Designing a Cross Model Quality Assurance System: Mission

To what degree were model review findings consistent with MHVQAS findings?

Model Number of standards that overlapped

Review 1 - # of standards with different findings

Review 2 - # of standards with different findings

MIHP 18 1 2HFA 20 2 7PAT 12 1 1

29

Take Home Message:There were some differences in ratings between reviews, but there were no

standards that were systematically different.

Page 30: Designing a Cross Model Quality Assurance System: Mission

Were there areas of the MHVQAS tool where models performed differently?

Scores on each standard were compared by model to identify if some models performed better on some standards than others There was not a statistically significant difference in ratings on

any of the standards between different home visiting models Used a non-parametric test – Kruskal-Wallis H

Take Home Message:One model is not more likely to perform well on any part of the tool

than any other model.

30

Page 31: Designing a Cross Model Quality Assurance System: Mission

Did models differ in their perceptions of the MHVQAS tool’s reliability and validity, or in their satisfaction with the tool?

Perceptions of reliability 13 items on a 1-6 scale focused on whether the tool could be applied

consistently (e.g., “The measures assessed are clear”) LIA Mean = 4.49 Differences between models were not statistically significant

Perceptions of validity 3 items on a 1-6 scale focused on if the tool was measuring what it

intended to measure (e.g., “The standards reflect key drivers of quality in home visiting”)

LIA Mean = 4.44 Differences between models were not statistically significant

31

Page 32: Designing a Cross Model Quality Assurance System: Mission

Did models differ in their perceptions of the MHVQAS tool’s reliability and validity, or in their satisfaction with the tool?

Satisfaction 17 items on a 1-6 scale focused on the review process (e.g., “The

onsite review process was efficient”) LIA Mean = 4.81 Differences between models were not statistically significant

Take home message:Perceptions of the MHVQAS tool’s reliability and validity were positive, and satisfaction with the process was high, and reliability, validity, and

satisfaction did not differ by model.

32

Page 33: Designing a Cross Model Quality Assurance System: Mission

Did models differ in the cost of preparing for and completing the MHVQAS process?

Site costs preparing for and conducting the review Mean = $2,722 Differences between models were not statistically significant

33

$2,685

$4,204

$2,958 $2,869

$2,291

$1,166

$3,052 $2,553

$0$500

$1,000$1,500$2,000$2,500$3,000$3,500$4,000$4,500

MIHP 1 MIHP 2 HFA 1 HFA 2 EHS 1 EHS 2 PAT 1 PAT 2

Cost of Preparing for and Completing the Process by Site (Total n=62)

Page 34: Designing a Cross Model Quality Assurance System: Mission

Did models differ in the cost of preparing for and completing the MHVQAS process?

Most of the cost was in staff time spent preparing for the review Home visitors spent the fewest hours (mean = 6.05) Supervisors (mean = 22.83) and directors (mean = 23) spent the

most hours Differences by model were not statistically significant

34

10.6612.79 13.24

8.88 8.38

2.43

4.46 1.83

2.08 1.87

02468

101214161820

Total MIHP HFA EHS PAT

Mean Hours Staff Spent on the Review Process (Total n=68)

Mean hours preparing per staff Mean hours day of review

Page 35: Designing a Cross Model Quality Assurance System: Mission

Did models differ in the cost of preparing for and completing the MHVQAS process?

There was a broad range in the amount of time spent on the total process by LIA Mean hours by LIA = 117.32, Min = 76.00, Max = 160.55 Differences between models were not statistically significant

Take Home Message:On average sites spent 117 total staff hours on the process and it cost

an average of $2,722 per site, and while some sites spent more time than others, there were not differences in time or money spent by

model.

35

Page 36: Designing a Cross Model Quality Assurance System: Mission

What opportunities for making the MHVQAS work better across models were identified?

36

Scope: Balance achieving a comprehensive view of quality against the

burden of documentation Specify a requirement for each MHVQAS measure so that

expectations are clear for models without a requirement under a particular measure

Translation: Illustrate connections between MHVQAS standards and model

standards Identify alignment between MHVQAS documentation requirements

and program documents Clarify areas of the tool where there was more misalignment with

model requirements (e.g., advisory groups, tracking referrals, timeframes for data)

Page 37: Designing a Cross Model Quality Assurance System: Mission

Use of the QA System Moving Forward

37

Page 38: Designing a Cross Model Quality Assurance System: Mission

Michigan Home Visiting Initiative

Validation of the tool, once achieved: MHVI will incorporate this process into sub-recipient

monitoring of MHVI grantees to assist in assuring model fidelity as required by both MIECHV and PA 291 of 2012.

Allows MHVI to provide documentation of ongoing monitoring. Can be shared with EBHV Model Representatives.

LIAs will be trained on the process to ensure they understand all components.

MHVI will work with state partners to spread the use of the tool across models and funding streams.

38

Page 39: Designing a Cross Model Quality Assurance System: Mission

Questions and Wrap-up39

Page 40: Designing a Cross Model Quality Assurance System: Mission

Contact Information40