Upload
cory-hart
View
216
Download
1
Embed Size (px)
Citation preview
Using Standardized Patients for Competency Assessment and Training of BHCs - Part Two
Neftali Serrano, PsyD, Suzanne Daub, LCSW, Travis Cos, PhD, Jeff Reiter, PhD,
Health Federation of PhiladelphiaCollaborative Family Healthcare Association 17th Annual Conference
October 15-17, 2015 Portland, Oregon U.S.A
Session # F4bOctober 17, 2014
I/We currently have or have had the following relevant financial relationships (in any amount) during the past 12 months:
–Neftali Serrano, Lead consultant, primarycareshrink.com–Jeff Reiter, Consulting Associate
primarycareshrink.com
Faculty Disclosure
Learning Objectives
At the conclusion of this session, the participant will be able to
• Describe the Standardized Patient (SP) methodology for assessing BHC core competencies.
• Identify the BHC competencies that were assessed and list the application of findings for quality improvement initiatives.
• Describe a variety of ways that this methodology can be replicated with or without access to an SP lab or grant funding.
Bibliography / ReferenceCreating a Simulated Mental Health Ward: Lessons Learned. Rossetti J, Musker K, Smyth S, Byrne E, Maney C, Selig K, Jones-Bendel T. J Psychosoc Nurs Ment Health Serv. 2014 Sep 12:1-7. doi: 10.3928/02793695-20140903-02. Consumer and Relationship Factors Associated With Shared Decision Making in Mental Health Consultations. Matthias MS, Fukui S, Kukla M, Eliacin J, Bonfils KA, Firmin RL, Oles SK, Adams EL, Collins LA, Salyers MP. Psychiatr Serv. 2014 Sep 15. doi: 10.1176/appi.ps.201300563. Twelve tips for asking and responding to difficult questions during a challenging clinical encounter., Soklaridis S, Hunter JJ, Ravitz P., Med Teach. 2014 Sep;36(9):769-74. doi: 10.3109/0142159X.2014.916782. Epub 2014 Jul 14.
'Thinking on my feet': an improvisation course to enhance students' confidence and responsiveness in the medical interview. Shochet R, King J, Levine R, Clever S, Wright S. Educ Prim Care. 2013 Feb;24(2):119-24.20. Improving physician-patient communication through coaching of simulated encounters.Ravitz P, Lancee WJ, Lawson A, Maunder R, Hunter JJ, Leszcz M, McNaughton N, Pain C. Acad Psychiatry. 2013 Mar 1;37(2):87-93. doi: 10.1176/appi.ap.11070138.
Learning Assessment
• A learning assessment is required for CE credit.
• A question and answer period will be conducted at the end of this presentation.
Rationale
• Description of network: The Health Federation of Philadelphia is a regional network of community health centers
• Over 40 Behavioral Health Consultants across 20 organizations/33 sites participate in a community of practice
• Need for meaningful standardization of competencies
Project History
2014• Partnerships developed between HFP, Philadelphia
College of Osteopathic Medicine (PCOM) and the Thomas Scattergood Foundation to enable the project.
• 21 BHCs observed in PCOM Standardized Patient (SP) lab by Neftali Serrano and Suzanne Daub
SP Project Structure
Pre work• Developed standardized patient cases• Developed rating tool (Revised 2015)• Trained raters to establish inter-rater reliability (2015)• PCOM trained standardized patients
Simulation Protocol• Occurred at PCOM lab over two days• 3 Actors – a PCP, 2 patients (1 male/1 female)• 2 Raters• Volunteer BHCs • BHCs provided rating scale ahead of time• Oriented to flow• Simulations were observed and recorded
Tools Used/ Created
• Case• Mock Medical Record• BHC Rating Scale• Working Alliance Inventory (WAI) SP and Self Rating Tool• Documentation Template• Post-Simulation BHC Feedback Survey
2014 Key Findings
Positive: • Experience was useful for competency development• Patient feedback will help shape future practice• Rater feedback targeted clinical skills for ongoing improvement
Negative: • Poor inter-rater reliability
• Likely that the pass/fail format of 2014 tool (raters could not provide as
nuanced a score therefore polarizing the ratings), and potentially “biased”
raters contributed to the inter-rater reliability problems
Limitations:• Small sample size of BHCs• Simulation may be best for non-novice BHCs
Part Two
Used results from 2014 to focus network training on • Motivational interviewing• Integration of health behavior change/ medical
comorbidities• SOAP note writing• Use of standardized assessment tools
And to• Revise BHC rating tool to include a rater rubric• Train “expert” raters and to establish inter-rater reliability
on BHC rating tool
Improved BHC Rating Scale and Rater Training
2015 SP Lab Experience
• 19 BHCs observed at PCOM lab over two days• 3 Actors – a PCP, 2 patients (1 male/1 female)• 2 Trained Raters• BHCs provided rating scale ahead of time• Oriented to flow• Simulations were observed and recorded
Results of the 2015 Simulation
BHC Rating Tool
• We were able to demonstrate consistency of ratings between the expert raters indicating overall functional value of the tool.
• Time developing trained raters is likely critical to successful outcome
Results of the 2015 Simulation• Functional analysis that is engaging for the patient and shows evidence of
motivational interviewing core concepts (vs. an interview)
2014 2015Pass Rate
(Pass/ Fail)
Rater 1 81.8%
Rater 2 30%
Pass Rate(Pass/ Fail)
Rater 1 91.6%
Rater 2 100%
Results of the 2015 Simulation
Working Alliance Inventory (WAI)
Combined Years Analysis (including both years)• No statistical differences in BHC ratings across items
across years• No statistical differences in SP ratings across items
across years• No statistical differences between BHC and SP ratings
across items across years• No statistical differences in ratings for BHCs who
participated in both years
Results of the 2015 Simulation
BHC Performance Improvement• Items connected to motivational interviewing rated well
overall – BHCs scored “average” or “above average” on items
related to defining the problem collaboratively and
generating solutions that matched the patient’s stage of
change. • Improvement seen in the area of integrating the medical
diagnosis and medication into the BHC encounter
Summary
Patient Raters BHCs Expert
Raters
Rated 14/19 BHCs Average or Above
Self ratings in line with patient and expert raters
Rated 14/19 BHCs Average or Above
Only 1 item discrepant
Only 3 items discrepant
Results of the 2015 Simulation
Opportunities for future training• Confidence and collegiality rated “below pass” on average by
both expert raters• BHC/PCP interaction related questions rated low • Use of screening tools rated low by all• SOAP note items were rated differently by both raters
BHC receives • Direct feedback from the simulated patient• Direct feedback from the expert rater
Network receives• Information on where to focus training • Allowed for Identification of high and low simulation
performers in BHC network.• Reinforcement of network wide standards of core
competencies/BHC behavior (consistency of service across a diversity of sites)
Value of the Simulation Experience
Rater Experience• Importance of the use of behavioral anchors in the rating tool– Clear expectations for BHCs– Objective guide for raters
• Rating is helpful for the rater– Exposes rater to a broad range of styles– Discussion of anchors solidifies expectations– Repeated rating of others reinforces norms– Could be valuable for BHCs to rate peers
• Performance of the SPs needs to be monitored
BHC Experience
• Expectations prior to the simulated patient exercise • The reality of the simulation and the BHC experience of doing
this exercise• The feedback and reflection process• The role of subsequent Health Federation training in the
following year• How did the collective experience inform the approach in the
second simulated exercise and ongoing in our work?
• Q1: I received an orientation to the experience and the workflow that was well organized and logical– 56% Strongly Agree; 44% Agree
• Q2: The standardized patient interview provides a useful structure in which to practice clinical skills, develop crucial communication abilities and demonstrate achieved clinical competence– 44% Strongly Agree; 50% Agree; 1% Neutral
Post BHC Survey
• Q3: I anticipate that the feedback I received from the standardized patient will shape my clinical practice going forward– 38% Strongly Agree; 50% Agree; 11% Neutral
• Q4: The rater feedback I received accurately targeted clinical skills that I would like to improve– 22% Strongly Agree; 66% Agree; 11% Neutral
• Q5: Overall, I am glad that I had the opportunity to test out my skills using a standardized patient– 50% Strongly Agree; 44% Agree; 5% Neutral
Post BHC Survey
Next Steps
• BHC Rating Tool and WAI are potentially useful supervision/shadowing tools
• Ways to replicate SP experience without a lab or sufficient funds• Video tape• Shadowing• Role Playing using patient/BHC/rater roles
Discussion and Questions