Transcript
Page 1: Electronic management of practice assessment data

© 2014 John Wiley & Sons Ltd. THE CLINICAL TEACHER 2014; 11: 381–386 381381

Assessment in practice

Electronic management of practice assessment data Brenda Stutsky , Faculty of Medicine , University of Manitoba , Winnipeg , Canada

SUMMARY Background : The assessment of a practising physician ’ s performance may be conducted for various reasons, including licensure. In response to a request from the College of Physicians and Surgeons of Manitoba ( CPSM ), the Division of Continuing Professional Development in the Faculty of Medicine, University of Manitoba, has established a practice-based assessment programme – the Manitoba Practice Assessment Program ( MPAP ) – as the College needed a method to evaluate the compe-tence and performance of physi-cians on the conditional register.

Context : Using a multifaceted approach and Can MEDS as a guiding framework, a variety of practice-based assessment surveys and tools were developed and piloted. Because of the challenge of collating data, the MPAP team needed a computerised solution to manage the data and assessment process. Innovation : Over a 2–year period, a customised web-based forms and information management system was designed, developed, tested and implemented. The secure and robust system allows the MPAP team to create assess-ment surveys and tools in which each item is mapped to Canadian

Medical Education Directives for Specialists (CanMEDS) roles and competencies. Reports can be auto-generated, summarising a physician ’ s performance on specifi c competencies and roles. Overall, the system allows the MPAP team to effectively manage all aspects of the assessment programme. Implications : Throughout all stages of design to implementa-tion, a variety of lessons were learned that can be shared with those considering building their own customised web-based system. The key to success is active involvement in all stages of the process!

The assessment of a practising doctor’s performance may be conducted for various reasons, including licensure

tct_12159.indd 381tct_12159.indd 381 7/17/2014 12:30:33 PM7/17/2014 12:30:33 PM

Page 2: Electronic management of practice assessment data

382 © 2014 John Wiley & Sons Ltd. THE CLINICAL TEACHER 2014; 11: 381–386

INTRODUCTION

The assessment of a practising physician ’ s performance may be

conducted for a variety of reasons, including identifying suboptimal practices, as a continuing professional development activity or for validating performance for a stakeholder. 1 The College of Physicians and Surgeons of Manitoba (CPSM) required a method to validate the compe-tence and performance of physicians on its conditional register, and enlisted the Faculty of Medicine at the University of Manitoba to develop a high-stakes manda-tory practice-based assessment programme.

CONTEXT

In 2010, work began on an as-sessment programme designed for physicians who have com-pleted postgraduate training, have practised for at least 2 years and for a variety of reasons have not achieved Royal College of Physicians and Surgeons of Canada (RCPSC) or The College of Family Physicians of Canada (CFPC) certifi cation. The outcome of the assessment enables the CPSM to determine whether to grant full registration to these physicians or not allow them to continue to practise in the province. Based on best practices, it was determined that a multifaceted approach would be needed to assess all components of a physician ’ s prac-tice using the Canadian Medical Education Directives for Specialists (CanMEDS) roles and competen-cies as the guiding framework. 1–6 The established programme, called the Manitoba Practice Assessment Program (MPAP), was piloted in 2011 and involves fi ve main com-ponents: (1) self-assessment; (2) multisource feedback or 360° feed-back; (3) chart audit/chart-stimu-lated recall; (4) interviews; and (5) direct observation. A team of two physicians and one non-physician

health care provider conducts the on-site assessment that includes the last three components.

The self-assessment compo-nent involves the completion of two surveys. First, physician candidates provide their educa-tional and practice history, and details of their current scope of practice. Next, via an 87–item refl ective practice survey, physi-cians self-assess their performance on a fi ve-point Likert scale (i.e. 5, among the best; 4, top half; 3, average; 2, bottom half; 1, among the worst; and ‘unable to assess is not scored’) and write a refl ective note for each of the CanMEDS roles. Specialists also complete a clinical skills checklist.

Multisource feedback is obtained from physician col-leagues request original be used, interprofessional colleagues and from patients. Colleagues rate the physician candidate on the same fi ve-point scale used by candi-dates in the self-assessment (i.e. from ‘among the best’ to ‘among the worst’; Figure 1 ), whereas patients use a strongly agree to strongly disagree scale. Multisource feedback surveys

range in length from 36 to 57 items, and every question on these and other surveys are mapped to one or more CanMEDS competencies.

Data collected during the chart audit and chart-stimulated recall session are recorded using a satisfactory/unsatisfactory scale. The main evaluation criteria include legibility, record keeping, clinical assessment, diagnosis, investigation, referral, treatment and management, and follow-up.

Interviews are conducted with one or more medical colleagues, usually a supervisor, and medical students and residents, if applicable. Interview guides have been created, and responses are recorded on the guides.

During the on-site assess-ment, fl ow sheets are used to record patient interactions, treatments and procedures, and diagnostic imaging interpreta-tions. Flow sheets range in length from 11 to 25 items using a satisfactory/unsatisfactory scale. An additional 36–item fi nal report tool is used to record

It was determined

that a multifaceted

approach would be needed to

assess all components of

a doctor’s practice

tct_12159.indd 382tct_12159.indd 382 7/17/2014 12:30:34 PM7/17/2014 12:30:34 PM

Page 3: Electronic management of practice assessment data

© 2014 John Wiley & Sons Ltd. THE CLINICAL TEACHER 2014; 11: 381–386 383

overall performance on each of the competencies, taking into consideration assessment data from all tools.

Based on the pilot test, we identifi ed that the collation of data to produce various reports based on the CanMEDS roles and competencies was a signifi cant challenge, given the number and length of the surveys and tools. The solution was to computerise the entire process, convert all tools into web-based forms and to auto-generate the reports.

INNOVATION

A variety of regulatory authorities and organisations offer physicians access to online systems designed to track continuing professional development activities, or for recertifi cation or revalidation purposes 7–10 ; however, the pur-pose and functionalities of those systems differ from what was required. In reviewing the litera-ture and completing an environ-mental scan , we did not fi nd an appropriate system or an assess-ment team that was electronically managing high-stakes data that included mapping to competen-cies. Therefore, we hired a local

computer programming company to work with us to build a custom-ised software system to electroni-cally manage our assessment data.

Our forms and information management system is separated into fi ve main sections: (1) form management; (2) user manage-ment; (3) document management; (4) reporting; and (5) system settings. In terms of form management, we are able to create custom forms by manipulat-ing the layout of the forms and incorporating a variety of respons-es, such as check boxes, radio buttons, drop-down lists or text boxes, to name a few (Figure 2 ). A key component of the forms, which is unique to this system, is the ability to map each item to one or more of the seven CanMEDS roles and the 154 RCPSC or 189 CFPC competencies (see Figure 3 ).

User categories include administrators, assessors, candidates, colleagues and patients. Administrators have full access to all sections of the system, including the ability to login as any other user. In preparation for an assessment, assessors can access self-assess-ment data entered directly into

the system by the candidate, eliminating the need to copy and courier documents to assessors. For the on-site assessment, assessors are provided with laptops and mobile Internet sticks so that they can input assessment data. Assessors are also provided with hardcopy forms as a backup, and depend-ing on the situation and location, assessors may record observations on the hardcopy forms and then later input data into the system. Physician and interprofessional colleagues complete the 360° surveys online; however, patients are given hardcopy surveys that are returned to the MPAP offi ce for data input.

The document management section of the system allows for the uploading of PDF documents. There is the ability to assign document access to one or more groups or individual users.

A variety of reports can easily be generated from the system, including a multisource feedback report (see Figure 4 ), a breakdown of the chart audit, a summary of the onsite assessment, and a comprehensive report that includes competencies and an average score from tools measur-ing that competency (see Figure 5 ). General MPAP reports include the monitoring of candi-date fl ow, progress at-a-glance and details of the assessors.

Another unique feature is that through the system settings, any standards or competencies (not only CanMEDS) can be entered and mapped to items without any additional program-ming. For each stage of the process e–mails can be set automatically: for example, when a physician candidate completes all self-assessment forms, an e–mail is automatically sent to an MPAP coordinator.

Overall, the forms and information management system has proven to be a robust

The collation of data to produce various reports based on the CanMeds roles and competencies was a signifi cant challenge

Figure 1 . Example of interprofessional colleague survey

tct_12159.indd 383tct_12159.indd 383 7/17/2014 12:30:37 PM7/17/2014 12:30:37 PM

Page 4: Electronic management of practice assessment data

384 © 2014 John Wiley & Sons Ltd. THE CLINICAL TEACHER 2014; 11: 381–386

platform that allows us to effectively manage all aspects of our MPAP. Our end-users verbally report satisfaction with general ease of use, and we rarely receive requests for assistance. As discovered during our pilot, it was too diffi cult to be able to generate manual reports given the volume of information and number of competencies; however, we can now generate multiple reports with just a couple of clicks. The system has worked so well that it has been duplicated and will be used by

another assessment type pro-gramme in our faculty.

IMPLICATIONS

Throughout our 2–year design, development, testing and im-plementation period, we learned many lessons that can be shared with those who are considering building their own customised web-based system.

• Envision what you want the end product to look like, even if you do not build all options

at one time. It is much easier for computer programmers to design the system as a total package as opposed to adding functionalities later.

• Select a reputable company, get references and ensure that you will be able to trust that the company can deliver the fi nal product within the estimated budget. In our case, the company underestimated the time to build the system, but still honoured the original quote of $60 000.00 CAN.

Our end-users verbally report

satisfaction with general ease of use

Figure 3 . View of competency selection drop-downs for each form fi eld

Figure 2 . View of form-building page

tct_12159.indd 384tct_12159.indd 384 7/17/2014 12:30:37 PM7/17/2014 12:30:37 PM

Page 5: Electronic management of practice assessment data

© 2014 John Wiley & Sons Ltd. THE CLINICAL TEACHER 2014; 11: 381–386 385

• Ensure all ownership, licens-ing, copyright and intellectual property issues are addressed prior to development.

• Work out hosting details with your computer programmers, as you may host the system in one place during development and then switch to another server at a later time. Clearly outline the security and privacy mecha-nisms, along with the data backup plan, which may include backup in another province, state or country.

• Have your entire process clearly outlined for the computer programmers. In our case, we had a binder of all forms and process steps that had been piloted. A bad process will not improve just because it becomes electronic.

• Establish clear timelines for each phase of the project and meet with the computer programmers regularly. Include a demonstration of work to date, so that feedback and direction can be provided

throughout the development process. Expect development time to take approximately 1 year.

• Ensure proper beta testing before going live. With any customised system, expect the debugging process to take up to 1 year.

• Despite knowing the computer literacy level of your end-users, plan on developing step-by-step user guides with screen captures and clear instructions for all end-users.

Multisource Feedback Report - Note: Higher scores reflect more positive ratings.

CanMEDS Roles

Medical Expert 3.9 4.2 4.8 3.9 3.8

Communicator 3.8 3.9 4.6 3.8 3.8

Collaborator 4.0 4.2 4.7 3.7 4.3

Manager 3.4 4.0 4.1 4.4 3.3

Health Advocate 3.6 4.1 4.4 3.7 4.1

Scholar 3.5 3.9 4.3 2.4 3.3

Professional

1X = Unable to Assess, 1 = Among the Worst, 2 = Bottom Half, 3 = Average, 4 = Top Half, 5 = Among the Best21 = Strongly Disagree, 2 = Disagree, 3 = Neither Agree Nor Disagree, 4 = Agree, 5 = Strongly Agree31 = Almost Never, 2 = Once in a While, 3 = Fairly Often, 4 = Very Frequently, 5 = Almost Always41 = Among the Worst, 2 = Bottom Half, 3 = Average, 4 = Top Half, 5 = Among the best

Scoring Legend

4.1 4.1 4.9 4.1 3.7

Averages 3.8 4.1 4.5 3.7 3.8

Respondents 7 9 43 43 1

Physician Colleagues1

(range 1-5)

Interprofessional1

Colleagues(range 1-5)

Patients2

(range 1-5)

Patients3

Empowerment(range 1-5)

Self-AssessmentScores4

(range 1-5)

360 Degree Survey Scores

Figure 4 . Example of multisource feedback report

Figure 5 . Example of full competency report

Despite knowing the computer literacy level of your end-users, plan on developing step-by-step user guides

tct_12159.indd 385tct_12159.indd 385 7/17/2014 12:30:38 PM7/17/2014 12:30:38 PM

Page 6: Electronic management of practice assessment data

386 © 2014 John Wiley & Sons Ltd. THE CLINICAL TEACHER 2014; 11: 381–386

• User support is a continuing requirement for both the administrative and end-user sides of the system, so plan and budget for continuing support and potential changes to the system.

CONCLUSION

There is a need for the effective electronic management of evalu-ation data of all types, whether it is applicable to assessment in undergraduate, postgraduate or continuing education. Whatever system is used, users need to be confi dent that the system has been built with the end-user in mind to ensure optimal usability. Continuous input and testing throughout all phases of the design, development, testing and implementation is key to pro-ducing a system that will meet

the needs of all end-users and stakeholders.

REFERENCES

1 . Scott A , Phelps G , Brand C . Assessing individual clinical per-formance: a primer for physicians . Intern Med J 2011 ; 41 : 144 – 155 .

2 . Fromme HB , Karani R , Downing SM . Direct observation in medical education: review of the literature and evidence for validity . Mt Sinai J Med 2009 ; 76 : 371 – 371 .

3 . Hauer KE , Ciccone A , Henzel TR , Katsufrakis P , Miller SH , Norcross WA , Papadakis MA , Irby DM . Remediation of the defi ciencies of physicians across the continuum from medical school to practice: a thematic review of the literature . Acad Med 2009 ; 84 : 1822 – 1832 .

4 . Khalifa KA , Ansari AA , Violato C , Donnon T . Multisource feed-back to assess surgical practice: a systematic review . J Surg Educ 2013 ; 70 : 475 – 486 .

5 . Frank JR , ed. The CanMEDS 2005 physician competency framework .

Ottawa : The Royal College of Physicians and Surgeons of Canada ; 2005 .

6 . Stutsky BJ , Singer M , Renaud R . Determining the weighting and relative importance of CanMEDS roles and competencies . BMC Res Notes 2012 ; 5 : 354 .

7 . Royal College of Physicians . CPD, education & revalidation . Available at http://www.rcplondon.ac.uk/cpd . Accessed on 6 August 2013.

8 . Royal College of Physicians and Surgeons of Canada . Mainport . Available at http://www.royalcollege.ca/portal/page/portal/rc/members/moc/about_mainport . Accessed on 6 August 2013.

9 . The College of Family Physicians of Canada . Continuing professional development (CPD) . Available at http://www.cfpc.ca/CPD . Accessed on 6 August 2013.

10 . The Foundation Programme . E–portfolio . Available at http://www.foundationprogramme.nhs.uk/pages/home/e-portfolio . Accessed on 6 August 2013.

Corresponding author ’ s contact details: Brenda Stutsky, Faculty of Medicine, University of Manitoba, 260 Brodie Centre, 727 McDermot AvenueWinnipeg, Manitoba, Canada, R3E 3P5. E-mail: [email protected]

Funding: The Division of Continuing Professional Development, Faculty of Medicine, University of Manitoba, provided the required fund-ing for the forms and information management system referred to in the article.

Confl ict of interest: The author has no competing interests to declare.

Ethical approval: An innovation is described in the article and ethical approval was not required.

doi: 10.1111/tct.12159

Users need to be confi dent

that the system has been built

with the end-user in

mind to ensure optimal

usability

tct_12159.indd 386tct_12159.indd 386 7/17/2014 12:30:39 PM7/17/2014 12:30:39 PM


Recommended