Assessment in practice

Electronic management of practice assessment data Brenda Stutsky, Faculty of Medicine, University of Manitoba, Winnipeg, Canada

SUMMARY Background: The assessment of a practising physician’s performance may be conducted for various reasons, including licensure. In response to a request from the College of Physicians and Surgeons of Manitoba (CPSM), the Division of Continuing Professional Development in the Faculty of Medicine, University of Manitoba, has established a practice-based assessment programme – the Manitoba Practice Assessment Program (MPAP) – as the College needed a method to evaluate the competence and performance of physicians on the conditional register.

Context: Using a multifaceted approach and CanMEDS as a guiding framework, a variety of practice-based assessment surveys and tools were developed and piloted. Because of the challenge of collating data, the MPAP team needed a computerised solution to manage the data and assessment process. Innovation: Over a 2–year period, a customised web-based forms and information management system was designed, developed, tested and implemented. The secure and robust system allows the MPAP team to create assessment surveys and tools in which each item is mapped to Canadian

Medical Education Directives for Specialists (CanMEDS) roles and competencies. Reports can be auto-generated, summarising a physician’s performance on specific competencies and roles. Overall, the system allows the MPAP team to effectively manage all aspects of the assessment programme. Implications: Throughout all stages of design to implementation, a variety of lessons were learned that can be shared with those considering building their own customised web-based system. The key to success is active involvement in all stages of the process!

The assessment of a practising doctor’s performance may be conducted for various reasons, including licensure

© 2014 John Wiley & Sons Ltd. THE CLINICAL TEACHER 2014; 11: 381–386 381

tct_12159.indd 381

7/17/2014 12:30:33 PM

It was determined that a multifaceted approach would be needed to assess all components of a doctor’s practice

INTRODUCTION

T

he assessment of a practising physician’s performance may be conducted for a variety of reasons, including identifying suboptimal practices, as a continuing professional development activity or for validating performance for a stakeholder.1 The College of Physicians and Surgeons of Manitoba (CPSM) required a method to validate the competence and performance of physicians on its conditional register, and enlisted the Faculty of Medicine at the University of Manitoba to develop a high-stakes mandatory practice-based assessment programme.

CONTEXT In 2010, work began on an assessment programme designed for physicians who have completed postgraduate training, have practised for at least 2 years and for a variety of reasons have not achieved Royal College of Physicians and Surgeons of Canada (RCPSC) or The College of Family Physicians of Canada (CFPC) certification. The outcome of the assessment enables the CPSM to determine whether to grant full registration to these physicians or not allow them to continue to practise in the province. Based on best practices, it was determined that a multifaceted approach would be needed to assess all components of a physician’s practice using the Canadian Medical Education Directives for Specialists (CanMEDS) roles and competencies as the guiding framework.1–6 The established programme, called the Manitoba Practice Assessment Program (MPAP), was piloted in 2011 and involves five main components: (1) self-assessment; (2) multisource feedback or 360° feedback; (3) chart audit/chart-stimulated recall; (4) interviews; and (5) direct observation. A team of two physicians and one non-physician

health care provider conducts the on-site assessment that includes the last three components. The self-assessment component involves the completion of two surveys. First, physician candidates provide their educational and practice history, and details of their current scope of practice. Next, via an 87–item reflective practice survey, physicians self-assess their performance on a five-point Likert scale (i.e. 5, among the best; 4, top half; 3, average; 2, bottom half; 1, among the worst; and ‘unable to assess is not scored’) and write a reflective note for each of the CanMEDS roles. Specialists also complete a clinical skills checklist. Multisource feedback is obtained from physician colleagues request original be used, interprofessional colleagues and from patients. Colleagues rate the physician candidate on the same five-point scale used by candidates in the self-assessment (i.e. from ‘among the best’ to ‘among the worst’; Figure 1), whereas patients use a strongly agree to strongly disagree scale. Multisource feedback surveys

range in length from 36 to 57 items, and every question on these and other surveys are mapped to one or more CanMEDS competencies. Data collected during the chart audit and chart-stimulated recall session are recorded using a satisfactory/unsatisfactory scale. The main evaluation criteria include legibility, record keeping, clinical assessment, diagnosis, investigation, referral, treatment and management, and follow-up. Interviews are conducted with one or more medical colleagues, usually a supervisor, and medical students and residents, if applicable. Interview guides have been created, and responses are recorded on the guides. During the on-site assessment, flow sheets are used to record patient interactions, treatments and procedures, and diagnostic imaging interpretations. Flow sheets range in length from 11 to 25 items using a satisfactory/unsatisfactory scale. An additional 36–item final report tool is used to record

382 © 2014 John Wiley & Sons Ltd. THE CLINICAL TEACHER 2014; 11: 381–386

tct_12159.indd 382

7/17/2014 12:30:34 PM

the system by the candidate, eliminating the need to copy and courier documents to assessors. For the on-site assessment, assessors are provided with laptops and mobile Internet sticks so that they can input assessment data. Assessors are also provided with hardcopy forms as a backup, and depending on the situation and location, assessors may record observations on the hardcopy forms and then later input data into the system. Physician and interprofessional colleagues complete the 360° surveys online; however, patients are given hardcopy surveys that are returned to the MPAP office for data input. Figure 1. Example of interprofessional colleague survey

overall performance on each of the competencies, taking into consideration assessment data from all tools.

computer programming company to work with us to build a customised software system to electronically manage our assessment data.

Based on the pilot test, we identified that the collation of data to produce various reports based on the CanMEDS roles and competencies was a significant challenge, given the number and length of the surveys and tools. The solution was to computerise the entire process, convert all tools into web-based forms and to auto-generate the reports.

Our forms and information management system is separated into five main sections: (1) form management; (2) user management; (3) document management; (4) reporting; and (5) system settings. In terms of form management, we are able to create custom forms by manipulating the layout of the forms and incorporating a variety of responses, such as check boxes, radio buttons, drop-down lists or text boxes, to name a few (Figure 2). A key component of the forms, which is unique to this system, is the ability to map each item to one or more of the seven CanMEDS roles and the 154 RCPSC or 189 CFPC competencies (see Figure 3).

INNOVATION A variety of regulatory authorities and organisations offer physicians access to online systems designed to track continuing professional development activities, or for recertification or revalidation purposes7–10; however, the purpose and functionalities of those systems differ from what was required. In reviewing the literature and completing an environmental scan, we did not find an appropriate system or an assessment team that was electronically managing high-stakes data that included mapping to competencies. Therefore, we hired a local

User categories include administrators, assessors, candidates, colleagues and patients. Administrators have full access to all sections of the system, including the ability to login as any other user. In preparation for an assessment, assessors can access self-assessment data entered directly into

The collation of data to produce various reports based on the CanMeds roles and competencies was a significant challenge

The document management section of the system allows for the uploading of PDF documents. There is the ability to assign document access to one or more groups or individual users. A variety of reports can easily be generated from the system, including a multisource feedback report (see Figure 4), a breakdown of the chart audit, a summary of the onsite assessment, and a comprehensive report that includes competencies and an average score from tools measuring that competency (see Figure 5). General MPAP reports include the monitoring of candidate flow, progress at-a-glance and details of the assessors. Another unique feature is that through the system settings, any standards or competencies (not only CanMEDS) can be entered and mapped to items without any additional programming. For each stage of the process e–mails can be set automatically: for example, when a physician candidate completes all self-assessment forms, an e–mail is automatically sent to an MPAP coordinator. Overall, the forms and information management system has proven to be a robust

© 2014 John Wiley & Sons Ltd. THE CLINICAL TEACHER 2014; 11: 381–386 383

tct_12159.indd 383

7/17/2014 12:30:37 PM

Our end-users verbally report satisfaction with general ease of use

Figure 2. View of form-building page

Figure 3. View of competency selection drop-downs for each form field

platform that allows us to effectively manage all aspects of our MPAP. Our end-users verbally report satisfaction with general ease of use, and we rarely receive requests for assistance. As discovered during our pilot, it was too difficult to be able to generate manual reports given the volume of information and number of competencies; however, we can now generate multiple reports with just a couple of clicks. The system has worked so well that it has been duplicated and will be used by

another assessment type programme in our faculty.

IMPLICATIONS Throughout our 2–year design, development, testing and implementation period, we learned many lessons that can be shared with those who are considering building their own customised web-based system. • Envision what you want the end product to look like, even if you do not build all options

at one time. It is much easier for computer programmers to design the system as a total package as opposed to adding functionalities later. • Select a reputable company, get references and ensure that you will be able to trust that the company can deliver the final product within the estimated budget. In our case, the company underestimated the time to build the system, but still honoured the original quote of $60 000.00 CAN.

384 © 2014 John Wiley & Sons Ltd. THE CLINICAL TEACHER 2014; 11: 381–386

tct_12159.indd 384

7/17/2014 12:30:37 PM

Multisource Feedback Report - Note: Higher scores reflect more positive ratings. 360 Degree Survey Scores CanMEDS Roles

Physician Colleagues1 (range 1-5)

Interprofessional1 Colleagues (range 1-5)

Patients2 (range 1-5)

Patients3 Empowerment (range 1-5)

Self-Assessment Scores4 (range 1-5)

Medical Expert

3.9

4.2

4.8

3.9

3.8

Communicator

3.8

3.9

4.6

3.8

3.8

Collaborator

4.0

4.2

4.7

3.7

4.3

Manager

3.4

4.0

4.1

4.4

3.3

Health Advocate

3.6

4.1

4.4

3.7

4.1

Scholar

3.5

3.9

4.3

2.4

3.3

Professional

4.1

4.1

4.9

4.1

3.7

Averages

3.8

4.1

4.5

3.7

3.8

7

9

43

43

1

Respondents

Despite knowing the computer literacy level of your end-users, plan on developing step-by-step user guides

Scoring Legend 1X = Unable to Assess, 1 = Among the Worst, 2 = Bottom Half, 3 = Average, 4 = Top Half, 5 = Among the Best 21 = Strongly Disagree, 2 = Disagree, 3 = Neither Agree Nor Disagree, 4 = Agree, 5 = Strongly Agree 31 = Almost Never, 2 = Once in a While, 3 = Fairly Often, 4 = Very Frequently, 5 = Almost Always 41 = Among the Worst, 2 = Bottom Half, 3 = Average, 4 = Top Half, 5 = Among the best

Figure 4. Example of multisource feedback report

Figure 5. Example of full competency report

• Ensure all ownership, licensing, copyright and intellectual property issues are addressed prior to development. • Work out hosting details with your computer programmers, as you may host the system in one place during development and then switch to another server at a later time. Clearly outline the security and privacy mechanisms, along with the data backup plan, which may include backup in another province, state or country.

• Have your entire process clearly outlined for the computer programmers. In our case, we had a binder of all forms and process steps that had been piloted. A bad process will not improve just because it becomes electronic. • Establish clear timelines for each phase of the project and meet with the computer programmers regularly. Include a demonstration of work to date, so that feedback and direction can be provided

throughout the development process. Expect development time to take approximately 1 year. • Ensure proper beta testing before going live. With any customised system, expect the debugging process to take up to 1 year. • Despite knowing the computer literacy level of your endusers, plan on developing step-by-step user guides with screen captures and clear instructions for all end-users.

© 2014 John Wiley & Sons Ltd. THE CLINICAL TEACHER 2014; 11: 381–386 385

tct_12159.indd 385

7/17/2014 12:30:38 PM

Users need to be confident that the system has been built with the end-user in mind to ensure optimal usability

• User support is a continuing requirement for both the administrative and end-user sides of the system, so plan and budget for continuing support and potential changes to the system.

CONCLUSION There is a need for the effective electronic management of evaluation data of all types, whether it is applicable to assessment in undergraduate, postgraduate or continuing education. Whatever system is used, users need to be confident that the system has been built with the end-user in mind to ensure optimal usability. Continuous input and testing throughout all phases of the design, development, testing and implementation is key to producing a system that will meet

the needs of all end-users and stakeholders. REFERENCES 1. Scott A, Phelps G, Brand C. Assessing individual clinical performance: a primer for physicians. Intern Med J 2011;41:144–155. 2. Fromme HB, Karani R, Downing SM. Direct observation in medical education: review of the literature and evidence for validity. Mt Sinai J Med 2009;76:371–371. 3. Hauer KE, Ciccone A, Henzel TR, Katsufrakis P, Miller SH, Norcross WA, Papadakis MA, Irby DM. Remediation of the deficiencies of physicians across the continuum from medical school to practice: a thematic review of the literature. Acad Med 2009;84:1822–1832. 4. Khalifa KA, Ansari AA, Violato C, Donnon T. Multisource feedback to assess surgical practice: a systematic review. J Surg Educ 2013;70:475–486. 5. Frank JR, ed. The CanMEDS 2005 physician competency framework.

Ottawa: The Royal College of Physicians and Surgeons of Canada; 2005. 6. Stutsky BJ, Singer M, Renaud R. Determining the weighting and relative importance of CanMEDS roles and competencies. BMC Res Notes 2012;5:354. 7. Royal College of Physicians. CPD, education & revalidation. Available at http://www.rcplondon.ac.uk/cpd. Accessed on 6 August 2013. 8. Royal College of Physicians and Surgeons of Canada. Mainport. Available at http://www.royalcollege. ca/portal/page/portal/rc/members/ moc/about_mainport. Accessed on 6 August 2013. 9. The College of Family Physicians of Canada. Continuing professional development (CPD). Available at http://www.cfpc.ca/CPD. Accessed on 6 August 2013. 10. The Foundation Programme. E–portfolio. Available at http:// www.foundationprogramme.nhs.uk/ pages/home/e-portfolio. Accessed on 6 August 2013.

Corresponding author’s contact details: Brenda Stutsky, Faculty of Medicine, University of Manitoba, 260 Brodie Centre, 727 McDermot AvenueWinnipeg, Manitoba, Canada, R3E 3P5. E-mail: [email protected]

Funding: The Division of Continuing Professional Development, Faculty of Medicine, University of Manitoba, provided the required funding for the forms and information management system referred to in the article. Conflict of interest: The author has no competing interests to declare. Ethical approval: An innovation is described in the article and ethical approval was not required. doi: 10.1111/tct.12159

386 © 2014 John Wiley & Sons Ltd. THE CLINICAL TEACHER 2014; 11: 381–386

tct_12159.indd 386

7/17/2014 12:30:39 PM

Electronic management of practice assessment data.

The assessment of a practising physician's performance may be conducted for various reasons, including licensure. In response to a request from the Co...
2MB Sizes 2 Downloads 3 Views