Insights

Teaching the breast examination using critiques Kulamakan Kulasegaram1,4, Nicole Woods2,4, Kerry Knickle3, Frances Wright2 and Tulin Cil2,4 1

Family and Community Medicine, University of Toronto, Toronto, Ontario, Canada Department of Surgery, University of Toronto, Toronto, Ontario, Canada 3 The Standardized Patient Program, Faculty of Medicine, University of Toronto, Toronto, Ontario, Canada 4 The Wilson Centre, Toronto, Ontario, Canada 2

Self-reports of graduating medical students suggest that their training on the clinical breast examination has been inadequate

T

he intimate nature of sensitive physical examinations makes training for undergraduate students difficult. These examinations can induce anxiety in some undergraduate students.1 Self-reports of graduating medical students suggest that their training on the clinical breast examination (CBE) has been inadequate.2 More than 80 per cent felt that they required further training upon completion of undergraduate studies,2 and other studies have found that CBE skills

deteriorated over time during undergraduate training.3 In this pilot study, we sought to address this issue by developing a novel learning tool that required students to engage in critical analysis of scripted clinical encounters followed by standardised expert feedback. Previous research suggests that observational practice – learning by viewing performance – can be improved by critiquing or giving feedback on the performance.4 This article reports the development and piloting of an

instructional tool – the Encounter Critique Module (ECM) – and an assessment process that can familiarise students with CBE best practices. Using critical analysis, derived from the literature in observational practice, this suggests that viewing specific skills being performed can help novices to learn.

INNOVATION Development of the ECM A team of two experienced surgeons identified the necessary technical and non-technical

156 © 2015 John Wiley & Sons Ltd. THE CLINICAL TEACHER 2016; 13: 156–158

tct_12368.indd 156

3/7/2016 1:23:07 PM

components of CBE using local experts and the clinical literature.5 This review led to a comprehensive list of best practice for CBE and components of the ‘ideal’ examination. Three expert clinicians in breast oncology independently verified this list of examination components (see Table 1). The list, combined with clinical expert experience, was developed into three scripts depicting the range of typical clinical encounters, labelled ‘standard’, ‘exemplary’ and ‘poor’. The encounters included clinician–patient interaction during the performance of a CBE, simulating communication and technical skill components of the examination. The encounters were filmed to create three 5–10 minute videos depicting a clinician examining three different patients according to the standard, exemplary and poor scripts. These videos were then edited for the purposes of the ECM. The ECM was designed in two consecutive single-session phases. In phase 1 of the ECM, learners would view the standard encounter, followed by either the exemplary or the poor encounter in a random presentation. For each encounter, learners would have to identify all of the positive and negative features of the clinician’s performance. Learners could pause and rewind the video at their own pace. In phase 2, learners would view the same encounters but with videos edited to pause and visually identify the features scripted by experts to depict positive or negative clinical performance. Learners would then compare the features to their own list from phase 1. Assessment An assessment tool was developed so that teachers could provide personalised feedback for

Table 1. List of clinical breast exam (CBE) components Preparing the Patient for the CBE

Introduction to the patient Permission-seeking Explaining the process Proper draping

Exam

Sitting and supine positioning

Both pre-clinical and clinical students reported low experience with sensitive examinations like CBE

Systematic palpation of the breast and nipple Palpation of regional lymphatics Closing

Re-draping Planning follow-up Opportunity for questions

Interaction

Consistent eye contact Empathetic communication Assessment of patient needs

future instruction. Each critique from phase 1 was evaluated on a seven-point scale for completeness, with a higher score indicating more complete critiques. Although the assessors were given the list of scripted features, they were also told to use their own judgment in assessing the critiques because of the variability in how students labelled the features. Thus, a global rating scale was felt to be appropriate for the scoring. Piloting Thirty undergraduate medical students at the University of Toronto were recruited via e-mail for the pilot: 12 were in preclinical training (years 1 or 2) and 18 were in clinical training (years 3 or 4). The participants were given a brief questionnaire asking about the number of times they had observed or performed eight different clinical examinations, including the CBE, to compare experience and opportunities for practise. Participants were also asked to express their satisfaction via a five-point scale with the current opportunities to practise CBE and the other examinations. Three independent raters were recruited to evaluate the critiques.

EVALUATION Both pre-clinical and clinical students reported low experience with sensitive examinations like the CBE, compared with other examinations such as cardiovascular examination. The average number of cardiovascular examinations exceeded 30, whereas the median number of CBEs was three for clinical students (see Table 2). Participants in clinical years expressed dissatisfaction with CBE practice, with the average score on the satisfaction scale being 2.2. The majority of participants took 40–60 minutes for both phases. The raters assessing the phase-1 critiques had acceptable inter-rater reliability (intraclass correlation, ICC, 0.73). There were no differences between pre-clinical and clinical learners in the completeness of the critiques, but the participants’ performance in identifying the key features improved between the standard encounters and the poor or exemplary encounters [F(2,54) = 13.9, p < 0.001; Cohen’s d = 0.78]. Feedback from participants suggested that they felt the opportunity to critique and receive corrective feedback to be valuable.

© 2015 John Wiley & Sons Ltd. THE CLINICAL TEACHER 2016; 13: 156–158 157

tct_12368.indd 157

3/7/2016 1:23:08 PM

Students in clinical training had minimal practise and satisfaction with the CBE

Table 2. Number of physical examinations performed by type

Breast

Pelvic

Rectal

Respiratory

Median

SD

Pre-clinical

0

0.67

Clinical

3

2.45

Pre-clinical

0

0.30

Clinical

3

9.75

Pre-clinical

0

0.69

Clinical

3

5.32

Pre-clinical

6

7.93

20

40.63

4

6.79

20

49.39

6

7.26

20

36.88

2

3.28

12.5

44.49

2

4.54

10

11.18

Clinical Cardiovascular

Pre-clinical Clinical

Abdominal

Pre-clinical Clinical

Musculoskeletal

Pre-clinical Clinical

Neurological

Pre-clinical Clinical

IMPLICATIONS The purpose of this pilot study was to examine a new tool for familiarising students with the CBE through the critiquing of simulated clinical encounters. Our questionnaire showed that even students in clinical training had minimal practise and

satisfaction with the CBE. These results suggest that learning the CBE is a challenge for undergraduate students. The ECM can be one tool in addressing this issue. Our pilot showed the tool to be feasible, efficient, and to have acceptable interrater reliability. Participants also evidenced some learning,

as the quality of their critique improved across encounters. Although this work is promising, there are limitations including the question of transfer to CBE performance and validation of the assessment against other evaluations. The synergy with other teaching methods also needs exploration. Future directions include longitudinal studies to assess transfer to clinical performance and integration into the curriculum. REFERENCES 1.

Sarikaya O, Civaner M, Kalaca S. The anxieties of medical students related to clinical training. Int J Clin Pract 2006;60:1414–1418.

2.

Kann PE, Lane DS. Breast cancer screening knowledge and skills of students upon entering and exiting a medical school. Acad Med 1998;73:904–906.

3.

Lee K, Dunlop D, Dolan NC. Do clinical breast examination skills improve during medical school? Acad Med 1998;73:1013–1019.

4.

Grierson LEM, Barry M, Kapralos B, Carnahan H, Dubrowski A. The role of collaborative interactivity in the observational practice of clinical skills. Med Educ 2012;46(4):409–416.

5.

Chalabian J, Formenti S, Russell C, Pearce J, Dunnington G. Comprehensive Needs Assessment of Clinical Breast Evaluation Skills of Primary Care Residents. Ann Surg Oncol 1998;5:166–172.

Corresponding author’s contact details: Dr Tulin Cil, Division of General Surgery, University Health Network & Women’s College Hospital, 3–130 Dept of Surgical Oncology, 610 University Avenue, Toronto, Ontario, M5G 2M9, Canada. E-mail: [email protected]

Funding: Education Development Fund Grants provided by the Faculty of Medicine, the Undergraduate Medical Program and the Department of Surgery at the University of Toronto. The study also used resources and personnel from the Standardized Patient Program and the Wilson Centre at the University of Toronto. Conflict of interest: None. Acknowledgements: We wish to thank Cheryl Ku for her assistance in recruitment and the Undergraduate Medical Education program, as well as the Standardized Patient Program at the Faculty of Medicine, University of Toronto. Ethical approval: Ethical approval was given by the University of Toronto Health Sciences Research Ethics Board (protocol reference #23691). doi: 10.1111/tct.12368

158 © 2015 John Wiley & Sons Ltd. THE CLINICAL TEACHER 2016; 13: 156–158

tct_12368.indd 158

3/7/2016 1:23:08 PM

Teaching the breast examination using critiques.

Teaching the breast examination using critiques. - PDF Download Free
177KB Sizes 7 Downloads 13 Views