really good stuff

REFERENCE 1 Lemon TI, Stone BA, Lampard R. Research skills for undergraduates – a must!. Perspect Med Educ 2013;2 (3):174–5. Correspondence: Thomas I Lemon, Institute of Medical Education, School of Medicine, Cardiff University, Cardiff CF14 4XW, UK. Tel: 00 44 29 2068 8101; E-mail: [email protected] doi: 10.1111/medu.12456

Recognising and rewarding clinical educator scholarship Bruce Fisher & Liam Rourke What problems were addressed? Ernest Boyer1 reminded us that scholarship is composed of four domains – teaching, integration, application and discovery – and urged that faculty members be equitably recognised for all four. Faculties of Medicine agree with this ideal, but have struggled with its implementation. Clinical educators continue to occupy lower ranks than those in research paths and are promoted more slowly. Faculty members may lack means of reporting their non-research contributions in ways that communicate the scholarship of their work, and department heads may lack the resources to evaluate these contributions. We posited that user-friendly resources and training might have a favourable impact on the recognition of clinical educators’ scholarly contributions. What was tried? We developed a web-based annual report and faculty development support system to promote more complete and standardised reporting for all domains of scholarly activity, but with a focus on clinical education contributions. Development was informed by published effective practices for reporting scholarship and by wide consultation with stakeholders in the faculty. Major revisions included the use of controlled text entry employing dropdown box pick-lists to promote the reporting of contributions using shared language to make complete and explicit descriptions of scholarship components (including the nature, quantity and quality of contributions, scope of dissemination, and evidence of peer review). We also provided workshops to faculty staff that addressed the rationale for the report, explained how it could improve the quality, reliability and equity of reporting and assessment of scholarship, and demonstrated how to use the form. We also provided online manuals and a help line. A demonstration view of the report is available upon request.

544

What lessons were learned? The report is used in our annual faculty merit increment award process, in which a merit increment of 1 represents ‘average’ performance and greater values represent ‘superior’ performance. To gauge the report’s impact, we reviewed merit increment award data for the 3 years before and 3 years after its implementation. We focused on full-time clinical faculty staff at the rank of assistant professor. ‘Clinical’ was applied to staff who spend at least 15% of their time in clinical activities. We further classified faculty staff as clinical educators or clinical researchers by requiring that the pertinent domain of contribution should represent at least 30% of their job description. Total numbers of ‘superior’ merit increments were pooled for each group over the 3 years prior to and after the intervention. Changes in the frequency of recognition of this ‘superior’ merit were evidenced by assessing the percentages of each group to which it applied before and after the intervention. A total of 1016 merit decisions for assistant professors were analysed. In the 3 years prior to the implementation of the new reporting system, the mean annual percentage of ‘superior’ merit increments was lower in the clinical educator group (17%) than in the clinical researcher group (29%). In the 3 years after implementation, this percentage increased significantly in the clinical educator group (39%; p = 0.003) to approximate that seen in clinical researchers (33%; non-significant increase). We believe that providing reporting tools and faculty development support such as ours can help achieve Boyer’s vision of equitable recognition for all domains of scholarship. REFERENCE 1 Boyer EL. Scholarship Reconsidered: Priorities of the Professoriate. Lawrenceville, NJ, Princeton University Press 1990. Correspondence: Bruce Fisher, Faculty of Medicine and Dentistry, University of Alberta, 5-112 Clinical Sciences Building, 11350-83 Avenue, Edmonton, Alberta, Canada T6G 2G3. Tel: 00 1 780 492 5111; E-mail: [email protected] doi: 10.1111/medu.12441

Near-peer facilitation: a win–win simulation Marilina Antonelou, Sanjay Krishnamoorthy, Gemma Walker & Nick Murch What problem was addressed? Peer and ‘nearpeer’ learning are rapidly expanding areas of

ª 2014 John Wiley & Sons Ltd. MEDICAL EDUCATION 2014; 48: 522–548

really good stuff educational research across many disciplines. The development of teaching skills forms an essential part of the foundation programme competencies expected of junior doctors.1 We developed a novel method of ensuring the sustainability of a medical student simulation training programme by cascading the facilitation training of junior doctors from one year cohort to the next, thereby ensuring the continuing professional development of both groups. What was tried? A medical student simulation programme was initially established by two junior doctors with the support of clinical teaching leads. Mapping of the undergraduate curriculum was used to create simulation scenarios that address technical and human factor skills. Weekly simulation sessions were held throughout the course of a year in the hospital’s simulation centre. The Laerdal SimMan was used. Following the annual mass rotation of junior doctors the following year, we were challenged to achieve the sustainability of the programme by recruiting new junior doctors as facilitators. Two separate facilitator training courses, each consisting of 2-hour evening sessions, were established, recruiting a total of 35 potential facilitators. They focused on debriefing skills and the role of human factors, as well as the more technical skills of operating a classic Laerdal SimMan manikin. Twenty-eight Year 3 medical students were invited to participate in the simulated management of an acutely unwell patient. Common scenario themes included upper gastrointestinal bleed, sepsis, myocardial infarction and acute asthma. Qualitative and quantitative data were collected using pre- and postcourse Likert scale-based feedback questionnaires. Medical students rated the educational value of the simulation experience at a mean of 4.7 out of 5. Mean ratings of perceived medical student confidence in managing acutely unwell patients increased from 1.8 to 3.5 out of 5 after the course. In free-text comments, students highlighted the use of the SBAR (situation, background, assessment, recommendation) protocol for effective handover, the ABCDE (airway, breathing, circulation, disability, exposure) approach for a systematic patient assessment, and the importance of the prompt escalation and prioritisation of tasks. Facilitators’ feedback indicated that 28 of the 35 certified participants voluntarily facilitated in the undergraduate simulation programme; 86% of facilitators were Foundation Year doctors and 60% of facilitators facilitated more than two sessions. In free-text comments, the junior doctors admitted that being a facilitator altered aspects of their own

clinical practice: it enhanced awareness of their own limitations, promoted interprofessional collaboration, and reinforced the use of a systematic approach in assessing an acutely unwell patient. Five of the 28 facilitators further explored their interest in medical education by undertaking research projects in medical simulation. What lessons were learned? A near-peer simulation training programme for medical students, run by newly qualified doctors, can be beneficial for both parties. For medical students, it can enhance the development of both technical and non-technical skills prior to qualification. In junior doctors, it can contribute to postgraduate professional development, provide opportunities to practise skills that may often be reserved for more senior colleagues, and possibly motivate doctors to become involved in medical education at an earlier stage in their careers. REFERENCE 1 Qureshi Z, Ross M, Maxwell S, Rodrigues M, Parisinos C, Hall HN. Developing junior doctor-delivered teaching. Clin Teach 2013;10 (2):118–23. Correspondence: Marilina Antonelou, Department of Acute Medicine, Royal Free Hospital, Pond Street, London NW3 2QG, UK. Tel: 00 44 20 02077940500; E-mail: [email protected] doi: 10.1111/medu.12443

Faculty development of an OSCE in an internal medicine clerkship Marcelo Cruzeiro & Valdes Bollela What problems were addressed? At our institution, undergraduate internship used only a structured global assessment as a method of evaluating clinical competence. The faculty coordinators therefore proposed to implement a cognitive and competence assessment rating system to address current issues, although most faculty staff had no previous experience in performance assessment. What was tried? Faculty enablement workshops on the evaluation of clinical competence were held. These focused on the development of an objective structured clinical examination (OSCE) to give feedback to Year 5 undergraduate students of medicine. Weekly meetings were held over 7 months in which the participants discussed developing the OSCE, as well as a blueprint and the building of test stations. Our blueprint was based on a competence matrix which showed how

ª 2014 John Wiley & Sons Ltd. MEDICAL EDUCATION 2014; 48: 522–548

545

Near-peer facilitation: a win-win simulation.

Near-peer facilitation: a win-win simulation. - PDF Download Free
49KB Sizes 4 Downloads 3 Views