commentaries

The science of learning and medical education William C McGaghie & Piero Marco Fisichella The traditional approach to medical curriculum design or revision involves work performed by a committee of medical and basic science experts, frequently with consultation with education professionals. Curriculum committee work is grounded in reason, experience and tradition. This work may also be governed by rules that specify curriculum planning steps, which include the identification of problems and an assessment of general needs achieved through a targeted needs assessment, an analysis of educational strategies (e.g. lectures, problem-based learning, clinical experiences), and evaluation and feedback.1 The structure and content of standardised tests such as the US Medical Licensing Examination (USMLE) Steps 1, 2 and 3 also shape the design of medical curricula because students must pass these examinations to advance in their medical careers.2 The traditional approach to medical curriculum design involves work performed by a committee of medical and basic science experts

The rational, experiential approach to medical curriculum design requires careful and detailed planning of curriculum Maywood, Illinois, USA

Correspondence: Dr William C McGaghie, Ralph P Leischner, Institute for Medical Education, Loyola University Chicago Stritch School of Medicine, Building 120, Room 316, 2160 South First Avenue, Maywood, Illinois 60153, USA. Tel: 00 1 708 216 6078; E-mail: wmcgaghie@luc. edu doi: 10.1111/medu.12396

106

components, with close attention to their integration and delivery. This curriculum development method has been used by medical schools worldwide for decades to plan educational programmes. It has also been employed by the US Accreditation Council for Graduate Medical Education (ACGME) to define categorical competencies (e.g. patient care, medical knowledge, systems-based practice) that postgraduate residents are expected to achieve3 and to specify measurement methods with which to evaluate these outcomes.4 The structure and content of standardised tests also shape the design of curricula because students must pass these examinations to advance in their careers

The advent and rapid growth of simulation technology in medical education present an opportunity to use more rigorous, scientific methods to identify key medical skills and competencies and to engineer educational programmes for their acquisition and maintenance among medical learners. In this issue of Medical Education, Causer and colleagues describe ‘a systematic, evidence-based framework for measuring and analysing superior performance’.5 This employs ‘a three-stage framework for capturing and developing expertise’5 grounded in scientific principles. The three stages represent: (i) the capture of expert performance in the laboratory or field settings; (ii) the identification of underlying mechanisms that account for expert performance, and (iii) the examination of how expertise developed. Effective use of the expert performance

approach allows for ‘the systematic identification and improvement of quantifiable performance metrics’ that are the foundation of simulation-based educational interventions using deliberate practice with feedback to shape and refine medical clinical competence.5 The advent and rapid growth of simulation technology present an opportunity to use more rigorous, scientific methods to identify key medical competencies

The expert performance approach is congruent with other recent writing in this journal on applying the science of learning in medical education.6 It also conforms with the spirit and substance of most chapters in The Cambridge Handbook of Expertise and Expert Performance, which provides a comprehensive inventory of methods to study expertise scientifically with examples from the learned professions (including medicine), arts, sports, games and other fields.7 Simulation-based mastery learning (SBML) in medical education featuring deliberate practice involves an array of high- and low-technology educational interventions that aim to shape, refine and maintain learners’ acquisition of cognitive, clinical and professionalism skills to high and uniform achievement standards. The SBML model is well suited to operate with the expert performance approach as its source of learning objectives and performance metrics. As we have stated elsewhere: …mastery learning has the following seven complementary features: baseline, or diagnostic

ª 2014 John Wiley & Sons Ltd. MEDICAL EDUCATION 2014; 48: 104–112

commentaries testing; clear learning objectives, sequenced as units in increasing difficulty; engagement in educational activities (e.g. skills practice, data interpretation, reading, focused on reaching the objectives); a set minimum passing standard (e.g. test score) for each educational unit; formative testing to gauge unit completion at a preset minimum passing standard for mastery; advancement to the next educational unit given measured achievement at or above the mastery standard, and continued practice or study on an educational unit until the mastery standard is reached. The goal in mastery learning is to ensure that all learners accomplish all educational objectives with little or no variation in outcome. The amount of time needed to reach mastery standards for a unit’s educational objectives varies among the learners.8 The SBML model is well suited to operate with the expert performance approach as its source of learning objectives and performance metrics

The SBML model has been used to help medical learners acquire and maintain an array of important clinical skills. In one study involving inexperienced first-year internal medicine residents, SBML increased the young doctors’ skills to high and uniform standards in four key categories: (i) recognition of physical examination findings (cardiac auscultation); (ii) performance of clinical procedures (paracentesis, lumbar puncture); (iii) management of critically ill patients (intensive care unit [ICU] skills), and (iv) communication with patients (leading a code status discussion).9

The goal in mastery learning is to ensure that all learners accomplish all educational objectives with little or no variation in outcome

Internist Jeffrey Barsuk and colleagues have performed a thematic, sustained and cumulative programme of SBML research studies on the acquisition and maintenance of central venous catheter (CVC) insertion skills. This research programme not only shows that CVC insertion skills can be acquired by medical learners to mastery standards in the simulation laboratory,10 but also that the skills transfer to ‘downstream’ outcomes that include a reduction in patient complications in the ICU,11 an 85% reduction in catheter-related bloodstream infections in the ICU,12 the long-term retention of CVC insertion skills,13 and the cost-effectiveness of SBML training, expressed as a 7–1 rate of return on financial investment.14

The SBML model has been used to help medical learners acquire and maintain an array of important clinical skills

These and other studies show that SBML, informed by the expert performance approach and the science of learning including deliberate practice, can yield translational results in the form of improved patient care practices, better patient and public health outcomes, and other collateral effects.15–17 Rigorous science, embodied in the expert performance approach, SBML and educational research as translational science, complements the rational, experiential approach to medical curriculum planning and delivery at a more granular level. Future research will no doubt amplify and strengthen the

ª 2014 John Wiley & Sons Ltd. MEDICAL EDUCATION 2014; 48: 104–112

science of learning in medical education.

REFERENCES 1 Kern DE, Thomas PA, Hughes MT, eds. Curriculum Development for Medical Education: A Six-Step Approach, 2nd edn. Baltimore, MD: Johns Hopkins University Press 2009. 2 Frederiksen N. The real test bias: influences of testing on teaching and learning. Am Psychol 1984;39: 193–202. 3 Accreditation Council for Graduate Medical Education. Program Director Guide to the Common Program Requirements. http://www.acgme.org. [Accessed 26 September 2013.] 4 Swing SR, Clyman SG, Holmboe ES, Williams RG. Advancing resident assessment in graduate medical education. J Grad Med Educ 2009;1:278–86. 5 Causer J, Barach P, Williams AM. Expertise in medicine: using the expert performance approach to improve simulation training. Med Educ 2014;48:115–23. 6 Mayer RE. Applying the science of learning to medical education. Med Educ 2010;44:543–9. 7 Ericsson KA, Charness N, Feltovich PJ, Hoffman RR, eds. The Cambridge Handbook of Expertise and Expert Performance. New York, NY: Cambridge University Press 2006. 8 McGaghie WC, Siddall VJ, Mazmanian PE, Myers J. Lessons for continuing medical education from simulation research in undergraduate and graduate medical education: effectiveness of continuing medical education: American College of Chest Physicians Evidence-Based Educational Guidelines. Chest 2009;135 (3 Suppl):62–8. 9 Cohen ER, Barsuk JH, Moazed F, Caprio T, Didwania A, McGaghie WC, Wayne DB. Making July safer: simulation-based mastery learning during intern boot camp. Acad Med 2013;88:233–9. 10 Barsuk JH, McGaghie WC, Cohen ER, Balachandran JS, Wayne DB.

107

commentaries Use of simulation-based mastery learning to improve the quality of central venous catheter placement in a medical intensive care unit. J Hosp Med 2009;4:397–403. 11 Barsuk JH, McGaghie WC, Cohen ER, O’Leary KJ, Wayne DB. Simulation-based mastery learning reduces complications during central venous catheter insertion in a medical intensive care unit. Crit Care Med 2009;37:2697–701. 12 Barsuk JH, Cohen ER, Feinglass J , McGaghie WC, Wayne DB. Use of simulation-based education to reduce catheter-related

bloodstream infections. Arch Intern Med 2009;169:1420–3. 13 Barsuk JH, Cohen ER, McGaghie WC, Wayne DB. Long term retention of central venous catheter insertion skills after simulation-based mastery learning. Acad Med 2010;85 (10 Suppl):9–12. 14 Cohen ER, Feinglass J, Barsuk JH, Barnard C, O’Donnell A, McGaghie WC, Wayne DB. Cost savings from reduced catheter-related bloodstream infection after simulationbased education for residents in a medical intensive care unit. Simul Healthc 2010;5:98–102.

15 McGaghie WC. Medical education research as translational science. Sci Transl Med 2010;2: 19cm8. 16 McGaghie WC, Draycott TJ, Dunn WF, Lopez CM, Stefanidis D. Evaluating the impact of simulation on translational patient outcomes. Simul Healthc 2011;6 (Suppl):42–7. 17 McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Translational educational research: a necessity for effective healthcare improvement. Chest 2012;142:1097–103.

Adopting programmatic feedback to enhance the learning of complex skills Muhamad Saiful Bahri Yusoff, Siti Nurma Hanim Hadie & Ahmad Fuad Abdul Rahim Feedback is widely recognised as an important tool for enhancing performance and practice.1,2 Archer defined effective feedback as information provided to learners about their previous performances that can be used to facilitate desirable and favourable improvement.2 Although some have criticised such definitions as paying inadequate attention to the role of the recipient in the feedback process,3 the inclusion of the phrase ‘can be used’ may be read as an implicit indication that feedback, to be effective, cannot be a one-directional process. Although feedback can have a very powerful effect on learning,1 it is the quality

of feedback that determines its power4 and this quality is defined to a large extent by the way in which the recipient is able to engage with the feedback he or she receives. Existing algorithms suggest feedback should be focused and organised, and should provide just the right amount of information for it to be advantageous to learning.1 How such goals interact with the cognition of the learner, however, remains incompletely understood. Its quality is defined to a large extent by the way in which the recipient is able to engage with the feedback

Kota Bharu, Malaysia

Correspondence: Muhamad Saiful Bahri Yusoff, Department of Medical Education, School of Medical Sciences, Universiti Sains Malaysia, Kubang Kerian, Kelantan, Kota Bharu 16150, Malaysia. Tel: 00 60 169 629640; E-mail: [email protected]

In this issue of Medical Education, Olupeliyawa and colleagues examine the influence of feedback delivered through the teamwork mini-clinical evaluation exercise (T-MEX) on students’ collaboration in health care teams.5 Their

paper presents a promising instructional method that bears the potential to offer strong programmatic feedback. The T-MEX allows assessors to observe students’ practices and to focus feedback on the particular behaviours required for effective teamwork skills.5 The authors report that the T-MEX appears to facilitate situated learning (i.e. learning that occurs in the context in which it will be applied) and informed self-assessment (i.e. the use of external information as a means for self-evaluation and self-reflection).5 As the T-MEX incorporates many assessment encounters, it offers an opportunity to explore how programmatic feedback efforts might interact with the cognitive workings of the individuals receiving the feedback. Hence, we found it useful to reflect upon how cognitive load theory (CLT) might shed light on the reasons for these findings.

doi: 10.1111/medu.12403

108

ª 2014 John Wiley & Sons Ltd. MEDICAL EDUCATION 2014; 48: 104–112

The science of learning and medical education.

The science of learning and medical education. - PDF Download Free
52KB Sizes 1 Downloads 3 Views