The Role of Simulation in Continuing Medical Education for Acute Care Physicians: A Systematic Review* P. Kristina Khanduja, MBChB, M Ed*1; M. Dylan Bould, MBChB, MEd2; Viren N. Naik, MD, MEd3’4’5; Emily Hladkowicz, BA5; Sylvain Boet, MD, PhD 3’5'6

Objectives: We systematically reviewed the effectiveness of simu­

ated the validity of simulation-based assessment. Thirteen studies

lation-based education, targeting independently practicing quali­

(30%) targeted the lower levels of Kirkpatrick's hierarchy with reli­

fied physicians in acute care specialties. We also describe how

ance on self-reporting. Simulation was unanimously described as a

simulation is used for performance assessment in this population. Data Sources: Data source included: DataMEDLINE, Embase, Cochrane Database of Systematic Reviews, Cochrane CENTRAL

tice. Of the 17 remaining studies, 10 used a single group or “no

Database of Controlled Trials, and National Health Service Economic Evaluation Database. The last date of search was January 31,2013.

were able to demonstrate both immediate and sustained improve­

positive learning experience with perceived impact on clinical prac­ intervention comparison group” design. The majority (n = 17; 44%) ments in educational outcomes. Nine studies reported the psycho­

Study Selection: All original research describing simulation-based

metric properties of simulation-based performance assessment as

education for independently practicing physicians in anesthesiol­ ogy, critical care, and emergency medicine was reviewed.

their sole objective. These predominantly recruited independent

Data Extraction: Data analysis was performed in duplicate with further review by a third author in cases of disagreement until

could discriminate between experienced and inexperienced opera­

consensus was reached. Data extraction was focused on effec­ tiveness according to Kirkpatrick's model. For simulation-based performance assessment, tool characteristics and sources of validity evidence were also collated. Data Synthesis: Of 39 studies identified, 30 studies focused on the effectiveness of simulation-based education and nine studies evalu­

practitioners as a convenience sample to establish whether the tool tors and concentrated on a single aspect of validity evidence. Conclusions: Simulation is perceived as a positive learning experi­ ence with limited evidence to support improved learning. Future research should focus on the optimal modality and frequency of exposure, quality of assessment tools and on the impact of sim­ ulation-based education beyond the individuals toward improved patient care. (Crit Care Med 2015; 43:1 8 6 -1 9 3 ) Key Words: continuing medical education; independent practitioner;

*See also p. 254. 1Department of Anesthesia, Mount Sinai Hospital, Toronto, Ontario, Canada,

maintenance of competence; performance assessment; review; simulation

d e p a rtm e n t of Anesthesia, The Children’s Hospital of Eastern Ontario, Ottawa, Ontario, Canada. d e p a rtm e n t of Anesthesthesiology, The Ottawa Hospital, Ottawa, Ontario, Canada. "•Clinician Educator, Royal College of Physicians and Surgeons of Can­ ada, Ottawa, Ontario, Canada. d e p a rtm e n t of Anesthesiology, The Ottawa Hospital Research Institute, Ottawa, Ontario, Canada. 6The Academy for Innovation in Medical Education, Faculty of Medicine, University of Ottawa, Ottawa, Ontario, Canada. Supplemental digital content is available for this article. Direct URL citations appear in the printed text and are provided in the HTML and PDF versions of this article on the journal’s website (http://journals.lww.com/ccmjournal). Dr. Boet is supported by the Department of Anaesthesiology of The Ottawa Hospital, University of Ottawa (Ottawa, Ontario, Canada). The remaining authors have disclosed that they do not have any potential conflicts of interest. For information regarding this article, E-Mail: [email protected] Copyright © 2014 by the Society of Critical Care Medicine and Lippincott Williams & Wilkins

DOI: 10.1097/CCM.0000000000000672 186

www.ccmjournal.org

articipation in formal continuing medical education (CME) forms a central component of professional devel­ opment for the independent practitioner in the acute medical specialties (1). In an era of growing public demand for the provision of high-quality care, the overarching goals of CME have moved from learning at the level of the individual practitioner toward improved patient outcomes and safety (2,3). In an effort to link participation in CME to robust learn­ ing outcomes, maintenance of certification frameworks have explored weighting forms of CME that result in more effec­ tive learning (4,5). In Canada, for example, traditional didactic sessions are rewarded with one credit per hour, whereas partic­ ipation in workshops that provide the learner with assessment receive three credits per hour (6).

P

January 2015 • Volume 43 • Number 1

Review Articles

Despite the rapid adoption of simulation in undergraduate and postgraduate training, this has not translated into an equal num ­ ber of simulation-based learning opportunities for independently practicing physicians (7). Simulation-based educational (SBE) interventions, especially those involving high-fidelity, mannequinbased simulations, are increasingly being recognized as facilitating active learning in a safe and a reproducible environment (8-12). Simulation has the potential for exposure to a wide variety of patient problems and conditions, whilst tailoring the exercise to individual learner’s needs and experience, as well as offering an opportunity to provide participants with feedback (13,14). A recent survey conducted by the American Society of Anesthesiologists (ASA) showed that the overwhelming major­ ity of independently practicing ASA members expressed an interest in simulation-based CME activities, with a large pro­ portion of respondents indicating that they perceived sim­ ulation-based CME as superior to traditional, lecture-based CME. However, the cost and inconvenience of limited access to simulation were quoted as the most prom inent obstacles to participating in simulation-based CME exercises (15). The ASA has since actively pursued opportunities to integrate simulation into its maintenance of competence (MOC) program specifi­ cally, m irroring trends in other specialities, such as surgery (16). The role of SBE at the level of medical student and resident has been well studied (11,14). However, simulation’s effective­ ness when aimed at the independently practicing acute care physician has not previously been systematically reviewed and analyzed. Therefore, we sought to determine the effectiveness of SBE when targeting the qualified independent practitioner for CME in anesthesia, critical care, and emergency medicine. Regulatory bodies outside Canada have already adopted timelimited certification for independent practitioners in line with MOC programs. Depending on the specialty board, recertifica­ tion not only is contingent on sufficient CME credits but also includes high-stakes, summative performance assessment (17). Although most examinations currently rely on written exami­ nation, simulation-based assessment may form an attractive complementary strategy, especially given the steadily m ounting evidence to support its feasibility, reliability, and validity in the formative and summative assessment of less experienced prac­ titioners (18). Therefore, we also systematically reviewed stud­ ies, where participation in a simulation-based exercise formed part of assessment of clinical competence.

METHODS This systematic review was planned, conducted, and reported in line with PRISMA statement for “preferred reporting items for systematic reviews (19).”

Objective In conducting this systematic review, we sought to answer two questions specifically: 1. Is participation in SBE interventions associated with a change in knowledge, skills, and behavior when targeting the independently practicing acute care specialist?

Critical Care Medicine

2. Is simulation being used solely for performance assessment of the independently practicing acute care specialist? And if so, what are the competencies being evaluated and what are the psychometric properties of tools employed?

Study Eligibility We intentionally chose broad inclusion criteria to establish an overview of the current landscape of SBE for the independent practitioner. Studies published in English were included if they described the use of simulation for 1) teaching or formative assessment or 2) summative assessment of clinical performance for our target population. For the purpose of this systematic review, we defined the term “simulation” as any educational technique that allows the imitation of an aspect of patient care, including hum an simulation, mannequin-based simulation, virtual reality, and part-task trainers, but excluding case-based learning using only text, video, or discussion. Our target popula­ tion consisted of independently practicing qualified physicians, either individuals or as part of a multidisciplinary care team, in the following specialties: pediatric and adult anesthesiology, pediatric and adult critical care medicine, including neonatol­ ogy, and pediatric and adult emergency medicine. Studies were eligible for inclusion if they described original research, but no other design restrictions were imposed. Educational outcomes at any level of a Kirkpatrick’s adapted hierarchy (13) (Table 1) or those describing the use of simulation-based assessment for formative or summative purposes were included. We decided a priori to exclude conference abstracts, letters to the editor, and commentaries from our systematic review. We justified our decision to exclude conference abstracts on the basis that a significant proportion of trials reported in this form never reach full publication (20). In addition, those abstracts that are eventually published in full are often systematically dif­ ferent from the original presentation (21). Our search strategy did identify a number of conference proceedings that were pub­ lished as journal supplements. These were hand-searched and cross-referenced for evidence of subsequent full-text publication.

Study Identification Under the leadership of an experienced research librarian, the authors designed a strategy to search MEDLINE In-Process & Other Non-Indexed Citations and Ovid MEDLINE(1948 to January week 3 2013), Embase (1980-2013 week 4), both through the OvidSP interface and The Cochrane Database of Systematic Reviews, (issue 8, 2011), Cochrane CENTRAL Database of Controlled Trials (issue 3, 2011) and National Health Service Economic Evaluation Database (issue 3, 2011) through the Wiley interface. Known relevant studies were used as seed articles for PubMed Related Citation searching. The search strategies are presented in Appendix 1 (Supplemental Digital Content 1, http://links.lww.com/CCM/B93). The last date of search was January 31,2013.

Study Selection All titles and abstracts identified by the literature search were screened for eligibility independently and in duplicate. w w w .c c m jo u r n a l.o r g

187

Khanduja et al

Abstracts with limited or insufficient information were auto­ matically promoted to full-text review. Disagreements were resolved by independent review by a third author. Data Collection Process and Quality Assessment A screening tool was developed for both the abstracts and the full-text articles based on a pilot search and modified through iterative testing and revision (Appendix 2, Supplemental Digital Content 2, http://links.lww.com/CCM/B94). This was completed independently and in duplicate with further review by a third author in case of disagreements. The quality of the included studies was assessed using the Cochrane Effective Practice and Organisation of Care risk of bias criteria for ran­ domized controlled trials (RCTs) (22, 23) (Appendix 3, Sup­ plemental Digital Content 3, http://links.lww.com/CCM/B95). Synthesis of Results A metaanalysis was not performed because of heterogeneity of study design and outcome measures; instead a narrative sum­ mary was conducted. RESULTS Study Selection We identified 2,310 potentially relevant articles, of which 1,963 remained after duplicate references were removed. Of these, 222 full articles were retrieved for full-text screening. A total of 183 full articles were subsequently excluded based on the inclusion/exclusion criteria, resulting in 39 articles included in this systematic review. The study flow is shown in Figure 1. Study Characteristics and Synthesis Setting. A total of 46% (n = 18) (12, 24-40) of studies were conducted in the United States, with the remainder recruiting specialists practicing in Canada (n = 9) (41-49), Australia and New Zealand (n = 5) (50-54), the United Kingdom (rt = 3) (55-57), Germany (n - 1) (58), Italy (n = 1) (59), the Neth­ erlands ( n - 1) (60), and the Asia Pacific region (n = 1) (61). Professional Discipline. Of our target learners, anesthe­ siologists formed the overwhelming majority of participants (73%) (Table 2). Outside the domain of assessment, only 10 studies intentionally recruited independent practitioners in the acute medical specialties as their sole intervention group (27,34,36,41,42,45-47,59). Study Design. Of the 39 articles describing a SBE interven­ tion, there were 10 (26%) RCTs (12,32,33,39,42,46,47,49,53, 57). The majority of remaining studies were quasiexperimental in design. Of these, only one (36) included a comparison group (Table 2; Supplemental Table 1, Supplemental Digital Content 4, http://links.lww.com/CCM/B96). Of the 29 remaining sin­ gle group designs, 15 studies were repeated measures designs (25, 28, 30, 31, 36, 37, 40, 41, 43,44,48, 50, 51, 56, 60), and 14 studies used a posttest only (26,27,29,34,35,38,45,54,55,58, 59, 61, 62). One study (26) integrated the use of mixed-meth­ ods analysis to explore clinical decision-making and counsel­ ing behaviors. 188

w w w .c c m jo u r n a l.o r g

Figure 1. Study flow diagram.

Focus o f Intervention. The focus of the teaching interven­ tions included technical skills and crisis resource management (CRM) for both individuals and multidisciplinary healthcare teams. The remainder of studies in this area evaluated perfor­ mance during simulated crises and procedural competence. Type o f Simulation. A detailed description of different types of simulators is provided in Supplemental Table 1 (Sup­ plemental Digital Content 4, http://links.lww.com/CCM/B96) and Supplemental Table 2 (Supplemental Digital Content 5, http://links.lww.com/CCM/B97). Instructional Methods and Duration. Studies described a wide range of instructional methods in addition to the SBE intervention that included didactic lectures, small group dis­ cussions, interactive exercises, instructor led feedback, and debriefing, as outlined in Supplemental Table 1 (Supplemental Digital Content 4, http://links.lww.com/CCM/B96). A study by Treloar et al (38) also addressed the feasibility of off-site or remote instructor-led simulation and debriefing. Levels o f Outcome Assessed. Educational outcomes primar­ ily targeted the lower levels of Kirkpatrick’s adapted hierarchy (Table 2; Supplemental Table 1, Supplemental Digital Con­ tent 4, http://links.lww.com/CCM/B96). Apart from studies January 2015 • Volume 43 • Number 1

Review Articles t a b l e 1.

Level 1

Kirkpatrick's Adapted Hierarchy of Evaluating Educational Outcomes (1) Reaction

Covers learners' views on the learning experience, its organization, presentation, content, teaching methods, and aspects of the instructional organization, materials, quality of instruction

Level 2a Learning: change in attitudes/perception

Modification of attitudes/perceptions-outcomes here relate to changes in the reciprocal attitudes or perceptions between participant groups toward intervention/simulation

Level 2b Learning: modification of knowledge or skills

Modification of knowledge/skills—for knowledge, this relates to the acquisition of concepts, procedures, and principles; for skills, this relates to the acquisition of thinking/problem-solving, psychomotor, and social skills

Level 3

Documents the transfer of learning to the workplace or willingness of learners to apply new knowledge and skills

Behavior

Level 4a Results: change in the Change in organizational practice-wider changes in the organizational delivery of care, professional practice attributable to an educational program Level 4b Benefits to patients

Any improvement in the health and well-being of patients/clients as a direct result of an educational program

employing survey-based methodology, the effectiveness of the SBE intervention was predominantly evaluated in the sim u­ lated setting (n = 13) to determine a change in the learner’s knowledge and/or skills (12, 33, 37, 40-42, 46-49, 53, 57, 60). Here, outcomes were assessed with emphasis on process mea­ sures, such as global rating of performance, procedural success, failure to detect key abnormalities, and behavioral aspects with instructor rating of clinical competence. More than two thirds table

of studies identified reported a positive effect on learning fol­ lowing participation in a SBE intervention (26, 27, 29, 34, 35, 37, 38, 40, 42, 45, 47, 48, 53-55, 58-62). All surveys describing the learner’s reaction and self-perceived impact on subsequent clinical practice reported high satisfaction scores with associ­ ated improved preparedness and/or performance (26, 27, 29, 34, 35, 38,45, 54, 55, 58, 59, 61, 62).

Summary of Findings by Kirkpatrick’s Adapted Hierarchy

2. Participant and Study

Characteristics

1. Level 1: participation in an educational experience and the learner’s reaction (n = 7) (26, 27, 35, 38,45, 59, 61)

No. of

No. of

S t u d ie s

P a r tic ip a n ts

Survey-based data collection

14

980

Posttest only or pretest-posttest

15

417

Randomized controlled trial

10

190

At this level, data regarding participants’ reaction and satis­ faction were gathered using surveys and questionnaires, usually measured using Likert scales. Information was collected imme­ diately following simulation-based learning. All studies reported positive responses. Few studies, however, commented on which aspect of the simulation was perceived as the most beneficial. 2. Level 2a: a change in the learner’s attitudes and perceptions (n —6) (29, 34, 52, 55,58, 62)

S t u d y C h a r a c te r is tic s

Study design

Participants Anesthetists

19

1,149

Emergency physicians

10

101

Critical care physicians

7

289

Mixed group of acute care physicians

3

41

4

104

For individuals

9

805

For healthcare teams

8

82

Clinical skills

1

49

Procedural skills

6

292

Communication skills

2

26

Assessment

9

216

Clinical topics Airway

Again, participant responses were primarily collected via surveys and questionnaires using Likert scales and relying on self-reported responses. Results unanimously demonstrated a perceived improvement in clinical preparedness and confidence with respect to procedural, technical, and nontechnical skills immediately following the SBE intervention. In addition, all stud­ ies focusing on CRM skills for multidisciplinary teams reported a perceived improvement in teamwork and communication skills.

Crisis resource management

Critical Care Medicine

3. Level 2b: a change in the learner’s knowledge and/or skills (« = 14) (12, 33, 37, 40-42,46-49, 51, 53, 57, 60) Six of 14 studies integrated a comparison/control group into their study design and exposed this group to an alterna­ tive educational intervention. Eight studies were a single group design with no control. A change knowledge and/or skills was evaluated by an immediate posttest in the simulated setting in 10 studies. Of w w w .c c m jo u r n a l.o r g

189

Khanduja et al

these, eight studies (25, 28, 37, 40, 41, 48, 60) were associated with statistically significant improvements in learning out­ comes. Only two studies (42, 57) examined the longer term retention of skills. Boet et al (42) demonstrated statistically significant improvements in the performance of advanced emergency airway procedures with retention testing at 1 year. Morton et al (57) were also able to demonstrate statistically significant improvements in communication skills for critical care physicians from pretest to posttest. These changes, how­ ever, were not maintained at 6 months. 4. Level 3: behavioral change (transfer of learning to the work­ place; n = 2) (39,63) Two studies (28, 39) evaluated performance-based out­ comes in the clinical setting. Although Steineman et al (28) found a statistically significant improvement in teamwork ratings, clinical task speed, and completion rates following participation of emergency department teams in high-fidelity simulation-based team training, Shapiro et al (39) could only demonstrate a trend toward improved team interactions and behaviors when a similar intervention was compared with a control group undergoing didactic teaching only. 5. Level 4a: change in professional practice (change in orga­ nizational practice—wider changes in the organizational delivery of care, attributable to an educational program; « = 1) (36) We found only one study conducted by Cooper et al (36) that examined the differences in safety climate, as a surrogate marker for patient safety interventions, among four academic anesthesia departments who had participated in simulationbased CRM training versus two who had not. Faculty were asked to complete a 59-item survey prior to training and 3 years following course completion. Although climate scores varied significantly between hospitals, there were no differ­ ences between the trained and the untrained cohorts. The group concluded that safety climate may not adequately reflect the impact of CRM training and that SBE in isolation may be insufficient to alter engrained behaviors without complemen­ tary reinforcing actions. 6. Level 4b: benefits to patients (any improvement in the health and well-being of patients/clients as a direct result of an educational program) We did not identify any studies that examined the effective­ ness of SBE at this level. Supplemental Table 3 (Supplemental Digital Content 6, http://links.lww.com/CCM/B98) summarizes nine studies that targeted independent practitioners solely for simulation-based performance assessment (24,25,30-32,43,44,50,56). The pri­ mary objective of all studies was to establish validity evidence of tools. Areas under investigation included CRM skills ( n = 5) and procedural skills. Evaluations generally related to process measures, with emphasis on key tasks, time-related endpoints, and ratings of global competence. A single study (25) reported on the properties of a tool designed to evaluate a nontechnical 190

w w w .c c m jo u r n a l.o r g

skill (situational awareness). All studies reported on aspects of construct validity (i.e., the degree to which a score can be inter­ preted as representing the intended underlying skill) (64) with emphasis on training level to define a group of experts versus intermediate or novice practitioners. The majority of studies (n = 7) reported a positive asso­ ciation between level of experience and simulation-based performance scores. Two studies focusing on the assessment of knowledge and procedural skills (procedural sedation and epidural catheter insertion, respectively) (25, 50) were unable to demonstrate statistically significant differences in perfor­ mance between independent practitioners and less experi­ enced physicians. Additional outcomes reported included reliability data, most often in the form of interrater reliability. Only one study (32) addressed more than one aspect of validity evidence by com­ paring performance scores on both animal and computer mod­ els. None of the studies identified evaluated the relationship between simulation-based performances with that in real life. Quality of Included Studies: Risk of Bias for RCTs We used the EPOC criteria for a small subset of studies that used a RCT format ( n = 10) (Appendix 3, Supplemental Digi­ tal Content 3, http://links.lww.com/CCM/B95). Only few stud­ ies provided precise details of randomization. There were large variations in the reporting of blinding of participants, personnel, and outcome assessors. Studies were considered to be at high risk if study personnel were not blinded to group allocations. Most studies used human raters for outcome assessments. Although all of these employed at least two raters, few studies provided details on rater training. All studies posed a focused research question with adequate reports of outcome data. Other sources of bias included a poorly defined sampling strategy, whereby subjects were often enrolled on a first-come, first-serve basis.

DISCUSSION Acute care physicians perceive SBE as a positive experience with self-reported impact on knowledge, skills, and attitudes in a wide variety of clinical settings. Limited evidence also sug­ gests that SBE is associated with improved learning outcomes both immediately and in the longer term when compared with no other intervention. Of more than 1,900 articles screened, only 10 studies recruited independent practitioners as their sole target group, highlighting that this population remains relatively understudied. Few studies compared the impact of participation in SBE with other educational strategies. This limits our ability to comment on the effectiveness of simulation when compared with alternatives. Although simulation-based assessment appears to discrim­ inate between experienced and inexperienced practitioners with respect to procedural and nontechnical skills, there is limited work on other important aspects of construct validity, such as internal structure, relation to other variables, response process, and, importantly, consequences. January 2015 • Volume 43 • Number 1

Review Articles L im ita tio n s an d S tre n g th s

Our results are limited primarily by both the quantity and the quality of studies eligible for inclusion. We decided a priori to restrict our definition of acute care specialties to those related to acute medicine. We thereby intentionally excluded surgical subspecialties, such as traumatology. In doing so, we managed to still capture the breadth of SBE interventions, while focus­ ing on independent practitioners who practice within speciali­ ties with complementary, overlapping skills, and competencies. Within our target population, a significant number of stud­ ies determined the perceived benefit of SBE by surveys, relying on self-reported changes to clinical practice. However, the abil­ ity of healthcare professionals to perform accurate self-assess­ ments is often flawed when compared with more objective [measures of performance (65, 66). Less than a quarter of the studies used a RCT design. Overall, few studies compared the effectiveness of SBE intervention with an alternative approach. Our broad inclusion criteria generated a large degree of variation with respect to participants, their backgrounds, and the type of educational intervention that adds the comprehen­ siveness, breadth of scope and generalizability of our review. Other strengths include a systematic search strategy and litera­ ture search that allowed us to identify a large pool of eligible studies first, followed by rigorous and reproducible coding. In te g ra tio n W ith E xisting L ite ra tu re

To our knowledge, this systematic review is the first to describe the role of SBE for the independent practitioner, while explor­ ing its effectiveness as part of a CME strategy. Although we have no comparison in the form of previous reviews, our findings echo many of the themes highlighted in the simula­ tion literature as a whole. The majority of studies included in this review used a single cohort design with a small number including a “no intervention control” group. We agree with the authors of a recent systematic review and metaanalysis (67,68) that this proves of little benefit in further understanding when and how to use simulation in the most effective manner. When addressing the role of simulation-based assessment, our find­ ings appear in keeping with those of Cook et al (69), whereby aspects of validity within our population of interest appear understudied. In addition, there is need for a more structured approach to reporting while addressing multiple sources of validity rather than a single element. The impact of SBE on clinical performance and patient outcome was associated with equivocal benefit when com­ pared with no intervention or nonsimulation instruction. The number of studies identified and resultant number of subjects included in this systematic review was small, making further meaningful synthesis difficult. The challenges inherent to con­ ducting studies that evaluate learning outcomes at the highest level of Kirkpatrick’s adapted hierarchy do not negate the need for strategically and rigorously designing studies as suggested in a recent systematic review (70). Although the role of simulation for independent practi­ tioners outside acute medicine does not appear to have been formally reviewed, the drive to include SBE into CME in Critical Care Medicine

other clinical specialties appears equally strong. The American Board of Family Medicine integrated simulation-based selfassessment modules into its MOC program as early as 2004 (71). These are highly rated by participants with self-reported positive changes to clinical practice (72). Within the surgical specialities, the American Board of Surgery requires “documen­ tation of successful completion of Advanced Cardiovascular Life Support (ACLS), Advanced Trauma Life Support, and Fundamentals of Laparoscopic Surgery for initial certifica­ tion.” ACLS has used the use of simulation-based learning and assessment for many years with evidence to support increased retention of skills when compared with other teaching and testing modalities (73). Simulation appears particularly ame­ nable to assessing procedural skill for laparascopic surgery and has already been integrated into the credentialing process by the American Board of Surgery (74).

IMPLICATION FOR RESEARCH AND PRACTICE This systematic review provides a significant insight into the current landscape of SBE when targeting specialists in acute medicine. The integration of simulation into CME and MOC for independent physicians in the acute medical specialties is not accompanied by high-quality evidence at this time. There is convincing evidence to support the role for simulation-based learning and, to some extent, assessment, at the undergraduate and postgraduate level (75). Although some of the theoretical aspects may apply to our target population, the learning needs of the experienced practitioners likely differ, making this area a high priority of for future research efforts. Much work is needed before we can definitively answer our initial research question as to the effectiveness of SBE, its potential role in CME, and formative or summative assessment of competence. Although some might claim that it is premature to advo­ cate for the more wide-spread implementation of simulation at this stage, we argue the opposite. More traditional educa­ tion strategies, such as conferences, workshops, and in-house rounds (76) will unlikely suffice to maintain the high stan­ dards of care that the profession, regulatory bodies, and public demand of us and in this instance “putting the cart before the horse” may be a potent driver for change. The aviation indus­ try has set a valuable example in this arena, whereby manda­ tory simulation training for airline crews at regular intervals has undoubtedly impacted on safety. The Controlled Risk Insurance Company in New England, for example, followed suit by introducing an incentive for anesthesiologists to partic­ ipate in simulation-based CRM training with some evidence to support a reduction in subsequent malpractice claims (77). This sets an interesting precedent in linking simulation to patient benefits. Although some evidence of positive learning outcomes with the potential for transferability into the clini­ cal setting exists (78), simultaneous and ongoing study will ensure the implementation of robust CME programs. Setting a research agenda that includes direct comparison with other educational strategies and simulation-modalities addresses the appropriate duration and frequency, and the impact on w w w .c c m jo u r n a l.o r g

191

Khanduja et al

clinical performance will not only further our understanding of simulation for CME but prove vital with respect to qual­ ity assurance. Similarly, educational outcomes need to be evaluated beyond self-reported and perceived effectiveness. In addressing Kirkpatrick’s hierarchy at the level of healthcare systems and patient safety, we can establish the missing factor in the cost:beneht analysis of SBE. Within the realm of assessment, simulation has a number of theoretical advantages over more traditional methods, such as self-reports and written test only (79). Clinical competence at the level of the experienced practitioner is multidimen­ sional and nuanced and cannot be sufficiently addressed by a single modality. To gain more wide-spread acceptance among experienced practitioners, it would be vital to explore barriers, in particular cultural misconceptions, together with circum­ stances in which simulation may be complementary to other training modalities. This has to go hand in hand with dedi­ cated and structured research of aspects of validity.

ACKNOWLEDGMENTS We acknowledge the contribution of Margaret Sampson, MLIS, PhD, AHIP, for her guidance in designing and imple­ menting the search strategy.

REFERENCES 1. Bashook PG, Parboosingh J: Recertification and the maintenance of competence. BMJ 1998; 316:545-548 2. Gallagher CJ, Tan JM: The current status of simulation in the main­ tenance of certification in anesthesia. Int Anesthesiol Clin 2010; 48:83-99 3. Iglehart JK, Baron RB: Ensuring physicians' competence-is maintenance of certification the answer? N Engl J Med 2012; 367:2543-2549 4. Peck C, McCall M, McLaren B, et al: Continuing medical education and continuing professional development: International comparisons. BMJ 2000; 320:432-435 5. Davis N, Davis D, Bloch R: Continuing medical education: AMEE Education Guide No 35. Med Teach 2008; 30:652-666 6. Royal College of Physicians and Surgeons of Canada. MOC Program “at a glance" summary. Available at: http://rcpsc.medical.org/opa/ moc-program/documents/MOCJnsert_FlyerJENG.pdf. Accessed September 25, 2011 7. Byrick RJ, Naik VN, Wynands JE: Simulation-based education in Canada: Will anesthesia lead in the future? Can J Anaesth 2009; 56:273-5, 275 8. DeMaria S Jr, Samuelson ST, Schwartz AD, et al: Simulation-based assessment and retraining for the anesthesiologist seeking reentry to clinical practice: A case series. Anesthesiology 2013; 119:206-217 9. Boet S, Bould MD, Sharma B, et al: Within-team debriefing versus instructor-led debriefing for simulation-based education: A random­ ized controlled trial. Ann Surg 2013; 258:53-58 10. Riem N, Boet S, Chandra D: Setting standards for simulation in anes­ thesia: The role of safety criteria in accreditation standards. Can J Anaesth 2011; 58:846-852

14. McGaghie WC, Siddall VJ, Mazmanian PE, et al; American College of Chest Physicians FHealth and Science Policy Committee: Lessons for continuing medical education from simulation research in undergrad­ uate and graduate medical education: Effectiveness of continuing medical education: American College of Chest Physicians EvidenceBased Educational Guidelines. Chest 2009; 135:62S-68S 15. American Society of Anesthesiologists. ASA member survey on simulation. Availiable at http://asatest.asahq.org/SIM/memberpollstats040606.pdf. Accessed September 25, 2011 16. Johnson KA, Sachdeva AK, Pellegrini CA: The critical role of accredi­ tation in establishing the ACS Education Institutes to advance patient safety through simulation. J Gastrointest Surg 2008; 1 2:207-209 17. Cassel CK, Leatherman S, Black C, et al: Physicians' assessment and competence: USA and UK. Lancet 2006; 368:1557-1559 18. Boulet JR, Murray DJ: Simulation-based assessment in anesthesi­ ology: Requirements for practical implementation. Anesthesiology 2010;112:1041-1052 19. Liberati A, Altman DG, Tetzlaff J, et al: The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evalu­ ate health care interventions: Explanation and elaboration. Ann Intern Med 2009; 151 :W 65-W 94 20. Scherer RW, Langenberg P, von EE: Full publication of results initially presented in abstracts. Cochrane Database Syst Rev 2007;MR000005 21. Hopewell S, Clarke M, Askie L: Reporting of trials presented in con­ ference abstracts needs to be improved. J Clin Epidemiol 2006; 59:681-684 22. Cochrane Effective Practice and Organisation of Care Group. Suggested risk of bias criteria for EPOC reviews. Available at http:// epoc.cochrane.org/epoc-author-resources. Accessed on November 02, 2013 23. Higgins JP, Altman DG, Gotzsche PC, et al; Cochrane Bias Methods Group; Cochrane Statistical Methods Group: The Cochrane Collaboration's tool for assessing risk of bias in randomised trials. BMJ 2011; 343:d5928 24. Henrichs BM, Avidan MS, Murray DJ, et al: Performance of certified registered nurse anesthetists and anesthesiologists in a simulationbased skills assessment. Anesth Analg 2009; 108:255-262 25. Kobayashi L, Dunbar-Viveiros JA, Devine J, et al: Pilot-phase findings from high-fidelity In Situ medical simulation investigation of emergency department procedural sedation. Simul Healthc 2012; 7:81 -9 4 26. Boss RD, Donohue PK, Roter DL, et al: “This is a decision you have to make”: Using simulation to study prenatal counseling. Simul Healthc 2012;7:207-212 27. Stevens LM, Cooper JB, Raemer DB, et al: Educational program in crisis management for cardiac surgery teams including high realism simulation. J Thorac Cardiovasc Surg 2012; 144:17-24 28. Steinemann S, Berg B, Skinner A, et al: In situ, multidisciplinary, sim­ ulation-based teamwork training improves early trauma care. J Surg Educ 2011; 68:472-477 29. Bretholz A, Doan Q, Cheng A, et al: A presurvey and postsurvey of a web- and simulation-based course of ultrasound-guided nerve blocks for pediatric emergency medicine. Pediatr Emerg Care 2012; 28:506-509 30. Murray DJ, Boulet JR, Avidan M, et al: Performance of residents and anesthesiologists in a simulation-based skill assessment. Anesthesiology 2007; 107:705-713 31. Dong Y, Suri HS, Cook DA, et al: Simulation-based objective assess­ ment discerns clinical proficiency in central line placement: A con­ struct validation. Chest 2010; 137:1050-1056 32. Chapman DM, Marx JA, Honigman B, et al: Emergency thoracotomy: Comparison of medical student, resident, and faculty performances on written, computer, and animal-model assessments. Acad Emerg Med 1994;1:373-381

11. Cook DA, Hatala R, Brydges R, et al: Technology-enhanced simula­ tion for health professions education: A systematic review and meta­ analysis. JAMA 2011; 306:978-988 12. Frengley RW, Weller JM, Torrie J, et al: The effect of a simulationbased training intervention on the performance of established critical care unit teams. Crit Care Med 2011; 39:2605-2611

33. Chapman DM, Rhee KJ, Marx JA, et al: Open thoracotomy procedural competency: Validity study of teaching and assessment modalities. Ann Emerg Med 1996; 28:641-647

13. Issenberg SB, McGaghie WC, Petrusa ER, et al: Features and uses of high-fidelity medical simulations that lead to effective learning: A BEME systematic review. Med Teach 2005; 27:10-28

34. Blum RH, Raemer DB, Carroll JS, et al: Crisis resource management training for an anaesthesia faculty: A new approach to continuing edu­ cation. Med Educ 2004; 38:45-55

192

w w w .c c m jo u rn a l.o rg

January 2015* Volume 43 • Number 1

Review Articles 35. Holzman RS, Cooper JB, Gaba DM, et al: Anesthesia crisis resource management: Real-life simulation training in operating room crises. J Clin Anesth 1995; 7 :6 75 -6 8 7 36. Cooper JB, Blum RH, Carroll JS, et al: Differences in safety climate among hospital anesthesia departments and the effect of a realistic simulation-based training program. Anesth Analg 2008; 106:574584, table of contents 37. Subbarao I, Bond WF, Johnson C, et al: Using innovative simula­ tion modalities for civilian-based, chemical, biological, radiological, nuclear, and explosive training in the acute management of terrorist victims: A pilot study. Prehosp Disaster M ed 2006; 2 1 :2 72-275 38. Treloar D, Hawayek J, Montgomery JR, et al; Medical Readiness Trainer Team: On-site and distance education of emergency medicine person­ nel with a human patient simulator. M il Med 2001; 1 6 6 :1 0 0 3 -10 0 6 39. Shapiro MJ, Morey JC, Small SD, et al: Simulation based teamwork training for emergency department staff: Does it improve clinical team performance when added to an existing didactic teamwork curricu­ lum? Qua! Saf Health Care 2004; 13:417-421 40. Reznek MA, Rawn CL, Krummel TM: Evaluation of the educational effectiveness of a virtual reality intravenous insertion simulator. Acad Emerg M ed 2002; 9 :1 3 1 9 -1 3 2 5 41. Borges BC, Boet S, Siu LW, et al: Incomplete adherence to the ASA difficult airway algorithm is unchanged after a high-fidelity simulation session. Can J Anaesth 2010; 5 7 :6 4 4 -6 4 9 42. Boet S, Borges BC, Naik VN, et al: Complex procedural skills are retained for a minimum of 1 yr after a single high-fidelity simulation training session. B r J Anaesth 2011; 1 0 7 :5 3 3 -5 3 9

skills with bereaved relatives. Anaesth Intensive Care 2000; 28: 1 8 4 -1 9 0 58. Stocker M, Allen M, Pool N, et al: Impact of an embedded simulation team training programme in a paediatric intensive care unit: A prospective, single-centre, longitudinal study. Intensive Care Med 2012; 3 8 :99 -1 0 4 59. Brazzi L, Lissoni A, Panigada M, et al: Simulation-based training of extracorporeal membrane oxygenation during H1N1 influenza pan­ demic: The Italian experience. Simul Healthc 2012; 7 :3 2 -3 4 60. Schilleman K, Witlox RS, Lopriore E, et al: Leak and obstruction with mask ventilation during simulated neonatal resuscitation. Arch Dis C hild Fetal Neonatal Ed 2010; 95 :F 3 9 8 -F 4 0 2 61. Vincent DS, Berg BW, Ikegami K: Mass-casualty triage training for international healthcare workers in the Asia-Pacific region using man­ ikin-based simulations. Prehosp Disaster M ed 2009; 2 4 :2 0 6 -2 1 3 62. Weller J, Morris R, Watterson L, et al: Effective management of anaes­ thetic crises: Development and evaluation of a college-accredited simulation-based course for anaesthesia education in Australia and New Zealand. Simul Healthc 2006; 1:2 0 9 -2 1 4 63. Steinemann S, Berg B, DiTullio A, et al: Assessing teamwork in the trauma bay: Introduction of a modified “ NO TECHS” scale for trauma. Am J Surg 201 2; 2 0 3 :6 9 -7 5 64. Cook DA, Beckman TJ: Current concepts in validity and reliability for psychometric instruments: Theory and application. Am J M ed 2006; 119:16 6 .e 7 -1 66.16 65. Davis DA, Mazmanian PE, Fordis M, et al: Accuracy of physician selfassessment compared with observed measures of competence: A systematic review. JAMA 2006; 2 9 6 :1 0 9 4 -1 1 0 2

43. Devitt JH, Kurrek MM, Cohen MM, et al: Testing internal consistency and construct validity during evaluation of performance in a patient simulator. Anesth Analg 1998; 8 6 :1 1 6 0 -1 1 6 4

66. Eva KW, Regehr G: Self-assessment in the health professions: A reformulation and research agenda. Acad Med 2005; 8 0 :S 4 6 -S 5 4

44. Devitt JH, Kurrek MM, Cohen MM, et al: The validity of performance assessments using simulation. Anesthesiology 2001; 9 5 :3 6 -4 2

67. Cook DA, Brydges R, Zendejas B, et al: Technology-enhanced simulation to assess health professionals: A systematic review of validity evidence, research methods, and reporting quality. Acad Med 2013; 88:872-883

45. Morgan PJ, Cleave-Hogg D: A Canadian simulation experience: Faculty and student opinions of a performance evaluation study. B r J Anaesth 2000; 85:779-781 46. Morgan PJ, Tarshis J, LeBlanc V, et al: Efficacy of high-fidelity simu­ lation debriefing on the performance of practicing anaesthetists in simulated scenarios. B r J Anaesth 2009; 1 0 3 :5 3 1 -5 37 47. Morgan PJ, Kurrek MM, Bertram S, et al: Nontechnical skills assess­ ment after simulation-based continuing medical education. Sim ul Healthc 2011; 6 :2 5 5 -2 5 9 48. Wong DT, Prabhu AJ, Coloma M, et al: What is the minimum training required for successful cricothyroidotomy? A study in mannequins. Anesthesiology 2003; 9 8 :3 4 9 -3 5 3 49. Chenkin J, Lee S, Huynh T, et al: Procedures can be learned on the Web: A randomized study of ultrasound-guided vascular access train­ ing. Acad Emerg Med 2008; 1 5 :9 4 9 -9 5 4 50. Lee RA, van Zundert TC, van Koesveld JJ, et al: Evaluation of the Mediseus epidural simulator. Anaesth Intensive Care 2012; 4 0 :31 1 -3 1 8

68. Cook DA: If you teach them, they will learn: Why medical education needs comparative effectiveness research. Adv Health S ci Educ Theory Pract 201 2; 1 7 :3 0 5 -3 1 0 69. Cook DA, Brydges R, Zendejas B, et al: Technology-enhanced simulation to assess health professionals: A systematic review of validity evidence, research methods, and reporting quality. Acad Med 2013; 88:872-883 70. Zendejas B, Brydges R, Wang AT, et al: Patient outcomes in simula­ tion-based medical education: A systematic review. J Gen Intern Med 2 0 1 3 ;2 8 :1 0 7 8 -1 0 8 9 71. American Board of Family Medicine. Maintenance of Certification: Part ll-Self-Assessment and Lifelong Learning. 2011. Available at http:// www.theabfm.org/moc/part2.aspx. Accessed November 1, 2013 72. Hagen MD, Ivins DJ, Puffer JC, et al: Maintenance of certification for family physicians (MC-FP) self assessment modules (SAMs): The first year. J Am Board Fam Med 2006; 1 9 :3 9 8 -4 0 3

51. Douglas AE, Holley A, Udy A, et al: Can learning to sustain life be BASIC? Teaching for the initial management of the critically ill in Australia and New Zealand. Anaesth Intensive Care 2010; 38:1043-1051

73. Wayne DB, Butter J, Siddall VJ, et al: Mastery learning of advanced cardiac life support skills by internal medicine residents using simu­ lation technology and deliberate practice. J Gen Intern Med 2006; 2 1 :2 5 1 -2 5 6

52. Weller JM, Bloch M, Young S, et al: Evaluation of high fidelity patient simulator in assessment of performance of anaesthetists. B r J Anaesth 2003; 9 0 :4 3 -4 7

74. Vassiliou MC, Dunkin BJ, Marks JM, et al: FLS and FES: Comprehensive models of training and assessment. Surg Clin North Am 2 0 1 0 ;9 0 :5 3 5 -5 5 8

53. Martin KM, Larsen PD, Segal R, et al: Effective nonanatomical endos­ copy training produces clinical airway endoscopy proficiency. Anesth Analg 2004; 9 9 :9 3 8 -9 4 4 , table of contents

75. Okuda Y, Bryson EO, DeMaria S Jr, et al: The utility of simulation in medi­ cal education: What is the evidence? M t Sinai J Med 2009; 76:330-343

54. Weller J, Wilson L, Robinson B: Survey of change in practice fol­ lowing simulation-based training in crisis management. Anaesthesia 2 0 0 3 ;5 8 :4 7 1 -4 7 3 55. Freeth D, Ayida G, Berridge EJ, et al: Multidisciplinary obstetric simu­ lated emergency scenarios (MOSES): Promoting patient safety in obstetrics with teamwork-focused interprofessional simulations. J Contin Educ Health Prof 2009; 2 9 :9 8 -1 0 4 56. Rovamo L, Mattila MM, Andersson S, et al: Assessment of newborn resuscitation skills of physicians with a simulator manikin. Arch Dis C hild Fetal Neonatal Ed 2011; 9 6 :F 3 8 3 -F 3 8 9 57. Morton J, Blok GA, Reid C, et al: The European Donor Hospital Education Programme (EDHEP): Enhancing communication

Critical Care Medicine

76. Davis D, O'Brien MA, Freemantle N, et al: Impact of formal continuing medical education: Do conferences, workshops, rounds, and other traditional continuing education activities change physician behavior or health care outcomes? JAMA 1999; 2 8 2 :867-874 77. McCarthy J, Cooper JB: Malpractice insurance carrier provides pre­ mium incentive for simulation-based training and believes it has made a difference. A P S F News! 2007; 22:17 78. Boet S, Bould MD, Fung L, et al: Transfer of learning and patient out­ come in simulated crisis resource management: A systematic review. Can J Anaesth 2014; 6 1 :571 - 5 8 2 79. Holmboe E, Rizzolo MA, Sachdeva AK, et al: Simulation-based assessment and the regulation of healthcare professionals. Simul Healthc 2011; 6 :S 5 8 -S 6 2

w w w .c c m jo u rn a l.o rg

193

Copyright of Critical Care Medicine is the property of Lippincott Williams & Wilkins and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use.

The role of simulation in continuing medical education for acute care physicians: a systematic review.

We systematically reviewed the effectiveness of simulation-based education, targeting independently practicing qualified physicians in acute care spec...
7MB Sizes 6 Downloads 17 Views