Technology-Enhanced Simulation and Pediatric Education: A Meta-analysis Adam Cheng, Tara R. Lang, Stephanie R. Starr, Martin Pusic and David A. Cook Pediatrics 2014;133;e1313; originally published online April 14, 2014; DOI: 10.1542/peds.2013-2139

The online version of this article, along with updated information and services, is located on the World Wide Web at: http://pediatrics.aappublications.org/content/133/5/e1313.full.html

PEDIATRICS is the official journal of the American Academy of Pediatrics. A monthly publication, it has been published continuously since 1948. PEDIATRICS is owned, published, and trademarked by the American Academy of Pediatrics, 141 Northwest Point Boulevard, Elk Grove Village, Illinois, 60007. Copyright © 2014 by the American Academy of Pediatrics. All rights reserved. Print ISSN: 0031-4005. Online ISSN: 1098-4275.

Downloaded from pediatrics.aappublications.org at Tufts Univ on September 25, 2014

REVIEW ARTICLE

Technology-Enhanced Simulation and Pediatric Education: A Meta-analysis AUTHORS: Adam Cheng, MD, FRCPC,a Tara R. Lang, MD,b Stephanie R. Starr, MD,c Martin Pusic, MD,d and David A. Cook, MD, MHPEe aDepartment

of Pediatrics, Alberta Children’s Hospital and University of Calgary, Calgary, Canada; bDivision of Neonatology, Department of Pediatric and Adolescent Medicine, Mayo Clinic College of Medicine, Rochester, Minnesota; cDepartment of Pediatric and Adolescent Medicine, and eDepartment of Medicine and Office of Education Research, Mayo Clinic College of Medicine, Rochester, Minnesota; and dOffice of Medical Education, Division of Educational Informatics, New York University School of Medicine, New York, New York KEY WORDS pediatric, technology-enhanced, simulation, systematic review, education ABBREVIATIONS CI—95% confidence interval ES—effect size MERSQI—Medical Education Research Study Quality Instrument NRP—neonatal resuscitation program TES—technology-enhanced simulation Dr Cheng was the principal investigator and conceptualized and designed the protocol, contributed to data collection, drafted the initial manuscript, and reviewed and revised the manuscript; he and Dr Cook had full access to all of the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis; Dr Lang contributed to designing the study protocol and data collection instruments, contributed to data collection, contributed to drafting the initial manuscript, and reviewed and revised the manuscript; Drs Starr and Pusic contributed to designing the study protocol and data collection instruments and reviewed and revised the manuscript; Dr Cook was the senior investigator and conceptualized and designed the protocol, contributed to data collection, carried out the statistical analysis, and reviewed and revised the manuscript; and all authors approved the final manuscript as submitted. www.pediatrics.org/cgi/doi/10.1542/peds.2013-2139 doi:10.1542/peds.2013-2139 Accepted for publication Jan 29, 2014 Address correspondence to Adam Cheng, MD, FRCPC, Department of Pediatrics, Alberta Children’s Hospital, 2888 Shaganappi Trail NW, Calgary, AB, Canada T3B 6A8. E-mail: [email protected]

abstract BACKGROUND AND OBJECTIVE: Pediatrics has embraced technologyenhanced simulation (TES) as an educational modality, but its effectiveness for pediatric education remains unclear. The objective of this study was to describe the characteristics and evaluate the effectiveness of TES for pediatric education. METHODS: This review adhered to PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) standards. A systematic search of Medline, Embase, CINAHL, ERIC, Web of Science, Scopus, key journals, and previous review bibliographies through May 2011 and an updated Medline search through October 2013 were conducted. Original research articles in any language evaluating the use of TES for educating health care providers at any stage, where the content solely focuses on patients 18 years or younger, were selected. Reviewers working in duplicate abstracted information on learners, clinical topic, instructional design, study quality, and outcomes. We coded skills (simulated setting) separately for time and nontime measures and similarly classified patient care behaviors and patient effects. RESULTS: We identified 57 studies (3666 learners) using TES to teach pediatrics. Effect sizes (ESs) were pooled by using a random-effects model. Among studies comparing TES with no intervention, pooled ESs were large for outcomes of knowledge, nontime skills (eg, performance in simulated setting), behaviors with patients, and time to task completion (ES = 0.80–1.91). Studies comparing the use of high versus low physical realism simulators showed small to moderate effects favoring high physical realism (ES = 0.31–0.70). CONCLUSIONS: TES for pediatric education is associated with large ESs in comparison with no intervention. Future research should include comparative studies that identify optimal instructional methods and incorporate pediatric-specific issues into educational interventions. Pediatrics 2014;133:e1313–e1323

PEDIATRICS (ISSN Numbers: Print, 0031-4005; Online, 1098-4275). Copyright © 2014 by the American Academy of Pediatrics FINANCIAL DISCLOSURE: The authors have indicated they have no financial relationships relevant to this article to disclose. FUNDING: No external funding. POTENTIAL CONFLICT OF INTEREST: The authors have indicated they have no potential conflicts of interest to disclose.

PEDIATRICS Volume 133, Number 5, May 2014

Downloaded from pediatrics.aappublications.org at Tufts Univ on September 25, 2014

e1313

With changes in both training and practice environments, simulation has emerged as an integral tool in health professions education. Simulation permits mastery learning through deliberate practice1 of high-risk and/or low-frequency events or procedures without compromising patient safety.2 In particular, technology-enhanced simulation (TES), defined as an educational tool or device with which the learner physically interacts to mimic an aspect of clinical care,3 has been the subject of much research. TES may include computer-based virtual reality simulators, high-fidelity and static mannequins, plastic models, live animals, inert animal products, and human cadavers. Pediatrics has embraced simulation as a learning modality, starting with the call for a paradigm shift in medical education,4 with subsequent integration of TES into pediatric residency and fellowship training programs.5–8 The effectiveness of TES as an educational intervention has been measured with various outcomes, spanning Kirkpatrick’s9 4 levels of learning evaluation: reactions (eg, what participants thought and felt about the training), learning (eg, increased knowledge/ skills), behavior (eg, performance in clinical setting), and results (eg, impact on real patients). A recent review and meta-analysis of TES revealed that simulation has large beneficial effects on knowledge, skills, and behavior and moderate beneficial effects on patient outcomes when compared with no intervention.3 When compared with nonsimulation instruction, TES was found to have a small to moderate positive impact on outcomes.3 However, this review did not isolate simulation for pediatric education. Although several narrative reviews have described the use of simulation in pediatrics,10–14 these reviews had limitations, including a lack of a systematic search e1314

for articles, an analysis of the quality of existing research, or a quantitative synthesis of outcomes. Furthermore, we cannot assume that the results from systematic reviews of TES in other fields3,15 can be extrapolated to the pediatric population. Pediatric patients vary from adults in size, physiology, and diseases treated; and the pediatric work environment and demographic characteristics of pediatric health care providers often differ from those of adult medicine. These differences could, in turn, influence the optimal methods of educating pediatric health care providers. A comprehensive review and synthesis of existing evidence in TES for pediatric education would provide educators guidance regarding the present uses of this instructional approach and identify gaps in the field that warrant attention in future research. The goal of this review was to describe the characteristics of pediatric-related TES studies and quantitatively evaluate the effectiveness of TES as an educational modality used for educating health professionals caring for neonates, children, and adolescents.

METHODS This review was planned, conducted, and reported in adherence with PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) standards of quality for reporting metaanalyses.16 Additional details on study design and execution have been published previously.3 Questions We sought to answer 3 questions: 1. What are the characteristics of pediatric studies involving TES? 2. What is the effectiveness of TES for teaching pediatric content to health care professionals in comparison with no intervention or

with other nonsimulation instruction? 3. What instructional design features are associated with improved learning outcomes for pediatric TES education? Study Eligibility We included comparative studies published in any language that (1) investigated the use of TES for training health care providers at any stage of training or practice, including physicians, nurses, paramedics, respiratory therapists, and emergency medical technicians and (2) focused solely on education for the care of the pediatric patient (#18 years of age). Studies with adult educational content were not included. We included single-group pretest-posttest, 2-group nonrandomized, and randomized studies. All studies making comparison with no intervention (eg, control arm), an alternate simulation modality, or a nonsimulation instructional modality were included. Data Sources We searched Medline, Embase, CINAHL, PsycINFO, ERIC, Web of Science, and Scopus, with the last search date of May 11, 2011. This search sought studies of TES broadly and included terms such as “simulation,” “computer simulation,” “training,” “skill,” “mannequin” or “manikin,” and “assessment” among others; this strategy has been published in full.3 We also reviewed all articles published in 2 journals devoted to health professions simulation (Simulation in Healthcare and Clinical Simulation in Nursing) and the references of several key review articles on pediatric simulation.10–14 Study Selection Study selection proceeded in 3 stages. First, to identify studies of simulationbased training broadly, 2 independent

CHENG et al

Downloaded from pediatrics.aappublications.org at Tufts Univ on September 25, 2014

REVIEW ARTICLE

reviewers screened for inclusion both the titles and abstracts of all potentially eligible studies, with chance-adjusted interrater agreement determined by using the intraclass correlation coefficient of 0.69. Second, to identify studies focused on pediatric education, 3 reviewers screened all articles by using loose criteria (erring on the side of inclusion). Finally, from this subset, 2 reviewers identified articles meeting criteria for inclusion in the present review.

Data Extraction We extracted information from each study independently and in duplicate, with all conflicts resolved by consensus. We abstracted information on the clinical topic, pediatric age subset (ie, content related to the following age groups: neonatal = ,1 month, pediatric = 1 month to ,13 years, adolescent = 13 years to 18 years), training level of learners, instructional design features of simulation training (eg, mastery learning, distributed practice, and multiple learning strategies), study design, method of group assignment, outcomes type, and methodologic quality.3 Methodologic quality was graded by using the Medical Education Research Study Quality Instrument (MERSQI)17 and an adaptation of the Newcastle-Ottawa scale18 for cohort studies. Data were abstracted separately for learner reactions (satisfaction) and for learning outcomes of knowledge, skills, behaviors with patients, and direct effects on patients. Skills outcomes were further classified as time (time to complete the task) and nontime (process [eg, performance rating in a simulated setting] or product [eg, successful task completion or major errors]), whereas behaviors with real patients were similarly classified as time (to complete task) and nontime (process; eg, performance rating in real patient setting).

Data Synthesis We conducted both quantitative and qualitative syntheses of the evidence. A standardized mean difference (Hedges’ g effect size [ES]) was calculated for each comparison by using standard techniques.19–21 We used randomeffects meta-analysis to quantitatively pool these results, organized by comparison (ie, comparison with no intervention, with another form of instruction, or between high and low physical realism) and outcome. For studies making a comparison with no intervention we conducted subgroup analysis on the basis of key study design and instructional design features. We also performed sensitivity analyses excluding the results of studies with imprecise ES calculations.3 We did not perform subgroup analyses for studies with active comparison due to few eligible articles per analysis. For studies comparing different methods of TES, we identified the theme of high versus low physical realism and pooled the results of related studies by using meta-analysis. For the purposes of this study, we defined physical realism as the physical properties of the simulation mannequin and accoutrements.22 High physical realism simulators are those that provide physical findings, display vital signs, physiologically respond to interventions (via computer interface), and allow for procedures to be performed on them (eg, bag mask ventilation, intubation, intravenous insertion), whereas low physical realism simulators are static mannequins that are otherwise limited in these capabilities. Studies that could not be combined in a quantitative synthesis of results were analyzed and summarized in a narrative synthesis. We quantified between-study inconsistency for analyses of $3 studies by using the I2 statistic,21 which estimates the percentage of variability not due to chance. I2 values .50% indicate

large inconsistency or heterogeneity. We used SAS 9.3 (SAS Institute, Cary, NC) for all analyses. Statistical significance was defined by a 2-sided a of 0.05. Interpretations of clinical significance emphasized confidence intervals (CIs) in relation to Cohen’s ES classifications (.0.8 = large, 0.5–0.8 = moderate, 0.2–0.5 = small, and ,0.2 = negligible).23 Updated Search We conducted an updated search from May 1, 2011, to October 1, 2013, of Medline only, with the following search terms: (“pediatr* OR neonat*) AND simulat* AND (educat* OR teach* OR learn*). Two reviewers screened for inclusion the titles, abstracts, and full articles (when necessary) of all potentially eligible studies, with a chance-adjusted interrater agreement determined by using the intraclass correlation coefficient of 0.77. The reviewers then extracted information on study design, clinical topic, pediatric age, training level of learners, outcomes type, and key results. We include in our meta-analytic or narrative synthesis all newly identified studies that made comparisons with other simulation. All newly identified studies (including studies making comparisons to no intervention) are summarized in Supplemental Table 3.

RESULTS Trial Flow As detailed in Fig 1, we identified 53 studies from our initial search that met our inclusion criteria. Thirty-nine studies compared TES with no intervention, 2 made comparisons with another instructional modality, and 14 made comparisons with another form of TES (2 studies reported 2 comparisons). Supplemental Table 2 provides detailed information on each study. A total of 292 candidate articles were identified from our updated search in

PEDIATRICS Volume 133, Number 5, May 2014

Downloaded from pediatrics.aappublications.org at Tufts Univ on September 25, 2014

e1315

FIGURE 1 Trial flow.

October 2013, of which 59 studies met inclusion criteria. Of these, 55 made comparisons with no intervention and 4 made comparisons with alternate simulation (see Supplemental Table 3 and the text below). Fifty-seven studies (N = 3666 learners) in total were included in the metaanalysis: 53 studies from the initial search and 4 studies from the updated search (ie, those making comparisons with alternate simulation). Table 1 summarizes the key features of the studies included in our meta-analysis. e1316

Study Characteristics The learners in the 57 studies included postgraduate physician trainees (n = 1641), physicians in practice (n = 562), nurses or nursing students (n = 447), and medical students (n = 235). Fortytwo (74%) studies included pediatric content and 17 (30%) included neonatal content in the educational intervention. Twenty-eight (49%) studies focused on resuscitation training, 10 (18%) on communication and teamwork skills, and 8 (14%) on various types of procedural skills training (eg,

venous access, endoscopy). The 57 studies reported 76 distinct outcomes, including 35 nontime skills, 17 knowledge outcomes, 12 time skills/behaviors, and 9 outcomes in the context of actual patient care (Table 1). Study Quality The methodologic quality of the included studies is summarized in Supplemental Table 4. The mean (SD) NewcastleOttawa scale (maximum 6 points) and MERSQI (maximum 18 points) study quality scores were 2.6 (1.8) and 11.8 (2.0), respectively.

CHENG et al

Downloaded from pediatrics.aappublications.org at Tufts Univ on September 25, 2014

REVIEW ARTICLE

TABLE 1 Description of Included Studies Study Characteristic All studies Study design Group allocation Comparisonb

Participantsb

Clinical taskc

Simulated setting Patient subsets

Outcomesb

Quality

Level

No. of Studies (No. of Participantsa)

— Two groups Randomized No intervention Nonsimulation training Alternate simulation training Medical students Physicians in postgraduate training Physicians in practice Nurses and nursing students Emergency medical technicians or students Other/ambiguous/mixed Resuscitation Team skills/communication Anesthesia (nonairway) Airway management Venous access Endoscopy Minimally invasive surgery Office-based practice Neonates Children Adolescents Reaction Knowledge Time skill/behavior Nontime skill Nontime behavior Patient effects NOS $4 points MERSQI $12 points

57 (3666) 30 (1878) 20 (1286) 39 (2280) 2 (101) 18 (1360) 4 (235) 36 (1641) 11 (562) 13 (447) 4 (78) 11 (703) 27 (2278) 10 (764) 4 (485) 3 (114) 3 (96) 2 (50) 2 (19) 2 (123) 17 (1184) 42 (2930) 2 (221) 3 (96) 17 (1311) 12 (161) 35 (1704) 5 (145) 4 (158) 21 (1340) 35 (2305)

See Supplemental Table 2 for details on individual studies. NOS, Newcastle-Ottawa scale. a Numbers reflect the number enrolled, except for outcomes that reflect the number of participants who provided observations for analysis. b The number of studies and trainees in some subgroups (summing across rows or columns) may add up to more than the number for all studies because several studies included .1 comparison arm or .1 trainee group, fit within .1 clinical topic, or reported multiple outcomes. c Other tasks were studied with lower frequency.

Synthesis: Comparison of Simulation Versus No Intervention In summary, simulation training was associated with large pooled ESs for all but patient effects, but with substantial inconsistency from study to study (I2 range = 65%–98%). We discuss these results by outcomes below.

Skill Outcomes Figure 2 summarizes the meta-analysis comparing TES with no intervention for outcomes of time and nontime skills. Twenty-seven studies (N = 1027 participants) reported nontime skill outcomes, with a pooled ES of 1.23 (95% CI:

0.95–1.51; P , .001). We found large inconsistency between studies (I2 = 87%), but all of the individual ESs favored the simulation intervention, indicating that studies varied in the magnitude but not the direction of benefit. Subgroup analysis exploring this inconsistency found that studies with a blinded outcome assessment had a lower pooled ES (ES = 0.69) than did unblinded assessments (ES = 1.51; P , .01), whereas other study quality features (randomization and overall quality scores) showed no statistically significant interactions (Fig 3). Other subgroup analyses (without statistical tests of significance) revealed pooled

ESs of approximately the same magnitude across learner groups, pediatric age subsets, and simulated practice setting. We found 5 studies reporting time in a simulation setting (time skills) and 1 reporting time in actual patient care (time behaviors). Pooling these time outcomes together revealed an ES of 1.24 (95% CI: 0.36–2.12; P , .01) (see Fig 2). Learner Behavior Outcomes (Nontime Behavior and Patient Effects) Among studies reporting outcomes with real patients, we found 5 nontime behaviors (pooled ES: 0.80; 95% CI: 0.08– 1.53; P = .03) and 4 patient effects (pooled ES: 0.32; 95% CI: 20.16 to 0.80; P = .20) (see Fig 2). Thirteen studies assessed knowledge outcomes (see Fig 2 for details). Synthesis: Comparison of Simulation Versus Other Forms of Instruction Two studies (N = 101 participants) compared TES with nonsimulation instructional modalities. A meta-analysis was not conducted because of the few studies. One randomized study24 compared a computerized training simulator with an instructional video for neonatal resuscitation booster training, with results showing a small effect favoring simulator training for nontime skill outcomes (N = 60; ES = 0.37) and negligible effect for knowledge (ES = 0.08). A second randomized study25 involving undergraduate nursing students compared a clinical hybrid experience (one-third simulation and two-thirds traditional clinical experience) versus a traditional clinical experience (with no simulation) and found a negligible association for knowledge outcomes favoring those in the traditional group (N = 41; ES = 20.12).

PEDIATRICS Volume 133, Number 5, May 2014

Downloaded from pediatrics.aappublications.org at Tufts Univ on September 25, 2014

e1317

FIGURE 2 Random-effects meta-analysis of studies comparing TES training with no intervention. Positive numbers favor the simulation intervention. Time outcome includes 5 studies in a simulated setting and 1 study in actual patient care. Particip., participants.

Synthesis: Comparison of Simulation Versus Other Types of TES Eighteen studies (14 from original the search and 4 from the updated search; N = 1360 participants) compared TES with other types of simulation instructional modalities. When inductively searching for conceptual themes shared across studies, we identified 7 studies comparing high with low physical realism simulation as part of the educational intervention, and conducted a metaanalysis pooling the results of these studies.26–32 We narratively discuss the other studies with nonrecurrent themes.33–43 Comparison of High Versus Low Physical Realism Simulation Seven studies26–32 compared the use of high versus low physical realism simulators for TES as an educational mo-

dality (Fig 4). Meta-analysis of these studies revealed pooled effects of small magnitude favoring high physical realism for nontime skills (4 studies; ES = 0.49; P , .001) and learner reactions (3 studies; ES = 0.70; P , .01). Pooled effects likewise favored high physical realism for knowledge (2 studies; ES = 0.31; P = .32) and time skills (2 studies; ES = 0.31; P = .18), but neither finding was statistically significant. Other Instructional Design Features Several additional studies revealed further insight into other instructional design features that may affect the effectiveness of TES as an educational modality, as follows33–43:

 Instructor-led versus self-directed training: 2 randomized studies showed that video instruction followed by self-directed practice with a mannequin was more effec-

tive than instructor-led training and practice when measuring nontime skills (N = 44, ES = 0.096; N = 36, ES = 0.823).33,34

 Expert modeling: a 2-group nonrandomized study comparing expert modeling versus self-directed hands-on practice among experienced transport teams showed that self-directed learning was associated with improved nontime skills (N = 24, ES = 0.36).35 However, another nonrandomized study with less experienced learners (nursing students) showed an association between expert modeling and improved knowledge acquisition and nontime skills (N = 16; ES = 0.907 for knowledge, ES =1.141 for nontime skills).36

 Timing of instruction: a 2-group nonrandomized study comparing neonatal resuscitation program (NRP)

FIGURE 3 Subgroup meta-analysis of TES compared with no intervention in pediatrics: nontime skill outcomes. Positive numbers favor the simulation intervention. P values reflect statistical tests exploring the differential effect of simulation training for study subgroups. Participant and pediatric age groups are not mutually exclusive. Some features could not be discerned for all studies. Subgroups do not always sum to the total number of studies. Particip., participants.

e1318

CHENG et al

Downloaded from pediatrics.aappublications.org at Tufts Univ on September 25, 2014

REVIEW ARTICLE

FIGURE 4 Random-effects meta-analysis of studies comparing high versus low physical realism simulators in TES where the simulation environment was otherwise standard between study groups. Positive numbers favor high physical realism simulators. Particip., participants.

training conducted over 1 day versus 2 days showed a favorable but non–statistically significant association between 1 day of NRP training and improved knowledge among medical students (N = 50, ES = 0.131).37

 Team training: in a randomized study comparing standard NRP training with versus without the addition of team training during lectures and skills stations, interns in the team training group demonstrated improved teamwork skills (N = 32, ES = 1.199).38

 Virtual reality training: a 2-group nonrandomized study comparing high-fidelity simulation versus virtual reality for teaching airway endoscopy skills to otolaryngology residents and pediatric surgery fellows showed an association with high-fidelity simulation (N = 36, ES = 0.543) and enhanced nontime skills.39

 Type of feedback: 1 randomized study compared the type of corrective feedback given to learners during cardiopulmonary resuscitation training. The combination of instructor and automated feedback (from the defibrillator) during training versus instructor feedback only demonstrated large effects favoring combined feedback for nontime skills immediately after the intervention (N = 69, ES = 1.454).40 A follow-up study made a similar comparison and assessed outcomes at 6 months,

showing marginal effects in favor of automated feedback with an instructor when compared with instructor only for nontime skills (N = 40, ES = 0.04).41

 Debriefing and repetitive practice: only 1 nonrandomized study looked at the impact of debriefing practices by comparing debriefing alone versus debriefing followed by repeated practice during pediatric emergency medicine scenarios. There was a positive association noted between debriefing with repeated practice and higher learner satisfaction (N = 115, ES = 0.377).42

 Video-assisted debriefing: 1 randomized study compared the effectiveness of video-assisted debriefing to oral debriefing alone in the context of simulated neonatal resuscitation and found small or negligible effects favoring video-assisted debriefing for time skills (N = 30, ES = 0.12) and nontime skills (ES = 0.27).43

DISCUSSION Our results indicate that, in comparison with no intervention, simulation training for pediatrics is associated with effects that are uniformly favorable but somewhat variable in magnitude. Although between-study inconsistency was high, subgroup analyses revealed generally similar effects across studies of different methodologic quality, learner group, and pediatric age. Among the few studies comparing different approaches to simulation education, high physical

realism simulators were associated with small to moderate benefits when compared with low physical realism simulators. Among all included studies, only a small fraction included an interprofessional learner group, procedural skills training, or patient care– related outcomes. Limitations and Strengths We chose to include studies only if the educational content was pediatricfocused. We considered a less restrictive definition (eg, all studies enrolling pediatric health care providers) but felt that focusing only on pediatric content would allow for stronger inferences regarding simulation-based pediatric education. Our analysis revealed high inconsistency between studies, reflecting variation in instructional design, clinical topics, leaner groups, and outcome measures. Many of the included studies had methodologic limitations or failed to clearly describe the context, instructional design, or outcomes; and these deficits limit the strength of our inferences. Only a small fraction of studies measured outcomes on real patients, thus limiting our ability to comment on translation of outcomes from the simulated environment to the real clinical environment.44 Although our meta-analysis of TES with high versus low physical realism revealed favorable effects for high physical realism, we found only 7 studies to include in this analysis. We were unable to draw definitive conclusions related to the optimal

PEDIATRICS Volume 133, Number 5, May 2014

Downloaded from pediatrics.aappublications.org at Tufts Univ on September 25, 2014

e1319

instructional design for pediatric simulation education because of a paucity of studies. Strengths of our study include the following: the exhaustive initial literature search; reproducible inclusion criteria encompassing a large range of learners; duplicate, independent review at all stages; rigorous coding of methodologic quality; and both quantitative and narrative synthesis of findings. Integration With Previous Work Previously published narrative reviews described the scope and use of simulation in pediatrics10,12 and pediatric subspecialty education11,13,14 but did not systematically characterize the studies or quantitatively synthesize the impact of simulation. As was found in previous meta-analyses of TES for health care professionals3 in general, our results showed large effects for knowledge and skills and small to moderate effects for learner behaviors and patient effects when compared with no intervention. These findings support the notion that, as for nonpediatric studies, simulation education can be used effectively for teaching pediatric content. The quality of research reported in our study parallels that reported in previous systematic reviews of the TES literature.3,15,18,45,46 Previous reviews using the MERSQI have reported average scores ranging from 9.6 to 12.3, compared with a mean MERSQI score of 11.8 in our study.3,15,17,18,45,46 Our finding that high physical realism simulators are weakly associated with improved outcomes when compared with low physical realism simulators differs from the results of other systematic reviews, suggesting that degree of realism has a negligible impact on learning outcomes.47–50 These results suggest that the relationship between realism and learning is complex and nonlinear. In fact, the importance

e1320

of realism likely depends on multiple factors, including the specific type or category of realism22 (eg, physical versus conceptual versus emotional), learner type (eg, novice versus experienced), learning objectives (eg, cognitive versus technical), and educational context (eg, assessment versus educational). A recent commentary on realism in simulation-based education suggests that the degree of “functional task alignment,” reflecting the accurate representation of key steps in a simulated task, may have greater impact on educational outcomes than overall realism.51 Realism is only useful to the extent that it fulfills a specific need in the functional design. For example, an enhanced physical feature such as capillary refill may have a beneficial effect for a scenario with medical students, where the key objective is the assessment of a child in septic shock. Alternatively, realistic capillary refill may have minimal impact on learning in a scenario where attending physicians are asked to manage a difficult airway. Future studies should test this hypothesis and explore the relative contributions of physical, conceptual, and emotional realism in different learning environments and with different levels of learners. To the degree that high realism is more expensive, such a refocusing of research and design efforts could lower the cost of simulationbased education. Implications We identified several implications for the field of pediatric simulation. First, our study indicates that TES is an effective educational modality for pediatrics and supports the continued implementation of TES into pediatric educational programs. In the 2-year period between our initial search and updated search, more studies were published than had been published before May 2011. Although this

in published research clearly indicates enthusiasm for simulation-based education for pediatrics, it is unfortunate that so many of these new studies make comparisons with no intervention, and hence contribute little to advancing the field. Our efforts to identify clear recommendations for effective simulation instructional design were hampered by the paucity of studies comparing 1 method of TES with another. Future work should include studies comparing different forms of simulation-based education in an effort to identify the instructional design features associated with the greatest improvement in outcomes. There are several possible explanations for the methodologic limitations noted in many of the studies from this review. These methodologic limitations may be related to the following: (1) lack of expertise and collaboration in simulationbased research; (2) lack of forethought when planning educational activities, resulting in a simulation education that never ends up being reported as a research product; and (3) local barriers to implementation of simulation research. To address these challenges, pediatric groups such as the International Pediatric Simulation Society (www.ipedsim.com) and the International Network for Simulation-based Pediatric Innovation, Research and Education (INSPIRE; www.inspiresim.com) have been established to increase research expertise, promote collaboration, and identify troubleshooting strategies for implementation of simulation-based research. Through the combined expertise of its members, the INSPIRE network is working toward establishing a set of guiding principles for conducting pediatric simulation research. These efforts will help to ensure that future studies are more methodologically sound and more likely to answer questions that will help advance the field of pediatrics.

CHENG et al

Downloaded from pediatrics.aappublications.org at Tufts Univ on September 25, 2014

REVIEW ARTICLE

Simulation has thus far been used predominantly to teach resuscitation and team skills in pediatrics. This important application of TES has been embraced by standardized resuscitation courses such as the Neonatal Resuscitation Program52 and the Pediatric Advanced Life Support53 course. Given the widespread use of simulation for this purpose, future studies should determine the relative importance and optimal use of variously sized pediatric simulators and consider the relative value of full-body simulators with builtin task-training capabilities (eg, chest tube insertion, needle decompression, surgical airway). Because health care providers struggle to maintain resuscitation skills after standardized courses, defining the optimal instructional design (eg, distributed learning versus annual recertification) for the simulation component of these courses becomes a high priority to ensure improved outcomes from critical illness in pediatric patients.54

professionals must be skilled at performing procedures in pediatric patients of all ages, yet there was a paucity of studies examining procedural skills training. Although current clinical training environments for pediatric trainees afford little opportunity in performing certain life-saving procedures, objectives for training still require that these procedures be learned and mastered.55 Simulation researchers should conduct studies to identify the most effective methods of acquiring and retaining skills, while also taking into account how differences in patient age will affect provider confidence, complication rates, and success rates. Similarly, only a tiny fraction of studies involved an interprofessional group of participants. Identifying the ideal strategies for collaborative learning via simulationbased education will help to meet the needs of these diverse groups and ensure that interprofessional competencies are addressed in a structured manner.56

versus in situ simulation in a real clinical space) on learning outcomes. Exploring the impact of context will be important as more simulation research is being conducted to identify strategies to enhance patient safety.57

Several topical gaps were identified in our study. Pediatric health care

Last, no studies examined the impact of context (eg, simulation laboratory

ACKNOWLEDGMENTS The authors thank Dr Ryan Brydges, Dr Stan Hamstra, Dr Rose Hatala, Dr Jason Szostek, Dr Amy Wang and Ms Patricia Erwin for their efforts in initial study selection and data abstraction.

mock codes. Paediatr Child Health (Oxford). 2012;17(2):e16–e20 Cheng A, Goldman R, AbuAish M, Kissoon N. A simulation-based acute care curriculum for pediatric emergency medicine fellowship training programs. Pediatr Emerg Care. 2010;26(7):475–480 Adler MD, Vozenilek JA, Trainor JL, et al. Development and evaluation of a simulation-based pediatric emergency medicine curriculum. Acad Med. 2009;84(7):935–941 Overly FL, Sudikoff SN, Shapiro MJ. High-fidelity medical simulation as an assessment tool for pediatric residents’ airway management skills. Pediatr Emerg Care. 2007;23(1):11–15 Kirkpatrick DL. Evaluation of training. RL Craig and LR Bittel (Eds) In: Training and Development Handbook. New York, NY: McGraw Hill; 1967 Cheng A, Duff J, Grant E, Kissoon N, Grant VJ. Simuation in paediatrics: an educa-

tional revolution. Paediatr Child Health (Oxford). 2007;12:465–468 Eppich WJ, Adler MD, McGaghie WC. Emergency and critical care pediatrics: use of medical simulation for training in acute pediatric emergencies. Curr Opin Pediatr. 2006;18(3):266–271 Weinberg ER, Auerbach MA, Shah NB. The use of simulation for pediatric training and assessment. Curr Opin Pediatr. 2009;21(3):282–287 Yager PH, Lok J, Klig JE. Advances in simulation for pediatric critical care and emergency medicine. Curr Opin Pediatr. 2011;23(3):293–297 Fehr JJ, Honkanen A, Murray DJ. Simulation in pediatric anesthesiology. Paediatr Anaesth. 2012;22(10):988–994 Ilgen JS, Sherbino J, Cook DA. Technologyenhanced simulation in emergency medicine: a systematic review and meta-analysis. Acad Emerg Med. 2013;20(2):117–127

CONCLUSIONS TES for pediatrics is associated with large favorable effects in comparison with no intervention. Unfortunately, the current literature does little to help identify the optimal method of delivering simulation-based education for pediatrics because of a paucity of comparative studies. Defining these optimal features for simulation-based education in pediatrics will be essential to ultimately help translate learning into improved provider performance and patient outcomes.

REFERENCES 1. Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005;27(1):10–28 2. Ziv A, Wolpe PR, Small SD, Glick S. Simulationbased medical education: an ethical imperative. Acad Med. 2003;78(8):783–788 3. Cook DA, Hatala R, Brydges R, et al. Technology-enhanced simulation for health professions education: a systematic review and meta-analysis. JAMA. 2011;306(9):978–988 4. Halamek LP, Kaegi DM, Gaba DM, et al. Time for a new paradigm in pediatric medical education: teaching neonatal resuscitation in a simulated delivery room environment. Pediatrics. 2000;106(4):E45 5. Sam J, Pierse M, Al-Qahtani A, Cheng A. Implementation and evaluation of a simulation curriculum for paediatric residency programs including just-in-time in situ

6.

7.

8.

9.

10.

11.

12.

13.

14.

15.

PEDIATRICS Volume 133, Number 5, May 2014

Downloaded from pediatrics.aappublications.org at Tufts Univ on September 25, 2014

e1321

16. Moher D, Liberati A, Tetzlaff J, Altman DG; PRISMA Group. Preferred Reporting Items for Systematic Reviews and Meta-Analyses: the PRISMA statement. J Clin Epidemiol. 2009;62(10):1006–1012 17. Reed DA, Cook DA, Beckman TJ, Levine RB, Kern DE, Wright SM. Association between funding and quality of published medical education research. JAMA. 2007;298(9): 1002–1009 18. Cook DA, Levinson AJ, Garside S, Dupras DM, Erwin PJ, Montori VM. Internet-based learning in the health professions: a metaanalysis. JAMA. 2008;300(10):1181–1196 19. Borenstein M. Effect sizes for continuous data. In: Cooper H, Hedges LV, Valentine JC, eds. The Handbook of Research Synthesis. 2nd ed. New York, NY: Russell Sage Foundation; 2009:221–235 20. Morris SB, DeShon RP. Combining effect size estimates in meta-analysis with repeated measures and independent-groups designs. Psychol Methods. 2002;7(1):105–125 21. Higgins JPT, Green S. Cochrane Handbook for Systematic Reviews of Interventions 5.1.0. 2011. Available at: www.cochrane.org/ resources/handbook/index.htm. Accessed February 28, 2013 22. Rudolph JW, Simon R, Raemer DB. Which reality matters? Questions on the path to high engagement in healthcare simulation. Simul Healthc. 2007;2(3):161–163 23. Cohen J. Statistical Power Analysis for the Behavioral Sciences. 2nd ed. Hillsdale, NJ: Lawrence Erlbaum; 1988 24. Curran VR, Aziz K, O’Young S, Bessell C. Evaluation of the effect of a computerized training simulator (ANAKIN) on the retention of neonatal resuscitation skills. Teach Learn Med. 2004;16(2):157–164 25. Parker RA, McNeill JA, Pelayo LW, Goei KA, Howard J, Gunter MD. Pediatric clinical simulation: a pilot project. J Nurs Educ. 2011;50(2):105–111 26. Campbell DM, Barozzino T, Farrugia M, Sgro M. High-fidelity simulation in neonatal resuscitation. Paediatr Child Health (Oxford). 2009;14(1):19–23 27. Donoghue AJ, Durbin DR, Nadel FM, Stryjewski GR, Kost SI, Nadkarni VM. Effect of high-fidelity simulation on Pediatric Advanced Life Support training in pediatric house staff: a randomized trial. Pediatr Emerg Care. 2009;25(3):139– 144 28. Donoghue AJ, Durbin DR, Nadel FM, Stryjewski GR, Kost SI, Nadkarni VM. Perception of realism during mock resuscitations by pediatric housestaff: the impact of simulated physical features. Simul Healthc. 2010;5(1): 16–20

e1322

29. Thomas EJ, Williams AL, Reichman EF, Lasky RE, Crandell S, Taggart WR. Team training in the neonatal resuscitation program for interns: teamwork and quality of resuscitations. Pediatrics. 2010;125(3):539– 546 30. Butler KW, Veltre DE. Implementation of active learning pedagogy comparing lowfidelity simulation versus high-fidelity simulation in pediatric nursing education. Clinical Simulation in Nursing. 2009;5:e129– e136 31. Cheng A, Hunt EA, Donoghue A, et al; EXPRESS Investigators. Examining pediatric resuscitation education using simulation and scripted debriefing: a multicenter randomized trial. JAMA Pediatr. 2013;167 (6):528–536 32. Finan E, Bismilla Z, Whyte HE, Leblanc V, McNamara PJ. High-fidelity simulator technology may not be superior to traditional low-fidelity equipment for neonatal resuscitation training. J Perinatol. 2012;32(4): 287–292 33. Kaczorowski J, Levitt C, Hammond M, et al. Retention of neonatal resuscitation skills and knowledge: a randomized controlled trial. Fam Med. 1998;30(10):705–711 34. Lee JC, Boyd R, Stuart P. Randomized controlled trial of an instructional DVD for clinical skills teaching. Emerg Med Australas. 2007;19(3):241–245 35. LeFlore JL, Anderson M. Effectiveness of 2 methods to teach and evaluate new content to neonatal transport personnel using high-fidelity simulation. J Perinat Neonatal Nurs. 2008;22(4):319–328 36. LeFlore JL, Anderson M, Michael JL, Engle WD, Anderson J. Comparison of self-directed learning versus instructor-modeled learning during a simulated clinical experience. Simul Healthc. 2007;2(3):170–177 37. Bhat BV, Biswal N, Bhatia BD, Nalini P. Undergraduate training in neonatal resuscitation—a modified approach. Indian J Matern Child Health. 1993;4(3):87–88 38. Thomas EJ, Taggart B, Crandell S, et al. Teaching teamwork during the Neonatal Resuscitation Program: a randomized trial. J Perinatol. 2007;27(7):409–414 39. Deutsch ES, Christenson T, Curry J, Hossain J, Zur K, Jacobs I. Multimodality education for airway endoscopy skill development. Ann Otol Rhinol Laryngol. 2009;118(2):81–86 40. Sutton RM, Niles D, Meaney PA, et al. “Booster” training: evaluation of instructorled bedside cardiopulmonary resuscitation skill training and automated corrective feedback to improve cardiopulmonary resuscitation compliance of Pediatric Basic

41.

42.

43.

44.

45.

46.

47.

48.

49.

50.

51.

52.

Life Support providers during simulated cardiac arrest. Pediatr Crit Care Med. 2011; 12(3):e116–e121 Sutton RM, Niles D, Meaney PA, et al. Lowdose, high-frequency CPR training improves skill retention of in-hospital pediatric providers. Pediatrics. 2011;128(1). Available at: www.pediatrics.org/cgi/content/full/ 128/1/e145 Auerbach M, Kessler D, Foltin JC. Repetitive pediatric simulation resuscitation training. Pediatr Emerg Care. 2011;27(1): 29–31 Sawyer T, Sierocka-Castaneda A, Chan D, Berg B, Lustik M, Thompson M. The effectiveness of video-assisted debriefing versus oral debriefing alone at improving neonatal resuscitation performance: a randomized trial. Simul Healthc. 2012;7(4):213– 221 Zendejas B, Brydges R, Wang AT, Cook DA. Patient outcomes in simulation-based medical education: a systematic review. J Gen Intern Med. 2013;28(8):1078–1089 Cook DA, Hamstra SJ, Brydges R, et al. Comparative effectiveness of instructional design features in simulation-based education: systematic review and meta-analysis. Med Teach. 2013;35(1):e867–e898 McKinney J, Cook DA, Wood D, Hatala R. Simulation-based training for cardiac auscultation skills: systematic review and meta-analysis. J Gen Intern Med. 2013;28 (2):283–291 Norman G, Dore K, Grierson L. The minimal relationship between simulation fidelity and transfer of learning. Med Educ. 2012;46 (7):636–647 Matsumoto ED, Hamstra SJ, Radomski SB, Cusimano MD. The effect of bench model fidelity on endourological skills: a randomized controlled study. J Urol. 2002;167(3): 1243–1247 Mundell WC, Kennedy CC, Szostek JH, Cook DA. Simulation technology for resuscitation training: a systematic review and meta-analysis. Resuscitation. 2013;84(9):1174– 1183 Kennedy CC, Cannon EK, Warner DO, Cook DA. Advanced airway management simulation training in medical education: a systematic review and meta-analysis. Crit Care Med. 2014;42(1):169–178 Hamstra SJ, Hatala R, Brydges R, Zendejas B, Cook DA. Reconsidering fidelity in simulation-based medical education. Acad Med. Jan 20, 2014, epub ahead of print Kattwinkel J, ed. Textbook of Neonatal Resuscitation. 6th ed. Elk Grove Village: American Academy of Pediatrics; American Heart Association; 2011

CHENG et al

Downloaded from pediatrics.aappublications.org at Tufts Univ on September 25, 2014

REVIEW ARTICLE

53. American Heart Association. Pediatric Advanced Life Support Provider Manual. Dallas, TX: American Heart Association; 2011 54. Bhanji F, Mancini ME, Sinz E, et al. Part 16: education, implementation, and teams: 2010 American Heart Association Guidelines for Cardiopulmonary Resuscitation and Emergency Cardiovascular Care.

Circulation. 2010;122(18 suppl 3):S920– S933 55. Al-Eissa M, Chu S, Lynch T, et al. Selfreported experience and competence in core procedures among Canadian pediatric emergency medicine fellowship trainees. CJEM. 2008;10(6):533–538 56. Wilhaus J, Palaganas J, Manos J, et al. Interprofessional Education and Healthcare

Simulation Symposium. 2012. Available at: http://ssih.org/uploads/static_pages/ipefinal_compressed_1.pdf. Accessed December 1, 2013 57. LeBlanc VR, Manser T, Weinger MB, Musson D, Kutzin J, Howard SK. The study of factors affecting human and systems performance in healthcare using simulation. Simul Healthc. 2011;6(suppl):S24–S29

PEDIATRICS Volume 133, Number 5, May 2014

Downloaded from pediatrics.aappublications.org at Tufts Univ on September 25, 2014

e1323

Technology-Enhanced Simulation and Pediatric Education: A Meta-analysis Adam Cheng, Tara R. Lang, Stephanie R. Starr, Martin Pusic and David A. Cook Pediatrics 2014;133;e1313; originally published online April 14, 2014; DOI: 10.1542/peds.2013-2139 Updated Information & Services

including high resolution figures, can be found at: http://pediatrics.aappublications.org/content/133/5/e1313.full. html

Supplementary Material

Supplementary material can be found at: http://pediatrics.aappublications.org/content/suppl/2014/04/0 9/peds.2013-2139.DCSupplemental.html

References

This article cites 49 articles, 4 of which can be accessed free at: http://pediatrics.aappublications.org/content/133/5/e1313.full. html#ref-list-1

Subspecialty Collections

This article, along with others on similar topics, appears in the following collection(s): Medical Education http://pediatrics.aappublications.org/cgi/collection/medical_e ducation_sub Teaching/Curriculum Development http://pediatrics.aappublications.org/cgi/collection/teaching_c urriculum_dev_sub Administration/Practice Management http://pediatrics.aappublications.org/cgi/collection/administra tion:practice_management_sub Medical Technology and Advancement http://pediatrics.aappublications.org/cgi/collection/med_tech_ advancement_sub

Permissions & Licensing

Information about reproducing this article in parts (figures, tables) or in its entirety can be found online at: http://pediatrics.aappublications.org/site/misc/Permissions.xh tml

Reprints

Information about ordering reprints can be found online: http://pediatrics.aappublications.org/site/misc/reprints.xhtml

PEDIATRICS is the official journal of the American Academy of Pediatrics. A monthly publication, it has been published continuously since 1948. PEDIATRICS is owned, published, and trademarked by the American Academy of Pediatrics, 141 Northwest Point Boulevard, Elk Grove Village, Illinois, 60007. Copyright © 2014 by the American Academy of Pediatrics. All rights reserved. Print ISSN: 0031-4005. Online ISSN: 1098-4275.

Downloaded from pediatrics.aappublications.org at Tufts Univ on September 25, 2014

Technology-enhanced simulation and pediatric education: a meta-analysis.

Pediatrics has embraced technology-enhanced simulation (TES) as an educational modality, but its effectiveness for pediatric education remains unclear...
1MB Sizes 0 Downloads 3 Views