Gearing up for milestones in surgery: Will simulation play a role? Aimee K. Gardner, PhD,a Daniel J. Scott, MD,a James C. Hebert, MD,b John D. Mellinger, MD,c Ariel Frey-Vogel, MD, MAT,d Raymond P. Ten Eyck, MD,e Bradley R. Davis, MD,f Lelan F. Sillin, III, MD,g and Ajit K. Sachdeva, MD,h Dallas, TX, Burlington, VT, Springfield and Chicago, IL, Boston and Burlington, MA, and Dayton and Cincinnati, OH

Background. The Consortium of American College of Surgeons–Accredited Education Institutes was created to promote patient safety through the use of simulation, develop new education and technologies, identify best practices, and encourage research and collaboration. Methods. During the 7th Annual Meeting of the Consortium, leaders from a variety of specialties discussed how simulation is playing a role in the assessment of resident performance within the context of the Milestones of the Accreditation Council for Graduate Medical Education as part of the Next Accreditation System. Conclusion. This report presents experiences from several viewpoints and supports the utility of simulation for this purpose. (Surgery 2015;158:1421-7.) From the UT Southwestern Medical Center,a Dallas, TX; the University of Vermont College of Medicine, Residency Review Committee for Surgery,b Burlington, VT; the Southern Illinois University School of Medicine,c Springfield, IL; the Massachusetts General Hospital for Children,d Boston, MA; Wright State University,e Dayton, OH; the University of Cincinnati,f Cincinnati, OH; the Lahey Center for Professional Development and Simulation,g Burlington, MA; and the American College of Surgeons,h Chicago, IL

ACCURATE DOCUMENTATION and evaluation of resident competencies has always been a challenge for Program Directors. The introduction of the Next Accreditation System (NAS) by the Accreditation Council for Graduate Medical Education (ACGME) defines specific milestones regarding resident knowledge, skills, and other competencies along a continuum. The granularity of these milestones makes it likely that new assessment methods will need to be developed to inform faculty decisions regarding resident progression along the identified targets. Identification of the most practical, informative, and cost-effective methods to assess and document trainee improvement throughout residency is critical. Simulation should play a pivotal role in satisfying these requirements, but little is currently Presented at the Seventh Annual Meeting of the Consortium of ACS-accredited Education Institutes, March 21–22, 2014, Chicago, Illinois. Accepted for publication March 13, 2015. Reprint requests: Aimee K. Gardner, PhD, Assistant Professor of Surgery, Associate Director of Education, Department of Surgery, UT Southwestern Medical Center, 5323 Harry Hines Blvd, Dallas, TX 75390. E-mail: aimee.gardner@ utsouthwestern.edu. 0039-6060/$ - see front matter Ó 2015 Elsevier Inc. All rights reserved. http://dx.doi.org/10.1016/j.surg.2015.03.039

known about the application of simulation to the milestones in surgery. Residency programs may be able to use simulated scenarios strategically to evaluate performance that would be too impractical or burdensome through traditional methods of assessment. The American College of Surgeons Program for Accreditation of Education Institutes (ACS-AEI) aims to advance simulation-based, surgical education and training to address national imperatives. During the 7th Annual Meeting of the Consortium of ACS-AEIs in March 2014, a multidisciplinary panel was assembled to discuss how simulation may be used to address effectively the new milestone requirements defined by the American Board of Surgery and the Residency Review Committee for Surgery (RRC). This article summarizes this panel discussion, beginning with an overview of the new requirements, followed by challenges for Program Directors and examples of how surgical and nonsurgical specialties are using simulation to evaluate and document achievement of various milestones. OVERVIEW FROM THE RRC PERSPECTIVE The ACGME launched recently the NAS, which focuses on educational outcomes rather than adherence to highly structured rules and processes; the goal was to better prepare physicians for practice in the future. The reasons for this SURGERY 1421

1422 Gardner et al

change are many and have been articulated previously.1 A cornerstone of NAS is the measurement and reporting of specialty-specific, educational milestones that track the development of individual residents along a continuum within the framework of 6 competency domains (medical knowledge, patient care and procedural skills, interpersonal and communication skills, professionalism, practice-based learning and improvement, and systems-based practice).1 The development of milestones was undertaken under the auspices of both the ACGME and American Board of Medical Specialties; this process involved a wide range of individuals, including representatives from the specialty boards, RRCs, ACGME, program directors, and residents. Unique milestones and reporting templates are now available for all specialties, and all programs are expected to begin using and reporting milestone evaluation beginning in July 2014.2 Residency training programs are required to have a Clinical Competence Committee (CCC) that will provide oversight for the assessment of milestone achievement for each trainee.3 To foster innovation and creativity, the ACGME has provided programs with sufficient freedom regarding the best way to approach the process of milestone assessment.4 For milestones to improve the assessment of trainee skills and to ensure competency of the graduating resident, these milestones must be linked to valid and reliable assessment tools. Best practices regarding how to achieve comprehensive and standardized assessments of surgery residents to inform milestone assessment have yet to be developed, but educators agree that simulation should play a pivotal role.5,6 The development of surgical simulators and their increased acceptance within the surgical community not only allow for effective training outside of the clinical environment, but also permit standardized and objective assessment of performance.6-8 Thus, performance in simulated settings can yield valuable information for the CCC. Simulation can also play an important role in remediation and for faculty development within the context of assessment of milestones. Incorporation of simulation into the mainstream of all aspects of medical and surgical education needs to continue at an accelerated pace. CHALLENGES FOR PROGRAM DIRECTORS As with any major change in education, residency programs may experience challenges in addressing specific milestones; however, simulation can help

Surgery November 2015

to alleviate some of these challenges. This article presents an overview of the stated goals of the milestones as outlined by the ACGME, along with a brief summary of how simulation needs to be integral to achieving various goals. An expressed goal of the milestones was that they would ‘‘guide curriculum development of the residency.’’ The growing number of national mandates, quality imperatives, compliance requirements within hospitals and health care systems, electronic record requirements, pressures on teaching faculty, and restricted resident work hours has impacted resident training. Negative impact has resulted with regard to junior resident operative experiences9 and in resident and faculty confidence in the ability of trainees to perform operations deemed essential for the general surgeon.10,11 Simulation-based training can help to supplement clinical experiences by providing standardized training to perform key procedures and acquire requisite skills. Specific, measurable areas of practice that define a profession are referred to as ‘‘entrustable professional activities’’ (EPAs).12 Identification of EPAs within surgery is valuable, because they may be used to drive curriculum development and to inform levels of supervision. For example, inserting a central venous catheter may be identified as an EPA for surgery residents, and specific simulation-based curricula may be developed to address training and assessment needs. As learners acquire the necessary knowledge and skills relating to insertion of a central venous catheter and managing potential complications, the level of required supervision should decrease from direct oversight to ‘‘entrustment,’’ confirming that the learner is deemed skilled to perform the professional duty without oversight.13 Thus, incorporating simulation EPAs in residency training can ensure that trainees are exposed uniformly to specific training standards and techniques and are able to practice their skills outside of patient care settings until ready to perform the procedure or carry out a task in real settings. Another stated goal of the milestones is that they will ‘‘support better assessment practices.’’ Considerable evidence shows that faculty members spend relatively little time observing directly resident behaviors,14 form their impressions of broader resident performance by globalizing their limited observations of clinical performance and professionalism,15 produce less meaningful global evaluations after a substantial time delay after direct observation,16 and lack insight into the dependence of their ratings on rater-dependent

Surgery Volume 158, Number 5

variables rather than learner-dependent variables.17 Simulation-based training and assessment can be more objective, which should help in the development of new assessment tools, expand the scope of assessments, and address new domains. The final stated goal of the milestones is to ‘‘enhance opportunities for early identification of struggling residents.’’ Prior data have suggested that problems are generally identified early but are rarely remediated successfully by the end of residency.18 The milestones are created in such a way that programs need to have standardized ways to assess very granular aspects of resident performance from the beginning of training. For example, milestone assessments should focus on basic surgical skills (such as airway management, knot tying, and suturing), as well as on managing crises and delivering bad news. Placing residents in a simulated scenario in which each of these skills can be taught, learned, and assessed can be extremely valuable, especially because standardizing training and assessment in clinical settings would be extremely difficult. Simulation can be used effectively as a diagnostic tool to identify potential deficiencies across a range of both technical and nontechnical competencies. Residency programs and program directors are being challenged in implementing the milestones. Simulation should help in this process by focusing on specific entrustable activities, identifying best practices for training and assessment, and detecting and correcting deficiencies in performance. EXPERIENCES FROM PEDIATRICS The pediatrics milestones were implemented in July 2013. Educators at the Massachusetts General Hospital for Children have explored use of simulation for milestone assessment. This group created an intern simulation program using a core set of 6 pediatric cases. Resident assessment along the milestone continuum was investigated and compared with the information received from the traditional evaluation systems (end-of-rotation evaluations, direct observation encounters, and in-training examinations). Data were collected to determine whether simulations could be used for early and accurate detection of trainee deficits. The curriculum involved interns serving as leaders for real patient cases throughout the year and then participating individually in 6 back-to-back, simulated cases at the end of the year. Checklist-based evaluations were created for each of the 6 simulated scenarios according to the milestone-based global evaluations. To obtain

Gardner et al 1423

consensus-based evaluations of each resident’s performance, instructors participated in a group evaluation session, in which they watched each of the 6 video-recorded simulation scenarios, individually completed the evaluation tool, reported their ratings with the group, and discussed any discrepancies until group consensus was achieved. The evaluations from the group consensus were then compared with the end-of-year summative evaluation. During the 2013–2014 academic year, all 24 pediatric interns participated in this curriculum, including interns in the categorical, med–peds, pediatric neurology, pediatric psychiatry, and preliminary programs. The simulation program was successful, and the experiences thus far provided much insight into both the challenges and benefits of a simulation program. It was realized quickly that more faculty need to be recruited and trained for expansion of the program to all classes of pediatric residents. Additionally, although the milestone language is very specific, there remains a struggle to differentiate between behaviors that relate to ‘‘patient care’’ from those that relate to ‘‘medical knowledge.’’ Similarly, behaviors that relate to ‘‘interpersonal and communication skills’’ and ‘‘professionalism’’ are difficult to separate because of the obvious overlap. Thus, intensive faculty training is needed concerning the use of the cases and evaluation forms to ensure sufficient interrater reliability. Nonetheless, simulation has provided invaluable insight into the clinical reasoning and fund of knowledge of the interns, which is typically not possible in a clinical setting. For example, personality plays a major role in evaluations derived from clinical environments, but not as much in simulated settings. Interns who have good interpersonal skills are able to obtain better evaluations presumably in clinical environments that do not correlate with performance in simulated settings. In contrast, the more reserved interns perform better on simulated scenarios than expected based on clinical evaluations. Thus, assessments based on simulations may level the playing field by adding opportunities to glean unique and potentially more accurate data about resident performance. At present, milestone-based evaluations based on simulation seem to add new and meaningful data to the current resident evaluation system; hence, these data will be useful for the CCCs with regard to assessing achievement of milestones. Simulation also seems to be useful in diagnosing

1424 Gardner et al

deficiencies and evaluating improvement after educational interventions. Additional investigations are planned to determine whether the formative and summative milestone-based assessments during the intern year predict performance in subsequent years. The introduction of the milestones has prompted a reconsideration of the possible uses of simulation and helped to unlock the potential of simulation in resident education and evaluation in pediatrics. EXPERIENCES FROM EMERGENCY MEDICINE Emergency medicine also implemented the milestones in July 2013, including 23 subcompetencies incorporating 227 milestones spread over 5 levels of performance from novice to expert.19,20 Although the milestones provide far more granularity for tracking resident progress, they also require more detailed documentation and assessment methods.21 A group of educators in emergency medicine initiated work at the Boonshoft School of Medicine at Wright State University to map the milestones on to their well-established, curriculum in resident simulation, which consists of >60 encounters per resident during the 3-year program. Their simulation curriculum uses both part task trainers and high-fidelity, case-based simulations. The sessions using part task trainees provide opportunities to demonstrate several patient care subcompetencies starting with basic procedures during intern orientation (eg, tube thoracostomy, placement of a central venous line, lumbar puncture, and airway control) and more advanced procedures for senior residents (eg, pericardiocentesis, arterial lines, and placement of transvenous pacemakers). The case-based simulations consist mainly of patients presenting to the emergency department with a variety of undifferentiated chief complaints; these subcompetencies incorporate both patient care subcompetencies and many of the milestones relating to team management, professional values, practice-based performance improvement, patient-centered communication, systems-based management, and patient safety subcompetencies. Additional subcompetencies are addressed through the use of interprofessional simulations conducted with local nursing schools and other locations of care, which include the intensive care unit and field settings. Overall, the simulation curriculum provides data for >60% of the milestones in emergency medicine, thus contributing to the overall assessment of each resident’s progress.

Surgery November 2015

To streamline the evaluation process for faculty, an electronic tool was developed using Microsoft Excel that linked each scenario to the relevant subcompetencies of the milestones. In this way, after observing each scenario, faculty could designate the level of performance achieved for the applicable subcompetency domains in a standardized fashion, rather than be bombarded with all of the 227 milestones at any 1 time. The format of this tool allowed evaluation data to be easily retrievable for progress reports and cumulative assessments. Initial results of this program have been favorable. Pilot work involving the intern class 6 months before the implementation of the milestones was implemented to evaluate feasibility and utility. The program was used successfully by faculty after a short training session. At 8 months into the NAS, milestones have been recorded consistently during resident simulations, and the tool has been able to produce all reports without additional transcription or manipulation of the data. As a work in progress, unresolved issues remain, however, including the frequency of documenting each milestone and whether the milestones linked with each simulation should be limited to those that match each resident’s level of training. Future efforts include trying to limit the number of milestones within the context of a simulation. In addition, the current design of the simulation program is being modified to address more milestones. EXPERIENCES FROM SURGERY Most surgery programs have embraced simulation-based training for their residents, but very few programs have incorporated formal assessments other than providing immediate feedback to their learners during the training in laboratory settings. With the implementation of the NAS, however, skills activities in the laboratory have been evaluated in the context of entrustment and EPAs.22 As noted, EPAs are the broad activities of practice that all physicians should be capable of performing; typically, EPAs may be observed as residents perform their day-to-day practice. For example, professionalism is not observed directly; rather, it is inferred from observations of residents interacting with patients. Most educators look at EPAs as encompassing a broad set of skills and competencies, such as the management of postoperative hypotension, but residents are also observed performing activities that are more focused and that represent some parts of the whole, like interpreting a blood gas in the context

Surgery Volume 158, Number 5

of the hypotensive patient. These focused activities have been termed ‘‘observable professional activities’’ (OPAs).23 The purpose of mapping OPAs to milestones is to ensure enough points of reference so that resident performance can be assessed reliably. By mapping these observations onto the milestones, the CCC can use objective measurements of performance for each resident, which can be benchmarked against other residents at the same level. Assessment of skills in simulated settings may be particularly useful to assess junior residents who have not completed enough operative procedures to be assessed adequately. With the milestones in mind, the University of Cincinnati revised the assessments performed at the conclusion of every skills laboratory, and a list of OPAs was constructed based on relevance and importance to surgery residents. The intern boot camp was an intuitive starting point, because such simulation-rich environments have proven effective in assessing and preparing interns to start residency training.24 Additionally, performance data from the boot camp were useful in providing the CCC with information regarding technical skills of individuals at the beginning of residency. Intern boot camps were tailored to evaluate specific skills needed for the milestone evaluation and to ensure that the interns would surpass the ‘‘critical deficiency’’ threshold early in residency and reach the initial ‘‘level one’’ performance by a predetermined date. For simple procedures, such as wound debridement and central venous access, skills labs were used not only to determine competency, but also to establish metrics regarding when direct supervision could be converted to indirect supervision in the clinical setting. The boot camp was also expanded to include 8 objective, structured clinical examinations (OSCEs) that evaluated baseline performance of the residents in assessing right lower quadrant pain, postoperative fever, chest pain, change in mental status, hypotension, oliguria, hyperkalemia, and hand-offs. The OSCEs facilitated robust mapping; for instance, in the scenario of right lower quadrant pain, a resident could be assessed in the domains of patient care, knowledge, professionalism, and interpersonal and communication skills. The evaluator was asked to assess how the resident interacted with the patient with specific questions like, ‘‘Did the resident introduce himself and treat the patient with respect?’’; this OPA mapped to professionalism. Limiting the number of OPAs for a given activity to #10 ensured that the evaluator was not overwhelmed. Similarly, the

Gardner et al 1425

education team mapped OPAs to the milestones to maximize accuracy and efficiency. Other simulations, including procedural and teambased exercises, allowed assessments for both technical and non-technical skills. Team-based simulations were useful for milestones that are more difficult to observe in the clinical setting, such as leadership of the operating room team during a difficult dissection and communication with the staff. In conclusion, the new accreditation system implemented recently by the ACGME includes definition of specific milestones within each specialty. The milestones are aimed at creating a roadmap for residency training within the specialty and thereby provide good reference points to assess the achievements of residents at each of the stages across the continuum of training. These milestones also support better practices of assessment and identify gaps in resident performance that need remediation. The next step beyond definition of milestones is to create EPAs that add clinical relevance to the milestones and allow assessment of the core competencies in a bundled fashion. EPAs also add the concept of resident entrustment to the assessments by conferring decisions with regard to entrusting residents to perform certain activities or procedures with greater independence; thus, these EPAs are relevant to decisions about indirect and direct supervision. EPAs and milestones may be assessed in real environments as well as in simulated settings. The simulated settings have the unique advantage of helping to standardize the scenarios on which the residents are being assessed and allowing the residents to make appropriate decisions and take actions without risk to patients. Simulations are being used in residency programs across the country to teach and assess specific clinical, technical, and nontechnical skills. Definition of EPAs and milestones in surgery provides the opportunity to advance these efforts. Also, sharing of experiences and best practices across various specialties, both surgical and nonsurgical, should be helpful in fostering innovation. The panel convened at the 2014 Meeting of the Consortium of ACS-AEIs articulated the importance of simulation in addressing EPAs and milestones and focused specifically on experiences from the 3 specialties, namely, pediatrics, emergency medicine, and surgery, each at different institutions. In pediatrics, the efforts were directed at the internship year and included development of a core set of 6 simulated cases.

1426 Gardner et al

Assessments in simulated environments demonstrated certain benefits over the assessments performed in real settings and also highlighted difficulties in placing milestones within the framework of certain competencies because of the overlap with other related competencies. This finding is important because it underscores the problems associated with compartmentalization of skills into separate but related competencies. The experience from emergency medicine was different in that the milestones were mapped onto an established resident simulation curriculum that included encounters across the 3 years of training. Both part-task trainers and high-fidelity case simulations were used for this purpose. This complex endeavor was demonstrated to be feasible. In surgery, in addition to addressing EPAs, residents were also observed performing focused activities that involved specific tasks. These were termed OPAs. The EPAs and OPAs were mapped to the milestones and were used in the intern boot camp. Assessments in simulated settings were used to evaluate the skills of the interns, and the results informed decisions relating to the level of supervision needed. A spectrum of clinical, technical, and nontechnical skills were addressed in this fashion. The varied experiences with the use of simulation across these 3 specialties yielded valuable information. The next step needs to involve development of a longitudinal, simulation-based curriculum across the 5 years of training based on EPAs and specific milestones. This development will be a major step toward ensuring that all residents demonstrate specific levels of competence and performance through various stages of their training. This information should be valuable to residency program directors and to the Institutional Clinical Competency Committee. The development of such a national, simulation-based curriculum will require considerable effort and deployment of considerable resources, but the benefits accrued should impact positively the training of surgery residents and patient care. REFERENCES 1. Nasca TJ, Philibert I, Brigham T, Flynn TC. The next accreditation system – rationale and benefits. N Engl J Med 2012;366:1051-6. 2. Accreditation Council for Graduate Medical Education (ACGME). ACGME milestones. Available from: www.acgme. org/acgmeweb/tabid/430/ProgramandInstitutionalAccre ditation/NextAccreditationSystem/Milestones.aspx. Accessed February 17, 2015.

Surgery November 2015

3. Accreditation Council for Graduate Medical Education (ACGME). Common program requirements. Available from: www.acgme.org/acgmeweb/Portals/0/PFAssets/ ProgramRequirements/CPRs2013.pdf. Accessed February 17, 2015. 4. French JC, Dannefer EF, Colbert CY. A systematic approach toward building a fully operational clinical competency committee. J Surg Educ 2014;6:e22-7. 5. Gardner AK, Scott DJ, Choti MA, Mansour JC. Developing a comprehensive resident education evaluation system in the era of milestone assessment. J Surg Educ 2015. http://dx.doi.org/10.1016/j.jsurg.2014.12. 007. Accessed May 15, 2015. 6. Stefanidis D, Grewal H, Paige JT, Korndorffer JR, Scott DJ, Nepomnayshy D, et al. Establishing technical performance norms for general surgery residents. Surg Endosc 2014;28: 3179-85. 7. Fried GM, Feldman LS, Vassiliou MC, Fraser SA, Stanbridge D, Ghitulescu G, et al. Proving the value of simulation in laparoscopic surgery. Ann Surg 2004;240:518-25. 8. Sachdeva AK, Buyske J, Dunnington GL, Sanfey HA, Mellinger JD, Scott DJ, et al. A new paradigm for surgical procedural training. Curr Probl Surg 2011;48: 854-968. 9. Kairys JC, McGuire K, Crawford AG, Yeo CJ. Cumulative operative experience is decreasing during general surgery residency: a worrisome trend or surgical trainees? J Am Coll Surg 2008;206:804-11. 10. Mattar SG, Alseidi AA, Jones DB, Jeyarajah DR, Swanstrom LL, Aye RW, et al. General surgery residency inadequately prepares trainees for fellowship: results of a survey of fellowship program directors. Ann Surg 2013;258:440-9. 11. Friedell ML, Vandermeer TJ, Cheatham ML, Fuhrman GM, Schenarts PJ, Mellinger JD, et al. Perceptions of graduating general surgery chief residents: are they confident in their training? J Am Coll Surg 2014;218:695-703. 12. ten Cate O, Scheele F. Competency-based postgraduate training: can we bridge the gap between theory and clinical practice? Acad Med 2007;82:542-7. 13. Mulder H, ten Cate O, Daalder R, Berkvens J. Building a competency-based workplace curriculum around entrustable professional activities: the case of physician assistant training. Med Teach 2010;32:e453-9. 14. Chisholm CD, Whenmouth LF, Daly EA, Cordell WH, Giles BK, Brizendine EJ. An evaluation of emergency medicine resident interaction time with faculty in different teaching venues. Acad Emerg Med 2004;11:149-55. 15. Verhulst SJ, Colliver JA, Paiva RE, Williams RG. A factor analysis study of performance of first-year residents. J Med Educ 1986;61:132-4. 16. Williams RG, Sanfey H, Chen X, Dunnington GL. A controlled study to determine measurement conditions necessary for a reliable and valid operative performance assessment: a controlled prospective observational study. Ann Surg 2012;256:177-87. 17. Williams RG, Verhulst S, Colliver JA, Sanfey H, Chen X, Dunnington GL. A template for reliable assessment of resident operative performance: assessment intervals, number of cases, and raters. Surgery 2012;152:524-7. 18. Williams RG, Roberts NK, Schwind CJ, Dunnington GL. The nature of general surgery resident performance problems. Surgery 2009;145:651-8. 19. Beeson MS, Carter WA, Christopher TA, Heidt JW, Jones JH, Meyer LE, et al. The development of the

Surgery Volume 158, Number 5

emergency medicine milestones. Acad Emerg Med 2013;20:724-9. 20. Korte RC, Beeson MS, Russ CM, Carter WA. The emergency medicine milestones: a validation study. Acad Emerg Med 2013;20:730-6. 21. Accreditation Council for Graduate Medical Education (ACGME). American Board of Emergency Medicine. The emergency medicine milestone project. Available from: www.acgme.org/acgmeweb/Portals/0/PDFs/Milestones/ EmergencyMedicineMilestones.pdf. Accessed February 17, 2015. 22. Hauer KE, Soni K, Cornett P, Kohlwes J, Hollander H, Ranji SR, et al. Developing entrustable professional activities as

Gardner et al 1427

the basis for assessment of competence in an internal medicine residency: a feasibility study. J Gen Intern Med 2013; 28:1110-4. 23. Warm EJ, Mathis BR, Held JD, Pai S, Tolentino J, Ashbrook L, et al. Entrustment and mapping of observable practice activities for resident assessment. J Gen Intern Med 2014; 29:1177-82. 24. Krajewski A, Filippa D, Staff I, Singh R, Kirton OC. Implementation of an intern boot camp curriculum to address clinical competencies under the new Accreditation Council for Graduate Medical Education supervision requirements and duty hour restrictions. JAMA Surg 2013; 148:727-32.

Gearing up for milestones in surgery: Will simulation play a role?

The Consortium of American College of Surgeons-Accredited Education Institutes was created to promote patient safety through the use of simulation, de...
106KB Sizes 0 Downloads 8 Views