ORIGINAL ARTICLE

Use of Simulation to Test Systems and Prepare Staff for a New Hospital Transition Mark D. Adler, MD,*† Bonnie L. Mobley, BSN,†‡ Walter J. Eppich, MD, MEd,*† Molly Lappe, BSN,‡ Michaeleen Green, BA,§ and Karen Mangold, MD, MEd* Objective: To describe the development and key outcomes arising from the use of simulation as a method to test systems and prepare staff for a transition to a new hospital. Methods: We describe a simulation program developed by key parties with the goal of reducing latent safety threats present at the opening of a new hospital and to train staff in new workflows. Issues identified were collected and reported to leadership. Outcomes included the number of learners reached, issues identified (grouped by theme), and results of a postmove survey of hospital-based staff. Results: Approximately 258 hours of simulation were conducted, impacting 514 participants. We conducted 64 hours of system testing and 196 hours of training during the main orientation process. Approximately 641 unique issues were identified (175 equipment, 136 code alarm, 174 barriers to care, and 156 incorrect signage). In a hospital-wide survey, 38% reported simulation as part of their training (39% of nurses and 23% of physicians). 43% of survey respondents reported multidisciplinary simulations; 55% of simulation attendees felt that the simulation was helpful and eased their transition to the new hospital. Conclusions: Systems testing and education using simulation can play a meaningful role in new facility training. Key lessons included early planning, allocation of resources to the effort, flexibility to adapt to changes, and planned integration with other training activities. A formal a priori plan to address issues identified during the process is necessary. Key Words: simulation, latent safety threats, system change

specifically. Many reports support its use for training and assessment as well as expert opinion on simulation best practices.6–8 Although the use of simulations has been reported in preparation for new departments and units, we identified no house-wide simulation implementation at the time of this undertaking. Simulationbased testing allows for evaluating clinical environments and workflows by engaging teams in care process in their work setting and obtaining feedback on their experiences. The use of simulation to support system change and test health-care systems to uncover latent safety issues has been described in recent works,9–13 although our understanding is in its infancy. Our institution has experience with using simulation on a smaller scale for system testing. Our hospital program, following this concept, used simulation to test new processes and procedures in a new 10-bed medical observation unit in our hospital in 2010. This previous effort informed this work, in which we describe the use of simulation to prepare for the move of an entire hospital from one locale and physical layout to a new location and structure. We hypothesized that simulation-based systems testing would permit the identification and remediation of issues related to the move of this scale and improve staff perception of readiness for the move. Our primary outcome was the number of events identified through this process, with secondary outcomes being survey data from participants and description of the challenges encountered during this process.

(J Patient Saf 2015;00: 00–00)

METHODS 1

S

ystem change introduces opportunities for error. In health care, as in other fields, we recognize that error can be active or latent.2 The goal of managing system change is to mitigate harm, in part by identifying risks before they reach a patient.3 As part of our hospital’s move from one location to larger facility, a number of activities were conducted to reduce risk, including an extensive design process by firms with experience in hospital development, input from clinical providers early in the process, and system testing. Included in this process was simulation-based testing. Simulation is an educational methodology with a decade-long track record in high-reliability fields in general4 and in health care5 From the *Departments of Pediatrics and †Medical Education, Northwestern University Feinberg School of Medicine; and ‡Departments of Nursing and §Medical Management, Ann & Robert H. Lurie Children’s Hospital of Chicago, Chicago, IL. Correspondence: Mark Adler, MD, kidSTAR Medical Education Program, Ann & Robert H. Lurie Children’s Hospital of Chicago, 225 E. Chicago Ave, 62 Chicago, IL 60611 (e‐mail: [email protected]). Conflicts of Interest and Source of Funding: Ms. Mobley, Green, and Lappe and Drs. Adler and Mangold have no conflicts of interest to declare. Dr. Eppich receives salary support paid to his institution to teach at the Center for Medical Simulation, Boston, USA, per diem honoraria to teach courses with PAEDSIM e.V., Germany, and serves on the Society for Simulation in Healthcare in his role as a Board Member (through 12/31/2014). Supplemental digital contents are available for this article. Direct URL citations appear in the printed text and are provided in the HTML and PDF versions of this article on the journal’s Web site (www.journalpatientsafety.com). Copyright © 2015 Wolters Kluwer Health, Inc. All rights reserved.

J Patient Saf • Volume 00, Number 00, Month 2015

Setting Children's Memorial Hospital was located in a large, urban community and functioned as a tertiary care pediatric facility, serving greater than 500,000 outpatients and 13,000 inpatients per year. The hospital was relocating from a 9-story building to a 23-story building located 3 miles away.

Planning Hospital planning for the move began in 2006. The simulation program was invited to help with this process by key leadership groups (patient safety, leadership, and nursing education) who owned the transition process. The active buy-in from key leaders, in our case, the chief nursing and medical officers, was built upon our previous well-received simulation work at our hospital, going back to 2002. We were able to further engage these leaders based on the impact from the observation unit project, in which a number of actionable issues were uncovered. We partnered, on a day-to-day basis, with nursing education as they led the educational effort. It should be noted that the new hospital education process involved both on-site and Web-based training, of which, much of the planning and budgeting had taken place before our engagement. The nursing education and patient safety team agreed to start with simulations with a plan to address financial support as we moved forward. In retrospect, this effort would not have started had we waited for financial support. www.journalpatientsafety.com

Copyright © 2015 Lippincott Williams & Wilkins. Unauthorized reproduction of this article is prohibited.

1

J Patient Saf • Volume 00, Number 00, Month 2015

2

www.journalpatientsafety.com

JUS AMJ JUS AMJ JUS AMJ Board Move Approval Site Selection Design Plan Move Planning Hospital Opens Simulation Initial Planning Simulation Planning Implementation Data Collection

TABLE 1. Timeline

2004

2006

2006

2007

2008

Implementation In January of 2012, six months before the hospital opening, we began exercises in the operating room and its adjacent care areas, the pediatric intensive care unit, the cardiac care unit, emergency department, and the heliport. We conducted elevator time testing at the same time. The simulation team conducted the simulations through a collaborative effort of the simulation, patient safety and nursing educational staff. We collected systems issues and provided this information to hospital administration. Issues that could be addressed (e.g., sign corrections) were corrected. Other issues required workflow changes as the underlying concern could not be fixed or represented a new facet of the work environment after the move (e.g., longer transport time to and from imaging in new facility). The length of training time varied in duration to reflect the needs of the specific units, typically involving about 2 of the total 8 hours of unit-specific orientation that each staff member received at the new site. As the simulations focused on process and not on clinical care, we used low fidelity approaches for most simulations, with higher fidelity use limited to certain critical care unit events. This choice increased the flexibility and reduced the

JUS JFM

2009

OND

JFM

2010

OND

JFM

2011

OND

JFM

2012

OND

Our first goal was to work with the nursing education team to develop an effective and cost-efficient approach for implementing simulation scenarios. Because of logistic issues, simulations needed to occur when other training was scheduled because no additional time was allotted. There were 2 rounds of onsite training, and we chose to integrate into the longer second phase that had more time, was unit-based, and occurred closer to opening. The potential target audience was large: hundreds of clinical staff members working across 12 clinical floors. To remain costneutral, we chose to focus on the clinical areas with substantive workflow changes or higher-risk/low-resource clinical settings. Using this approach to prioritization, the planning group chose to focus on high-yield areas: (a) the emergency department (ED), (b) the new cardiac care unit, (c) pediatric intensive care unit, (d) the operating room (OR) floors, (e) transport team (new ambulance bay and heliport), and (f ) radiology. Meeting with the educators and clinical leaders for each area, we developed simulation exercises that matched their greatest concerns, both for testing systems and workflows. Table 1 illustrates the project timeline. There were several external constraints on the training schedule. Construction of the hospital would be completed very close to the opening date. There was a fiscal incentive to open as soon as the facility was ready and to not allot extra time for training after this point. This fact made it difficult to test systems and workflows, as these systems would not be completely operational during our training window. Because of the cost associated with operating 2 facilities, the information technology, including the clinical software and patient monitors, would not be available until a few weeks before opening. Stocking of patient supplies would similarly occur just before the hospital move. To address the above constraints, we created simulations that focused on process and workflow: patient transports, call systems, care handoffs, code responses, and orientation to new room layouts. In addition to the foreseen barriers to testing systems and processes completely, there were 2 unscheduled blocks of time removed from the orientation calendar because of unexpected events. The team addressed this loss of time, as well as feedback from staff regarding the desire to review systems that were not available during the first cycle by conducting certain additional system tests (code, OR workflows) in the week leading up to opening and during the first few days of operation.

AMJ

Adler et al

© 2015 Wolters Kluwer Health, Inc. All rights reserved.

Copyright © 2015 Lippincott Williams & Wilkins. Unauthorized reproduction of this article is prohibited.

J Patient Saf • Volume 00, Number 00, Month 2015

time and cost involved in implementation without a substantive impact on our goals. Faculty with simulation debriefing experience facilitated all debriefings at the bedside immediately after the simulations. This allowed for direct reference to the clinical environment (e.g., “the suction canister is in a place I cannot reach”). The discussions focused on “What worked well?” and “What were barriers to care?” as this was the content of interest. We continued until all issues were identified. Medical issues were addressed as time permitted. Each simulation lasted about 30 minutes but varied by area. In the 2 weeks before opening, we conducted 3 additional system tests, which benefited from the proximity to opening day as more staff and equipment were on site and functional: (a) Code tests — The purpose of this late system testing was a specific focus on code response across the hospital, including public, employee, and patient care areas. We chose this time frame as members of the code team were on site. Their presence allowed us to test both the mechanics of the code system (paging, elevator swipe pads, and signage) and team function in the new space. In addition to looking at arrival time for each team member, we also recorded the barriers to arriving, accessing airway equipment and locating the crash cart as well as the ability to run the code as a team in the area. (b) Patient flow tests to OR from ED and catheterization lab — These occurred on the weekend of opening when the OR

Use of Simulation to Test Systems

was lightly scheduled. We conducted a trauma simulation, focusing on transport from the ambulance bay, to the ED and then to the OR. Finally, we simulated 2 cardiac surgical cases in the cardiac catheterization lab and the new cardiothoracic OR room. (c) ED Triage — The emergency department tested patient flow through the triage area, which was previously unavailable because of construction limitations. (d) Cardiac intensive care unit – We tested critical events with a fully stocked unit, and the operating floors tested patient flow from arrival check-in through the postoperative area.

Data Collection Plan We collected data on 3 main outcomes (a) number of participants undergoing training, hours of training and some information regarding costs; (b) number of issues identified in during the process, reported as totals and compared with those issues reported to a call center after the move, and (c) survey data from staff. This online, anonymous survey was sent to all hospital clinical employees 3 months after the move (See Appendix 1, Supplemental Digital Content, http://links.lww.com/JPS/A24). The survey collected demographic data as well feedback on the characteristics of the training group and self-assessed readiness for the move. Three emails were sent to the group email address in an attempt to improve response rates. Survey responses are reported as

TABLE 2. Issues Identified Category (N)

Example

Equipment (175)

Emergency airway box obstructed by equipment in many locations Room layout and door functions in ICU rooms presented barriers to bringing equipment to bedside (e.g., door opening inward, computer cart by the bedside) New beds that staff were unfamiliar with impeding care, transport, and BLS during code events. Code Alarm (136) Code buttons in ICUs located in areas difficult to reach Code alarm did not ring to both sides of the unit Elevator call system for codes did not work as planned (e.g., did not designate an elevator for medical emergencies limiting the stops to the designated floor) Code buttons triggered incorrect information in pages (e.g., wrong floor, nondescript locations, architectural number not common number) Patient Care Concerns (174) Code carts on public areas are in locked closets, hard to find and move to a room in a timely manner Longer routes from ICUs to other hospital locations Longer and farther road trips for patients to be off the unit for testing

Signage/Wayfinding (156)

Access to patient care areas limited, not all clinical care providers had the same security access Lack of signage when exiting off staff elevators delaying code responders Shift to architectural numbers versus common room numbers (Children's Memorial Hospital to Lurie Children’s cultural shift) Complex architectural numbering design hindering location of rooms. Signage organization not intuitive Name changes for preexisting departments created confusion

© 2015 Wolters Kluwer Health, Inc. All rights reserved.

Remediation Many boxes were moved to facilitate access Staff education and changes to certain equipment to facilitate use Remedial just-in-time education focused on equipment where knowledge gaps existed Buttons moved or education provided to providers Code system software corrected Programming changes and focused code team education Testing and correction of code buttons throughout facility Staff education and additional simulations before hospital opening Workflow changes to ICU transport practices Development of “go bags” for medications, treatments, and airway support. Policy created for provider accompanied road trips Recoding of badge card readers and reclassification of clinical care provider badges. Signage improvements (in stages, first for opening and then over next 2 years) Revisit the numbering of rooms and changing to common room numbers for patient care areas Adding in North/South/East/West and “Land Side”/“Lake Side” signage Signs reorganized to put patient care rooms at the top Remediation of wayfinding exercises

www.journalpatientsafety.com

Copyright © 2015 Lippincott Williams & Wilkins. Unauthorized reproduction of this article is prohibited.

3

J Patient Saf • Volume 00, Number 00, Month 2015

Adler et al

proportions for demographic data and either proportions or dichotomized agree/disagree proportions for Likert questions. Description of barriers encountered is provided.

RESULTS A total of 258 hours of simulation were conducted, impacting 514 participants. Of this, 64 hours were system testing and the remainder, 196 hours, was part of the early orientation process. A mix of interprofessional and single profession simulations was held. The cost of this project included simulation staff time for the hours stated above plus about 30 minutes of setup and breakdown time for 2 people (in our case, staff were physicians and/or nurses). As we scheduled events for learners during their allotted general training time, we did not incur further staff costs. Table 2 details the clinical issues identified during our process. In total, 641 unique issues were identified before opening including 175 equipment issues, 136 code alarm functionality problems, 174 unexpected barriers to care, and 156 incorrect signage/wayfinding. Postopening, the “Move Center” help desk systems collected 4489 problems, of which, 1724 issues were related to clinical care areas that were the subject of the simulations. Seven hundred ninety-one staff members responded to the online survey, with the respondents consisting of 56% nurses, 21% physicians, 9% advanced practice nurses, all from a broad variety of work locations and years of experience. Eight-nine percent of respondents reported attending an early orientation, and 38% reported simulation as part of this orientation. A higher proportion of nurses (39%) reported participating in a simulation than physicians (23%). Overall, 43% of respondents reported multidisciplinary simulations (at least one other person from a field other than their own). Fifty-five percent of those who attended an early orientation with simulation felt that that this orientation helped eased their transition to the new hospital. Forty-six percent felt the number of simulations was sufficient. Forty-eight percent felt the amount of time for simulations was sufficient. We encountered specific issues during this process: (a) The window for planning and implementation was short, and demand on time and resources was high. There was a strong fiscal incentive to adhere to the planned opening date, given how a delay would significantly hinder patient care operations and continuity of care. As a result, simulations occurred in a partly finished facility, which impacted the training and preparation for the simulation team (bringing supplies, for example). In particular, we found that clinical information technology was not available for training in the new facility until after opening. (b) Scheduling learners was challenging and required integration with other activities. A majority of hospital employees are hourly employees, and there were no additional resources for extra simulation testing. Furthermore, matching schedules across professions was challenging because of different administrative structures (MD vs. others) and the need to staff the old facility. We noted that the nursing and ancillary staff had more formal systems ensuring attendance to events than did physicians. Although we made an effort to promote interprofessional training, this goal was met more fully in some areas than others. (c) The feedback obtained from simulations needed to be managed in real time and routed to leadership in an effective manner that prevented the data from getting lost during this busy time. We developed a new interface between our program and the patient safety leadership group. (d) Code response testing revealed important findings, including challenges in wayfinding because of signage and

4

www.journalpatientsafety.com

paging information, issues with elevator access (e.g., pharmacy floor has one elevator for primary use) and prioritization of code swiping to call an elevator. This testing led to substantial changes in code response workflow, signage, and team education.

DISCUSSION Our results demonstrate that a large number of issues (641) were identified and fell into the specific groupings outlined in Table 2. Although not directly comparable to the number of events identified at opening via the call center, the number of issues is substantial. We found that our staff perceived increased readiness for the new hospital transition. We noted some specific barriers that we discovered during the process. We believe that the early inclusion of simulation activities in move planning, the presence of existing simulation resources available locally, and the significant support from the education and patient safety leadership of our institution contributed to the success of this effort. To address the barriers we identified, we recommend the following: (a) begin planning as early as possible and seek resources to subsidize the time and effort of the simulation faculty to conduct this work, (b) during planning, work toward dedicated simulation-based testing time, likely within the same time frame as other training, (c) plan for routing data from simulations to administration, including how it will be routed and prioritized and how information will flow back to care areas (ideally constituted as a dedicated administrative person’s role), and (d) conduct code response testing early in the process. Code team testing requires participation of all code team members to provide feedback on the barriers and concerns they have responding to codes in various areas of the hospital. Our study is limited to some degree from arising from a single site, although moves of this kind are sufficiently infrequent to limit the possibility of a multihospital effort. Furthermore, our survey was sent out via an email alias to the entire hospital staff and faculty that includes some individuals that would not be part of simulations (outpatient staff ) as there was not an inpatient only list. The survey included a step to opt out if the recipient was not an inpatient provider to mitigate this issue, but we do not have a denominator for this survey population to calculate a true response rate.

CONCLUSIONS The incorporation of simulation-based systems testing into new facility training is feasible. Key success factors include early planning, allocation of resources to the effort, alternative implementation timelines to allow flexibility with work flow interruptions due to unforeseen events, integration with other training activities, and a formal plan to address issues identified during the process. REFERENCES 1. Manojlovich M, Lee S, Lauseng D. A systematic review of the unintended consequences of clinical interventions to reduce adverse outcomes. J Patient Saf. 2014;1. 2. Reason J. Human error: models and management. BMJ. 2000;320: 768–770. 3. Fairbanks RJ, Wears RL, Woods DD, et al. Resilience and resilience engineering in health care. Jt Comm J Qual Patient Saf. 2014;40:376–383. 4. Henriksen K, Moss F. From the runway to the airway and beyond. Qual Saf Health Care. 2004;13(Suppl 1):i1. 5. Baker DP, Day R, Salas E. Teamwork as an essential component of high‐reliability organizations. Health Services Res. 2006;41:1576–1598.

© 2015 Wolters Kluwer Health, Inc. All rights reserved.

Copyright © 2015 Lippincott Williams & Wilkins. Unauthorized reproduction of this article is prohibited.

J Patient Saf • Volume 00, Number 00, Month 2015

Use of Simulation to Test Systems

6. Issenberg SB, McGaghie WC, Petrusa ER, et al. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005;27:10–28.

10. Villamaria FJ, Pliego JF, Wehbe-Janek H, et al. Using simulation to orient code blue teams to a new hospital facility. Simul Healthc. 2008;3: 209–216.

7. McGaghie WC, Issenberg SB, Petrusa ER, et al. A critical review of simulation-based medical education research: 2003–2009. Med Educ. 2010;44:50–63.

11. Geis GL, Pio B, Pendergrass TL, et al. Simulation to assess the safety of new healthcare teams and new facilities. Simul Healthc. 2011;6: 125–133.

8. Dieckmann P, Phero JC, Issenberg SB, et al. The first Research Consensus Summit of the Society for Simulation in Healthcare: conduction and a synthesis of the results. Simul Healthc. 2011;6(Suppl):S1–S9.

12. Patterson MD, Geis GL, Falcone RA, et al. In situ simulation: detection of safety threats and teamwork training in a high risk emergency department. BMJ Qual Saf. 2013;22:468–477.

9. Kobayashi L, Shapiro MJ, Sucov A, et al. Portable advanced medical simulation for new emergency department testing and orientation. Acad Emerg Med. 2006;13:691–695.

13. Wheeler DS, Geis G, Mack EH, et al. High-reliability emergency response teams in the hospital: improving quality and safety using in situ simulation training. BMJ Qual Saf. 2013;22:507–514.

© 2015 Wolters Kluwer Health, Inc. All rights reserved.

www.journalpatientsafety.com

Copyright © 2015 Lippincott Williams & Wilkins. Unauthorized reproduction of this article is prohibited.

5

Use of Simulation to Test Systems and Prepare Staff for a New Hospital Transition.

To describe the development and key outcomes arising from the use of simulation as a method to test systems and prepare staff for a transition to a ne...
108KB Sizes 3 Downloads 7 Views