© Oncology Nursing Society. Unauthorized reproduction, in part or in whole, is strictly prohibited. For permission to photocopy, post online, reprint, adapt, or otherwise reuse any or all content from this article, e-mail [email protected]. To purchase high-quality reprints, e-mail [email protected].

n Article

Using Failure-to-Rescue Simulation to Assess the Performance of Advanced Practice Professionals Lisa M. Blackburn, MS, RN, AOCNS®, Sherri Harkless, MSN, APRN/CNS, RN-BC, and Paula Garvey, MSN-ED, RN-BC

The use of advanced practice professionals (APPs) has been established in oncology care. APPs are frequently the most readily available ordering provider for care guidance when it becomes evident that a patient with cancer is failing. The purpose of the current preliminary descriptive project was to determine the best method for assessing APP performance in oncology-specific circumstances, particularly in the failing patient with cancer. A test group of 14 APPs completed a competency self-assessment, the Basic Knowledge Assessment Tool (BKAT)-8SR, and attended a four-hour simulation and classroom experience. Competency checklists with 30 priority interventions for each scenario had been anticipated by an expert panel. The APP competency self-assessment was © Ohio State University Wexner Medical Center measured for knowledge base and critical thinking. All of the APPs scored at or above the level of a critical care nurse with one year of experience on the BKAT-8SR. Twenty-seven of the anticipated interventions were enacted by all APPs. Five additional interventions were ordered that had not been anticipated. The success of this educational strategy has stimulated new learning opportunities, including initiation of a full-team oncology failure-to-rescue simulation, course restructuring, and other innovative simulation experiences. Lisa M. Blackburn, MS, RN, AOCNS®, is a leukemia clinical nurse specialist and Sherri Harkless, MSN, APRN/CNS, RN-BC, is an acute care and critical care nurse educator, both at the Arthur G. James Cancer Hospital and Richard J. Solove Research Institute in Columbus, OH; and Paula Garvey, MSN-ED, RN-BC, is a program manager for simulation and continuing nursing education at the Ohio State University Wexner Medical Center in Columbus. The authors take full responsibility for the content of the article. The authors did not receive honoraria for this work. The content of this article has been reviewed by independent peer reviewers to ensure that it is balanced, objective, and free from commercial bias. No financial relationships relevant to the content of this article have been disclosed by the authors, planners, independent peer reviewers, or editorial staff. Blackburn can be reached at [email protected], with copy to editor at CJONEditor@ ons.org. (Submitted July 2013. Revision submitted September 2013. Accepted for publication September 15, 2013.) Key words: advanced practice professionals; rescue simulation; continuing education Digital Object Identifier: 10.1188/14.CJON.301-306

T

he use of advanced practice professionals (APPs), such as certified nurse practitioners (CNPs) and physician assistants (PAs), has expanded greatly in the past 10 years in inpatient and ambulatory settings. The advanced practice role has evolved in response to the numerous changes in health care (i.e., distribution of services, staffing, technology, and educational opportunities) (McCorkle et al., 2012; Murphy-Ende, 2002). APPs have become a mainstay of patient care and are valued for the knowledge and skills they add. Other members of the oncology healthcare team are aware of the contributions APPs make in the areas of decision making and problem solving (Kilpatrick, 2013; Klipfel et al., 2011). In the inpatient setting, APPs are frequently the most readily accessible ordering provider when clinical guidance is needed. Although the physician is consulted, the APP sometimes is the first provider notified when it becomes evident that a patient with

cancer is failing. Early identification of patient deterioration and timely patient management have been demonstrated to improve patient outcomes (Cooper et al., 2011; Endacott et al., 2012). As the population ages and the number of oncologists declines, the demand for APPs to provide expert clinical care will intensify (McCorkle et al., 2012).

Background High-fidelity human patient simulation, a largely risk-free approach to learning (Yuan, Williams, Fang, & Ye, 2012), has been widely used in colleges of nursing for many years; however, simulation is relatively new to hospital-based nursing education (Decker, Sportsman, Puetz, & Billings, 2008). Clinical simulation is a technique using guided practices that imitate substantial aspects of the real world in a fully interactive approach

Clinical Journal of Oncology Nursing • Volume 18, Number 3 • Using Failure-to-Rescue Simulation to Assess Performance

301

that have used simulation as one of the teaching and evaluation methods. About 1,000 nurses have completed Expectations Context Restoring Patterns Action simulation-based education Background Analytic events or competency evaluRelationship Intuitive Initial Grasp ations. Despite the use of Narrative simulation with RNs, it had never been used with APPs Reflection on action Outcomes Reflection in action at OSUWMC. and clinical learning The education needs of practitioners continue after Reflecting graduation and throughout clinical practice (Benner, FIGURE 1. Clinical Judgment Model Sutphen, Leonard, & Day, Note. From “Thinking Like a Nurse: A Research-Based Model of Clinical Judgment in Nursing,” by C.A. Tanner, 2006, 2009; del Bueno, 2005). At Journal of Nursing Education, 40, p. 208. Copyright 2006 by SLACK Inc. Adapted with permission. the authors’ cancer hospital, nursing standards require a voluntary education needs (Kaddoura, 2010). This type of simulation incorporates a comassessment of all levels of puterized full-body mannequin that can be programmed to prostaff every two years. This written self-assessment contains a vide realistic physiologic responses to a practitioner’s actions. combination of multiple-choice, oncology-focused topics, as The mannequins have realistic heart, lung, and bowel sounds, well as some questions that require fill-in-the-blank answers. reactive pupils, and full-body pulses that respond to physiologic As a component of the full assessment, last completed in 2011, events. The mannequins are used to simulate hemodynamic the APP Educational Needs Assessment was completed by 70 monitoring, needle decompressions, chest tube insertions, of 132 APPs, representing a 53% response rate at that time. The and can be made to exhibit realistic bleeding capabilities. The results clearly indicated a need for more education interventions mannequins are able to demonstrate a wide variety of clinical conditions and respond to practitioners’ corresponding interventions. Because of those capabilities, the mannequins are • Immediate patient assessment after receiving report a useful tool for hospital educators to teach and to aid in the • Vital signs with critical analysis of results evaluation of competencies (Decker et al., 2008). • Early recognition of respiratory distress Hospital-based nursing educators are routinely faced with • Appropriate notification of RN/charge nurse with assessment of teaching new and experienced nurses in addition to determining respiratory distress clinical competency. Nursing competency involves the acquisi• Appropriate notification of RN/charge nurse with assessment of tion of relevant knowledge, the development of psychomotor sepsis • Appropriate notification of attending physician/fellow with skills, and the ability to apply the knowledge and skills in a assessment of respiratory distress and sepsis given context (Decker et al., 2008). Nurse educators have dis• Give appropriate orders based on patient condition with critical covered that the traditional methods of evaluating competency analysis of results. (e.g., pre- and post-tests, verbal question-and-answer sessions, – Respiratory distress return demonstrations) do not always meet their needs to accu• Arterial blood gas rately determine competencies. Those methods of competency • Oxygen evaluation allow nurses to show every skill step on paper or in a • Bronchodilator controlled setting rather than demonstrate the critical responsi– Sepsis bilities of practice and an understanding of nursing science in a • Bolus IV fluids situation that replicates reality more closely (Allen et al., 2008). • Urinalysis with culture and sensitivity Requests for abandoning these traditional approaches have been • Lactate level • Broaden antibiotic coverage influenced by the complex healthcare environment, changes in • Central venous pressure technology, and pressing patient safety issues (Byrne, 2005). • Responds appropriately to nursing questions regarding guidance in Hospital-based nursing educators are turning to high-fidelity huthe situation man patient simulation as a solution (Decker, Utterback, Thomas, • Requests or implements the sepsis resuscitation bundles Mitchell, & Sportsman, 2011). A major role of nursing educators • Appropriately obtains and follows the febrile neutropenia protocol is to facilitate learning and evaluation of skills and competencies • Proficient understanding and ordering of the various oxygen and the development of clinical judgment (i.e., the marriage of modalities knowledge and clinical practice) (Lasater, 2011). The Ohio State University Wexner Medical Center (OSUWMC) FIGURE 2. Examples of Priority Interventions for in Columbus has been using simulation as an educational tool and Advanced Practice Professional Education Assessment a method to evaluate nursing competencies since 2006. To date, for Sepsis and Respiratory Failure Simulation educators have completed 14 unit-based or division-based projects Noticing

302

Interpreting

Responding

June 2014 • Volume 18, Number 3 • Clinical Journal of Oncology Nursing

designed to better prepare healthcare practitioners to respond in emergency or rapidly declining clinical situations in the oncology population. The top six most highly rated topics were (a) acute and critical oncology care, (b) caring for the unstable patient with cancer, (c) symptom management, (d) diagnostic tests and interpretation, (e) pain management, and (f) palliative care. Those topics were identified despite that 52% (n = 69) of the respondents had been in their advanced practice role for more than five years, and 51% (n = 68) had more than five years of experience in oncology. After a detailed review of the full education needs assessment results, the advanced practice educator, a Master’s prepared APP educator, determined that the needs called for an education assessment and intervention with the use of simulation. The purpose of the current evidence-based practice pilot project was to determine a best method for assessing APP performance in oncology-specific circumstances. The secondary objective was to determine if simulation would provide a robust assessment and education intervention for this group.

Theoretical Framework The theoretical framework for this project was based on the Clinical Judgment Model (Tanner, 2006) (see Figure 1). This theory supports the use and development of clinical judgment through experiential learning and explores the ways in which care providers understand patient issues, attend to salient information, and respond with the deliberate, conscious interventions of the proficient or expert practitioner (Lasater, 2007, 2011; Tanner, 2006).

Methods and Analysis A test group of 14 APPs, which was about 10% of the APP workforce at the time of implementation, completed the full oncology APP failure-to-rescue simulation initiative. The initiative, designed to correlate with the model shown previously, included the following: completion of an online competency self-assessment, completion of the Basic Knowledge Assessment Tool (BKAT)-8S R , and attendance at a four-hour simulation and classroom experience. The BKAT-8S R has been used in research since 1979 and has been accepted as a standard for measuring basic knowledge in critical care. The program and content were developed by the OSUWMC simulation program manager and the Ohio State University Comprehensive Cancer Center—Arthur G. James Cancer Hospital and Richard J. Solove Research Institute advanced practice educator and acute care nurse educator. The program structure included a didactic portion that reviewed core elements of typical failure-torescue situations and case scenario “dissections” that showed trends in care, priorities of care, and responses to actions and orders. This portion of the experience focused on the APP’s role in care of the declining patient with cancer. The instructor reviewed common critical clinical events for patients with cancer and key intervention strategies. Emphasis was placed on the ability to assess for subtle changes in condition to be able to intervene early and avoid escalating the patient crisis. Clinical scenarios for the simulations were developed from a review of previous patients who had presented with what were

• • • • • • • • • •

• • • •

Immediate patient assessment after receiving report Critical analysis of vital signs Early recognition of respiratory distress Appropriate notification of RN/charge nurse with assessment of respiratory distress Appropriate notification of attending physician/fellow with assessment of respiratory distress Proficient understanding and ordering of the various oxygen modalities Recognizes and treats decreased urine output Recognizes and treats fluid volume deficit Recognition of acute mental status change as an early sign of deterioration Performs critical analysis of the following – Central venous pressure – Arterial blood gases – Chemistry panel – Complete blood count – Lactate level – Chest x-ray Rapid recognition of lethargy and minimal respiratory effort Orders or uses bimanual valve mask to supplement respirations Anticipates and prepares for cardiac arrest Notification of physician with impending code status

FIGURE 3. Examples of Priority Interventions for Advanced Practice Professional Education Assessment for Pneumonectomy and Pain Simulation

later deemed to be difficult-to-manage crisis oncology clinical situations. The review included specific patient charts, early response team data, code-blue data, and event reports. The simulation included one patient scenario focused on a sepsis and respiratory failure patient and one scenario focused on a postoperative patient following a pneumonectomy who also was experiencing pain. The specific patient scenarios were chosen because those types of cases were seen more frequently in the data reviewed and because each case had a clear but divergent focus; one case was more medically focused, and the other was more surgically focused. Competency checklists with about 30 specific priority interventions for each scenario were anticipated by an expert panel. The expert panel consisted of eight staff members, including educators, APPs, nursing directors, and physicians. To monitor for the anticipated interventions during simulation, APP competency checklists were developed for each scenario (see Figures 2 and 3). To assess and evaluate APP performance during these simulations, oncology clinical nurse specialists were incorporated in the sessions to act in other vital team member roles, including RNs, a charge nurse, patient care associates (i.e., nursing assistant), a respiratory therapist, an anesthesiologist, and family members. A basic script was created for each scenario, which was used as a guide for all participants to allow for the clinical situation to evolve as planned. Figure 4 shows an excerpted example of this script for the patient with sepsis and respiratory failure. All APP participants completed both simulation scenarios (see Figure 5). Each scenario was followed by a debriefing session, which included comments by the participants evaluating their own responses and the educators conducting

Clinical Journal of Oncology Nursing • Volume 18, Number 3 • Using Failure-to-Rescue Simulation to Assess Performance

303

Background The patient is being admitted to inpatient service from an outpatient clinic. The primary nurse and the charge nurse are already in the patient room. The APP will receive a report from a nurse practitioner in the clinic. The report should be hurried, but if the APP asks questions, they should be answered, if possible. Report “We have a 60-year-old male with newly diagnosed acute myelogenous leukemia. He was discharged last week after induction with daunorubicin, cytarabine, and etoposide. He visited the clinic today with complaints of an intermittent fever for two days. His fever was 100.5°F in the clinic. During induction, he experienced a severe exacerbation of COPD and started a prednisone ‘burst and taper.’ He is still on 10 mg per day. He continues to have some nausea, vomiting, and mucositis, but no diarrhea. Line and peripheral blood cultures have been drawn and sent. Patient still has a cough, and clinicians are not sure if it’s COPD, so they did a multiplex swab and a chest x-ray, but they have not been read out yet. Patient also has a history of hypertension. He has not taken any medications today because of nausea. He is on sliding-scale insulin with steroids, but he was never previously diabetic. Cefepime is hanging now. Looks stable.” If the APP asks: • Keep vein open; normal saline into double-lumen catheter in the right internal jugular • Beats per minute, 108; respiratory rate, 28; blood pressure, 104/70 • Temperature of 101.5°F last night at home • Pulse oximeter was 94% on room air. • Uses albuterol as needed, but none today • Lungs are decreased bilateral bases. • White blood cell—500; absolute neutrophil count—20%; hemoglobin—8.2; platelets—75,000 • Chemistries and liver function tests were clear, but creatine was at 1.2, up from baseline of 0.8 previously. Mannequin Sinus tachycardia with ST segment elevation at 108 beats per minute, blood pressure of 104/70, and respiratory rate of 28, with 100.5°F and oxygen saturation at 93%. As the APP is assessing the patient, his rhythm becomes more unstable with the addition of premature ventricular contractions. The blood pressure should begin to decrease with a slow drop to hang around 76/36, and the oxygen saturation should decrease to hang around 85%, with signs of cyanosis and mental confusion. Mannequin Voice Mr. B [continuous complaining, anxious, mildly confused]: “My chest hurts. I can’t catch my breath. It hurts to breathe. I just don’t feel good.” Mr. B should be complaining and confused constantly enough that it should be obvious when he stops talking and becomes unresponsive. APP—advanced practice professional; COPD—chronic obstructive pulmonary disease

FIGURE 4. Excerpted Script for APP Sepsis/Respiratory Failure Patient the simulation. The responses that evolved from the simulator patient were based on the interventions that were enacted and, therefore, could not always be anticipated.

Findings The test group of APPs (N = 14) had an average of 7.8 years in their advanced practice roles. Eleven of the APPs were CNPs, 304

and three were PAs. Of the CNPs in the test group, seven had their advanced certification as family nurse practitioners, and seven were certified as acute care nurse practitioners. None were certified in oncology, which is fairly common among APPs. All of the providers worked exclusively in the inpatient setting because that was the focus of the pilot trial. Eight participants had previous critical care experience. All members of the test group were asked to complete a competency selfassessment prior to their simulation in the care of the failing patient with cancer by answering two questions on a five-point Likert-type scale, with 1 being “needs improvement,” 3 being “competent,” and 5 being “adept.” One of the questions asked nurses to rate current competency in terms of knowledge base when caring for a failing patient, and the average rating for this question was 4. Another question asked participants to rate current critical thinking skills when caring for a failing patient with cancer, and the average score was 3.71. Those ratings show that the APPs felt more than competent in caring for a failing patient in terms of general knowledge base and critical thinking skills, with critical thinking ranking slightly lower. The next step in this initiative was to complete the BKAT8SR. The BKAT-8SR has been used for more than 30 years in the assessment of basic critical care knowledge of nurses and has become accepted as a standard for measuring basic knowledge in critical care nursing. Items on the BKAT-8SR contain multiple choice and fill-in-the-blank questions that measure the recall of basic critical care information and the application of basic knowledge in critical care situations. After an orientation or education to critical care, the expected average score is 84%–87%. All of the APP participants scored at or above the level of a critical care RN with one year of experience on this test, with an average score of 92%. During the actual simulation activity, 27 of the 30 anticipated interventions were enacted by all practitioners. Interventions that were anticipated but not enacted by all participants included central venous pressure monitoring, the ordering of a lactate level when the first key signs of sepsis were identified, and notifying or calling the responsible physician to the room in a timely manner. Those missing core interventions were discussed and emphasized during the subsequent debriefing and education session. Five additional crucial interventions, which had not been anticipated, were ordered with some regularity by the practitioners. Of the participants, six ordered stat med nebulizers, six ordered troponin level, six ordered trendelenberg, seven ordered an anesthesia consultation for probable intubation, and eight ordered an ST elevated myocardial infarction (STEMI) alert. In reviewing those unexpected interventions postsimulation, the educators felt that troponin level was not anticipated because the clinical scenario progressed in a different manner than anticipated (i.e., chest pain with related assessment and intervention). In addition, the process of STEMI alert had just been initiated throughout the institution prior to the simulation experience. No significant difference was found in performance by any measure used prior to or during the simulation experience between family nurse practitioners versus acute care nurse practitioners, or between CNPs versus PAs. In general, the participant evaluation of the simulation experience was unwaveringly positive. One participant said, “This is an excellent way to provide real-world scenarios in a practice

June 2014 • Volume 18, Number 3 • Clinical Journal of Oncology Nursing

environment,” and another said, “Great way to simulate my responses to patient issues that need to be caught and addressed quickly.” Another participant praised the program by saying, “Informative. Educational. Should be required of all advanced practice professionals and house staff!”

Discussion Performance assessment by use of simulation can have positive and negative effects on the participants. The authors’ education sessions proved to have positive outcomes related to early recognition and management of a deteriorating patient, team communication, and identification of additional education needs. The participants noted that after the simulation activity, future education needs seemed much clearer to them than when they were queried on a written self-assessment alone. Participants provided expected as well as unexpected interventions, to the case scenarios. Team-based learning augmented with simulation provides the opportunity to expand on the expected outcomes to capture alternate pathways of care. In the sessions from the current study, the authors discovered unexpected interventions and responses as a result of the scenarios progressing down a different diagnosis pathway. Educators need to be prepared to recognize unanticipated variances, quickly decide if they are appropriate for the expected outcomes of the session, and redirect as needed. The success of the education strategy has stimulated the initiation of a full-team oncology failure-to-rescue simulation, as well as other innovative simulation experiences at the authors’ hospital. This new type of simulation session broadened the participant base to include bedside nurses and advanced practice nurses, as well as other disciplines. The full team included two RNs, a charge nurse, a disease-specific CNP, a patient care associate, a respiratory therapist, and an attending physician, all of whom usually work together on the care of a specific patient population. The case scenario for this simulation was revised to be typical of the care usually provided by the group. Using findings from the previous sessions for the APPs, the key elements for the simulation activity focused on assessment and recognition, interventions, and communication. The addition of the attending physician as a participant in the sessions stimulated a different level of performance by the CNP, who sought approval for intervention orders prior to initiating them independently. A limitation of the current study is the lack of measurement of confidence pre- and postsimulation, which could be helpful in measuring impact. Participant performance and feedback from the APP failureto-rescue simulation sessions influenced structure and content changes to other oncology courses offered by the nursing education department, including the expert symptom management course and oncologic emergencies course. A course on the fundamentals in acute and critical care oncology is under development at the authors’ hospital. Patients with cancer are unique in that they are receiving care for cancer, in addition to dealing with preexisting health issues. Healthcare providers are challenged daily with determining whether the patient is presenting in a typical fashion for his or her oncology diagnosis and treatment or if he or she is heading down a path to something more unexpected.

FIGURE 5. Advanced Practice Professional Oncology Failure-to-Rescue Simulation Note. Image courtesy of Ohio State University Wexner Medical Center. Used with permission.

Implications for Nursing Simulation was shown to be an effective modality in the training of oncology APPs. A target goal of any teaching modality is to measure the effectiveness on participant learning and the impact at the patient level. Current practice demands have changed the model of care in oncology, and preparation of APPs should include individual and team-based learning programs that use simulation. The use of simulation to improve knowledge, skill acquisition, and teamwork and to standardize care is at the forefront of improved patient outcomes and safety initiatives and will continue to be integrated into education programs targeted to oncology healthcare providers at all levels. A multidimensional education program that incorporates team-based learning can provide specific experiences for the learner, improving retention of knowledge, competence, skill acquisition, and patient outcomes. An important question that educators should ask is, “Does the addition of simulation to this program enhance participant outcome?” Despite the realism of the mannequins, patients with cancer often have atypical presentations for common emergencies, such as sepsis and respiratory failure when compared to the non-oncology patient. Respiratory distress may be difficult to simulate because of mechanical limitations in simulating the work of breathing (i.e., use of accessory muscles, nasal flaring, and facial expressions). Therefore, clinical case presentation review still needs to be incorporated into learning for all levels of oncology healthcare providers.

Conclusions The use of simulation in hospital-based education programs is limited. Although modern mannequins are easy to use, significant expertise and cost are required to implement simulation effectively in education programs. Other challenges for educators include balancing participant interest in simulation with potential learning outcomes, as well as the time required for the educator to develop, implement, and evaluate outcomes of this type of training.

Clinical Journal of Oncology Nursing • Volume 18, Number 3 • Using Failure-to-Rescue Simulation to Assess Performance

305

Implications for Practice u Invest in simulation as one option for clinical education of staff. u Provide

advanced practice professionals with education interventions designed to enhance skills for managing failing patients.

u Allow teams of oncology healthcare providers in all roles to

use simulation for realistic clinical learning and evaluation.

The future of oncology education and use of simulation as a teaching modality at the authors’ hospital includes several innovative initiatives, such as the purchase of a midlevel simulator, the development of a year-long oncology advanced practice nurse fellowship, the revision of previous oncology courses to include simulation, and the continued use of the high-fidelity simulation center for multidisciplinary team training and performance assessment. Continued support from the medical and nursing leadership teams is vital for team training to include practitioners of all levels. Evaluating the effectiveness of such learning is key to the success of any program.

References Allen, P., Lauchner, K., Bridges, R.A., Francis-Johnson, P., McBride, S.G., & Olivarez, A. (2008). Evaluating continuing competency: A challenge for nursing. Journal of Continuing Education in Nursing, 39, 81–85. doi:10.3928/00220124-20080201-02 Benner, P., Sutphen, M., Leonard, V., Day, L. (2009). Educating nurses: A call for radical transformation. San Francisco, CA: Jossey-Bass. Byrne, M. (2005). CCI Think Tank: The future of learning: Building a bridge between competency and patient safety. Retrieved from http://www.cc-institute.org/docs/default-document-library/ 2011/10/19/thinkTank_buildingABridge.pdf Cooper, S., McConnell-Henry, T., Cant, R., Porter, J., Missen, K., Kinsman, L., . . . Scholes, J. (2011). Managing deteriorating patients: Registered nurses’ performance in a simulated setting. Open Nursing Journal, 5, 120–126. doi:10.2174/18744346011 050100120 Decker, S., Sportsman, S., Puetz, L., & Billings, L. (2008). The evolution of simulation and its contribution to competency.

306

Journal of Continuing Education in Nursing, 39, 74–80. doi:10.3928/00220124-20080201-06 Decker, S., Utterback, V.A., Thomas, M.B., Mitchell, M., & Sportsman, S. (2011). Assessing continued competency through simulation: A call for stringent action. Nursing Education Perspectives, 32, 120–125. doi:10.5480/1536-5026-32.2.120 del Bueno, D. (2005). A crisis in critical thinking. Nursing Education Perspectives, 26, 278–282. Endacott, R., Scholes, J., Cooper, S., McConnell-Henry, T., Porter, J., Missen, K., . . . Champion, R. (2012). Identifying patient deterioration: Using simulation and reflective interviewing to examine decision making skills in a rural hospital. International Journal of Nursing Studies, 49, 710–717. Kaddoura, M.A. (2010). New graduate nurses’ perceptions of the effects of clinical simulation on their critical thinking, learning, and confidence. Journal of Continuing Education in Nursing, 41, 506–516. doi:10.3928/00220124-20100701-02 Kilpatrick, K. (2013). How do nurse practitioners in acute care affect perceptions of team effectiveness? Journal of Clinical Nursing, 22, 2636–2647. doi:10.1111/jocn.12198 Klipfel, J.M., Gettman, M.T., Johnson, K.M., Olson, M.E., Derscheid, D.J., Maxson, P.M., . . . Vierstraete, H.T. (2011). Using high-fidelity simulation to develop nurse-physician teams. Journal of Continuing Education in Nursing, 42, 347–357. doi:10.3928/00220124-20110201-02 Lasater, K. (2007). Clinical judgment development: Using simulation to create an assessment rubric. Journal of Nursing Education, 46, 496–503. Lasater, K. (2011). Clinical judgment: The last frontier for evaluation. Nurse Education in Practice, 11, 86–92. McCorkle, R., Engelking, C., Lazenby, M., Davies, M.J., Ercolano, E., & Lyons, C.A. (2012). Perceptions of roles, practice patterns, and professional growth opportunities: Broadening the scope of advanced practice in oncology. Clinical Journal of Oncology Nursing, 16, 382–387. doi:10.1188/12.CJON.382-387 Murphy-Ende, K. (2002). Advanced practice nursing: Reflections on the past, issues for the future. Oncology Nursing Forum, 29, 106–112. doi:10.1188/02.ONF.106-112 Tanner, C.A. (2006). Thinking like a nurse: A research-based model of clinical judgment in nursing. Journal of Nursing Education, 45, 204–211. Yuan, H.B., Williams, B.A., Fang, J.B., & Ye, Q.H. (2012). A systematic review of selected evidence on improving knowledge and skills through high-fidelity simulation. Nurse Education Today, 32, 294–298. doi:10.1016/j.nedt.2011.07.010

June 2014 • Volume 18, Number 3 • Clinical Journal of Oncology Nursing

Copyright of Clinical Journal of Oncology Nursing is the property of Oncology Nursing Society and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use.

Using failure-to-rescue simulation to assess the performance of advanced practice professionals.

The use of advanced practice professionals (APPs) has been established in oncology care. APPs are frequently the most readily available ordering provi...
526KB Sizes 0 Downloads 3 Views