Developmental Neurorehabilitation

ISSN: 1751-8423 (Print) 1751-8431 (Online) Journal homepage: http://www.tandfonline.com/loi/ipdr20

Developing authentic clinical simulations for effective listening and communication in pediatric rehabilitation service delivery Gillian King, Tracy A. Shepherd, Michelle Servais, Colleen Willoughby, Linda Bolack, Deborah Strachan, Sheila Moodie, Patricia Baldwin, Kerry Knickle, Kathryn Parker, Diane Savage & Nancy McNaughton To cite this article: Gillian King, Tracy A. Shepherd, Michelle Servais, Colleen Willoughby, Linda Bolack, Deborah Strachan, Sheila Moodie, Patricia Baldwin, Kerry Knickle, Kathryn Parker, Diane Savage & Nancy McNaughton (2014): Developing authentic clinical simulations for effective listening and communication in pediatric rehabilitation service delivery, Developmental Neurorehabilitation, DOI: 10.3109/17518423.2014.989461 To link to this article: http://dx.doi.org/10.3109/17518423.2014.989461

Published online: 30 Dec 2014.

Submit your article to this journal

Article views: 48

View related articles

View Crossmark data

Full Terms & Conditions of access and use can be found at http://www.tandfonline.com/action/journalInformation?journalCode=ipdr20 Download by: [University of Birmingham]

Date: 11 November 2015, At: 16:14

http://informahealthcare.com/pdr ISSN: 1751-8423 (print), 1751-8431 (electronic) Dev Neurorehabil, Early Online: 1–11 ! 2014 Informa UK Ltd. DOI: 10.3109/17518423.2014.989461

Developing authentic clinical simulations for effective listening and communication in pediatric rehabilitation service delivery Gillian King1,2, Tracy A. Shepherd3,4, Michelle Servais3, Colleen Willoughby3, Linda Bolack3, Deborah Strachan5, Sheila Moodie6, Patricia Baldwin3, Kerry Knickle7, Kathryn Parker4, Diane Savage4, & Nancy McNaughton7 Bloorview Research Institute, Toronto, Ontario, Canada, 2Department of Occupational Science and Occupational Therapy, University of Toronto, Toronto, Ontario, Canada, 3Thames Valley Children’s Centre, London, Ontario, Canada, 4Holland Bloorview Kids Rehabilitation Hospital, Toronto, Ontario, Canada, 5Independent Consultant, Canada, 6School of Communication Sciences and Disorders, Western University, London, Ontario, Canada, and 7Standardized Patient Program, University of Toronto, Toronto, Ontario, Canada

Downloaded by [University of Birmingham] at 16:14 11 November 2015

1

Abstract

Keywords

Purpose: To describe the creation and validation of six simulations concerned with effective listening and interpersonal communication in pediatric rehabilitation. Methods and findings: The simulations involved clinicians from various disciplines, were based on clinical scenarios related to client issues, and reflected core aspects of listening/communication. Each simulation had a key learning objective, thus focusing clinicians on specific listening skills. The article outlines the process used to turn written scenarios into digital video simulations, including steps taken to establish content validity and authenticity, and to establish a series of videos based on the complexity of their learning objectives, given contextual factors and associated macrocognitive processes that influence the ability to listen. A complexity rating scale was developed and used to establish a gradient of easy/simple, intermediate, and hard/complex simulations. Conclusions: The development process exemplifies an evidence-based, integrated knowledge translation approach to the teaching and learning of listening and communication skills.

Communication skills, digital videos, fidelity, listening, pediatric rehabilitation, scenario, simulation

Introduction There are a number of best practices for the use of simulations in interprofessional education (IPE), including those concerning the development and evaluation of simulations [1, 2]. Few articles, however, have addressed the process by which authentic, high-fidelity simulations are developed. In pediatric rehabilitation in particular, we are not aware of any articles that provide practical guidance on simulation development. The purpose of this article was therefore to share the experiences of an interdisciplinary team who created a series of digital video simulations as part of a research program addressing the use of IPE in developing pediatric rehabilitation clinicians’ listening/communication skills. The article describes an integrated knowledge translation (KT) activity in which end users of the product (i.e. the simulations) were involved in the project from the very beginning as integral team members [3]. Our intent was to create an ‘end knowledge’ product that would assist in continuing professional development in the workplace, as well as in the education of students in professional programs in pediatric rehabilitation. In the following sections, we consider the importance of effective listening and interpersonal communication in the delivery of pediatric rehabilitation services and the nature of Correspondence: Gillian King, Bloorview Research Institute, 150 Kilgour Road, Toronto, Ontario, Canada M4G 1R8. E-mail: gking27@ uwo.ca

History Received 17 August 2014 Revised 12 November 2014 Accepted 15 November 2014 Published online 30 December 2014

best practices in IPE simulation development. We then discuss the role of the simulations in our research program on clinical listening, before moving to a consideration of the multiple factors that guided the development process. Clinical listening in pediatric rehabilitation Listening and communication are widely considered to be essential to the success of interventions for children with disabilities and their families [4, 5]. Furthermore, they are a core competency for both medical and collaborative practice [6–8]. Practitioners’ listening skills are essential to client satisfaction with services [9–11], integral to the development of positive client–practitioner relationships [12, 13], and most likely necessary for client engagement [14]; however, few simulation interventions have targeted the training of listening skills in healthcare practice. In medical education, articles addressing clinical skills have been published much less frequently than those examining practical procedures [2], and training in listening and communication skills is considered to be fragmentary and often accidental [9]. This work on simulation development builds on the research of an interdisciplinary team with a long-standing interest in the listening skills of pediatric rehabilitation clinicians. In the initial stages of our research, we developed and validated the Effective Listening and Interactive Communication Scale (ELICS), which assesses four core aspects of listening/ communication identified in the literature, including receptive, exploratory, consensus-oriented, and action-oriented listening

Downloaded by [University of Birmingham] at 16:14 11 November 2015

2

G. King et al.

[15]. This scale is responsive to the effects of educational interventions, as it captured significant pre–post changes in listening skills associated with therapists’ participation in an occupational therapy mentorship program [16]. We have also reported on the benefits of engaging healthcare professionals in interprofessional groups to discuss clinical scenarios focused on listening and communication [17]. There has been little published work on the use of simulations, particularly digital video simulations, to enhance the ‘soft skills’ of practicing pediatric clinicians. One study examined the use of simulated parents in developing pediatric medical residents’ communication skills in giving bad news [18]. Downar et al. [19] used a workshop format, involving didactic teaching and live simulations with standardized family members, to teach communication skills to subspecialty ICU trainees. Another study relevant to children’s services evaluated the effectiveness of teaching active listening skills to pre-service early childhood educators (i.e. students in a teacher preparation program) [20]. More recently, Hill et al. [21] assessed the foundational clinical skills of speech-language students, using a simulated clinical placement. In addition to speech-language competencies, this study assessed four professional competencies, including communication. Each of these studies addressed the development of the skills of trainees and graduate students rather than the skills of practicing clinicians. Furthermore, none of the studies addressed a complex skill by developing a series of simulations reflecting a gradient of complexity. In a recent review of the evidence for utilizing simulation in teaching communication skills to medical providers, Karkowsky and Chazotte [22] indicated that formal training in communication is rare for those in practice, and discussed the utility of simulation methodology. They pointed to the need to use standardized communication scales, in order to move the field forward. In summary, to our knowledge, there has been no work on the use or development of listening/ communication skill simulations for pediatric rehabilitation clinicians and/or students. Effective listening and interpersonal communication (hereafter referred to as ‘listening’) is an ‘‘uber’’ or ‘‘umbrella’’ competency requiring the use of various skills and also the ability to know what skill to use under which circumstances. For example, there are times when receptive listening is no longer clinically productive and clinicians need to move a session along by engaging in exploratory listening. The training of listening skills requires a series of simulations adequately covering the content domain of clinical listening in a particular intervention context. Accordingly, we set out to develop a series of easy/simple to hard/complex digital video simulations that reflected the interdisciplinary nature of pediatric rehabilitation service delivery and involved service providers from various disciplines, along with parents, youth with disabilities, and/or other stakeholders. We planned to use these simulations in a continuing education intervention for clinicians providing pediatric rehabilitation services. Best practices in the development of IPE simulations The medical simulation literature discusses best practices in developing and structuring simulations to enhance

Dev Neurorehabil, Early Online: 1–11

student learning [1, 2, 23]. In developing our digital video simulations, we paid close attention to three features of these guidelines that deal with the design/development of simulations: (a) capturing clinical variation by developing simulations reflecting a range of situations and complexity levels [1, 2]; (b) ensuring simulation fidelity by obtaining expert opinion about the face validity (realism) of the simulations [23, 24]; and (c) promoting cognitive interactivity (i.e. cognitive engagement) through the use of simulations varying in degree of complexity, multiple repetitions, and intentional simulation sequencing [1]. Simulation complexity is important, as an appropriate level of difficulty is needed to challenge and engage learners [24]. Evidence supports including a range of difficulty as a best practice for simulation-based education [1, 2]. The theoretical basis to our work is noteworthy, as researchers have criticized the atheoretical nature of simulation efforts and have called for theoretically guided approaches to simulation use and development [25, 26]. The literature on best simulation practices is just beginning to consider the importance of adopting an explicit theoretical basis to the skill being addressed. In the case of a complex skill such as listening, it is important to establish content validity (addressed by appropriate coverage of a domain), along with face validity (addressed by capturing clinical variation, ensuring authenticity, and promoting clinical engagement). The novelty of our approach concerns the depth to which we addressed the overall learning objective of enhancing clinical listening skills, including: the theoretical basis to our work; establishing learning objectives linked to items in a validated measure of clinical listening; the effort taken to establish complexity ratings for the simulations (based on the development of a complexity rating scale and ratings by multiple individuals); and the ultimate development of a graduated series of digital video simulations based on complexity levels. Development of clinical listening simulations Evidence-based communication skills training is necessary in pediatrics [18]. We developed the clinical listening simulations as the first step in a combined development and feasibility project preceding a planned randomized control trial of an evidence-based simulation intervention. The overall project objectives were: (a) to develop a series of high-fidelity digital videos of simulated clinician–client interactions exemplifying problematic listening-related situations in the pediatric intervention context, using standardized patients/ clients, clinicians, and family members (SPs), and (b) to conduct a pilot study of a comprehensive intervention, which includes repeated exposure to a series of listening skill simulations, accompanied by guided debriefing in interprofessional discussion groups, and individualized solutionfocused coaching designed to move skills into practice [27]. We felt that the use of simulations exemplifying the listening skills in the ELICS’ conceptual framework would not only enhance content validity, but also promote competency by providing clinicians with a framework by which to understand listening skills and retrieve knowledge for application in practice [28].

Downloaded by [University of Birmingham] at 16:14 11 November 2015

DOI: 10.3109/17518423.2014.989461

Our work integrates five important academic functions of learning-based service delivery organizations, including collaborative applied research, IPE, faculty development, collaborative practice, and KT [29]. The project is an example of a fully integrated work-related activity that involves a collaborative approach to research, seeks to develop tools to enhance IPE and support faculty development, incorporates interprofessional collaboration, and has a well-developed KT component. Furthermore, the project involves multiple collaborative processes, including co-producing knowledge, co-learning (among interprofessional team members), co-constructing meaning (in the generation of the clinical scenarios to be used in the simulations), co-using knowledge (throughout the entire research process), and sharing this knowledge [29]. This article focuses on the KT aspect of our work. The project exemplifies an integrated KT approach where knowledge users are integral members of the endeavor. This ensured that the core aspects of effective listening were reflected authentically in scenario and simulation development. The experienced clinicians on the team (occupational, speech-language, and behavioral therapists) had the tacit knowledge needed to tailor the scenarios and resulting simulations to meet the learning needs of novice practitioners as well as the experiential needs of expert practitioners [30, 31]. The resulting digital videos will provide the means for clinicians to utilize evidence regarding the nature of effective listening skills in practice. Article objective The objective was to provide a full description of the process taken by an interdisciplinary team to turn a series of clinically- and theoretically-based listening scenarios into digital video simulations, including the steps taken to establish content validity and authenticity (i.e. how engaging and ‘real-life’ the scenarios are to practicing clinicians and clients), and to establish a graduated series of videos based on the complexity of their learning objectives. The intent was to provide practical guidance to those seeking information about how to develop quality digital video simulations.

Methods and findings Ethics approval was obtained from Holland Bloorview Kids Rehabilitation Hospital. The research and development team (n ¼ 12) included four clinician-researchers with educational and training roles, three researchers and professors, one pediatric rehabilitation manager and two senior directors, and two facilitator educators affiliated with The University of Toronto’s Standardized Patient Program. Team members had backgrounds in education, psychology, speech-language pathology, occupational therapy, audiology, and social work, and experience in the conceptualization and measurement of listening skills, alternate dispute resolution, simulation methodology, and experiential learning. They also had experience in the development of innovative measures and frameworks to enhance the delivery of pediatric rehabilitation services, coaching in pediatric rehabilitation, KT, and the evaluation of educational interventions for healthcare professionals.

Developing authentic clinical simulations

3

The experience of the clinical team members ranged from 20 to 34 years (M ¼ 27 years). The primary goal was to turn a series of clinical scenarios into digital video simulations of 3- to 5-min in length. An additional goal was to ensure that scenarios and resulting simulations represented a range of easy/simple, intermediate, and hard/complex levels of difficulty. Scenario development was an iterative process, as changes to one element of a scenario often necessitated other changes. As described below, a focus group was held to obtain information on how authenticity might be improved, and also on the complexity of the learning objectives of the scenarios. Development of scenarios and learning objectives Six scenarios were developed by four clinical team members with expertise in pediatric rehabilitation service delivery, including a speech-language pathologist, two occupational therapists (one a senior manager directly managing clinicians), and a behavior therapist. One of these clinicians had specific training on simulation development, briefing, and debriefing. The scenarios were based on previous work [17]. Each scenario had a key learning opportunity and objective related to different critical listening skills contained in the ELICS [15]. The revision of the scenarios to create usable outlines for the digital videos entailed approximately 20 h of meetings over a one-year period, in which the learning objectives and SP roles were refined in an iterative process. Figure 1 shows the overall development process. The original written scenarios were extensively revised to make them more specific, include clear learning objectives related to the ELICS, ensure clinical variation, create appropriate briefing and debriefing guidelines, establish authenticity, and tailor SP roles across the scenarios to contain costs. The SP roles described in the written scenarios provided behavioral examples to support the learning objectives, which were linked directly to the ELICS scales. Input was solicited from other expert clinicians and from methodology experts in educational simulation on the research team. SPs then were trained to perform the roles of clinicians, family members, and other stakeholders (e.g. a school principal, a family support worker) in the simulation videos. To ensure clinical variation, scenarios included clinicians from a broad range of disciplines, various pediatric clinical settings, and a variety of time frames (e.g. at the beginning or middle of the therapeutic relationship). Scenarios were of various levels of difficulty for the learners, based on the complexity of their learning objectives for an entry level clinician (discussed below). Appendix 1 contains two scenario examples (one easy/simple, one hard/complex) to illustrate the gradation in complexity of learning objectives. Development of briefing and debriefing guidelines Clear briefing and debriefing guidelines were written, based on the learning objectives for each scenario, with consultation provided by facilitator educators from the SP program. The briefing guidelines were developed to ground the learner in the concept of listening and the learning objectives, and to focus the learner on the specific listening behaviors that were

4

G. King et al.

Figure 1. Process of developing the clinical listening simulations.

Dev Neurorehabil, Early Online: 1–11

Draing scenarios based on theorecally-based constructs underlying the clinical skill, and ensuring clinical variaon

Scenarios were originally based on actual clinical examples

Preparing learning objecves for each scenario based on their key intents

Downloaded by [University of Birmingham] at 16:14 11 November 2015

Preparing briefings for each scenario, based on clinical knowledge

Preparing debriefing guidelines for each scenario

Redraing scenarios and briefing notes to reflect ghtened objecves, address authencity issues, and to be clearer for the standardized paents

Examining scenario authencity and rang simulaon complexity

targeted in each scenario. The briefing guidelines also provided background information for the scenarios. The debriefing guidelines followed standard simulation debriefing guidelines by using key questions [32–34]. Examples of scenario background information and debriefing questions are provided in Appendix 1. Establishing scenario authenticity Development of the scenarios by a team of four clinical experts established a good degree of authenticity, since the scenarios were based on real-life clinical situations these therapists had encountered in practice. Further work was also conducted to ensure that the scenarios were realistic. This entailed a focus group with individuals not involved in developing the scenarios (discussed in detail in the following section), and feedback from the two simulation methodology experts on our team, who provided input into authenticity during the training sessions for the SPs (see section titled Production of digital videos). Through feedback from focus group participants and the training process, we obtained useful suggestions for verbal and nonverbal behaviors to be displayed in the scenarios, and background information to be included in the briefs, such as why the issue was being raised at that point in time, and the appropriate age for diagnosis or prescription of a device. Focus group members and simulation experts and also suggested real-life dialogue for the parents and/or clinicians in the scenarios. All of the suggestions were discussed by the research team, with decisions for changes made by consensus during team meetings.

Establishing scenario authenticity through focus groups with parents and clinicians We held a focus group to share the scenarios with parents and clinicians, and get input on their authenticity and complexity. To ensure varied backgrounds and levels of expertise, we deliberately selected eight individuals to invite. Seven of these individuals attended a 2½ h focus group, including two family support workers (parent mentors who were themselves parents of children with special needs), three practicing clinicians considered to be experts (a speech-language pathologist, social worker, and physical therapist), and two clinical novices (a newly graduated occupational therapist and an audiologist in graduate school). As desired, there was a good mix of perspectives with respect to disciplines, roles, and level of expertise. The focus group was run by two study team members, with one team member facilitating the discussion and the other taking notes and contributing to the discussion as needed. Participants were welcomed to the focus group and provided with an agenda and guidelines (ground rules) for how the focus group would be run. Each of the six scenarios was then considered in turn. In each case, participants individually read the scenario and reflected on it, then engaged in discussion, using standard questions. Standard questions asked for each scenario were: How well does this scenario reflect what would happen in an interaction (authenticity)? How would you rate this scenario in terms of the complexity of the situation as a whole for the clinician? The focus group participants were very cognitively engaged. They unanimously agreed that each scenario made

Downloaded by [University of Birmingham] at 16:14 11 November 2015

DOI: 10.3109/17518423.2014.989461

sense and could actually happen, and recommended using more general terms to make scenarios applicable to a broader audience (e.g. using the term ‘family support worker’ rather than ‘parent mentor’). After the discussion, participants were asked to individually rank the complexity of the scenarios, from least to most complex. They remarked that this was challenging, and indicated that a number of factors would affect the complexity of the learning objectives, including situational factors such as the number of stakeholders with different agendas, the importance of the meeting outcome for the child, the emotional content of the interaction, how much the clinician had to adapt his/her approach or behavior, and the clinician’s inferred comfort level. Based on this feedback, as well as feedback from the scenario development group, the research team decided that it was important to consider the learning objectives from the viewpoint of the novice learner considering the situation as a whole. Production of digital videos SPs were carefully selected and recruited by the SP program educators to portray each role. Two-h training sessions for each scenario occurred with the SPs, two clinical team members who developed the scenarios, and the SP program facilitators. Training involved discussion of the briefing guidelines and role descriptions, as well as rehearsal to share and practice the simulations and roles. The scenarios were unscripted, leading to a natural spontaneous dialogue targeting the key behaviors. SPs creatively crafted their characters by asking questions of the developers to clarify their motivations and improve their understanding of the issues presented, and they were provided with feedback on their performance by the developers. SPs who took on clinician roles were provided with detailed information about the health and development conditions of clients in the scenarios, and were coached in how they should approach the situation. In addition, between the training and the taping, SPs did further research on their roles and characters to ensure they had a full understanding. Trained SPs needed very little feedback about their emotional reactions and interactions in the situations, given their background and training with the SP program. The result was a co-constructed product, ensuring that the SPs could portray the roles and capture the specific learning objectives with observable statements and behavior. On a separate occasion, 2-h taping sessions were scheduled for each scenario. Sets were created to capture the specific nature of the rooms in which the interactions were to take place, and wardrobes were selected to match the SP roles. A two-camera setup was used to digitally record the simulations. Specific taping instructions were provided to ensure that affect and facial expressions were captured from SPs in key roles at specific times, using close ups and wider pan angles. Simulations were recorded in five or six takes. The development team then decided on the best versions of the simulations. These versions were edited by a media specialist and reviewed by the team until the product was complete. The digital videos ranged from 3 to 5 min in length, as intended. At this point, due to the input of the SPs in the simulations and the changes that evolved during the SP training process,

Developing authentic clinical simulations

5

we realized that it would be more important to rate the complexity of the digital video simulations rather than the written scenarios. New complexities came to light or were emphasized during the training sessions and subsequent filming. Evaluation of simulation complexity Since our intent was to create a graduated series of simulations for an IPE intervention, we aimed to develop simulations ranging from more straightforward to increasingly complex clinical situations where multiple interdependent and dynamic interactions and events were occurring, thus requiring the use of a number of macrocognitive processes on the part of the learner. Although it is considered important to develop a range of training simulations to reflect the situational complexity of practice reality, and to engage learners at different levels of expertise [35], the literature provides no guidelines on how to do this. Best practice articles on the use of simulation in healthcare education focus on the development of simulations for a single medical technique or skill, such as radiograph interpretation or suturing, and simulation difficulty is considered in terms of aspects such as task pace [36]. In simulations of medical techniques, distraught family members are considered to obscure the learning objectives. In contrast, our focus was precisely on such situational factors (e.g. heightened emotions and disagreement) that would make clinical listening less or more difficult. Accordingly, we developed an evolving list of complexity indicators suggested by the development team, the full research team, and focus group members. We then developed a complexity rating scale that could be used by others to establish simulation complexity. Based on complexity theory [37], we defined simulation complexity in terms of the extent to which the simulation situations contained elements, unpredictably associated with one another, that affected the ability to listen effectively. We focused on the complexity of the situated learning objectives based on the perspective of the novice learner, recognizing that clinical reasoning and decision-making skills are different for novice and more experienced professionals [38]. Several macrocognitive processes are considered to affect expert practice [35]. As applied to clinical listening in pediatric rehabilitation, these higher order cognitive processes include (a) maintaining common ground (i.e. ensuring a foundation of common comprehension in the interaction), (b) making sense of various aspects of the interaction (e.g. dealing with the complexity of the child’s behavioral/medical problems), (c) managing uncertainty, risk, and time pressures (e.g. the safety of the child, and the importance of the clinical situation for the child’s outcomes), and (d) managing attention (which is influenced by the number of people in the interaction, the emotions of these individuals, and the dynamics between them). We considered these as characteristics of the simulation situations that would place lesser to greater demands on the novice learner. Development of a complexity rating scale Ten indicators of scenario complexity were generated to reflect these macrocognitive processes, guided by information

Downloaded by [University of Birmingham] at 16:14 11 November 2015

6

G. King et al.

Dev Neurorehabil, Early Online: 1–11

from the focus group and our clinical team members. These indicators reflected management of the complexities and inherent uncertainty of real-world listening, and their presence was considered to make application of listening skills more difficult. We developed a Complexity Rating Scale (see Appendix 2) to measure these indicators for each scenario. The process of creating the Complexity Rating Scale involved a series of team discussions. Ten versions of the scale were created until we were satisfied with the end product. As described in the instructions (see Appendix 2), the person viewing the digital video and completing the scale needs to take the stance or mindset of a novice learner, making a gestalt judgment on each item. Total scores on the scale range from 10 to 60 (a 50-point scale). We created cutoffs based on a tertile split, with 10–25 points reflecting an easy/simple simulation, 26–44 points reflecting an intermediate simulation, and 45–60 points reflecting a hard/complex simulation. These cutoffs capture the upper and lower 16 points on the scale and the middle 18-point spread. Mean scores can be calculated on each of the four subscales: Maintaining Common Ground, Making Sense, Managing Uncertainty, and Managing Attention. Determining the complexity ratings of the listening simulations To determine the complexity of our simulations, eight team members watched the six digital videos and completed the Complexity Rating Scale independently after viewing each video. No discussion occurred among the team members during the rating process. The complexity ratings of each rater were then examined, and the ratings of two individuals were removed (these individuals consistently gave higher ratings than the other team members). Table I provides the final complexity rating for each simulation (easy/simple, intermediate, or hard/complex), the mean total complexity score for the simulations, and the mean rating (and standard deviations) for the four subscales in the measure (Maintaining Common Ground, Making Sense, Managing Uncertainty, and Managing Attention).

Based on our cut-offs, the final complexity ratings for the simulations indicated that one was easy/simple (#1), four were intermediate (#2 to #5), and one was hard/complex (#6). These ratings were highly similar to our original intentions in creating simulations reflecting a range of complexity, with #1 considered to be simplest and #6 to be most complex. Four simulations were rated as originally intended (#1, #3, #5, and #6); two (#2 and #4) were originally intended to be easy/ simple but were rated as intermediate. It should be noted that the development group experienced difficulty in agreeing on the level of complexity of the moderately challenging scenarios. This can be seen in the larger standard deviations of the mean total complexity scores for simulations #4 and #5 (3rd column from the left in Table I). The mean total complexity scores indicate that simulation #5 was less complex than intended, relative to the other simulations. We visually compared the total complexity scores of the six simulations for two groups of raters: three clinical team members (two occupational therapists and a behavior therapist) who had been intensively involved in the development of the scenarios/ simulations, compared to three others not involved (two researchers and one clinician). There was good agreement across the two groups (i.e. mean complexity scores within 5 points of one another on the 50-point scale), with one exception. Simulation #5 was rated as appreciably less complex by the clinicians (M ¼ 27.33) than the researchers (M ¼ 41.67). This may reflect bias on the part of the developers, or a lack of clinical expertise on the part of the comparison group. In addition to overall complexity ratings and total complexity scores, the scale provides information about four aspects of complexity. The mean scores on these subscales indicate that all followed the overall gradient of the total complexity scores. The mean scores across all scenarios indicate that Managing Attention received the highest rating (M ¼ 3.81) whereas Managing Uncertainty received the lowest rating (M ¼ 3.08). This suggests that Managing Attention might be the most difficult aspect of the simulations for learners.

Table I. Complexity ratings for the six digital video simulations.

Simulation number

Final complexity rating*

#1

1

#2

2

#3

2

#4

2

#5

2

#6

3

Mean Overall

N/A

Mean total complexity score (out of 60)a

Maintaining common groundb

Making senseb

Managing uncertaintyb

Managing attentionb

13.00 (2.53) 32.67 (4.23) 37.50 (3.21) 37.83 (9.11) 34.50 (9.09) 52.67 (2.34) 34.69 (13.00)

1.42 (0.58) 3.33 (1.17) 3.50 (1.26) 3.25 (1.54) 3.00 (1.00) 5.42 (0.58) 3.32 (1.55)

1.33 (0.41) 3.50 (0.45) 3.25 (0.94) 4.50 (1.00) 4.00 (1.05) 5.58 (0.20) 3.69 (1.49)

1.22 (0.27) 2.56 (0.93) 3.44 (0.75) 3.28 (1.18) 3.22 (1.07) 4.78 (0.50) 3.08 (1.33)

1.28 (0.39) 3.78 (0.72) 4.56 (0.27) 4.17 (0.69) 3.61 (1.25) 5.44 (0.40) 3.81 (1.45)

*1 ¼ easy/simple; 2 ¼ intermediate; 3 ¼ hard/complex. a Calculated as the mean of summed scores over 10 items for each rater. b Mean (standard deviation).

Developing authentic clinical simulations

DOI: 10.3109/17518423.2014.989461

Overall, the rating tool distinguishes between different levels of simulation complexity in the ways intended. It provides important information regarding the situational factors influencing the complexity of the learning objectives for the novice learner.

Downloaded by [University of Birmingham] at 16:14 11 November 2015

Discussion This article described the process used to develop six IPE digital video simulations of clinical listening, designed to enhance learning by providing a gradient of complexity or difficulty for pediatric rehabilitation clinicians. In the development process, attention was paid to the concepts to be portrayed in the simulations, their authenticity in the eyes of multiple stakeholder groups, and their complexity. Thus, the development of the simulations followed best practices by capturing clinical variation with respect to a range of complex situations, ensuring simulation fidelity, and promoting cognitive engagement [1, 2, 23]. This article contributes an indepth understanding of procedures that could be used by others to establish authenticity, rate complexity, and design scenarios that illuminate the key elements required by SPs in the simulations. We recommend that others: use focus group methods to obtain broad perspectives; adopt the viewpoint of the novice clinician when establishing complexity; use a rating scale consisting of indicators of complexity; and, above all, work with experts in simulation development. According to principles of active learning, it is essential to tailor educational methods to specific settings or circumstances [39]. In the present case, authenticity was assured by a two-step process involving scenario development by a group of practicing clinicians followed by focus group discussion involving family support workers, expert practicing clinicians, and more novice clinicians, who provided important new perspectives. To increase the utility of focus group feedback, we suggest that focus groups be held when scenarios are in a draft stage. To increase shared understanding, we recommend that there is cross-representation of team members in the scenario development and focus group phases. We speculate that the use of focus groups to establish authenticity is particularly important when scenarios are developed by a group of less experienced clinicians representing a smaller range of disciplines. Establishing complexity was a more difficult process, as there is little guidance in the literature regarding how to ensure a graduated series of simulations. We identified four types of situational factors affecting the complexity of clinical interactions, based on clinical experience and the theoretical notion of macrocognitive processes. We then established complexity ratings for the six simulations, based on the ratings of the full research team and focusing on the novice viewpoint. It is important to consider the complexity of the clinical interaction as a whole and to focus on simulation complexity from the point of view of a novice clinician as the learner. For example, the emotional content of the interaction can greatly influence the clinician’s ability to mindfully attend. In fact, a defining characteristic of clinical expertise is the ability to monitor and regulate attention, enabling distracting and nonessential information to be ignored [40]. Thus, a simulation that would appear quite straightforward to an expert

7

might appear overwhelming to a novice practitioner. Since the goal of the digital video simulations was to enhance learning, the appropriate stance is that of the novice practitioner. The indicators of situational complexity were: maintaining a common foundation of understanding; making sense of aspects of the interaction; managing uncertainty, risk, and time pressures; and managing attention under trying circumstances. These indicators reflect macrocognitive processes that affect the ability to listen effectively. Furthermore, all are relevant to any clinical encounter in pediatric rehabilitation service delivery. Accordingly, we recommend the use of the Complexity Rating Scale (Appendix 2) to establish a gradient of simulation complexity, based on a theoretically and clinically grounded process. In reflecting on the learning of the development team, it would have been helpful at the outset to have had a better understanding of the type of information needed by the SPs portraying the roles in the simulations, and also of the need to be explicit in the characteristics required of these actors (e.g. gender, age, appearance). We learned that trained SPs are part of the creative process to bring the scenarios to life and shape the final product. Thus, scripting out a scenario is not necessary. All that is required is a brief background explanation of the roles of each individual in the scenario, explanation of the specific disabilities being represented, and any clinical terms required for authenticity. For fidelity purposes, SPs need some coaching in the use of nonverbal behaviors or verbal statements required to be clinically accurate and to ensure specific learning objectives are reached. We began our development of the scenarios without the benefit of consultation with individuals with expertise in simulation methodology (i.e. those from an SP program). In the future, we would work with these individuals from the outset. In co-constructing the simulations with SP experts, the development team learned many things that helped to make the simulations successful, including the need to carefully select SPs to either avoid or conform to stereotypes, depending on the learning objectives. We also learned the importance of training alternate SPs due to the need to complete the taping in a cost-effective manner. Expertise is required in filming, camera angles, lighting, wardrobe, make up, set design, and after-production of videos. An SP program can provide all of these resources.

Conclusions Although simulation development can be a lengthy process, we found it worthwhile to take the time to do this thoroughly and well, ensuring that development conformed to best practices and theory. By addressing development systematically, we discovered the necessity of establishing a theoretically meaningful gradient of complexity, in which macrocognitive processes reflecting the wider scenario circumstances were taken into account in establishing the complexity of the learning objectives. We learned not to focus narrowly on the skills to be learned, but rather on their application, which involves consideration of the complexity of the application context. We also learned the value of involving multiple groups and stakeholders, which ensured the clinical simulations were authentic and clearly tied to

Downloaded by [University of Birmingham] at 16:14 11 November 2015

8

G. King et al.

concrete learning objectives. Involving direct care providers and managers from different disciplines and organizations, with input from expert researchers and educators, provided a broader perspective when considering the novice learner’s perspective. A key conclusion is the importance of an interdisciplinary team effort in developing a series of authentic digital videos of simulations graduated in terms of their complexity. Our next step is to use these simulations in an educational intervention involving group discussions of the simulations combined with individual solution-focused coaching [27]. In this study, we will evaluate the utility of this intervention in enhancing learners’ listening skills, as measured by the ELICS and role plays. We plan to conduct this work with novices (i.e. students and new graduates), as well as more experienced service providers. Others could use a similar process to develop effective, evidence-based interventions. In summary, this work exemplifies an evidence-based approach to the teaching and learning of clinical listening skills. It also reflects an integrated KT perspective in the development of a clinically relevant knowledge product. Such authentic, high-fidelity simulations have high promise for use in IPE, for continuing professional development in the workplace, and for the education of students in professional programs in pediatric rehabilitation.

Dev Neurorehabil, Early Online: 1–11

7.

8.

9.

10.

11.

12.

13.

14.

Acknowledgements We thank Madhu Pinto for her assistance as the project research coordinator, Greg Vanden Kroonenberg for taping and editing the digital videos, and Darlene Hubley for her advice on project design.

15.

16.

Declaration of interest The authors report no declaration of interest. This study was supported by funding from a SIM-one Simulation Research and Innovation Grant and by Holland Bloorview Kids Rehabilitation Hospital. We thank Thames Valley Children’s Centre, Western University, Holland Bloorview Kids Rehabilitation Hospital, and the University of Toronto’s Standardized Patient Program for their in-kind support of the project.

17.

18.

19.

References 1. Cook DA, Brydges R, Hamstra SJ, Zendejas B, Szostek JH, Wang AT, Erwin PJ, Hatala R. Comparative effectiveness of technologyenhanced simulation versus other instructional methods: A systematic review and meta-analysis. Simulation in Healthcare 2012; 7(5):308–320. 2. Issenberg SB, McGaghie WC, Petrusa ER, Gordon DL, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: A BEME systematic review. Medical Teacher 2005;27(1):10–28. 3. Canadian Institutes of Health Research. Guide to knowledge translation planning at CHIR: Integrated and end-of-grant approaches. Ottawa, ON: Government of Canada; 2012. 4. Dunst CJ, Trivette CM, Deal AG. Enabling and empowering families. Cambridge, MA: Brookline Books; 1988. 5. MacKean GL, Thurston WE, Scott CM. Bridging the divide between families and health professionals’ perspectives on familycentred care. Health Expectations 2005;8:74–85. 6. Suter E, Arndt J, Arthur N, Parboosingh J, Taylor E, Deutschlander S. Role understanding and effective communication as core

20.

21.

22. 23.

24.

25.

competencies for collaborative practice. Journal of Interprofessional Care 2009;23(1):41–51. Accreditation Council for Graduate Medical Education website [Internet]. Available from: www.acgme.org.2000 [last accessed 2 Aug 2003]. Office of Interprofessional Education. Advancing the interprofessional education curriculum 2009; 2008. Available from: http:// www.ipe.utoronto.ca/std/docs/IPE%20Curriculum%20Overview% 20FINAL%20oct%2028.pdf [last accessed 10 Aug 2014] Anolli L, Vescovo A, Agliati A, Mantovani F, Zurloni V. Simulation-based training of communication and emotional competence for the improvement of physician-patient relationship. Annual Review of CyberTherapy and Telemedicine 2006; 4:79–86. Boudreau JD, Cassell E, Fuks A. Preparing medical students to become attentive listeners. Medical Teacher 2009; 31(1):22–29. Duffy FD, Gordon GH, Whelan G, Cole-Kelly K, Frankel R. Assessing competence in communication and interpersonal skills: The Kalamazoo II report. Academic Medicine 2004;79(6): 495–507. Moore TG, Larkin H. ‘‘More than my child’s disability’’: A comprehensive review of family-centred practice and family experiences of early childhood intervention services. Melbourne, Victoria: Scope (Vic) Inc.; 2006. King G, Batorowicz B, Shepherd TA. Expertise in researchinformed clinical decision making: Working effectively with families of children with little or no functional speech. EvidenceBased Communication Assessment and Intervention 2008; 2(2):106–116. Ziviani J, Poulsen A, King G, Johnson D. Motivation and paediatric interventions: A predisposition, mechanism for change or outcome? [Letter to the editor]. Developmental Medicine and Child Neurology 2013;55(10):965–966. King GA, Servais M, Bolack L, Shepherd TA, Willoughby C. Development of a measure to assess effective listening and interactive communication skills in the delivery of children’s rehabilitation services. Disability and Rehabilitation 2011; 34(6):459–469. King G, Tam C, Fay L, Pilkington M, Servais M, Petrosian H. Evaluation of an occupational therapy mentoring program: Effects on therapists’ skills and family-centered behavior. Physical & Occupational Therapy in Pediatrics 2011;31(3):245–262. Shepherd T, King G, Servais M, Bolack L, Willoughby C. Clinical scenario discussions of listening in interprofessional health care teams. International Journal of Listening 2014; 28(1):47–63. Gough JK, Frydenberg AR, Donath SK, Marks MM. Simulated parents: Developing paediatric trainees’ skills in giving bad news. Journal of Paediatrics and Child Health 2009;45(3):133–138. Downar J, Knickle K, Granton JT, Hawryluck L. Using standardized family members to teach communication skills and ethical principles to critical care trainees. Critical Care Medicine 2012; 40(6):1814–1819. McNaughton D, Hamlin D, McCarthy J, Head-Reeves D, Schreiner M. Learning to listen: Teaching an active listening strategy to preservice education professionals. Topics in Early Childhood Special Education 2008;27(4):223–231. Hill AE, Davidson BJ, McAllister S, Wright J, Theodoros DG. Assessment of student competency in a simulated speech-language pathology clinical placement. International Journal of SpeechLanguage Pathology 2014;16(5):464–475. Karkowsky CE, Chazotte C. Simulation: Improving communication with patients. Seminars in Perinatology 2013;37:157–160. McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003– 2009. Medical Education 2009;44(1):50–63. Brown FS. Nursing faculty beliefs and practices regarding debriefing human patient simulation experiences. Davis, CA: University of California; 2011. Harris KR, Eccles DW, Ward P, Whyte J. A theoretical framework for simulation in nursing: Answering Schiavenato’s call. The Journal of Nursing Education 2013;52(1):6–16.

Developing authentic clinical simulations

Downloaded by [University of Birmingham] at 16:14 11 November 2015

DOI: 10.3109/17518423.2014.989461

26. Schiavenato M. Reevaluating simulation in nursing education: Beyond the human patient simulator. Journal of Nursing Education 2009;48:388–394. 27. Baldwin P, King G, Evans J, McDougall S, Tucker MA, Servais M. Solution-focused coaching in pediatric rehabilitation: An integrated model for practice. Physical & Occupational Therapy in Pediatrics 2013;33(4):467–483. 28. Bransford JD, Brown AL, Cocking RR. How people learn. Washington: National Academy Press; 2000. 29. King G, Kingsnorth S, Parker K, Adamson K, Thomson N, Rothstein MG. An integrated model of functions to advance collaborative workplace learning in academic health sciences centers. Manuscript submitted for publication. 2014. 30. Salisbury M. A framework for collaborative knowledge creation. Knowledge Management Research and Practice 2008;6:214–224. 31. Stahl G. A model of collaborative knowledge-building. In: Fishman B, O’Connor-Divelbiss S, editors. Fourth international conference of the learning sciences. Mahwah, NJ: Lawrence Erlbaum; 2000. pp 70–77. 32. Fritzsche DJ, Leonard NH, Boscia MW, Anderson PH. Simulation debriefing procedures. Developments in Business Simulation and Experiential Learning 2004;31:337–338. 33. Rudolph JW, Simon R, Dufresne RL, Raemer DB. There’s no such thing as’’ nonjudgmental’’ debriefing: A theory and method for debriefing with good judgment. Simulation in Healthcare 2006; 1(1):49–55.

9

34. Fanning RM, Gaba DM. The role of debriefing in simulation-based learning. Simulation in Healthcare 2007;2(2):115–125. 35. Schubert CC, Denmark TK, Crandall B, Grome A, Pappas J. Characterizing novice-expert differences in macrocognition: An exploratory study of cognitive work in the emergency department. Annals of Emergency Medicine 2013;61(1):96–109. 36. Motola I, Devine LA, Chung HS, Sullivan JE, Issenberg SB. Simulation in healthcare education: A best evidence practical guide. AMEE Guide No. 82. Medical Teacher 2013; 35(10):e1511–e1530. 37. Stacey RD. The science of complexity: An alternative perspective for strategic change processes. Strategic Management Journal 1995; 16(6):477–495. 38. Benner P, Hughes RG, Sutphen M. Chapter 6. Clinical reasoning, decisionmaking, and action: Thinking critically and clinically. In: Hughes RG, editor. Patient safety and quality: An evidence-based handbook for nurses. Rockville, MD: Agency for Healthcare Research and Quality; 2008. 39. Schuwirth LWT, Van Der Vleuten CPM. Changing education, changing assessment, changing research? Medical Education 2004; 38(8):805–812. 40. Higgs J, Jones M. Clinical reasoning in the health professions. In: Higgs J, Jones M, editors. Clinical reasoning in the health professions. 2nd ed. Oxford: Butterworth Heinemann; 2000. pp 3–14.

Appendix 1 Table A1. Scenario examples.

Scenario background Information

Learning objective(s) Item numbers map onto items in the ELICS*

#1 (easy/simple) This appointment is taking place in a clinic room with a preschool speech language pathologist (SLP) and child’s mother. The SLP wants to provide results of a complete assessment and provide recommendations to work on receptive language development. Ashley is a 3-year-old girl with a speech and language disorder. She has receptive language delays and some articulation errors. Ashley’s mother is most concerned that she does not pronounce her words properly.

Receptive listening E10 . . . try to be open to what people are saying to you E16 . . . try to be present in the moment with the person Exploratory listening E9 . . . provide information, education, and instruction

#6 (hard/complex) This appointment is the 6 month check point at the neuromuscular clinic. The physiotherapist wants to talk about getting a wheelchair, which was briefly mentioned at the last appointment with mom. The mother usually comes to the appointments alone but the father came to this one because they were going to talk about the wheelchair. Toby is an 8-year-old boy with Duchenne muscular dystrophy. He is finding it difficult to walk long distances and the physician and therapists at the neuromuscular clinic want to talk about him getting a wheelchair for long distances to reduce fatigue.

Receptive listening E12 . . . listen to what is not being said Exploratory listening E9 . . . provide information, education, and instruction E21 . . . feel you are able to identify a person’s greatest worry or concern about an issue, and the reason why

Debriefing questions What did you notice? What did you observe? What about that? What ideas do you have? What is ‘‘real’’ about the situation? Were there opportunities to do something different? What stood out? What effect did that have?

*Effective Listening and Interpersonal Communication Scale.

What do others think? Is there anything else that we have not touched on?

10

G. King et al.

Dev Neurorehabil, Early Online: 1–11

Appendix 2 Table A2. Complexity Rating Scale for clinical simulation situations. Purpose:

Downloaded by [University of Birmingham] at 16:14 11 November 2015

Instructions:

This scale is used to determine the level of complexity of a digital video simulation. The complexity of a clinical simulation for a learner is considered to be based on managing various macrocognitive processes that affect practice: (a) Maintaining Common Ground (i.e. making sure there is a common foundation of comprehension for all involved), (b) Sense Making (i.e. ensuring that aspects of the interaction make sense), (c) Managing Uncertainty, Risk, and Time Pressures, and (d) Managing Attention. Thus, a highly complex situation would involve differing perspectives, discordant information, a high degree of uncertainty, and difficulty managing attention due to high emotion and many stakeholders. This scale therefore measures the complexity of simulation situations with respect to these factors. Adopt the stance of an observer of the situation, with no particular role to play. Put yourself into the mindset of a novice learner and consider the situation as a whole – not the skills of clinicians in the situation. Circle the number that most accurately reflects your agreement with each item. For example, for item 1, consider the extent to which the simulation provides clear and unambiguous information for the novice learner: if it clearly provides relevant information, then you would circle 1 (strongly agree on the left hand side); if it is difficult to see relevant information or information is only obtained after difficulty in extracting it, then you would circle 6 (strongly agree on the right hand side). It is best to base your rating on your first impressions.

From the viewpoint of a novice learner:

+

+

+

+

+

+

Strongly Agree Agree Agree Agree Strongly Agree a a Little Agree Little A. With respect to demands for maintaining common ground, the simulation situation . . . 1 2 3 4 5 1. Provides relevant information (i.e. child/family/others provide information clearly) 2. Unfolds as one would expect (i.e. 1 2 3 4 5 a straightforward, orderly situation) B. With respect to demands for making sense of the situation, the simulation situation . . . 1 2 3 4 5 3. Involves simple client issues (i.e. simple health/behavior, functioning, and/or participation issues or problems) 1 2 3 4 5 4. Does not require adjustments to plans based on other’s desires (i.e. the clinician, family, and/or others share the same viewpoint)

6 6

1. Does not provide relevant information (e.g. information is not clear; information is difficult to obtain) 2. Does not unfold as one would expect (i.e. the situation is not straightforward)

6

3. Involves difficult client issues (i.e. complex health/behavior, functioning, and/or participation issues or problems)

6

4. Requires adjustments to plans based on other’s desires (i.e. family/others have a totally different viewpoint and/or priorities, and the clinician must dynamically adjust plans)

C. With respect to demands for managing uncertainty, risk, and time pressures, the simulation situation . . . 1 2 3 4 5 6 5. Involves significant or ambiguous safety 5. Involves few safety risks for the risks for the child/youth (i.e. high risk child/youth (i.e. very apparent that situation or difficult to determine if safety there are few or no safety risks) risk is apparent) 1 2 3 4 5 6 6. Does not involve child- and family-centered 6. Involves child- and family-centered outcomes (e.g. child/family ‘needs’ have outcomes (e.g. easy to consider negative impact on desired clinical physical and emotional needs of outcomes) child/family) 1 2 3 4 5 6 7. Does not involve providing service in a 7. Involves providing service in a manner appropriate to child’s/family’s manner appropriate to child’s/famreadiness (e.g. time-sensitive pressures ily’s readiness (e.g. few or no time drive services) pressures) D. With respect to demands for managing attention, the simulation situation . . . 1 2 3 4 8. Involves easily managed viewpoints/perspectives (i.e. positive collaboration is easy) 9. Involves few problems and is a low-emotion situation (i.e. there are few stakeholders, few problems, and low levels of emotion) 10. Involves a highly engaged client (i.e. the child/family/other is highly motivated and engaged in the therapy process)

1

2

3

4

5

6

5

6

8. Involves difficulties in managing viewpoints/perspectives (e.g. there are differing viewpoints; people are unwilling to cooperate; people are not open to other perspectives) 9. Involves difficult problems involving multiple stakeholders and is a highly emotional situation (i.e. very problematic and/or highly emotional; multiple stakeholders) 10. Involves a disengaged or resistant client (i.e. the child/family/other has low motivation, is disengaged from the therapy process or has different priorities)

DOI: 10.3109/17518423.2014.989461

Scoring the complexity Add up the total score (questions 1 through 10): _________.

Downloaded by [University of Birmingham] at 16:14 11 November 2015

Interpreting the score Find where the total score lies to determine the complexity of the simulation situation: 10–25 Points¼Level 1: Easy/Simple (10 is the minimum score) 26–44 Points¼Level 2: Intermediate 45–60 Points¼Level 3: Hard/Complex (60 is the maximum score)

Developing authentic clinical simulations

11

Developing authentic clinical simulations for effective listening and communication in pediatric rehabilitation service delivery.

To describe the creation and validation of six simulations concerned with effective listening and interpersonal communication in pediatric rehabilitat...
567KB Sizes 1 Downloads 8 Views