Developmental Neurorehabilitation

ISSN: 1751-8423 (Print) 1751-8431 (Online) Journal homepage: http://www.tandfonline.com/loi/ipdr20

A listening skill educational intervention for pediatric rehabilitation clinicians: A mixedmethods pilot study Gillian King, Michelle Servais, Tracy A. Shepherd, Colleen Willoughby, Linda Bolack, Sheila Moodie, Patricia Baldwin, Deborah Strachan, Kerry Knickle, Madhu Pinto, Kathryn Parker & Nancy McNaughton To cite this article: Gillian King, Michelle Servais, Tracy A. Shepherd, Colleen Willoughby, Linda Bolack, Sheila Moodie, Patricia Baldwin, Deborah Strachan, Kerry Knickle, Madhu Pinto, Kathryn Parker & Nancy McNaughton (2015): A listening skill educational intervention for pediatric rehabilitation clinicians: A mixed-methods pilot study, Developmental Neurorehabilitation To link to this article: http://dx.doi.org/10.3109/17518423.2015.1063731

Published online: 25 Aug 2015.

Submit your article to this journal

Article views: 68

View related articles

View Crossmark data

Full Terms & Conditions of access and use can be found at http://www.tandfonline.com/action/journalInformation?journalCode=ipdr20 Download by: [University of California, San Diego]

Date: 04 April 2016, At: 01:03

http://informahealthcare.com/pdr ISSN: 1751-8423 (print), 1751-8431 (electronic) Dev Neurorehabil, Early Online: 1–13 ! 2015 Informa UK Ltd. DOI: 10.3109/17518423.2015.1063731

ORIGINAL ARTICLE

A listening skill educational intervention for pediatric rehabilitation clinicians: A mixed-methods pilot study Gillian King1, Michelle Servais2, Tracy A. Shepherd2,3, Colleen Willoughby2, Linda Bolack2, Sheila Moodie4, Patricia Baldwin2, Deborah Strachan5, Kerry Knickle6, Madhu Pinto1, Kathryn Parker3, & Nancy McNaughton6 1

Bloorview Research Institute, University of Toronto, Toronto, Ontario, Canada, 2Thames Valley Children’s Centre, London, Ontario, Canada, Holland Bloorview Kids Rehabilitation Hospital, Toronto, Ontario, Canada, 4School of Communication Sciences and Disorders, Western University, London, Ontario, Canada, 5Independent Consultant, London, Ontario, Canada, and 6Standardized Patient Program, University of Toronto, Toronto, Ontario, Canada

Downloaded by [University of California, San Diego] at 01:03 04 April 2016

3

Abstract

Keywords

Objective: To prepare for an RCT by examining the effects of an educational intervention on the listening skills of pediatric rehabilitation clinicians, piloting study procedures, and investigating participants’ learning experiences. Methods: Six experienced clinicians received the intervention, consisting of video simulations and solution-focused coaching regarding personal listening goals. Self- and observer-rated measures of listening skill were completed and qualitative information was gathered in interviews and a member checking session. Results: Significant change on self-reported listening skills was found from pre- to post-test and/or follow-up. The pilot provided useful information to improve the study protocol, including the addition of an initial orientation to listening skills. Participants found the intervention to be a highly valuable and intense learning experience, and reported immediate changes to their clinical and interprofessional practice. Conclusion: The educational intervention has the potential to be an effective means to enhance the listening skills of practicing pediatric rehabilitation clinicians.

Coaching, communication skills, intervention, pediatric rehabilitation, listening, professional development, simulation

Introduction This pilot study examined the impact and procedures of an innovative interprofessional educational intervention designed to enhance the listening skills of pediatric rehabilitation clinicians who deliver services to children with disabilities and their families. Interprofessional education is a growing and important field, yet little work has targeted the training of clinical listening skills in a systematic and evidence-based manner. The pilot study was conducted to prepare for a randomized controlled trial (RCT) of a comprehensive educational intervention employing interprofessional group discussions of video simulations and individual solutionfocused coaching concerning listening goals. To set the stage for the pilot, we consider the importance of listening skills in pediatric rehabilitation, and the role of interprofessional workplace opportunities in ensuring the development of these crucial interpersonal skills. We then describe the nature of the educational intervention, which was designed by an interprofessional team of clinicians, researchers, and educators with a long-standing interest in enhancing clinical competencies related to relational, goal-oriented practice [1].

Correspondence: Gillian King. E-mail:[email protected]

History Received 25 February 2015 Revised 15 June 2015 Accepted 15 June 2015 Published online 29 July 2015

Importance of listening skills in pediatric rehabilitation Listening and communication skills (hereafter referred to as ‘listening’) are fundamental to best clinical practice. They are integral to developing relationships with clients, and thereby essential to successful interventions for children with disabilities, as well as to client and family satisfaction and engagement in the intervention process [2–4]. In addition, listening and communication skills are a core competency for collaborative practice [5, 6]. We view effective clinical listening as purposeful, goaloriented, and relational in nature [1, 7]. Good clinical listening is non-judgmental, active, and intended to bring about various outcomes, including gathering information about the client situation and priorities, creating relationships, ensuring clients feel listened to and understood, and co-creating goals and action plans. Reflecting a functional viewpoint, good clinical listening provides clarity, a sense of mutual meaning, affirmation, and a sense of direction for intervention [8–10]. The role of interprofessional workplace learning opportunities Despite the unquestioned importance of listening skills and their day-to-day clinical relevance, they are often taken

Downloaded by [University of California, San Diego] at 01:03 04 April 2016

2

G. King et al.

for granted. Clinical service organizations typically provide professional development opportunities addressing technical knowledge [11] and ignore the development of generic, transdisciplinary competencies needed to work in an increasingly demanding and complex service environment [12]. There is, however, increasing awareness of the importance of interprofessional education in assisting clinicians with the development of relational skills such as effective listening. There is recognition that health care organizations should ideally provide an array of learning opportunities, including those for deliberate practice, interprofessional discussion and dialog, authentic and immediate feedback on performance, and self- and guided reflection [12]. In this context, simulations are seen as a highly useful educational tool [13]. There have been few evaluations of interprofessional educational interventions designed to enhance the listening skills of practicing clinicians, particularly in pediatric rehabilitation. Listening skill interventions are typically discipline-specific and targeted towards medical or nursing residents or trainees. Discipline-specific interventions for trainees include the use of simulated parents to develop pediatric medical residents’ communication skills in giving bad news [14], use of teaching and live simulations with standardized family members to teach communication skills to sub-specialty ICU trainees [15], and use of a simulated clinical placement to assess the professional competencies, including communication skills, of speech-language pathology students [16]. With respect to practicing clinicians, a recent RCT examined the effects on nurses’ communication competence of a scenario-based simulation training course (including instruction and group discussion of video simulations involving a standardized stroke patient and family) [17]. Although this RCT found no significant difference in communication performance compared to traditional course training, nurses reported the scenario-based simulation training to be more effective. A study by Watters et al. [13] found that simulation training in communication (along with managing emergency situations, teamwork, and leadership) enhanced the self-efficacy of doctors and nurses. In contrast to these previous studies, we used video simulations as one part of a comprehensive educational strategy, and we targeted practicing clinicians from multiple disciplines, thus ensuring broad applicability to pediatric rehabilitation. Our collaborative partnership began with the development of a self-report measure, the Effective Listening and Interactive Communication Scale (ELICS), which assesses receptive, exploratory, consensus-oriented, and action-oriented listening [7]. These subscales can be considered as listening stances (intents) that clinicians use to achieve various aims in clinical encounters. We then used the ELICS to evaluate the effects of an occupational therapy mentorship program [18], and demonstrated the utility of interprofessional group discussions of listening scenarios [19]. To prepare for the pilot study reported here, we transformed six clinical listening scenarios into 3- to 5-minute video simulations involving interactions among ‘standardized patients’ in the roles of family members, clients (parents, parents and youth), and other stakeholders such as teachers and parent mentors (hereafter referred to as ‘standardized

Dev Neurorehabil, Early Online: 1–13

clients’). We followed best practices for simulation development in interprofessional education by establishing the authenticity of these video simulations using feedback from parents, clinicians, students, and standardized clients, and ensuring that the simulations represented a range of difficulty levels [20]. Nature of the listening skill educational intervention We adopted an explicit theoretical basis to the intervention, with learning objectives based on concepts in a published measure of clinical listening skills [7]. The intervention was a structured formal opportunity [12] consisting of two main educational strategies: interprofessional group discussion of a series of authentic video simulations concerning listening scenarios varying in complexity levels, combined with individualized solution-focused coaching designed to promote reflection and facilitate knowledge translation into practice. This composite intervention integrates the features of highfidelity simulations [21, 22] with the benefits of interprofessional group discussions and individualized coaching. The intervention was evidence-based in that we followed best practices in simulation development in creating our video simulations [20]. We also based the coaching component on a conceptual framework of solution-focused coaching [23]. The intervention constitutes a multifaceted learning opportunity involving experiential, instructional, and observational learning through interprofessional dialog [24] and coaching [25]. Interprofessional dialog is considered to promote wider perspectives on issues and enable clinicians to see blind spots created by disciplinary perspectives [24]. Solution-focused coaching is an extremely promising intervention that is being increasingly applied in health care, particularly as a way to effectively engage clients in the change process [23, 26]. Solution-focused coaching ‘uses positive reframing and strategic questions to assist clients in envisioning a preferred future and developing practical solutions to move toward their vision’ (p. 468) [23]. The intervention also provides multiple embedded opportunities for self- and guided reflection on performance [27], deliberate practice [28], and feedback on performance [29]. Reflection has long been considered a fundamental process by which clinicians develop expertise [30]; the knowledge and awareness that arise through reflection are considered to be necessary for the development of expertise [31]. Pilot study objectives Pilot studies are small scale preliminary studies with varying objectives, which can include estimating effect size, determining recruitment rates, and evaluating feasibility [32, 33]. Determining feasibility of intervention and measurement procedures is done to enhance the likelihood of success of a future main study, and is particularly important when complex interventions are employed [34]. The overall aim was to conduct a pilot study in preparation for a RCT of the listening skill educational intervention. There were three specific objectives. The first was to take a preliminary look at the effects of the intervention on participants’ listening skills, as measured by self-report at three points in time and an observer-rated measure of

Listening skill educational intervention

DOI: 10.3109/17518423.2015.1063731

listening skill behaviors. The second objective was to investigate feasibility, using end-of-study interviews to ensure methods were sound and work out kinks in the protocol. We also trialed an observational rubric assessing participants’ end-of-study display of listening skills. The third objective was to investigate participants’ learning experiences and perceptions of learning benefits, using end-of-study interviews. The criteria for judging the pilot to be a success were: evidence of positive impacts of the intervention on listening skills, evidence that the protocol was feasible with modifications [32], and evidence that participants saw learning benefits.

Methods

Downloaded by [University of California, San Diego] at 01:03 04 April 2016

Overview of pilot study design The study utilized a mixed-methods approach, with learners followed over the course of the educational intervention. We adopted an adaptive pilot design [32], in which concurrent changes are made to study procedures. We used a ‘‘concurrent triangulation’’ approach, in which quantitative and qualitative data are collected and equal weight is given to each [35, 36]. We conceptually integrated the qualitative and quantitative findings post data collection and analysis – at the interpretation stage [35]. As shown in Figure 1, there were three major components to the educational intervention. First, participants took part in interprofessional group discussions of aspects of listening exhibited in simulation videos, followed by debriefing to

Time 1 (Pre-test)

Intervenon

•Assessment of experse levels •ELICS Time 1

•Video observaon and facilitated discussion (Session 1) •Individual coaching (Session 2) •Video observaon and facilitated discussion (Session 3) •Individual coaching (Session 4)

3

heighten awareness of listening skills. They then took part in individual coaching sessions utilizing a solution-focused approach, to encourage personal reflection and the development of personal plans for changes to practice. Participants then took part in another round of video discussion and coaching. The final component was a live simulation day, in which participants were videotaped taking part in simulations with standardized clients to capture displayed listening skills. Figure 1 indicates the four measurement time points: before the start of the intervention (Time 1); on the morning before the start of the live simulation day (Time 2); two weeks later (Time 3); and at a member checking session four months later (Time 4). In accordance with our adaptive design, the member checking session was added based on feedback from participants that they would appreciate a final session to obtain a sense of closure on the study, and also on our desire to obtain participants’ perceptions of the emerging study themes concerning learning benefits and changes to the study protocol. Participants and recruitment Study approval was obtained from the Research Ethics Board of Holland Bloorview Kids Rehabilitation Hospital. The inclusion criteria for participants were: (a) currently practicing as a physical/occupational/behavior therapist or speechlanguage pathologist, (b) engaged in a facilitator role with families (i.e., directly involved in client care and decision making), and (c) working at one of two centers providing pediatric rehabilitation services. Clinicians received an email invitation from the study research assistant (forwarded to them by their managers), with the following attachments: a study information flyer and letter, consent form, Background Information Form, and Self-Rating Scale of Listening and Interpersonal Communication Skills (described below). Interested clinicians were also asked to forward an information letter and Peer-Nomination Scale of Listening and Interpersonal Communication Skills to three peers. Completed forms were sent to the research assistant. Procedure

Time 2

•ELICS Time 2 •Live simulaons and feedback •ELICS-AR data (from debriefers and actors) •Individual reflecon

The intervention consisted of four sessions: two sessions of group video observation and discussion (one month apart), each followed (one week later) by individual solution-focused coaching (Figure 1). Video discussion sessions

•ELICS Time 3 •Qualitave interviews (end-of-study interviews) Time 3

•Member checking Time 4 (Final Meeng)

Figure 1. Intervention Design. bolded..

Note: Aspects of the intervention are

These involved six 3- to 5-minute listening-related videos within a pediatric intervention context [see 20]. The situations were authentic, as determined by clinical team members and members of a focus group [20]. High-quality videos were created with standardized clients, using multiple takes from different camera angles to capture facial expressions and affect. Each situation was based on key learning objectives related to listening, and rated as easy, intermediate, or hard/ complex, based on a 50-point rating scale capturing 10 indicators of scenario complexity [20]. As described in detail in a previous article [20], the complexity scale assessed the degree to which various situational factors (e.g., heightened

4

G. King et al.

emotions and disagreement, time pressures, the complexity of the child’s problems) affected the ability to listen effectively. Participants were provided with a brief before watching each video. Afterwards, they engaged in a facilitated group discussion with three debriefers from the research team. Open-ended questions were used to focus the discussion on the learning objectives; discussion lasted 20–25 minutes for each video.

Downloaded by [University of California, San Diego] at 01:03 04 April 2016

Solution-focused coaching sessions Coaching was provided by a member of the research team, a clinical practice coach and certified solution-focused practitioner with extensive experience in training and coaching pediatric rehabilitation clinicians. Solution-focused strategies included the use of compliments, outcome questions, scaling questions, exception questions, looking for strengths and solutions, and assignment of a ‘homework’ task [37]. Two individual coaching sessions were provided (described below), each lasting 45–60 minutes. Each session was audio recorded to verify coaching fidelity. Coaching questions were formulated using the participant’s language (metaphors, analogies, use of words), hopes, priorities, and readiness regarding listening skill goals and plans. The first session focused on participants’ goal setting and plan of action related to their listening skills. Following a solution-focused approach [23], participants were asked about their hopes for participating in the research project, with micro questions used to probe for details (e.g., What will be different when you can do this? What will other people (clients/colleagues/supervisor) say they notice is different when you are doing this?). In the second session, the coach facilitated a conversation that supported participants to reflect upon their experiences and goals/plans from the first session and to develop plans for goal progression. Questions used in this session included: What’s better since our last session together? (How did that happen? What did you do/not do?); What surprised or perhaps challenged you? (How did you manage given what happened?); What are you noticing about your listening skills as a result of these experiences?; Given your experiences, what might be some next steps as you move forward? (What/who might be helpful for you in that? How might you do that?). Live simulation day The live simulations were held at a rehabilitation treatment center, utilizing actors from the University of Toronto’s Standardized Patient Program. The day consisted of the following sessions: a brief group introduction; three 40 minute simulations, each followed by 40 minutes for individual reflection; lunch break; and a final debriefing session with all participants, debriefers, and standardized clients. During the introduction session, participants completed the ELICS and learned about the plans for the day. The simulation sessions consisted of a live simulation (10–15 minutes) and a debriefing with two debriefers, including feedback from the standardized clients (15–20 minutes). Debriefers and standardized clients then completed the ELICS-Assessment Rubric to rate participants’ listening skills.

Dev Neurorehabil, Early Online: 1–13

The live simulations were designed to be challenging, reflecting intermediate to hard levels of complexity. The live simulations were similar to the video simulations in their focus on problematic listening scenarios; the difference was the change in participant role from observer to actor. Before each session, participants received a brief summary to contextualize the rehabilitation scenario. They were asked to participate in the session as they would in their regular clinical role/discipline. The simulations had varying objectives concerning the four listening skills and involved two to four standardized clients. The intervention was originally designed so that participants would move from less complex to more complex simulations. However, for feasibility reasons (to keep the intervention to one day) and consistency reasons (to use only one simulation team per scenario), participants were rotated through simulations, with some starting on the more challenging scenarios. After each live simulation, participants moved to separate private rooms for a period of reflection on what they had just experienced. They were given a sheet of paper with the following instructions: ‘The following questions are meant to help you reflect on your learning experience with the live simulation that just occurred. Take a few minutes to jot down your thoughts (point form is fine)’. The sheets of paper included the following ‘critical reflection on practice’ questions: What did you feel went well with your listening in this simulation? What did you learn from participating in this simulation? Can you describe any new understandings or insights that you may have had about yourself or your practice? Considering future similar situations in your practice: What will you continue doing? What, if anything, will you do differently? Information from the written reflections is not included in this article, as we did not originally have ethics approval to use these data. When we examined the reflections, we found them to be very rich and subsequently have obtained ethical approval to prepare a separate article on the findings. Finally, the six participants and nine standardized clients met as a group with the six debriefers for 40 minutes at the end of the day to discuss their experiences and perceptions of the simulations. Two research team members were also present in this face-to-face session. The debriefers led the session, asking the participants and standardized clients for their comments, observations, and thoughts about the day. The study participants asked questions about particular scenarios and commented about how real the simulations felt and about how difficult some of the simulations were. The debriefers thanked the participants and acknowledged people’s contributions to the general discussion. The study participants expressed appreciation about the way the standardized clients provided feedback. End-of-study interviews Semi-structured interviews were conducted by the research assistant to obtain feedback about the study procedures and participant experience. The majority (n ¼ 5) occurred two weeks after the live simulation day; one was conducted three weeks afterwards. Three interviews were face-to-face and three were conducted by phone. The interviews lasted about

DOI: 10.3109/17518423.2015.1063731

1 hour on average (range ¼ 44–64 minutes) and were audio recorded. The recordings were professionally transcribed with identifying information removed. Participants were asked questions about the study as a whole, such as: Can you tell me your thoughts and reactions to the whole study process? What worked well? What did not work? What could be improved? They were also asked about specific parts of the intervention, for example: What worked or did not work with regard to the videos as a learning tool? Tell me your thoughts and impressions of the individualized coaching sessions.

Downloaded by [University of California, San Diego] at 01:03 04 April 2016

Member checking session A member checking session was held approximately four months after the end-of-study interviews. All participants attended. Two researchers guided the meeting with a third taking written notes for later analysis. The objectives were to provide participants with preliminary study results to ensure accuracy of data interpretation by the research team; obtain feedback on the study protocol and processes (value of interprofessional video discussions and debriefing, the coaching sessions, and the live simulation day); and obtain participants’ recommendations for study design modifications. Analysis To meet Objectives 2 and 3 (participants’ perceptions of procedures and their learning experiences, respectively), we conducted a thematic analysis [38] based on information from the interviews and group member checking session. Thematic analysis is a widely-used qualitative analytic method for identifying, analyzing, and reporting patterns (themes) within data. This approach allowed us to focus on the particular data we needed to address our objectives [39]. Initially, one team member pulled preliminary themes from the qualitative interviews for presentation at the member checking session. These themes concerned the overall experience, overall study procedures, and perceptions of each component of the educational intervention (i.e., videos and debrief sessions, coaching sessions, and the live simulation day). In the member checking session, participants confirmed the emerging ideas and elaborated on some aspects, captured in notes taken at the meeting. Next, the interviews were read in their entirety by four team members (three researchers and one clinician), with each person independently generating ideas for discussion. These team members then met to extract interview themes, using a process of consensus. The group agreed on key themes concerning participants’ experiences, overall study procedures, and the videos and discussion, coaching, and live simulation day. The lead author then combined the interview themes with the member checking notes, creating a table of overall study themes. These were discussed at a full team meeting, where further endorsement was obtained, including agreement from the research assistant who conducted the interviews. Measures Background Information Form This form, used in previous studies [7, 18], provided information about clinician’s age, education, discipline,

Listening skill educational intervention

5

years of employment at the center, years in clinical practice, and caseload experience. Self-Rating Scale of Listening and Interactive Communication Skills This self-categorization scale provides a rating (developing listener, good listener, highly skilled listener), based on a definition referring to skills and outcomes [7]. The definition is as follows: ‘Highly skilled listeners/communicators listen to clients and other team members mindfully, sensitively, and with intent. These listeners convey their understanding back to the client to ensure its accuracy. They use this co-created understanding to create a shared vision, which is used to facilitate the change process. Highly skilled listeners ask the right questions at the right time in order to gain a complete understanding as to what is needed when working toward optimal outcomes. They establish a relationship in order to develop trust. They are alert for new information, which indicates when a change in plan is necessary’. Clinicians are asked to select the category that best describes them, with a ‘developing listener’ defined as a clinician just beginning to develop these skills, a ‘good listener’ defined as one who has some, but not all of the described skills, and a ‘highly skilled listener’ defined as a clinician who practices listening skills consistently. Peer-Nomination Scale of Listening and Interactive Communication Skills Similar to the self-rating scale, this peer-nomination scale provides a rating of developing listener, good listener, or highly skilled listener, based on the same definition [7]. In this study, we determined a peer rating based on the consensus or majority of the peer nominations. Effective Listening and Interactive Communication Scale (ELICS) This is a valid self-assessment of listening and communication skills in the context of pediatric rehabilitation practice. This 24-item scale measures: Receptive Listening (mindful attention to understand the client’s situation), Exploratory Listening (dialog to elicit information and establish clarity about issues), Consensus-oriented Listening (brainstorming and explanation of rationales to establish shared understanding and jointly determined goals), and Action-oriented Listening (supporting and enabling clients to establish actions toward desired outcomes). Items are rated on a 7-point scale with all points labeled (7 ¼ to a very great extent; 4 ¼ to a moderate extent; 1 ¼ not at all). Internal consistency reliabilities range from very good to excellent (0.78 to 0.90) [7]. The clinical responsiveness of the ELICS to change over time has been demonstrated in an 11-month facilitated, collaborative group mentorship intervention [18]. Effective Listening and Interactive Communication ScaleAssessment Rubric (ELICS-AR) A rubric is a tool used to rate the performance of an individual on a number of criteria or dimensions, including the competency-based performance of professionals (see, for example, the Interprofessional Collaborator Assessment

6

G. King et al.

Dev Neurorehabil, Early Online: 1–13

Rubric [40]). The ELICS-AR (see Appendix 1) was developed as a tool for simulation observers to rate participants’ displayed listening skills (Appendix 1). The 24 items in the ELICS-AR are arranged by the four ELICS listening scales. Respondents use a 5-point Likert scale (0 ¼ not at all, 1 ¼ rarely, 2 ¼ occasionally, 3 ¼ frequently, and 4 ¼ consistently) to rate the listener’s performance on each item. The research team trialed the ELICS-AR using the six listening videos, and found the rubric to be a useful tool for rating clinicians’ listening skills.

Results

Downloaded by [University of California, San Diego] at 01:03 04 April 2016

Participants The study involved three physical, two occupational, and one behavior therapist (n ¼ 6). As shown in Table I, the clinicians ranged in age from 35 to 55+, and the majority (n ¼ 4) had a Bachelor’s degree as their highest level of education. They were highly experienced pediatric therapists (M ¼ 21.58 years in clinical practice and 16.58 years at current center). Approximately 85% of their clients had high service needs (three or more), indicating that they had more complex caseloads. All participants were good (intermediate) to highly skilled (expert) listeners. Five of the six considered themselves to be good listeners; peers rated five of the six as highly skilled listeners (all three peers were in consensus); and four were seen as expert listeners by the coach. The one individual who was rated as a good listener by all three peers and the coach also rated herself as a good listener.

ELICS scale. The highest scores at each time point were for receptive listening, and the lowest were for exploratory listening. We explored the latter with participants in the final member checking session, where they indicated that the words ‘issue’ and ‘challenge’ (in the three exploratory listening items that did not show positive improvement over time) led them to give consistently lower responses to these items. As clinicians who use solution-focused coaching in their practice, they were attuned to the meaning of words and trained to see strengths rather than issues. As a result, they interpreted these items as negative and as non-applicable to the way they practiced. Thus, the explanation for lower exploratory listening is not that there is a problem with the ELICS scale, but that the respondents were a select group of practitioners who practiced in accordance to solution-focused coaching and strengths-based principles. As shown in Table II, there were significant differences on all four ELICS scales over time, with large effect sizes. In all cases, Time 3 scores (follow-up) were significantly higher than Time 1 scores (pre-test). Time 2 scores were significantly higher than Time 1 scores for receptive listening and exploratory listening, but not for the other scales. For actionoriented listening, there was also a significant increase between Time 2 and Time 3 scores. Although the sample was small, there were no outliers and ANOVAs are robust to skewed data, which were minimal. There were departures from normality for kurtosis for some of the ELICS data, and we need to stress that these findings are preliminary.

Changes in self-reported listening skills over time (Objective 1)

Observer-rated listening skills in end-of-study live simulations (Objective 1)

There were clear changes in self-reported listening skills over time (Table II). Time 3 scores were the highest on each

Standardized clients and debriefers rated participants’ displayed listening skills in the live simulations, using the ELICS-AR. As shown in Table III, ELICS-AR scores ranged from rarely (1) to frequently (3) on the 5-point scale. The overall average scores (bottom of Table III) indicate that participants frequently displayed listening skills in live simulation #1, occasionally to frequently displayed listening skills in live simulation #2, and occasionally displayed listening skills in live simulation #3 (which was designed to be the hardest/most complex simulation). Thus, the three live simulations differed in complexity, as desired, with the first being the easiest and the third being the hardest. Both sets of raters saw live simulation #3 as ‘harder’ than #1. For the standardized clients, there was a significant difference in overall ELICS-AR scores for the three live simulations, with live simulation #3 rated as producing significantly lower display of listening skills than both live simulation #1 [t(10) ¼ 4.64, p50.01] and live simulation #2 [t(10) ¼ 2.84, p50.01)]. For the debriefers, there was also a significant difference in overall ELICS-AR scores, with live simulation #1 rated significantly higher in displayed listening skills than live simulation #2 [t(10) ¼ 3.47, p50.01] and live simulation #3 [t(10) ¼ 5.24, p50.001)]. Table IV presents correlations in the ratings of debriefers and standardized clients for the ELICS-AR scales for each live simulation, as well as overall correlations (all of the latter are statistically significant). Focusing on overall scores, there was very high agreement between the different sets of

Table I. Participant background information. Variable Age 35 to 44 45 to 54 55+ Mean Years in Clinical Practice Mean Years Employed at Center Highest Level of Education Bachelor’s degree Master’s degree Professional Discipline Physical therapist Occupational therapist Behavior therapist Percent of Clients on Current and Past Caseloads with Three or More Service Needs Self-Nomination of Expertise Developing listener Good listener Highly skilled listener Peer-Nomination of Expertise Developing listener Good listener Highly skilled listener Coach Rating of Expertise Novice Intermediate Expert

n¼6 2 3 1 21.58 16.58 4 2 3 2 1 84.1% 0 5 1 0 1 5 0 2 4

Listening skill educational intervention

DOI: 10.3109/17518423.2015.1063731

7

Table II. Mean ELICS scale scores (standard deviations) at three study time points.

ELICS Scale

Time 1 (Pre-test) ab

Time 2 (before live simulations)

Time 3 (two weeks later)

a

F-test and Effect Sizes

b

Receptive Listening

5.64

(0.76)

5.92 (0.79)

6.31 (0.66)

Exploratory Listening

4.33cd (0.61)

5.12c (0.93)

5.29d (1.05)

Consensus-oriented Listening

5.17e (0.60)

5.50 (0.59)

6.02e (0.64)

Action-oriented Listening

5.29f (0.95)

5.63g (0.80)

6.17fg (0.74)

F(2,10) ¼ 5.74, p50.02 2 ¼ 0.53 F(2,10) ¼ 12.70, p50.002 2 ¼ 0.72 F(2,10) ¼ 5.23, p50.03 2 ¼ 0.51 F(2,10) ¼ 7.23, p50.01 2 ¼ 0.59

Downloaded by [University of California, San Diego] at 01:03 04 April 2016

2 ¼ partial eta squared (5 0.05 ¼ small; 0.05 to 0.12 ¼ medium; 40.12 ¼ large) [49] Note: 7 ¼ to a very great extent; 6 ¼ to a great extent; 5 ¼ to a fairly great extent; 4 ¼ to a moderate extent; 3 ¼ to a small extent; 2 ¼ to a very small extent; 1 ¼ not at all Means with the same superscripts are significantly different from one another, using paired sample t-tests

standardized clients and debriefers in live simulation #2 and #3, but lower agreement between the standardized clients and debriefers in live simulation #1. There was particularly high agreement across the live simulations for consensus-oriented listening. Intervention procedures and participants’ learning experiences (Objectives 2 and 3) In the interviews and member checking session, participants reported that the intervention was a powerful experience with multiple benefits arising from the different components of the study (video discussion and debriefing, coaching, live simulation). These benefits included greater awareness of listening, accelerated learning, enhanced reflection, and actual application to every-day interactions with clients and colleagues. One participant remarked ‘‘I don’t remember ever having been part of something that was so impactful so quickly for learning (#1)’’. Others said: ‘‘When I reflect on the overall experience, I’m really, really glad that I got to do it (#5)’’; ‘‘It was a great experience. . .a really powerful experience (#3)’’; ‘‘The learning has been tremendous (#1)’’; ‘‘It’s up there in terms of my favorite clinical education events (#5)’’. The intervention provided an intense learning experience that was challenging and at times stressful: ‘‘It was very hard. . .parts of it were very, very, very, hard. . .but, I’m still glad that I took part in it (#4)’’; ‘‘To actually be in that scenario, and see the impact that it had on me, for a few days after (#1)’’; ‘‘I think it’s very powerful. I think it had a huge impact...it’s worth the stress to have the learning, because it stays with you much longer because. . .it’s part of your experience (#5)’’. Like the participants, we were surprised at the amount of reflection the intervention triggered (‘‘the surprising part about it to me was the amount self-reflection that I was going to be doing. And...how that would impact me [#5]’’), as well as at how quickly the learning was moved into practice: ‘‘I know that I was different after every step of the way (#1)’’; ‘‘I was surprised how it... carried on after the moment. . .I think there was a heightened awareness that I hadn’t expected coming out of the study (#3)’’. All participants indicated applying what they had learned, and gave examples, including better appreciation of the family perspective, listening to what

Table III. Mean ELICS-AR scale scores (standard deviations) for the live simulations (displayed skills). ELICS-AR Scale

Live Sim 1

Receptive Listening Standardized clients 3.20 Debriefers 3.28 Exploratory Listening Standardized clients 2.97 Debriefers 2.99 Consensus-oriented Listening Standardized clients 2.17 Debriefers 2.82 Action-oriented Listening Standardized clients 3.08 Debriefers 3.21 Overall Average Scores Standardized clients 2.86 Debriefers 3.07 0 ¼ Not At All; 4 ¼ Consistently.

Live Sim 2

Live Sim 3

(0.25) (0.31)

3.10 (0.66) 2.76 (1.13)

3.17 (0.36) 2.44 (0.63)

(0.29) (0.43)

2.30 (0.59) 1.38 (0.50)

1.63 (0.57) 1.53 (0.44)

(0.60) (0.70)

2.85 (0.76) 1.54 (0.69)

1.06 (0.77) 1.51 (0.68)

(0.37) (0.50)

2.60 (0.70) 1.83 (0.78)

0.93 (0.92) 1.17 (0.68)

(0.22) (0.42)

2.71 (0.66) 1.88 (0.73)

1.70 (0.57) 1.66 (0.51)

1 ¼ Rarely;

2 ¼ Occasionally;

3 ¼ Frequently;

is meaningful to the family, and how to be ‘present’ in interactions with clients. Most surprisingly, some participants reported applying what they had learned to their interactions with colleagues and teachers (e.g., engaging in more listening, finding out others’ agendas, not making assumptions about what others are saying or hoping for from interactions). Participants provided many suggestions to improve the participant experience, all of which involved greater interaction to enhance learning and reflection, and provide greater understanding of their experiences. The main suggestions were to include: (a) an initial orientation to the study as a whole, including discussion of the nature, theoretical basis, and importance of listening skills, along with clinical examples; (b) fuller orientation to the live simulation day, where aims and procedures are clearly outlined so that participants know what to expect; (c) greater consistency in the way in which the debriefers provided feedback; and (d) a final study session to provide a sense of closure. Based on the interview feedback, we took the opportunity to add a final study session in the pilot, which also served as a member checking session for the research team, with respect to emerging ideas.

8

G. King et al.

Dev Neurorehabil, Early Online: 1–13

Table IV. Correlations among ELICS-AR scale ratings for standardized clients and debriefers. ELICS-AR scale Receptive listening Exploratory listening Consensus-oriented listening Action-oriented listening Overall average scoresa

Live Sim 1

Live Sim 2

Live Sim 3

0.71 0.27 0.90** 0.12 0.57**

0.93* 0.91* 0.95* 0.91* 0.85**

0.60 0.72 0.81* 0.88* 0.85**

*p50.05 two-tailed ** p50.01 two-tailed a Correlations calculated out of 24 paired standardized client-debriefer sets of observations (df ¼ 23): 4 ELICS scores for each of 6 participants

Downloaded by [University of California, San Diego] at 01:03 04 April 2016

Video and debrief sessions These were seen as a good starting point for the study, as they ‘‘gave us all a common perspective to talk about (#6)’’. There was strong consensus that the scenarios displayed in the videos were authentic, real, and believable, although somewhat exaggerated. One participant stated: ‘‘It was quite powerful to see a video. . .[and] be part of a discussion...It reinforced...what was going well and what wasn’t (#3)’’. The group discussions affirmed participants’ own ideas and they learned from others’ perspectives, including gathering new ideas to inform practice. Participants thought it would be helpful to alert future study participants to the emotional impact of the videos and the subsequent impact on self-reflection and heightened awareness about listening. One participant reported not sleeping due to the intense reflection triggered by the videos. Another participant said: ‘‘There was one of the video scenarios that was really hard, and was really kind of haunting (#4)’’. Coaching sessions The participants appreciated having designated time to work on personal listening goals with the guidance of an experienced coach. The coaching supported reflection: ‘‘She made me dig deep and think. . .the conversation took me to a place that I would not have got to with my own selfreflection (#1)’’. In the member checking session, a clinician remarked: ‘‘It brought to my attention the importance of listening and really made a difference in my clinical practice’’. Participants valued articulating what they were thinking and experiencing, and having someone to help them work things through: ‘‘She really encouraged me to think about – for me in that situation – what would have worked, or what could work in the future (#6)’’; ‘‘It was so helpful...to have some designated time...to be able to try and think, sort out thoughts and reflect, and have someone who was listening and actually helping to support me to... make sense of things (#3)’’. Coaching provided an opportunity to integrate thoughts into action plans: ‘‘I think having someone just to reflect back to you and to. . .help you create a direction or a path with your thoughts is valuable (#2)’’; ‘‘It helps facilitate you integrating that knowledge and putting what you think and what you learned into an action plan (#5)’’.

Live simulation day Participants appreciated receiving immediate, unfiltered feedback from the standardized clients about their listening skills: ‘‘It was really helpful to have feedback from...the actors (#4)’’; ‘‘And then you have people being able to give you honest feedback, whereas...in a real live client situation. . .you don’t have the recipient there giving you really clear concrete feedback about how it felt for them (#2)’’. The skills of the standardized clients led to a very valuable and ‘real life’ learning experience: ‘‘They were so good. . .And they were so real...it was very, very easy to just get in your mind that this was a real scenario. . .it allowed me to just totally be there to be practicing (#1)’’. Many suggestions were made about things that could be changed to make the live simulation day a better experience for future participants. The majority of the feedback had to do with making the simulation environment a ‘safer’ (less anxiety-provoking, less tiring) experience, including (a) providing a better orientation to the day, (b) reducing the cognitive load on participants, (c) giving the simulations in the order originally intended (starting with the least complex and getting harder), and (d) ensuring the debriefers led the simulations in a consistent manner. We discuss each of these. First, the research team was made aware of the importance of including an introductory session that would better describe what people should expect during the day. Participants recommended alerting others to the emotional aspects of the simulations. We realized that the orientation should indicate that it is not necessary to attempt to resolve the clinical scenarios, as the goal is to display learned listening skills. Nonetheless, we will give future participants two or three minutes to wrap up the simulation sessions if they wish, in order to experience a sense of personal resolution. Second, participants indicated experiencing a substantial cognitive load, as they had to learn about the live simulation process for the first time and engage in three challenging simulations. Participants were videotaped, which added to their stress. One individual felt so unsettled by the most complex simulation that she found it hard to process information: ‘‘I was unsettled enough that when I came out, I couldn’t even find the next room number, I was staring at...and it was numbers and a letter and I thought wow. And I can’t find the next room because I cannot read and process what I’m reading. And...afterwards, I was just thinking so much about some of the families that come to the centre for the first time (#3)’’. Another participant said: ‘‘I would have preferred to have been able to sleep well after. [both laugh] For a couple of days afterward, [it] would have been better if I could have slept well, and not, you know, been kicking myself about scenario three (#4)’’. Third, for logistical reasons, the majority of the participants did not experience the live simulations in the intended order (i.e., in order of complexity). This likely contributed to the stress of participants, as did the long day itself. In the member checking session, participants indicated that they were fatigued after the second live simulation, and suggested that two live simulations would be enough. In the future, we

Downloaded by [University of California, San Diego] at 01:03 04 April 2016

DOI: 10.3109/17518423.2015.1063731

plan to reduce the number of live simulations given on a single day. Fourth, the interviews indicated that participants experienced mixed messages from the manner in which the three sets of debriefing teams conducted the live simulations. Some stressed an educational intent to the day (and referred to the different types of listening), whereas others saw it as the endof-study evaluation of listening skills. Some gave encouraging feedback more than others. In the future we will ensure adequate orientation around the aims of the simulations for both the participants and debriefers. Two other valuable pieces of knowledge came from observations of research team members. First, from a research perspective, we felt that it would be important for future participants to briefly leave the room at the end of the simulation, while the standardized clients and debriefers complete the ELICS-AR independently. Completing the ELICS-AR after the debriefing may have created bias in the ratings. However, since this might increase participants’ stress and disrupt the integrity of the debriefing process, we intend to have a panel of raters view and rate videotaped simulations, rather than the debriefers. Second, participants indicated that completing written reflection journals was helpful: ‘‘I also think what worked well was. . .where we went from scenario, reflection, scenario, reflection, scenario, reflection—I think that having the reflection after the...session was...really helpful (#1)’’; ‘‘Because there were some things that I realized in that reflection that I didn’t realize right after the scenario (#5)’’. Consequently, we plan to use written reflection journals as a future source of data.

Discussion The aims of this pilot study were take a preliminary look at the effects of an educational intervention on self-reported and observer-rated listening skills, to try out the study procedures, and to investigate participants’ learning experiences. First, even with only six participants, statistically significant selfreported changes in listening skills were found over time, particularly between Times 1 and 3. While receptive and exploratory listening changed from Time 1 to Time 2 (before the start of the live simulation day), consensus-oriented and action-oriented listening showed significant changes only at Time 3 (two weeks later). The delays for these two types of listening skills make sense, as they appear to be more complex and require more opportunity for deliberate practice. There was continual skill acquisition over the intervention, including as a result of the live simulation day, which was initially considered to be our end-of-study evaluation of listening skills. The qualitative interviews picked up this learning from the live simulations, as did the ELICS, which showed a significant increase from Time 2 to Time 3 for action-oriented listening. With respect to observer ratings of listening skills, there was substantial agreement between standardized clients and debriefers in their ratings of participant performance in each of the live simulations (acknowledging the bias created by our procedure). There was particularly high agreement across the live simulations for consensus-oriented listening, which may be easiest for observers to identify. Thus, our listening skill

Listening skill educational intervention

9

measures were feasible to implement and showed expected change (the ELICS) and validity (the ELICS-AR). With respect to the second objective (feasibility), while the video discussions and solution-focused coaching sessions worked well and ran smoothly, the live simulation day, which was the most complex to implement, provided a number of learnings. For example, we realized on this day that participants were uncertain about the types of listening targeted by the intervention. We plan to provide an initial teaching component to the intervention, either via an introductory workshop or an online tutorial. We had assumed that participants would internalize the nature of the core listening skills through experiential learning, but realized that formal instructional learning is required. By adding this component to the intervention, we will cover all aspects of Miller’s [41] pyramid of clinical competence: knows, knows how, shows how, and does. Overall, the pilot provided useful information to improve the study protocol, including the addition of an initial orientation to listening skills and study procedures, greater transparency and consistency in the procedures to lessen the stress of the live simulations, better orientation for debriefers, inclusion of an end-of-study member checking session, and the use of reflection journals as a source of data. The final objective was to understand the learners’ experience. They unanimously agreed that the intervention was challenging, valuable, and highly impactful learning experience. Appropriate challenge and support are fundamental aspects of learning and development [42]. Each part of the intervention was seen to provide benefit. The interprofessional video discussions broadened perspectives and created heightened awareness of listening. The solution-focused coaching provided dedicated time for reflection and action planning around individual goals, which may have served to solidify behavioral intentions [43]. The live simulations provided an opportunity to apply the skills and receive immediate, authentic feedback from standardized clients. Throughout the process, participants remarked on heightened awareness, accelerated learning, enhanced reflection, and seamless application to their every-day clinical and interprofessional interactions. We hypothesize that the immersive nature of the intervention played an important role – it was a highly engaging intervention in which participants invested appreciable time, energy, effort, and reflection. Study strengths and limitations The listening skill intervention is innovative and has the potential, due to its interprofessional and comprehensive nature, to enhance the listening skills of practicing pediatric rehabilitation clinicians. Strengths of the study include its ecological validity, use of a previously validated self-report communication assessment tool, and objective assessment by debriefers and standardized clients. A limitation is that the participants were all highly experienced clinicians, motivated towards ongoing learning, and humble with respect to their listening skills. These characteristics are hallmarks of expertise [44, 45]. We were unsuccessful in recruiting novice or intermediate learners. In the member checking session, participants indicated that the prospect of being videotaped in live simulations created some

Downloaded by [University of California, San Diego] at 01:03 04 April 2016

10

G. King et al.

anxiety, which may have contributed to difficulties in recruiting more novice clinicians. In future recruitment, we will highlight the supportive aspect of the intervention, the value of having an opportunity to enhance listening skills, and benefits that will likely result for clients and clinicians from skilled listening. Different aspects of the intervention may be more or less important or valuable to more novice listeners. For example, novices may have different reactions to what is shown in the videos (e.g., oversimplifying and not seeing them as complex), may be more or less likely to reflect on their listening skills and/or the coaching opportunity, and may have different reactions to and learnings from the live simulation day. Another limitation is the self-report nature of the dependent measure of listening skill. Since we view listening skills as involving mindful stances, a self-report measure is best to capture these, as well as nuanced changes in self-perceptions over time. Self-assessment is a fundamental aspect of continuing education and self-directed learning in the workplace, and the average individual is a reasonably competent assessor of his or her strengths and weaknesses [46]. Nonetheless, we developed an observer-rated measure of displayed behaviors (ELICS-AR) for use on the live simulation day. Our sample was too small to allow us to examine the extent of correspondence between self- and observer-ratings, and we speculate that the correspondence may be low. Selfratings are based on a holistic understanding of one’s listening skills over multiple situations, whereas observer ratings are situation-specific. There is a place for both self-report and observer-rated measures of listening skill, as they tell us different things [7]. Accordingly, we plan to include two live simulation days in our future research, which will allow us to examine changes in observer-rated skills between two time points, and will also serve to provide participants with live simulation experience in advance of the post-test. Research implications This pilot study provided preliminary evidence of the utility of a listening skills intervention that is theory- and evidencebased in conceptualization and design. The study indicated that the measures worked as desired, helped us to fine-tune the procedures and intervention design, and indicated that participants reported valuable benefits. The findings will inform the development of an RCT to ascertain the effectiveness of the intervention and will include expert, intermediate, and novice practitioners. Based on the findings, we speculate that the intervention’s effects are due to targeting multiple types of learning (fitting the different learning styles of participants). The intervention provided authentic video examples for discussion; repeated opportunities for reflection and deliberate practice in both group discussion and individual coaching formats; and increasing levels of simulation challenge, with support from peers, debriefers, and a coach. Further investigation is warranted regarding the role of challenge as a change process implicated in educational interventions. A second research implication concerns the need for a broader view of learning as an ongoing process that is artificially (but necessarily) truncated by educational

Dev Neurorehabil, Early Online: 1–13

research objectives. In this regard, we learned the value of having qualitative interviews and a final member checking session following live simulation ratings of learned skills. There is growing evidence that, under the right conditions, the skills learned in simulation are transferred to clinical practice. However, studies seldom include a longitudinal follow-up [13, 47] and the literature primarily addresses medical procedural skills rather than complex skills such as clinical listening [47]. The clinicians in the present study indicated that immediate application was fostered by the intervention and they discussed this in follow-up interviews and a member checking session held four months later. Thus, by following clinicians during a sustained intervention process and capturing information about the perceived clinical impact of the intervention over time, this study addressed a major gap in much of the simulation research. A third implication concerns the nature of the live simulations themselves. The pilot indicated that the most complex simulation resulted in the least frequent use or display of listening skills of all types. Simulation researchers would benefit from using a series of live simulations varying in complexity and using a formal observational assessment tool such as the ELICS-AR, in addition to ascertaining learners’ perceptions of change in skills using validated assessments. A limitation of using only observers’ behavioral ratings as measures of skill acquisition is that the complexity of the simulations (e.g., their emotional content and the number of people with different agendas) can play an important role in skill display. Clinical, educational, and organizational implications Generally, clinicians are focused on their client during sessions and may not be conscious of how they are using their listening skills. This study draws attention to importance of being mindful about listening, particularly in situations where the emotional stakes are high. It is important for clinicians to feel it is alright to follow the lead of the family and client in the moment, rather than feeling they need to solve the problem. Families indicate that just being heard is an important part of the process of moving forward [4]. Selfreflection on one’s listening skills can enhance familycentered practice and, in turn, enhance client outcomes [2]. Although interprofessional communication is considered a core competency to be acquired during graduate training programs in health and rehabilitation sciences, relatively little time is spent on communication skills in postgraduate education [48]. There is likely a great deal of variability in how (or if) listening skills are taught and evaluated within university settings. This work provides definitive methods for measurement of listening skills; can be used to develop educational initiatives to teach listening skills in rehabilitative programs; and provides evidence of the need for multiple ways of intervening (self-evaluation, reflection, coaching) to facilitate sustainability of change in listening behavior. With respect to organizational implications, our intervention may be primarily suitable for larger organizations willing to invest resources in the ongoing learning of practitioners. However, even smaller organizations are likely to spend professional development dollars on an evidence-based

DOI: 10.3109/17518423.2015.1063731

professional development opportunity that fits their vision and mission. All organizations, regardless of their size, may see the importance of this type of intervention for clinicians, if it matches their learning-oriented and relationship-based philosophy [12]. In conclusion, this work addresses a crucial aspect of health care delivery worldwide—listening and communication in the health care context. Since listening skills are crucial to quality health care and to interprofessional practice, this work may have substantial potential importance for educational practice, clinical practice, and client outcomes.

Downloaded by [University of California, San Diego] at 01:03 04 April 2016

Acknowledgments We thank Thames Valley Children’s Centre, Western University, Holland Bloorview Kids Rehabilitation Hospital, and the University of Toronto’s Standardized Patient Program for their kind support. We also thank the standardized patients for their contributions to the project, Greg Vanden Kroonenberg for taping and editing the videos, and Darlene Hubley and Diane Savage for their assistance. Gillian King holds the Canada Research Chair in Optimal Care for Children with Disabilities.

Declaration of interest The authors report no declarations of interest. This study was supported by a SIM-one Simulation Research and Innovation Grant and Holland Bloorview Kids Rehabilitation Hospital.

References 1. King G. A relational goal-oriented model of optimal service delivery to children and families. Physical & Occupational Therapy in Pediatrics 2009;29:384–408. 2. Dunst CJ, Trivette CM, Deal AG. Enabling and empowering families. Cambridge, MA: Brookline Books; 1988. 3. MacKean GL, Thurston WE, Scott CM. Bridging the divide between families and health professionals’ perspectives on familycentred care. Health Expectations 2005;8:74–85. 4. King G, Desmarais C, Lindsay S, Pie´rart G, Te´treault S. The roles of effective communication and client engagement in delivering culturally sensitive care to immigrant parents of children with disabilities. Disability and Rehabilitation. 2015; 37, 1372– 1381. 5. Suter E, Arndt J, Arthur N, Parboosingh J, Taylor E, Deutschlander S. Role understanding and effective communication as core competencies for collaborative practice. Journal of Interprofessional Care 2009;23:41–51. 6. Office of Interprofessional Education. Advancing the interprofessional education curriculum 20092008. Available from: http:// www.ipe.utoronto.ca/std/docs/IPE%20Curriculum%20Overview% 20FINAL%20oct%2028.pdf (accessed June 14, 2015). 7. King GA, Servais M, Bolack L, Shepherd TA, Willoughby C. Development of a measure to assess effective listening and interactive communication skills in the delivery of children’s rehabilitation services. Disability and Rehabilitation 2012;34: 459–469. 8. King G, Currie M, Bartlett D, Gilpin M, Willoughby C, Tucker MA, et al. The development of expertise in pediatric rehabilitation therapists: Changes in approach, self-knowledge, and use of enabling and customizing strategies. Developmental Neurorehabilitation 2007;10:223–240. 9. Walsh F. Family resilience: A framework for clinical practice. Family Process 2003;42:1–18. 10. Walsh F. A family resilience framework: Innovative practice applications. Family Relations 2002;51:130–137.

Listening skill educational intervention

11

11. Slater DY, Cohn ES. Staff development through analysis of practice. American Journal of Occupational Therapy 1991;45: 1038–1044. 12. King G. A framework of personal and environmental learningbased strategies to foster therapist expertise. Learning in Health and Social Care 2009;8:185–199. 13. Watters C, Reedy G, Ross A, Morgan NJ, Handslip R, Jaye P. Does interprofessional simulation increase self-efficacy: A comparative study. BMJ Open 2015;5:e005472. 14. Gough JK, Frydenberg AR, Donath SK, Marks MM. Simulated parents: Developing paediatric trainees’ skills in giving bad news. Journal of Paediatrics and Child Health 2009;45:133–138. 15. Downar J, Knickle K, Granton JT, Hawryluck L. Using standardized family members to teach communication skills and ethical principles to critical care trainees. Critical Care Medicine 2012;40: 1814–1819. 16. Hill AE, Davidson BJ, McAllister S, Wright J, Theodoros DG. Assessment of student competency in a simulated speech-language pathology clinical placement. International Journal of SpeechLanguage Pathology 2014;16:464–475. 17. Hsu L, Chang W, Hsieh S. The effects of scenario-based simulation course training on nurses’ communication competence and selfefficacy: A randomized controlled trial. Journal of Professional Nursing 2014; 95, 356–364. 18. King G, Tam C, Fay L, Pilkington M, Servais M, Petrosian H. Evaluation of an occupational therapy mentoring program: Effects on therapists’ skills and family-centered behavior. Physical & Occupational Therapy in Pediatrics 2011;31:245–262. 19. Shepherd T, King G, Servais M, Bolack L, Willoughby C. Clinical scenario discussions of listening in interprofessional health care teams. International Journal of Listening 2014;28:47–63. 20. King G, Shepherd TA, Servais M, Willoughby C, Bolack L, Strachan D, et al. Developing authentic clinical simulations for effective listening and communication in pediatric rehabilitation service delivery. Developmental Neurorehabilitation 2014. Early Online. 21. Issenberg SB, McGaghie WC, Petrusa ER, Gordon DL, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: A BEME systematic review. Medical Teacher 2005;27:10–28. 22. McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003– 2009. Medical Education 2009;44:50–63. 23. Baldwin P, King G, Evans J, McDougall S, Tucker MA, Servais M. Solution-focused coaching in pediatric rehabilitation: An integrated model for practice. Physical & Occupational Therapy in Pediatrics 2013;33:467–483. 24. Titchen A, Ersser SJ. Explicating, creating and validating professional craft knowledge. In: Higgs J, Titchen A, editors. Practice knowledge and expertise in the health professions. Oxford: Butterworth Heinemann; 2001. pp 48–56. 25. Rees PG, Hays BJ. Fostering expertise in occupational health nursing: Levels of skill development. Journal of the American Association of Occupational Health Nursing 1996;44:67–72. 26. Schwellnus H, King G, Thompson L. Client-centred coaching in the paediatric health professions: A critical scoping review. Disability & Rehabilitation 2015; 37, 1305–1315. 27. Guest CB, Regehr G, Tiberius RG. The life long challenge of expertise. Medical Education 2001;35:78–81. 28. Jensen GM, Gwyer J, Hack LM, Shepard KF. Expertise in physical therapy practice. Boston, MA: Butterworth Heinemann; 1999. 29. Jennings L, Skovholt TM. The cognitive, emotional, and relational characteristics of master therapists. Journal of Counseling Psychology 1999;46:3–11. 30. Schon DA. The reflective practitioner: How professionals think in action. New York: Basic Books; 1983. 31. APA Presidential Task Force on Evidence-Based Practice. Evidence-based practice in psychology. American Psychologist 2006;61:271–285. 32. Thabane L, Ma J, Chu R, Cheng J, Ismaila A, Rios LP, et al. A tutorial on pilot studies: The what, why and how. BMC Medical Research Methodology 2010;10:1. 33. Arain M, Campbell MJ, Cooper CL, Lancaster GA. What is a pilot or feasibility study? A review of current practice and editorial policy. BMC Medical Research Methodology 2010;10:67.

Downloaded by [University of California, San Diego] at 01:03 04 April 2016

12

G. King et al.

34. Murray E, Treweek S, Pope C, MacFarlane A, Ballini L, Dowrick C, et al. Normalisation process theory: A framework for developing, evaluating and implementing complex interventions. BMC Medicine 2010;8:63. 35. Creswell J, Plano Clark V, Gutmann M, Hanson WE. Advanced mixed methods research designs. In: Tashakkori A, Teddlie C, editors. Handbook of mixed methods in social and behavioral research. Thousand Oaks, CA: Sage; 2003. pp 209–240. 36. Hanson WE, Creswell JW, Clark VLP, Petska KS, Creswell JD. Mixed methods research designs in counseling psychology. Journal of Counseling Psychology 2005;52:224–235. 37. Franklin C, Trepper TS, Gingerich WJ, McCollum EE. Solutionfocused brief therapy: A handbook of evidence-based practice. New York: Oxford University Press; 2012. 38. Braun V, Clarke V. Using thematic analysis in psychology. Qualitative Research in Psychology 2006;3:77–101. 39. Vaismoradi M, Turunen H, Bondas T. Content analysis and thematic analysis: Implications for conducting a qualitative descriptive study. Nursing & Health Sciences 2013;15:398–405. 40. Curran V, Hollett A, Casimiro LM, Mccarthy P, Banfield V, Hall P, et al. Development and validation of the interprofessional collaborator assessment rubric (ICAR). Journal of Interprofessional Care 2011;25:339–344. 41. Miller GE. The assessment of clinical skills/competence/performance. Academic Medicine 1990;65:S63–S67. 42. Csikszentmihalyi M, Rathunde K. The development of the person: An experiential perspective on the ontogenesis of psychological complexity. In: Lerner RM, ed. Handbook of child psychology

Dev Neurorehabil, Early Online: 1–13

43.

44.

45.

46. 47.

48.

49.

Theoretical models of human development 5th ed. New York: John Wiley & Sons; 1998. pp 635–684. Gollwitzer PM, Sheeran P. Implementation intentions and goal achievement: A meta-analysis of effects and processes. Advances in Experimental Social Psychology 2006;38: 69–119. King G, Currie M, Bartlett DJ, Strachan D, Tucker MA, Willoughby C. The development of expertise in paediatric rehabilitation therapists: The roles of motivation, openness to experience, and types of caseload experience. Australian Occupational Therapy Journal 2008;55:108–122. Skovholt TM, Jennings L, Mullenbach M. Portrait of the master therapist: Developmental model of the highly functioning self. In: Skovholt TM, Jennings L, editors. Master therapists: Exploring expertise in therapy and counseling. Boston: Allyn & Bacon; 2004. pp 125–146. Regehr G, Hodges B, Tiberius R, Lofchy J. Measuring selfassessment skills: An innovative relative ranking model. Academic Medicine 1996;71:S52–S54. McGaghie WC, Draycott TJ, Dunn WF, Lopez CM, Stefanidis D. Evaluating the impact of simulation on translational patient outcomes. Simulation in Healthcare 2011;6:S42–S47. Gillis AE, Morris MC, Ridgway PF. Communication skills assessment in the final postgraduate years to established practice: A systematic review. Postgraduate Medical Journal 2015;91:13–21. Cohen J. Statistical power analysis for the behavioral science. 2nd ed. Hillsdale, NJ: Lawrence Erlbaum; 1988.

DOI: 10.3109/17518423.2015.1063731

Listening skill educational intervention

Downloaded by [University of California, San Diego] at 01:03 04 April 2016

Appendix 1: Effective Listening and Interactive Communication Scale Assessment Rubric (ELICS-AR)

13

A listening skill educational intervention for pediatric rehabilitation clinicians: A mixed-methods pilot study.

To prepare for an RCT by examining the effects of an educational intervention on the listening skills of pediatric rehabilitation clinicians, piloting...
1KB Sizes 0 Downloads 11 Views