International Journal of

Radiation Oncology biology

physics

www.redjournal.org

Education

Assessing Interpersonal and Communication Skills in Radiation Oncology Residents: A Pilot Standardized Patient Program Melody Ju, BA,* Abigail T. Berman, MD,y Wei-Ting Hwang, PhD,z Denise LaMarra, MS,* Cordelia Baffic, BA,y Gita Suneja, MD,y and Neha Vapiwala, MDy *Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania; yDepartment of Radiation Oncology, University of Pennsylvania, Philadelphia, Pennsylvania; and zDepartment of Biostatistics and Epidemiology, University of Pennsylvania, Philadelphia, Pennsylvania Received Nov 14, 2013, and in revised form Dec 12, 2013. Accepted for publication Jan 8, 2014.

Summary This is the first described standardized patient (SP) evaluation program for radiation oncology residents. Statistically higher scores from faculty substantiate the concern that resident evaluations are generally positive but nondiscriminating. Thus, faculty should be encouraged to provide honest and critical feedback to hone residents’ interpersonal skills. Results also demonstrate poor interrater agreement among faculty, residents, and SPs, suggesting the need for residents and faculty to seek better calibration to true patient perceptions.

Purpose: There is a lack of data for the structured development and evaluation of communication skills in radiation oncology residency training programs. Effective communication skills are increasingly emphasized by the Accreditation Council for Graduate Medical Education and are critical for a successful clinical practice. We present the design of a novel, pilot standardized patient (SP) program and the evaluation of communication skills among radiation oncology residents. Methods and Materials: Two case scenarios were developed to challenge residents in the delivery of “bad news” to patients: one scenario regarding treatment failure and the other regarding change in treatment plan. Eleven radiation oncology residents paired with 6 faculty participated in this pilot program. Each encounter was scored by the SPs, observing faculty, and residents themselves based on the Kalamazoo guidelines. Results: Overall resident performance ratings were “good” to “excellent,” with faculty assigning statistically significant higher scores and residents assigning lower scores. We found inconsistent inter rater agreement among faculty, residents, and SPs. SP feedback was also valuable in identifying areas of improvement, including more collaborative decision making and less use of medical jargon. Conclusions: The program was well received by residents and faculty and regarded as a valuable educational experience that could be used as an annual feedback tool. Poor inter rater agreement suggests a need for residents and faculty physicians to better calibrate their evaluations to true patient perceptions. High scores from faculty members substantiate the concern that resident evaluations are generally positive and

Reprint requests to: Neha Vapiwala, MD, Department of Radiation Oncology, University of Pennsylvania, 3400 Civic Center Blvd, TRC 2 West, Philadelphia, PA 19104. Tel: (215) 662-2337; E-mail: Neha. [email protected] We did not use any copyrighted information or patient photographs. The data presented in this manuscript were acquired in accordance with the Int J Radiation Oncol Biol Phys, Vol. 88, No. 5, pp. 1129e1135, 2014 0360-3016/$ - see front matter Ó 2014 Elsevier Inc. All rights reserved. http://dx.doi.org/10.1016/j.ijrobp.2014.01.007

policies of the Institutional Review Board at the University of Pennsylvania. Conflict of interest: none.

International Journal of Radiation Oncology  Biology  Physics

1130 Ju et al.

nondiscriminating. Faculty should be encouraged to provide honest and critical feedback to hone residents’ interpersonal skills. Ó 2014 Elsevier Inc.

Introduction Research in clinical medicine, radiation biology, and medical physics are constantly advancing the field of radiation oncology, but the therapeutic relationship between physician and patient remains at the core of a successful practice. Strong physician communication skills are associated with increased patient satisfaction, better health outcomes, and greater adherence to treatment (1, 2). Additionally, interpersonal skills are now expected competencies for new physicians. This is reflected in the Accreditation Council for Graduate Medical Education’s (ACGME) commitment to testing for process-oriented general competencies in patient care, interpersonal and communication skills, and professionalism since 1999 (3, 4). The medical community has responded to this call for excellence in interpersonal skills with the development of standardized patient (SP) programs for trainees in a variety of specialties. SPs are lay people trained to portray the scripted symptoms, personality, and beliefs of a simulated patient case (5). Many research studies demonstrate that SP portrayals are both credible representations of patient cases and reliable across encounters and SPs (1, 6, 7). Residency training programs utilizing SPs have been described in a variety of specialties ranging from internal medicine and general surgery to radiology (8-12). To date, there has been no formal study of communication training and use of SPs in radiation oncology residency programs. Herein we report the results of the first SP program in radiation oncology training to our knowledge. The overall execution and reception of the pilot program, as well as its utility in radiation oncology residency communication training, is reported. In addition, a comparison of resident performance evaluations by residents themselves, SPs, and faculty is performed herein and discussed.

Methods and Materials Case development Two cases were developed by the study team at the University of Pennsylvania Radiation Oncology department (Fig. 1), in collaboration with the Standardized Patient Program at the Perelman School of Medicine.

Standardized patients The SP Program recruited 6 SPs to attend 1 of 2 4-hour training sessions, depending on the case they would be portraying. SPs were required to demonstrate consistent, realistic portrayals, as outlined in case materials, and reliable checklist

scoring. Each SP practiced delivering feedback during role plays and received input from the trainer and their peers to ensure their feedback was specific and substantive (13).

Participants All radiation oncology residents in postgraduate years 2 to 5 were invited to participate in the study. All residents had previously completed a training workshop in patient communication and delivering bad news. Eleven residents completed the SP program and gave informed consent for their evaluations to be used in this study.

Faculty Clinical faculty mentors were chosen at random from the Radiation Oncology department. The same faculty member observed both case 1 and case 2 for 2 residents.

Evaluation metric The Kalamazoo Essential Elements Communication Checklist-Adapted (KEECC-A) was used by the SPs, residents, and faculty to evaluate residents’ interpersonal skills for each case. The KEECC-A is an adapted version of the Kalamazoo guideline, which was developed by expert consensus in 2001 and identifies essential elements of physician-patient communication (Table 1) (14). The KEECC-A was developed to grade each Kalamazoo category on a 5-point Likert scale (1 Z poor, 2 Z fair, 3 Z good, 4 Z very good, 5 Z excellent) for greater accessibility and ease of evaluation. Both the Kalamazoo and KEECC-A metrics have been validated as reliable metrics for SP training programs and examinations to evaluate resident communication skills (14-16).

Patient encounter sessions A total of 11 residents participated in this institutional review board-approved study. Each resident was paired with 1 of 6 faculty members for a 1 hour 45 minute-evaluation session (Fig. 1). Residents were randomly selected to start case 1 or case 2 in a one-on-one session with the SP while a faculty member observed in a separate room, using live cameras. After self evaluations and SP and faculty evaluations were completed, the resident presented the case to a faculty member with the SP and had an opportunity to receive and respond to feedback from both the faculty and the SP. This was repeated for a second encounter with a second round of evaluations. Finally, residents reconvened with faculty members and SPs for a debriefing session to

Volume 88  Number 5  2014

Fig. 1.

Radiation oncology standardized patients 1131

Timetable of the SP program with case descriptions is shown.

conclude the event. Because the program recruited 6 SPs and 6 faculty members, the full 1 hour 45 minute evaluation session was repeated twice to account for a total of 11 radiation oncology residents.

Statistical analysis The Likert scale was dichotomized into high (scores 4 and 5) and low (scores 1, 2, and 3) scores. Mixed effects logistic regression models were used to compare scores between cases for each evaluator, and kappa statistics were used to assess inter rater agreement. Marginal homogeneity testing was used to determine if there were differences among the resident, SP, and faculty scores.

together (Fig. 2). Faculty and SPs rarely deviated from “very good” or “excellent” ratings for residents. In fact, only 1 resident received a score of less than “very good” or “excellent” from a faculty member. Comparatively, the most common overall self-assessment rating given by residents was 3, or “good.” When data were compared as low versus high scores, marginal homogeneity tests showed that faculty gave residents statistically higher scores than residents gave themselves in case 1 (PZ.01) and that there was a similar trend in case 2 (PZ.08). There were no significant differences in the low versus high distribution when comparing resident and SP scores or faculty and SP scores.

Inter rater agreement

Results Comparing case 1 to case 2 To determine whether significant differences existed between the 2 case scenarios, a mixed logistic regression was performed with overall ratings for the 2 cases. It was found that there were no statistically significant differences between overall ratings for the 2 cases in the resident selfassessment (odds ratio [OR], 0.48, 95% confidence interval [CI], 0.09-2.63, PZ.40), SP scores (OR, 0.45, 95% CI, 0.03-5.84, PZ.54), or both scores combined (OR, 0.64, 95% CI, 0.17-2.39, PZ.51). A mixed logistic regression analysis of faculty scores could not be performed because almost all residents received a high rating of Likert 4 to 5.

Overall performance An overall rating score was given for each resident encounter, summarizing resident performance on each element of the KEECC-A. Given no significant differences in scores between case 1 and case 2, the overall ratings for both cases were combined for each resident and analyzed

To further evaluate the educational significance of the observed score distribution, resident and faculty scores were compared to SP evaluations by inter rater agreement analysis. Using SP evaluation as a surrogate for patient evaluation, residents tended to grade themselves as less adept at communicating, whereas faculty tended to overestimate resident communication skills (Fig. 3). Table 2 demonstrates neither residents nor faculty consistently interpreted the patient’s perception of their interpersonal skills. There was significant inter rater agreement between faculty and SPs in case 1 and between resident and SPs in case 2.

Participant experience Common themes of SP and resident responses are described in Table 3.

Discussion This is the first SP program in radiation oncology to our knowledge. The results of this novel program shed light on

International Journal of Radiation Oncology  Biology  Physics

1132 Ju et al. Table 1 Kalamazoo consensus statement on essential elements of physician-patient communication Essential element Establishes rapport

Opens discussion

Gathers information

Understands patient’s perspective of illness

Shares information

Reaches agreement on problems and plans

Provides closure

Tasks  Encourages a partnership between physician and patient  Respects patient’s active participation in decision making  Allows patient to complete his/her opening statement  Elicits patient’s full set of concerns  Establishes/maintains a personal connection  Uses open-ended and closed-ended questions appropriately  Structures, clarifies, and summarizes information  Actively listens using nonverbal (eg, eye contact, body position) and verbal (words of encouragement) techniques  Explores contextual factors (eg, family, culture, gender, age, socioeconomic status, spirituality)  Explores beliefs, concerns, and expectations about health and illness  Acknowledges and responds to patient’s ideas, feelings, and values  Uses language patient can understand  Checks for understanding  Encourages questions  Encourages patient to participate in decision to the extent he/she desires  Checks patient’s willingness and ability to follow the plan  Identifies and enlists resources and supports  Asks whether patient has other issues or concerns  Summarizes and affirms agreement with the plan of action  Discusses follow-up (eg, next visit, plan for unexpected outcomes)

the challenges of training radiation oncology residents in effective communication skills. An overall rating score, summarizing resident performance on each element of the KEECC-A, was given for

each resident encounter by the resident, the SP, and faculty. The overall ratings for both cases were combined to look at total distribution of scores. The overall ratings tended to fall only in the “very good” and “excellent” categories under faculty and SP scoring, whereas resident self-evaluation encompassed a broader range of scores. Residents tend to have a more critical self-assessment and perceive greater room for improvement, whereas faculty and SPs were overall satisfied with their performances. SPs rated 86% of encounters as “very good” or “excellent,” despite comments that residents occasionally lacked empathy and collaborative intent or provided too much information too quickly (Table 3). These data suggest that residents are very critical of their communication skills or perhaps that SPs are overly generous with their evaluations. The faculty involved in this study also tended to view their residents as having excellent communication skills. We found a statistically higher proportion of Likert 4-5 scores given by faculty than by residents in self-assessment. The high Likert scale scoring from faculty members is consistent with generally positive resident evaluations of residents that are seen in their clinical work throughout residency. This underscores the fact that grade inflation in resident evaluations by faculty, both in the SP and realworld clinical setting, is a substantiated concern. Regardless, faculty should be encouraged to provide honest and critical feedback early and often as it is critical for residents to improve their interpersonal skills. Inter rater agreement analysis was done to evaluate the educational significance of these differences in score distribution. Inter rater agreement appeared to be inconsistent among residents, faculty, and SP. Using SP grading as the closest approximation to patient perception, residents perceived their performance more accurately in case 2, whereas faculty perceived resident performance more accurately in case 1. This suggests that residents may be more adept at perceiving patient or parental response to disappointing scenarios in the medical system, whereas faculty may be better at interpreting patient response to grief and hopelessness after being informed of a treatment failure. This may be due to residents having more frequent exposure to the frustrations involved in navigating the healthcare system, whereas faculty have more practical experience relaying bad news of treatment failure. Notably, faculty were able to observe the resident-SP encounter only through live recordings, which might have hindered their ability to accurately assess body language and tone from both parties. One possible explanation for poor inter rater agreement is that faculty and residents lack the specialized training SPs received to consider the appropriate body language, tone, word choice, and pace of conversation necessary for effective communication. Without an understanding of what patients consider to be important elements of physician-patient communication, it is difficult for residents and faculty alike to accurately evaluate and improve their performance. The results of this analysis provide a rationale

Volume 88  Number 5  2014

Fig. 2. shown.

Radiation oncology standardized patients 1133

Distribution of overall Likert scale rating by resident, SP, and faculty after combining scores for case 1 and case 2 is

for continued communications training during residency but also raise the question of whether faculty may need to be trained to better evaluate communication skills as well. Striving toward inter rater agreement would benefit both residents and faculty members by improving their ability to effectively communicate with patients and accurately perceive their patient’s perspectives. This is an important mission of both individual residency programs and the ACGME, and inter rater agreement may be an interesting metric for future studies of the effectiveness of communications training. SP feedback revealed many areas for improvement in communication style and content. SPs frequently expressed a desire to be collaborators in their overall treatment plan and complained when they felt their physician was giving too much information too quickly or made a treatment decision without regard to the patient’s opinions. The

Fig. 3.

ability of a resident to empathize with the patient’s emotions of frustration or grief was highlighted throughout the feedback for both encounters. Surprisingly, phrases such as “standard of care” was perceived as subexcellent treatment when in fact it represents the best and most evidence-based treatment. SP training programs using different scenarios may help to illuminate other important areas of misinterpreted medical jargon pertaining to the radiation oncology setting. Residents and faculty gave positive reviews of the training program. The 2 cases were similar in difficulty level, as demonstrated by a lack of significant differences in overall scores (case 2 vs case 1: OR, 0.64 [PZ.51]). Overall, residents felt that the cases were properly calibrated to their level of training, that the cases were realistic representations of clinical scenarios, and that they would benefit from more SP programs. Through quantitative

Overall Likert scale rating for each case by resident, SP, and faculty is shown.

International Journal of Radiation Oncology  Biology  Physics

1134 Ju et al. Table 2 faculty

Pair-wise comparison of scores by resident, SP, and

Pair-wise comparison Resident vs faculty Resident vs SP Faculty vs SP

Case 1

Case 2

k statistic P value k statistic P value k Z 0.15 .90 k Z 0.30 .05 k Z 0.00 .50 k Z 0.31 .02 k Z 0.41 .02 k Z 0.15 .75

scores and qualitative feedback, annual evaluations for each resident from an SP program may be used to highlight improvements from the previous year and reveal new areas of focus for the upcoming year. Future programs could be strengthened by pre- and postprogram evaluations to identify components of the program that could be improved and also to assess changes in participant perceptions of the SP program. Additional workshops allowing residents to repeat

attempts with a single encounter or practice with multiple scenarios may also be considered. Our study has limitations. A greater number of performance samples and raters would have yielded a greater power to our statistical analyses (17). This could be accomplished by aggregating data from annual evaluations or expanding the SP program to multiple institutions. Critics of the SP program question whether the SP model of a single encounter is realistic given that physicians get many opportunities to build the physician-patient relationship in clinical practice. The modern radiation oncology practice, however, often serves as a consult service that is frequently required to deliver bad news and convey a large volume of technical information in the initial visit. The SP training model serves to improve the critical element of building quick rapport, ultimately helping patients to trust their physicians and make informed treatment decisions.

Conclusions Table 3 Themes presented in the SP and resident feedback Reflections from the patient experience Delivery  “Casualness of the doctor” made the SP feel insignificant  “Information came at me very fast and non-stop with no pauses or silences. It all happened so quickly.bad news came out of the blue.” SP often felt the breaking of bad news occurred too quickly and did not allow time for emotional processing.  SP wanted a stake in the decision-making process and did not appreciate when the plan was presented but not discussed. On the other hand, some SPs also felt pressured to make a decision without the physician’s guiding opinions. Word choice  The phrase “standard treatment” was assumed by SPs to represent average care and not the best care  Encouraging sentiments like “this isn’t over,” “we’re going to fight this,” and others helped the SP overcome grief and dread, whereas “unfortunately” and “I’m afraid” increased feelings of hopelessness  Validation of perspectives, feelings, and emotions expressed through body language made the SP feel better cared for; “You did everything right” helped reassure SPs that they had made the right choice for their child Reflections from the resident experience  Case scenarios were realistic and accurately portrayed  Residents appreciated opportunities to receive patient feedback. Specific feedback regarding word choice, body language, and voice inflection were particularly helpful, for example: “I found the words I say can be unhelpful and too dramatic.”  The ability to practice delivering bad news was valuable as residents rarely get this opportunity.  The experience could be improved if residents could privately review their recordings and repeat encounters to practice different wordings.

This paper describes the first pilot SP training program for residents in radiation oncology to our knowledge. The results demonstrate higher faculty scores, inconsistent inter rater agreement, and a need for residents and faculty physicians to better calibrate their evaluations to true patient perceptions. In addition, feedback from SPs revealed perceived lack of empathy, absence of shared decision making, and excessive medical jargon as major barriers to developing strong interpersonal relationships. Future SP programs can further explore the issues illuminated by this study, including: confirming the findings regarding high and nondiscriminating faculty scores, identifying reasons for poor inter rater agreement, and developing methods of improving faculty and resident understanding of the patient perception.

References 1. Reddy S, Vijayakumar S. Evaluating clinical skills of radiation oncology residents: Parts I and II. Int J Cancer 2000;90:1-12. 2. Stewart MA. Effective physician-patient communication and health outcomes: A review. CMAJ 1995;152:1423-1433. 3. Swing SR. The ACGME outcome project: Retrospective and prospective. Med Teach 2007;29:648-654. 4. Rider EA, Keefer CH. Communication skills competencies: Definitions and a teaching toolbox. Med Educ 2006;40:624-629. 5. Barrows HS. An overview of the uses of standardized patients for teaching and evaluating clinical skills. AAMC. Acad Med 1993;68: 443-451. discussion 451-453. 6. Erby LA, Roter DL, Biesecker BB. Examination of standardized patient performance: Accuracy and consistency of six standardized patients over time. Patient Educ Couns 2011;85:194-200. 7. van Zanten M, Boulet JR, McKinley D. Using standardized patients to assess the interpersonal skills of physicians: Six years’ experience with a high-stakes certification examination. Health Commun 2007; 22:195-205. 8. Berger JS, Blatt B, McGrath B, et al. Relationship express: A pilot program to teach anesthesiology residents communication skills. J Grad Med Educ 2010;2:600-603.

Volume 88  Number 5  2014 9. Jagadeesan R, Kalyan DN, Lee P, et al. Use of a standardized patient satisfaction questionnaire to assess the quality of care provided by ophthalmology residents. Ophthalmology 2008;115:738-743. 10. Lown BA, Sasson JP, Hinrichs P. Patients as partners in radiology education: An innovative approach to teaching and assessing patientcentered communication. Acad Radiol 2008;15:425-432. 11. Chun MB, Young KG, Honda AF, et al. The development of a cultural standardized patient examination for a general surgery residency program. J Surg Educ 2012;69:650-658. 12. Yudkowsky R, Alseidi A, Cintron J. Beyond fulfilling the core competencies: An objective structured clinical examination to assess communication and interpersonal skills in a surgical residency. Curr Surg 2004;61:499-503.

Radiation oncology standardized patients 1135 13. Ende J. Feedback in clinical medical education. JAMA 1983;250:777781. 14. Makoul G. Essential elements of communication in medical encounters: The Kalamazoo consensus statement. Acad Med 2001;76:390-393. 15. Joyce BL, Steenbergh T, Scher E. Use of the Kalamazoo Essential Elements Communication Checklist (Adapted) in an institutional interpersonal and communication skills curriculum. J Grad Med Educ 2010;2:165-169. 16. Schirmer JM, Mauksch L, Lang F, et al. Assessing communication competence: A review of current tools. Fam Med 2005;37:184-192. 17. Boulet JR, McKinley DW, Whelan GP, et al. Quality assurance methods for performance-based assessments. Adv Health Sci Educ Theory Pract 2003;8:27-47.

Assessing interpersonal and communication skills in radiation oncology residents: a pilot standardized patient program.

There is a lack of data for the structured development and evaluation of communication skills in radiation oncology residency training programs. Effec...
540KB Sizes 0 Downloads 3 Views