The American Journal of Surgery (2015) 210, 396-400

Surgical Education

Influence of clerkship sites on National Board of Medical Examiners surgery subject examination performance Rebecca M. Rentea, M.D.a, Brian D. Lewis, M.D.a,b,*, Amy J. Leisten, M.S.a, Robert Treat, Ph.D.c,d, Philip N. Redlich, M.D., Ph.D.a,b,c a

Department of Surgery, Office of Educational Services, Medical College of Wisconsin, 9200 W. Wisconsin Avenue, Milwaukee, WI 53226, USA; bClement J. Zablocki VA Medical Center, 5000 W. National Avenue, Milwaukee, WI 53295, USA; cOffice of Academic Affairs, Medical College of Wisconsin, 9200 W. Wisconsin Avenue, Milwaukee, WI 53226, USA; dOffice of Educational Services, Medical College of Wisconsin, 9200 W. Wisconsin Avenue, Milwaukee, WI 53226, USA

KEYWORDS: Surgery clerkship; National Board of Medical Examiners; Site comparability

Abstract BACKGROUND: As one measure of comparability of student experiences on a 2-month surgery clerkship, scores on the National Board of Medical Examiners Surgery Subject Examination (NSSE) were evaluated against a number of variables. METHODS: NSSE scores for 701 students completing the surgery clerkship over 3.5 years were analyzed. Students rotated at academic, VA, and community hospitals with 1 month of general surgery paired with 1 month of a surgical subspecialty. The effect of 15 rotation site pairings on NSSE performance was analyzed by analysis of variance. The relationship of site-specific student evaluation variables and NSSE scores was examined by stepwise multivariate linear regression. RESULTS: No statistical differences were demonstrated between NSSE scores and site-specific parameters of duty hours, resident participation, or type of hospital, nor between NSSE scores and paired sites constituting the overall clerkship experience. CONCLUSION: Performance on the NSSE was not impacted by any assigned paired sites, supporting comparability of overall clerkship experiences. Ó 2015 Elsevier Inc. All rights reserved.

The authors have no conflicts of interest to disclose. Presented as a poster presentation at the Association for Surgical Education meeting, April 23-25, Orlando, Florida. * Corresponding author. Tel.: 11-414-805-8620; fax: 11-414-8059170. E-mail address: [email protected] Manuscript received September 30, 2014; revised manuscript December 18, 2014 0002-9610/$ - see front matter Ó 2015 Elsevier Inc. All rights reserved. http://dx.doi.org/10.1016/j.amjsurg.2014.12.035

Student learning on our third-year surgical clerkship occurs in paired general or subspecialty surgery sites in academic, VA, or community hospitals as previously described.1 Teaching activities beyond learning from clinical exposure include core didactic sessions, conferences, and skills laboratories attended by all students regardless of site, as well as site-specific teaching rounds and conferences. Until recently, objective assessment of knowledge gained on our clerkship was determined, in part, by an in-house examination developed by our faculty at various

R.M. Rentea et al.

Clerkship site influence on NBME scores

sites and administered at the end of the 2-month rotation. The content of this examination was guided by the core clerkship learning objectives, directed readings, and other activities experienced by all students regardless of site. We transitioned the examination to the National Board of Medical Examiners Surgery Subject Examination (NSSE) to allow comparison of student performance with national norms and expose students to the types and styles of questions expected on the United States Medical Licensing Examination Step 2 Clinical Knowledge. The NSSE has been widely adopted in surgical clerkships in the United States likely for ‘‘its sound psychometric properties, national norms, and reasonable price,’’ as stated by Lind et al.2 Although a review of multisite surgical training programs revealed few differences in examination performance between students rotating at tertiary or affiliated sites,3 significant differences in some programs continue to be reported between sites using either a faculty-developed examination or the NSSE,4–6 possibly reflecting the impact of learning environments at different sites. We have previously reported that monitoring the educational quality of surgical clerkship sites based on student perception data is critical to achieving the clerkship’s education goals.1 However, given the popularity of some sites and recognizing the continued accreditation requirement to ensure comparable educational experiences and equivalent methods of assessment across all instructional sites, we were interested in determining if students’ assigned paired sites impact performance on the end-ofclerkship NSSE.

Methods As previously described,1 students were assigned to 2 consecutive 1-month rotations at sites that were consistently paired. In general, students spend 1 month on elective general surgery sites at either a university-affiliated adult hospital, a Veterans Affairs (VA) hospital, or in one of 4 community hospitals, and 1 month on subspecialty sites (vascular, pediatric, trauma, cardiothoracic, and transplant surgery as well as surgical oncology) in either the university adult hospital, university-affiliated children’s hospital, or VA hospital. Overall, students rotate at 8 university-affiliated adult and children’s hospital (academic) sites, 4 community hospital sites, and 3 VA hospital sites in 15 paired options. All sites, except 2, had consistent surgery resident participation with one site having intermittent resident participation. The structure of the clerkship has core learning activities for all students including general and operating room orientation sessions, a suture workshop, weekly didactic series, grand rounds, morbidity and mortality conferences, and professor rounds where students present and discuss cases. Exposure to cases is monitored via an electronic log to ensure appropriate patient experiences over the duration of the clerkship. Site directors

397 Table 1 Listing of core and site-specific learning activities and requirements List of core activities/requirements for all students regardless of site  General orientation  Operating room orientation  Suture workshop at the start of the clerkship  Weekly didactic lecture series  Weekly professor rounds with case presentations  Grand Rounds  M&M conference  Case log documentation from required list of diagnoses  Completion of required number of H&Ps List of site-specific activities  Teaching rounds  Teaching in the operating room  Exposure to site-specific cases  Opportunity for procedures  Attendance at clinic H&P 5 history and physical; M&M 5 morbidity and mortality.

provide site-specific teaching rounds and conferences, ward, clinic, and operating room exposure, as well as the opportunity for procedures. A summary of core and sitespecific activities is listed in Table 1. As oversight, our Student Education Committee, chaired by the clerkship director, reviews student evaluation data from sites every 2 to 4 months to ensure an optimal educational experience over the 2-month clerkship and the appropriateness of the site pairings. The committee provides feedback to the site directors with their data compared with other sites after every rotation. All students complete an objective structured clinical examination (OSCE) near the end of the clerkship. The NSSE is administered to students on the final day of the clerkship. The NSSE contributes 35% to the final grade, with other components contributing as follows: 50% clinical performance, 5% formal presentation, 5% history and physical documentation, and 5% OSCE scores. Students are required to complete a site evaluation survey after each month and an overall evaluation at the end of the 2-month clerkship rotation. In addition to the questions previously described,1 students were asked to track and report their duty hours. After institutional review board approval, student site evaluation surveys and scores on their NSSEs were analyzed from all third-year medical

Table 2 Difference between the highest and the lowest NSSE mean scores of first rotation site First rotation site

n

Rank Mean (SD) D

Academic hospital site* 42 1 Community hospital† 45 15

P value

74.0 (8.2) 3.4 .056 70.6 (8.2)

NBME 5 National Board of Medical Examiners; NSSE 5 NBME Surgery Subject Examination; SD 5 standard deviation. *One of 8 sites. † One of 4 sites.

398

The American Journal of Surgery, Vol 210, No 2, August 2015

Table 3 Difference between the highest and the lowest NSSE mean scores of second rotation site Second rotation site

n

Rank Mean (SD)

Academic hospital site* 30 1 Community hospital 52 15

D

P value

75.0 (10.9) 4.8 .017 70.2 (7.0)

NBME 5 National Board of Medical Examiners; NSSE 5 NBME Surgery Subject Examination; SD 5 standard deviation. *Site is different than in Table 2.

students from 2008 to 2012. Students assigned to infrequently occurring pairs represent less than 5% of the data and were not included in this analysis. Students are required to complete all postclerkship surveys in order for their grades to be released. The relationship between NSSE and clerkship survey variables across all sites was examined by bivariate Pearson correlations and stepwise multivariate linear regression analysis using NSSE scores or overall educational value as outcome variables. NSSE scores were analyzed over 4 academic years (2008 to 2012) by univariate analysis of variance to determine the effect of clerkship rotational sites (individual and paired) on student performance. Statistical analysis was generated by SPSS 15.0 (SPSS, Inc, Chicago, IL). A P value less than .050 was considered statistically significant.

Results During the 3.5-year study period, 701 students were enrolled in the required third-year clerkship. We initiated our study by analyzing site-specific data for each month followed by combined site data representing the entire 2-month clerkship experience. Each rotation’s mean NSSE scores were compared separately for those sites that occurred first and those sites that occurred second in order to identify any differences between high- and lowperforming sites and to provide insight into subsequent analyses of paired sites. A difference in mean NSSE scores of 3.4 was found between the first sites with the highest and lowest scores (Table 2). This difference did not reach statistical significance (P 5 .056). However, for the second site on which students rotated (Table 3), a difference of 4.8 was found between the highest and the lowest mean

Table 4 Difference between the highest and the lowest NSSE mean scores of first site aggregated by type First rotation site

n

Mean (SD)

Dmax

P value

Community Academic VA

233 340 128

72.8 (8.9) 72.8 (8.2) 71.2 (7.7)

1.6

.146

NBME 5 National Board of Medical Examiners; NSSE 5 NBME Surgery Subject Examination; SD 5 standard deviation; VA 5 Veterans Affairs.

Table 5 Difference between the highest and the lowest NSSE mean scores of second site aggregated by type Second rotation site

n

Mean (SD)

Dmax

P value

Community Academic VA

232 339 130

72.5 (7.7) 72.4 (8.8) 72.8 (8.6)

.4

.935

NBME 5 National Board of Medical Examiners; NSSE 5 NBME Surgery Subject Examination; SD 5 standard deviation; VA 5 Veterans Affairs.

NSSE scores, which reached statistical significance (P 5 .017). When sites were aggregated as either representing academic, community, or VA sites for the first and second rotations, no statistical significance was noted in the NSSE scores (Tables 4 and 5). In addition, no statistically significant difference in NSSE scores was noted between sites, which did or did not have assigned surgical residents. Also, there was no statistically significant association found between students’ site evaluations and NSSE scores via Pearson correlations across all sites or aggregated by academic, community, or VA sites. Our previous work by Redlich et al1 demonstrated that there were a number of predictors of the overall educational value of a clerkship site. We reanalyzed our current data for predictors of overall educational value of a site, including the new variables of NSSE scores and student selfreported duty hours. Linear regression analysis revealed a statistically significant relationship between overall site education value and 5 variables (R2 5 .90, P 5 .001). In Table 6, the strength of the predictors is reported as beta coefficients in descending order. Although there was no statistically significant association between educational value and NSSE (r 5 .115, P 5 .385), NSSE does show as a weak but statistically significant predictor of overall educational value. There was no statistically significant relationship of duty hours and overall educational value across all

Table 6 Multivariate linear regression of overall educational value for all sites on clerkship variables Regression coefficients* Item

Beta

P value

Opportunities for direct patient care Educational value of instruction with attending on rounds Hours/week direct instructional contact with attending Quality of teaching by house staff NBME Surgery Subject Examination score

.476

.001

.329

.001

.167

.013

.164 .120

.045 .011

NBME 5 National Board of Medical Examiners. *Overall regression model goodness-of-fit: R2 5 .90 (P 5 .001).

R.M. Rentea et al.

Clerkship site influence on NBME scores

399

Figure 1 Plot of NSSE mean scores across site pairs. Only those pairings with 10 or more students are shown. The number beside each site indicates a unique site among similar types of sites. No differences reached statistical significance.

sites. Further analysis by aggregate sites revealed additional variables predicting educational value including instruction from attendings in the operating room and constructive feedback on histories and physicals in the adult academic site, opportunities for procedures in the community sites, and duty hours in the VA sites. Following our regression model of site-specific data, we analyzed paired site data and NSSE scores using analysis of variance. Mean scores from all paired sites over the study period trended slightly higher than 70 and are shown in Fig. 1 for those pairings with 10 or more students. However, no statistically significant differences between NSSE mean scores were found across the 15 paired sites.

Comments Multiple sites for surgical clerkships are often used given increasing class sizes; however, adherence to Liaison Committee on Medical Education standards of comparable experiences and equivalent assessment methods must still be maintained. An important tool for assessing students’ knowledge gained from clerkship experiences is the NSSE. Because rotations vary in popularity based on feedback and hands-on experience, differences in performance on the NSSE could reflect site-specific rotation experiences. Although we found some differences in NSSE scores and/ or student educational value for individual sites, no differences were noted in NSSE scores for the paired sites. Analysis of our data confirms the continued contribution of multiple variables to the students’ perceived overall educational value of the surgery clerkship.1 Opportunities for patient care, instruction by attendings, and quality of house staff teaching maintained their importance. In the era of duty hour compliance, data collected allowed for an additional analysis of the impact of duty hours on the educational value of those sites.

There is controversy regarding the impact of time spent on self-study vs clinical rotation activities on NSSE scores. Recent data reported from an academic institution with university and community clerkship sites showed improved NSSE scores on services with heavier clinical loads and less study time before a reorganization of the clerkship.5 However, the number of students in their study is much smaller than this study and their workload data were determined by extrapolation from student reports of working greater than 80 hours/week. Following reorganization, NSSE scores were similar across sites, showing the importance of clerkship oversight to ensure comparable experiences for students. Interestingly, an earlier report from the same institution showed no correlation of patient load and clinical task load with final clerkship grades, leading the authors to suggest that study and lecture time may better prepare students for the NSSE than performing clinical tasks.7 In our study, we found no relationship of duty hours with NSSE across all sites. One of the aspects of our clerkship is the pairing of sites that include university-affiliated adult and children’s hospital sites, VA hospital sites, and multiple community hospital sites. Evaluation of NSSE performance by type of site has been studied by others. Williams et al8 reported no difference in academic, community, or combined sites on OSCE or NSSE performance at a Midwestern medical school. In a literature review of multisite surgical teaching programs, little differences were noted in NSSE performance when comparing students rotating at tertiary care or affiliated community hospitals.3 Similarly, we found no correlation of NSSE with any specific type of site, supporting the findings of prior studies. There are several limitations to our study. Although the number of students is large at 701 over multiple years, our study is limited to a single institution, with a unique structure of clerkship educational activities and consistent pairings of sites. Additionally, all sites are located in the

400 same metropolitan area allowing for a core set of activities that can be attended by all students. Such a structure may not be feasible at other schools. We did not examine other factors that could affect student performance such as timing of clerkships during the academic year, differences in case log documentation by students at various sites, or attendance at core lectures and conferences. Also, our study did not compare outcomes of students on general vs subspecialty services because our students rotate on general and subspecialty services that are paired together to constitute the clerkship experience. In summary, no differences were found in overall NSSE performance for multiple paired clinical sites, supporting the importance of comparable experiences for our students. A committee dedicated to oversight of the medical student clinical experiences at various sites is important to ensure that site pairings remain appropriate over time. Furthermore, as curricular reform is undertaken in medical schools, changes to the surgical clerkship are inevitable and efforts to maintain ongoing quality oversight of the clinical experience including outcomes cannot be overemphasized. With active oversight of the clerkship, site pairings can be designed to mitigate any site-specific differences to ensure an optimal overall experience that complies with Liaison Committee on Medical Education accreditation standards.

The American Journal of Surgery, Vol 210, No 2, August 2015

Acknowledgments We thank Dawn S. Bragg, PhD, for her critical review of the manuscript.

References 1. Redlich PN, Milkowski T, Bragg D, et al. Multiple variables influence the educational value of surgical clerkship sites. Am J Surg 2006;191:178–82. 2. Lind DS, Deladisma AM, Cue JI, et al. Survey of student education in surgery. J Am Coll Surg 2007;204:969–76. 3. Ng VK, McKay A. Challenges of multisite surgical teaching programs: a review of surgery clerkship. J Surg Educ 2010;67:1–8. 4. Bradley III EL, Littles AB, Romrell LJ. The surgical clerkship: a contemporary paradigm. J Surg Res 2012;177:14–20. 5. Myers JA, Vigneswaran Y, Gabryszak B, et al. NBME subject examination in surgery scores correlate with surgery clerkship clinical experience. J Surg Educ 2014;71:205–10. 6. Tatum RP, Jensen A, Langdale LA. Expanding surgical clerkships to remote community sites: the success of the Washington, Wyoming, Alaska, Montana, and Idaho experience. Am J Surg 2009;198: 436–41. 7. Libbin JB, Hauge LS, Myers JA, et al. Evaluation of student experience and performance in a surgical clerkship. Am Surg 2003;69: 280–6. 8. Williams M, Ambrose M, Carlin AM, et al. Evaluation of academic and community surgery clerkships at a Midwestern medical school. J Surg Res 2004;116:11–3.

Influence of clerkship sites on National Board of Medical Examiners surgery subject examination performance.

As one measure of comparability of student experiences on a 2-month surgery clerkship, scores on the National Board of Medical Examiners Surgery Subje...
291KB Sizes 3 Downloads 4 Views

Recommend Documents