Published Ahead of Print on May 1, 2015 as 10.1212/WNL.0000000000001641

CONTEMPORARY ISSUES: INNOVATIONS IN EDUCATION

Milestone-compatible neurology resident assessments A role for observable practice activities

Lyell K. Jones Jr., MD Elliot L. Dimberg, MD Christopher J. Boes, MD Scott D.Z. Eggers, MD David W. Dodick, MD Jeremy K. CutsforthGregory, MD Andrea N. Leep Hunderfund, MD David J. Capobianco, MD

Correspondence to Dr. Jones: [email protected]

ABSTRACT

Objective: Beginning in 2014, US neurology residency programs were required to report each trainee’s educational progression within 29 neurology Milestone competency domains. Trainee assessment systems will need to be adapted to inform these requirements. The primary aims of this study were to validate neurology resident assessment content using observable practice activities (OPAs) and to develop assessment formats easily translated to the Neurology Milestones.

Methods: A modified Delphi technique was used to establish consensus perceptions of importance of 73 neurology OPAs among neurology educators and trainees at 3 neurology residency programs. A content validity score (CVS) was derived for each neurology OPA, with scores $4.0 determined in advance to indicate sufficient content validity. Results: The mean CVS for all OPAs was 4.4 (range 3.5–5.0). Fifty-seven (78%) OPAs had a CVS $4.0, leaving 16 (22%) below the pre-established threshold for content validity. Trainees assigned a higher importance to individual OPAs (mean CVS 4.6) compared to faculty (mean 4.4, p 5 0.016), but the effect size was small (h2 5 0.10). There was no demonstrated effect of length of education experience on perceived importance of neurology OPAs (p 5 0.23). Two sample resident assessment formats were developed, one using neurology OPAs alone and another using a combination of neurology OPAs and the Neurology Milestones. Conclusions: This study provides neurology training programs with content validity evidence for items to include in resident assessments, and sample assessment formats that directly translate to the Neurology Milestones. Length of education experience has little effect on perceptions of neurology OPA importance. Neurology® 2015;84:1–5 GLOSSARY ACGME 5 Accreditation Council for Graduate Medical Education; CVS 5 content validity scores; GME 5 graduate medical education; OPA 5 observable practice activity.

Editorial, page XXX

In 1999, the Accreditation Council for Graduate Medical Education (ACGME) and American Board of Medical Specialties introduced the 6 core domains of competence of clinical medicine: patient care, medical knowledge, practice-based learning and improvement, interpersonal skills and communication, professionalism, and systems-based practice.1 To address the challenges associated with translating these broad competencies into more specific clinical measures, the ACGME launched the Milestone Project in 2009, charging specialties to develop specific educational accomplishments required to establish clinical competency in each of the 6 domains.2 Beginning in 2014, ACGME-accredited neurology residency programs were required to evaluate each trainee semiannually according to the Neurology Milestones. The Neurology Milestones identify multiple subcompetencies within each of the 6 core competencies of medicine (for example, Neurologic Exam is a subcompetency of the patient care domain).3 Each of the 29 subcompetencies includes a series of Levels (1–5), representing the continuum of knowledge, skills, and attitudes to be acquired during neurology residency training. Each Level includes one or more developmental accomplishments, or Milestone anchor

Supplemental data at Neurology.org From the Department of Neurology (L.K.J., C.J.B., S.D.Z.E., J.K.C.-G., A.N.L.H.), Mayo Clinic, Rochester, MN; the Department of Neurology (E.L.D., D.J.C.), Mayo Clinic, Jacksonville, FL; and the Department of Neurology (D.W.D.), Mayo Clinic, Scottsdale, AZ. Go to Neurology.org for full disclosures. Funding information and disclosures deemed relevant by the authors, if any, are provided at the end of the article. © 2015 American Academy of Neurology

ª 2015 American Academy of Neurology. Unauthorized reproduction of this article is prohibited.

1

statements (for example, Level 3 of the Neurologic Exam Milestone requires trainees to be able to visualize papilledema). Although it is not currently a requirement for graduation, neurology trainees are in general expected to achieve Level 4 in each subcompetency to be considered prepared for independent clinical practice.3 While the Neurology Milestones provide specific neurology trainee educational goals, they are not well-informed by the Likertstyle, peer-normed performance assessments historically used in most graduate medical education (GME) programs. Furthermore, the Milestones were not designed to be direct assessment tools themselves. As a result, neurology residency programs need to ensure their assessment systems can accurately inform the Milestones for each trainee’s performance. To adapt assessment systems to a Milestonecompatible format, GME programs may opt to use assessment tools that translate observed trainee performance to the Milestone subcompetencies, levels, and individual anchor statements.4 One such tool is the observable practice activity (OPA). OPAs are defined as those individual, observable practices with which trainees are progressively entrusted during training.5 For example, a possible neurology OPA would be “Diagnoses and manages common headache syndromes.” OPAs are flexible and versatile assessment constructs, can assess progress toward mastery of one or more competencies or subcompetencies, and have been used in other specialties’ training programs.5 There are no published or validated neurology OPAs or other Milestone-compatible assessment tools. The primary objectives of this study were to (1) validate neurology resident assessment content in the form of neurology OPAs and (2) develop an assessment format more easily translated to the Neurology Milestones. Secondary aims were to compare perceptions of importance of individual neurology OPAs between faculty and trainees and to assess any effect of teaching longevity on perceived OPA importance. METHODS Development of neurology OPAs. Three authors (L.K.J., E.L.D., and D.W.D.) developed a list of neurology OPAs after review of the following sources: (1) trainee evaluation 2

Neurology 84

requirements outlined by the ACGME for neurology residency programs6 and all accredited programs,7 (2) ACGME-defined neurology educational milestones,3 (3) available literature on OPAs and similar assessment tools for other specialties,5,8–10 and (4) personal experience in neurology residency program curriculum development. OPAs were the chosen assessment instrument given (1) they are versatile enough to provide adequate content diversity to trainee assessments in all areas important for independent clinical practice5 and (2) Milestones were not developed to be used as direct assessment tools and were not intended to cover “the entirety of the dimensions of the 6 domains of physician competency.”3 A list of 72 neurology OPAs was developed. This study was reviewed and approved by the Mayo Clinic Institutional Review Board.

Establishing neurology OPA content validity. A modified Delphi technique11,12 was used to establish consensus on the importance of each neurology OPA. The Delphi technique is a structured communication and validation tool that uses quantitated, anonymous, expert opinion assessed in rounds. The consensus opinion is made available to participants between rounds so they may revise their responses, with inter-round convergence reflecting success of the process. This Delphi method was applied via Web-based survey in 2 rounds. Survey recipients included neurology faculty and trainees from 3 ACGME-accredited adult neurology residency programs, reflecting diversity of department size and geographic location: Mayo Clinic Rochester (faculty size .100, 9 neurology residents per year, small Midwestern city), Mayo Clinic Florida (faculty size 25–30, 4 neurology residents per year, large Southeastern city), and Mayo Clinic Arizona (faculty size 25–30, 3 neurology residents per year, large Southwestern city). Faculty surveyed were all active neurology resident educators (defined as working with trainees ranging from weekly to daily on average), were selected from a variety of subspecialty interests and backgrounds including general neurology, and ranged from junior to senior in education experience. The trainees who took the survey were neurology postgraduate year 4 or clinical neurology fellows who had recently graduated from one of the 3 residency programs. The Delphi method was applied over 2 rounds using identical survey questions. The survey instrument instructed all recipients to rate each neurology OPA on a continuous importance scale: 1, completely unimportant; 2, not very important; 3, neutral; 4, important; 5, must be achieved. Survey recipients were given opportunities in both rounds to suggest additional assessment items not included in the survey. After round 1, mean importance scores for each item were calculated and included adjacent to each survey item in round 2. All survey responses were anonymous during the deliberation period. Round 2 was completed approximately 3 weeks after round 1. Content validity scores (CVS) were derived directly from the mean round 2 score for each item, with a score $4.0 determined in advance to indicate sufficient content validity. Designing neurology trainee assessments. Following analysis of the assessment content validation, sample neurology residency assessment forms were designed to reflect practical implementation of the resulting content items. Considerations in the design of the sample assessment formats included (1) each item’s CVS, (2) the corresponding subcompetency to which each item can be mapped, and (3) specific ACGME reporting requirements. Analysis. Descriptive statistics were applied to the survey data. Individual and mean variance for survey items was calculated after round 1 and round 2 to determine if variance decreased, as is anticipated in the Delphi technique. A power estimate was completed prior to the study. For an anticipated difference in CVS of

June 2, 2015

ª 2015 American Academy of Neurology. Unauthorized reproduction of this article is prohibited.

0.2, a sample size of 30 survey respondents setting a at 0.05 and b at 0.80 would be adequate to reject or accept the null hypothesis that career stage does not affect perceived OPA importance (power estimate 5 99%). Effect sizes were calculated with partial h2. Data were visually inspected for normality, and the Wilcoxon test or linear regression was used for group comparisons as appropriate. All tests were 2-sided with p , 0.05 determined in advance to represent significance.

A total of 72 neurology OPAs were developed and included in the Delphi validation process (table e-1 on the Neurology® Web site at Neurology.org). Most items were derived from the Neurology Milestones. One additional item was suggested during round 1 of the survey (“Effectively manage transitions of care, including use of a structured communication tool”) and was included to all subsequent survey recipients in round 1 and round 2 for a total of 73 neurology OPAs in the final analysis. RESULTS Neurology OPAs.

Neurology OPA content validity. A total of 44 faculty

and trainees received survey requests in round 1 and round 2 (36 faculty and 8 trainees). Participant characteristics and summary CVS scores are listed in table 1. Forty recipients responded to at least one survey round (91%) and 25 (57%) participated in both rounds. Mean variance decreased between rounds 1 and 2 (0.39–0.15, p , 0.0001), signifying consensus had been established by the Delphi technique. Individual CVS for each neurology OPA are listed in table e-1. Fifty-seven (78%) OPAs had a CVS $4.0, leaving 16 (22%) below the preestablished threshold for content validity. Effects of trainee status or educational longevity on OPA importance. Trainees assigned a higher importance to

individual neurology OPAs (mean CVS 4.6) compared

Table 1

Participant characteristics and neurology OPA content validity scores

Total Delphi process participants

Values

Round 1, n (%)

34/44 (77)

Faculty

27/34 (79)

Trainees

7/34 (21)

Round 2, n (%)

31/44 (70)

Faculty

25/31 (81)

Trainees

6/31 (19)

Mean faculty education experience, y (range)

11.7 (1–33)

Mean CVS variance Round 1

0.39

Round 2

0.15

Variance change

20.24 (p , 0.0001)

Mean round 2 CVS (range)

4.4 (3.5–5.0)

Neurology OPAs with round 2 CVS ‡4.0, n (%)

57/73 (78)

Abbreviations: CVS 5 content validity score; OPA 5 observable practice activity.

to faculty (mean 4.4, p 5 0.016), but the effect size was small (h2 5 0.10). Among the faculty participants, there was no effect of length of education experience on perceived importance of neurology OPAs (p 5 0.23). Using neurology OPAs to develop trainee assessment tools. After reviewing the results of the analysis, 2

strategies were developed to design neurology assessment templates. The first involved using OPAs alone as the core of the assessment template (listed in table e-1), with trainee performance on each OPA scored on an entrustment scale (figure).5 This approach requires then linking (or mapping) each OPA to the appropriate competency, subcompetency, and individual Milestones for reporting to the ACGME. The second strategy consisted of using OPAs to complement questions derived from individual Milestone subcompetencies (table e-2). This second approach resulted in development of 2 question types, or a hybrid format: (1) questions based on the Milestone subcompetencies that could be rated on a scale directly translated to the appropriate subcompetency Level and (2) questions based on the developed list of neurology OPAs, graded on the same entrustment scale used in the first approach. Similarly, items were then mapped to the appropriate neurology competency, subcompetency, and Milestone. The validity of the assessment content was considered in the development of both types of assessment forms, with lower-scoring items included only when required for reporting to the ACGME. DISCUSSION The ACGME Milestones emphasize assessment of trainees according to developmental accomplishments rather than by comparison to peers (i.e., they are criterion-based rather than normreferenced). Accordingly, neurology training programs need to ensure that resident performance (1) is measured with these principles in mind and (2) is accurately translated to the Milestones for reporting to the ACGME. The results of this study provide neurology training programs with content validity evidence for items to include in their assessment forms, in a format that directly translates to the Neurology Milestones. Reviewing the relative perceived importance of neurology OPAs in this study reveals several trends. Neurology OPAs that were perceived to be more important in this study were generally reflective of (1) knowledge and skills used in a broad variety of clinical settings (such as history taking, clinical documentation, and localization) and (2) skills required to manage neurologic emergencies (such as aborting status epilepticus or using thrombolytics for ischemic cerebral infarction). Items that received lower importance scores tended to reflect (1) skills or procedures Neurology 84

June 2, 2015

ª 2015 American Academy of Neurology. Unauthorized reproduction of this article is prohibited.

3

Figure

Assessment scale for observable practice activities

Sample neurology trainee assessment form question based on an observable practice activity scored on an entrustment scale.

used in narrower or lower acuity settings (such as recognizing normal variants on EEG) and (2) areas less closely identified with core adult neurology practice (such as psychiatry or child neurology). While the significant number of clinical neurophysiology OPAs with low CVS could indicate a general perception that these skills are less important, it is possible that their lower importance score reflected the comparatively limited number of settings in which the skills are applicable. It is important to note that none of the OPAs studied here received scores indicating they are perceived as unimportant (i.e., scores of 1 or 2 on the survey instrument). In fact, there are many clinical contexts or practice settings in which items that received lower importance scores in this study could be very important for a trainee to learn. We do not intend to suggest that lower scoring items should not be taught or assessed during neurology training, but rather that programs may decide to assess higher scoring items more frequently. A program choosing to use an OPA-based assessment format would not likely assign all 73 items to every educational experience (which would lead to unnecessarily long or irrelevant assessment forms). For example, a program might assess the “Accurately perform a neurologic exam on the comatose patient” OPA during a neurocritical care rotation, but not a neuromuscular clinic rotation. This tailoring approach allows programs to select how frequently they assess different skills, and we anticipate that programs may wish to assess items with higher importance scores more frequently or thoroughly than items with lower scores. Little is known about differences between trainees and practicing physicians in perceived importance of clinical competencies. While trainees in this study 4

Neurology 84

rated neurology OPAs with higher importance scores than did neurology faculty, the difference was small. Furthermore, faculty tended to score items similarly despite differences in longevity of teaching experience. These findings suggest that perceived importance of clinical competencies (presented here in the form of OPAs) remains largely unchanged over the continuum of clinical neurologic experience. The neurology assessment formats presented here are versatile enough to be applied across a variety of neurology programs and readily allow incorporation of individual program features or internal requirements. It is important to emphasize that the list of OPAs presented here, while complete in terms of the Neurology Milestones, is not a comprehensive list of all characteristics desired in a graduate of neurology residency. While neurology programs may use these OPAs and suggested assessment form formats to comply with ACGME reporting requirements, program directors should not neglect assessment of other experiences important in neurology residency training but not covered by the Milestones (for example, individual program research requirements or neuropathology rotations). Many of the OPAs included in this study could themselves be composed of subordinate observable practice activities, what we refer to as sub-OPAs. For example, the “Perform and interpret an electrodiagnostic evaluation (NCS/EMG) and write a report” OPA could be broken down into several smaller sub-OPAs (some of which, such as “Describe common pitfalls of NCS/EMG,” were included in this study). The flexibility of the OPA format allows individual programs to design and tailor their own sub-OPAs, also scored on an entrustment scale, for separate neurology rotations or other educational experiences. A similar approach could also be used for

June 2, 2015

ª 2015 American Academy of Neurology. Unauthorized reproduction of this article is prohibited.

neurology fellowship trainee assessments, many of which will also be required to report progression along their own Milestones. Limitations of this study include its relatively small sample size. Future attempts to replicate or expand on these findings would benefit from surveys of a larger number of respondents across a larger number of institutions and programs. There are also limitations inherent to the technique we used in this study to collect content validity evidence. The Delphi method, while commonly used to determine content validity in educational settings,8,10 fundamentally reflects expert consensus opinion rather than analysis of empirical educational outcomes. While participating faculty included a diverse array of subspecialists and generalists, the possibility of bias related to participant background cannot be excluded. While accurate and meaningful trainee feedback is important in GME, it is worthwhile to recall that the fundamental mission and public trust of neurology training programs is to prepare residents for independent neurologic practice.13 In an era of GME funding shortfalls,14 it is our hope that the assessment content and formats provided here will be helpful to neurology programs and will allow increasingly scarce GME resources to be directed to other aspects of trainee education. AUTHOR CONTRIBUTIONS Lyell K. Jones Jr: design and conceptualization of the study, analysis and interpretation of data, drafting and revising the manuscript. Elliot L. Dimberg: design and conceptualization of the study, analysis and interpretation of data, drafting and revising the manuscript. Christopher J. Boes: design and conceptualization of the study, drafting and revising the manuscript. Scott D.Z. Eggers: design and conceptualization of the study, drafting and revising the manuscript. David W. Dodick: design and conceptualization of the study, drafting and revising the manuscript. Jeremy K. Cutsforth-Gregory: drafting and revising the manuscript. Andrea N. Leep Hunderfund: drafting and revising the manuscript. David J. Capobianco: design and conceptualization of the study, drafting and revising the manuscript.

STUDY FUNDING No targeted funding reported.

DISCLOSURE L. Jones Jr, E. Dimberg, C. Boes, S. Eggers, D. Dodick, and J. CutsforthGregory report no disclosures relevant to the manuscript. A. Leep

Hunderfund has contractual rights to receive royalties from the sale of “Real EMG” online modules. D. Capobianco reports no disclosures relevant to the manuscript. Go to Neurology.org for full disclosures.

Received October 16, 2014. Accepted in final form December 29, 2014. REFERENCES 1. Swing SR. The ACGME outcome project: retrospective and prospective. Med Teach 2007;29:648–654. 2. Caverzagie KJ, Iobst WF, Aagaard EM, et al. The internal medicine reporting milestones and the next accreditation system. Ann Intern Med 2013;158:557–559. 3. The Neurology Milestone Project [online]. Available at: http://acgme.org/acgmeweb/Portals/0/PDFs/Milestones/ NeurologyMilestones.pdf. Accessed September 3, 2014. 4. ten Cate O, Scheele F. Competency-based postgraduate training: can we bridge the gap between theory and clinical practice? Acad Med 2007;82:542–547. 5. Warm EJ, Mathis BR, Held JD, et al. Entrustment and mapping of observable practice activities for resident assessment. J Gen Intern Med 2014;29:1177–1182. 6. ACGME Program Requirements for Graduate Medical Education in Neurology [online]. Available at: http://www.acgme. org/acgmeweb/Portals/0/PFAssets/ProgramRequirements/ 180_neurology_07012014.pdf. Accessed September 3, 2014. 7. ACGME Common Program Requirements [online]. Available at: http://www.acgme.org/acgmeweb/Portals/0/ PFAssets/ProgramRequirements/CPRs2013.pdf. Accessed September 3, 2014. 8. Hauer KE, Kohlwes J, Cornett P, et al. Identifying entrustable professional activities in internal medicine training. J Grad Med Educ 2013;5:54–59. 9. Hicks PJ, Schumacher DJ, Benson BJ, et al. The pediatrics milestones: conceptual framework, guiding principles, and approach to development. J Grad Med Educ 2010;2: 410–418. 10. Shaughnessy AF, Sparks J, Cohen-Osher M, Goodell KH, Sawin GL, Gravel J Jr. Entrustable professional activities in family medicine. J Grad Med Educ 2013;5:112–118. 11. Hatcher SCaT. The Web-based Delphi Research Technique as a Method for Content Validation in HRD and Adult Education Research [online]. Available at: http:// files.eric.ed.gov/fulltext/ED492146.pdf. Accessed September 3, 2014. 12. Landeta J. Current validity of the Delphi method in social sciences. Technol Forecast Soc Change 2006:467–482. 13. Nasca TJ, Philibert I, Brigham T, Flynn TC. The next GME accreditation system: rationale and benefits. N Engl J Med 2012;366:1051–1056. 14. Wilensky GR, Berwick DM. Reforming the financing and governance of GME. N Engl J Med 2014;371:792–793.

Neurology 84

June 2, 2015

ª 2015 American Academy of Neurology. Unauthorized reproduction of this article is prohibited.

5

Milestone-compatible neurology resident assessments: A role for observable practice activities Lyell K. Jones, Jr., Elliot L. Dimberg, Christopher J. Boes, et al. Neurology published online May 1, 2015 DOI 10.1212/WNL.0000000000001641 This information is current as of May 1, 2015 Updated Information & Services

including high resolution figures, can be found at: http://www.neurology.org/content/early/2015/04/30/WNL.0000000000 001641.full.html

Supplementary Material

Supplementary material can be found at: http://www.neurology.org/content/suppl/2015/05/01/WNL.000000000 0001641.DC1.html http://www.neurology.org/content/suppl/2015/05/01/WNL.000000000 0001641.DC2.html

Citations

This article has been cited by 1 HighWire-hosted articles: http://www.neurology.org/content/early/2015/04/30/WNL.0000000000 001641.full.html##otherarticles

Subspecialty Collections

This article, along with others on similar topics, appears in the following collection(s): All Education http://www.neurology.org//cgi/collection/all_education Methods of education http://www.neurology.org//cgi/collection/methods_of_education

Permissions & Licensing

Information about reproducing this article in parts (figures,tables) or in its entirety can be found online at: http://www.neurology.org/misc/about.xhtml#permissions

Reprints

Information about ordering reprints can be found online: http://www.neurology.org/misc/addir.xhtml#reprintsus

Neurology ® is the official journal of the American Academy of Neurology. Published continuously since 1951, it is now a weekly with 48 issues per year. Copyright © 2015 American Academy of Neurology. All rights reserved. Print ISSN: 0028-3878. Online ISSN: 1526-632X.

Milestone-compatible neurology resident assessments: A role for observable practice activities.

Beginning in 2014, US neurology residency programs were required to report each trainee's educational progression within 29 neurology Milestone compet...
229KB Sizes 0 Downloads 5 Views