Commentary

Medical Specialty Boards Can Help Measure Graduate Medical Education Outcomes Lars E. Peterson, MD, PhD, Peter Carek, MD, Eric S. Holmboe, MD, James C. Puffer, MD, Eric J. Warm, MD, and Robert L. Phillips, MD, MSPH

Abstract U.S. graduate medical education (GME) training institutions are under increasing scrutiny to measure program outcomes as a demonstration of accountability for the sizeable funding they receive from the federal government. The Accreditation Council for Graduate Medical Education (ACGME) is a potential agent of measuring GME accountability but has no interaction with physicians after residency training is completed. American Board of Medical Specialty (ABMS) member boards interact with physicians throughout their careers through maintenance of certification (MOC) and are a potential

B

ecause of the sizeable investment by the government in funding U.S. graduate medical education (GME), multiple stakeholders have advocated for measuring the accountability of GME institutions based on such criteria as workforce specialty mix and distribution, services delivered to the community, and educational quality.1 The focus on educational quality as well as competency brings the Accreditation Council for Graduate Medical Education (ACGME) into the fray as a potential accountability agent. At a recent public hearing by the Institute of Medicine on this topic, ACGME CEO Dr. Thomas Nasca2 said: The actions of the ACGME must fulfill the social contract, and must cause sponsors to maintain an educational environment that assures: the safety and quality of care of the patients under the care of residents today; the safety and quality of care of the patients under the

Please see the end of this article for information about the authors. Correspondence should be addressed to Dr. Peterson, American Board of Family Medicine, 1648 McGrathiana Pkwy., Suite 550, Lexington, KY 40511-1247; telephone: (859) 269-5626; fax: (859) 335-7509; e-mail: [email protected]. Acad Med. 2014;89:840–842. First published online April 18, 2014 doi: 10.1097/ACM.0000000000000250

840

source of valuable data on physician competency and quality of care, both of which could be used to measure GME accountability. The authors propose that ABMS boards and the ACGME deepen their existing relationship to better assess residency training outcomes. ABMS boards have a wealth of data on physicians collected as a by-product of MOC and business operations. Further, many ABMS boards collect practice demographics and ­scopeof-practice information through MOC enrollment surveys or recertification examination questionnaires. These

care of our graduates in their future practice; the provision of a humanistic educational environment where residents are taught to manifest professionalism and effacement of self-interest to meet the needs of their patients.

During the hearing, Dr. Nasca also spoke to the role of certifying boards in this process. The American Board of Medical Specialties (ABMS) member boards pick up where the ACGME leaves off, certifying physicians as they enter practice and regularly assessing their knowledge and quality of care through maintenance of certification (MOC). The purpose of this Commentary is to describe potential direct and contributing roles that ABMS member boards could play in assessing GME outcomes in collaboration with the ACGME. Certifying Boards Data

Most residency graduates will attain board certification and participate in MOC throughout their careers. The MOC process is designed to assure the public that certified physicians are engaged in self-directed assessment and continuous quality improvement. During the MOC process, ABMS boards collect a wealth of data on their diplomates to monitor examination reliability

data are potentially valuable in helping residencies know what their graduates are doing in practice. Part 4 of MOC generally involves assessment of the quality of care delivered in practice, and ABMS boards could share these deidentified data with the ACGME and residency programs to provide direct feedback on the practice outcomes of graduates. ABMS member boards and the ACGME should broaden their long-standing relationship to further develop shared roles and data-sharing mechanisms to better inform residencies and the public about GME training outcomes.

across physician personal and practice demographics, and to better understand the scope of practice of physicians in each specialty to enable better evaluation of MOC programs. Data collection begins in residency with information such as sites of training and time to complete training, as boards must determine a resident’s eligibility to sit for the certification examination based on completion of residency requirements. Additionally, many boards offer an in-training examination, which many residencies use to track a resident’s development of medical knowledge. Some boards also collect residency director evaluations of residents, which the American Board of Internal Medicine has shown predict future practice performance. Boards also possess board certification examination scores. As physicians transition from training into MOC, most boards collect members’ practice information, such as location and scope of practice, through MOC enrollment surveys or recertification examination surveys. Boards gather and assess the data provided for Part 4 of MOC, which requires assessment of the quality of care a physician provides. This array of data may be useful for creating accountability measures for the

Academic Medicine, Vol. 89, No. 6 / June 2014

Commentary

GME system focused on what residency graduates do, the quality of care they provide, and where they practice. Some boards have already used data collected through MOC to characterize workforce trends.3,4 For example, the American Board of Family Medicine has examined the narrowing scope of practice of family physicians using self-reported data on practice patterns collected during MOC examination registration.4 The American Board of Internal Medicine linked residency evaluations and board scores with data from a nationally representative sample of practicing physicians, the Center for Studying Health System Change’s Community Tracking Study, and found that lower evaluations were associated with lower likelihood of board certification and lower income.3 These studies indicate a capacity for certifying boards to use existing data to play a direct role in measuring training outcomes in conjunction with the ACGME’s efforts to longitudinally assess training programs. Graduate Survey Requirements

Currently, the ACGME requires many training programs to survey their graduates to track preparation for practice, scope of practice, and procedural skills; however, no standard mechanism exists to carry out these surveys, and the surveys vary in content among and between specialties. Graduate surveys could be standardized within each specialty, but reporting burden due to collection of data already gathered by boards would continue to drive down response rates, and assessment of the quality of care graduates provide based on clinical data would be beyond the capacity and scope of these surveys. Instead, boards should work collaboratively with both the ACGME and residency programs to provide deidentified data sufficient to give programs useful assessment information. The boards and the ACGME could also collaborate to craft graduate surveys that would complement existing MOC data to permit a more comprehensive program assessment and reduce physician reporting burden. Collaboration Between ACGME and Medical Specialty Boards

The ABMS member boards already work with the ACGME to establish standards and requirements for residency training. These relationships will only deepen as

Academic Medicine, Vol. 89, No. 6 / June 2014

the Next Accreditation System and the Milestones are implemented over the next few years.5 The Milestones are a collaborative effort among the ACGME, ABMS member boards, and the GME community with the understanding that the development of clinical competency is a lifelong process that does not end at the completion of residency training. Through the Next Accreditation System, ABMS boards and the ACGME are now defining physician competencies within their disciplines more robustly using a developmental framework. The competencies and Milestones in essence help to define the holistic blueprint for a specialty. Through MOC, boards will be able to measure a physician’s development in the “early” years immediately post residency, and this information could be shared with programs, and with the ACGME, to create a feedback loop to improve training programs as well as certification and accreditation processes. Exchange of these data may also create a pathway to better determine which assessment methods and educational outcomes best measure physician competency and practice outcomes. A deepened relationship between ABMS member boards and the ACGME also gives the certifying boards a direct role in providing public transparency and accountability for GME outcomes, much like they currently do for certification of individual physicians. Examples of Evaluations From the Proposed Collaboration

ABMS boards’ workforce distribution data are one example of how boards can contribute to measurement of GME accountability. Many residency programs are created to increase numbers of physicians of a particular specialty in the surrounding region or state. Existing public and proprietary data sets can measure accountability for training programs and teaching hospitals that focus on specialty and geographic distribution, but these data are not easily available to programs that wish to track particular trends or goals. Because many programs lose track of their graduates, ABMS boards could provide programs with aggregate percentages of graduates practicing in the same state, within a certain radius of the training site in particular specialties, or in underserved geographies, for example.

Graduates’ scope of practice is a major concern of training programs and ABMS boards and is another area where boards have the potential to inform training outcomes. For example, family physicians are trained broadly to take care of patients of all ages, all disease entities, and across multiple settings. Despite such broad training, many family physicians are limiting their scope of practice. Whether this narrowing is due to personal choice, restrictions from employers or credentialing facilities, population needs, or level of preparation from residency is unknown. The American Board of Family Medicine has used its demographic data to document the narrowing scope of practice, but these data could also be linked to residency programs and be combined with area-level health care market data (e.g., physician-topopulation ratios, socioeconomics, rural/ urban status) to explore whether the quality of residency education or factors external to training are associated with a reduced scope of practice. To further illustrate this, if a family medicine residency program were evaluated on the percentage of its graduates who provide obstetrical deliveries, any reports would need to consider contextual factors that may influence the likelihood of an individual physician providing this service—namely, the area-level availability of obstetrician–gynecologists and certified nurse midwives, degree of rurality, availability of facilities where deliveries occur, and malpractice environment. Such research is an opportunity to move beyond measuring program outputs and outcomes to doing research on educational models and contextual factors associated with outputs and outcomes. In this regard, ABMS boards have an opportunity to be a partner in research underpinning training outcomes. Through Part 4 of MOC, most boards collect quality-of-care data associated with a quality improvement project. Many boards offer multiple options to satisfy this requirement, and, as a consequence, internists may perform a quality improvement activity on hypertension, diabetes, or a quality measure of their choosing. This limits the ability to report a standard quality measure for every physician in a specialty. However, measures of improvement and sustainability of change may be created from different disease entities to

841

Commentary

provide an assessment of overall quality of care. The ACGME and programs could then receive these data on their graduates’ performance in aggregate, which they could then use to monitor effectiveness of training and to improve areas of weakness without sacrificing the practicing physician’s confidentiality. Ensuring the Physician Workforce Meets the Needs of the Public

We propose that ABMS member boards and the ACGME expand their current relationship to explore robust assessment of residency training outcomes with an emphasis on quality of care provided by graduates and on clinical competencies. A secondary, but potentially important role for boards is as contributors of data that inform understanding of the specialty and geographic distribution of residency graduates for the benefit of the public. The primary role would complement previous work on the development of Milestone measures and longitudinal competency assessment methods. ABMS boards are well positioned for the secondary role given the depth and validity of the data they already collect about physicians in the United States. Both functions could

842

support research on the factors associated with particular outcomes. We believe that through these roles, boards could provide meaningful measures and feedback to training sites with real potential to create a physician workforce better able to meet the needs of the public. Such collaboration could also enhance the boards’ role in transparently assuring the public, and also those that fund GME, that the outcomes of training meet public expectations. Funding/Support: No external sources of funding. Other disclosures: None reported. Ethical approval: Reported as not applicable. Disclaimers: The views expressed represent the personal opinions of the authors and are not necessarily the views or policies of their respective institutions. Dr. Peterson is research director, American Board of Family Medicine, and assistant professor of family and community medicine, University of Kentucky, Lexington, Kentucky. Dr. Carek is professor and chair, Department of Family Medicine, University of Florida, Gainesville, Florida. Dr. Holmboe is senior vice president for Milestones development and evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois. At the time this Commentary was written, he was chief medical officer, American Board of Internal Medicine, Philadelphia, Pennsylvania.

Dr. Puffer is president and chief executive officer, American Board of Family Medicine, Lexington, Kentucky. Dr. Warm is professor, Department of Internal Medicine, University of Cincinnati, Cincinnati, Ohio. Dr. Phillips is vice president of research and policy, American Board of Family Medicine, Lexington, Kentucky.

References 1 Reddy A, Lazreg S, Phillips R, Bazemore A, Lucan S. Towards defining and measuring social accountability in graduate medical education: A stakeholder study. J Grad Med Educ. 2013;5:439–445. 2 Nasca TJ. The Next Accreditation System, and the Clinical Learning Environment Review (CLER): Where Are We in Implementation? 2012. ­­­http://www.iom.edu/~/media/ Files/Activity%20Files/Workforce/ GMEGovFinance/2012-DEC-19/Nasca.pdf. Accessed February 19, 2014. 3 Gray B, Reschovsky J, Holmboe E, Lipner R. Do early career indicators of clinical skill predict subsequent career outcomes and practice characteristics for general internists? Health Serv Res. 2013;48:1096–1115. 4 Tong ST, Makaroff LA, Xierali IM, et al. Proportion of family physicians providing maternity care continues to decline. J Am Board Fam Med. 2012;25:270–271. 5 Nasca TJ, Philibert I, Brigham T, Flynn TC. The next GME accreditation system— rationale and benefits. N Engl J Med. 2012;366:1051–1056.

Academic Medicine, Vol. 89, No. 6 / June 2014

Medical specialty boards can help measure graduate medical education outcomes.

U.S. graduate medical education (GME) training institutions are under increasing scrutiny to measure program outcomes as a demonstration of accountabi...
246KB Sizes 2 Downloads 4 Views