Original Research

Do Family Physicians Choose Self-Assessment Activities Based on What They Know or Don’t Know?

LARS E. PETERSON, MD, PHD; BRENNA BLACKBURN, MPH; ANDREW BAZEMORE, MD, MPH; THOMAS O’NEILL, PHD; ROBERT L. PHILLIPS JR., MD, MSPH Introduction: Maintenance of Certification (MOC) for Family Physicians (MC-FP) includes clinical SelfAssessment Modules (SAMs). Whether family physicians choose SAMs that reflect their aptitudes or knowledge gaps has not been studied. Method: Secondary analysis of demographic data, 2009 certification examination scores, and 2009–2012 SAM participation data. We computed disease-specific scores for asthma, diabetes, and hypertension from the examination. We ran unadjusted logistic and adjusted conditional logistic regression models of score quintiles, matched on the number of SAMs completed and controlling for physician demographics and area-level social deprivation. Results: In 2009, 9 610 physicians passed the exam. Mean scores were 591.4 (SD ± 308.5) for asthma, 558.6 (SD ± 216.1) for diabetes, and 533.3 (± 226.7) for hypertension. Average scores on hypertension and diabetes were higher for physicians who subsequently completed related SAMs but not for those who completed the asthma SAM. The percentage of physicians in each quintile of scaled score who completed each SAM increased for diabetes (32.3%–40.9%) and hypertension (33.0%–36.9%). For asthma, logistic regression analyses found no statistically significant associations. For diabetes, there was a consistent association in both models between higher score quintile and likelihood of taking the SAM. For hypertension, an association of higher score and higher likelihood of taking the SAM was significant only in the third quintile (OR = 1.20 (1.03, 1.39)). Discussion: We found inconsistent relationships between physician knowledge and SAM selection. For MOC to better impact quality, boards should consider directing physicians toward MOC activities that fill knowledge gaps rather than areas of strength. Key Words: Maintenance of Certification/licensure, evaluation-educational intervention, self-assessment

Introduction After completion of residency training, physicians are expected to maintain and update their medical knowledge through a largely self-directed program of continuing proDisclosure: Dr. Lars E. Peterson is a full-time employee of the American Board of Family Medicine. Dr. Peterson: Research Director, American Board of Family Medicine; Ms. Blackburn: Research Assistant, American Board of Family Medicine; Dr. Bazemore: Director, Robert Graham Center; Dr. O’Neill: Vice President of Psychometric Service, American Board of Family Medicine; Dr. Phillips: Vice President of Research and Policy, American Board of Family Medicine. Correspondence: Lars E. Peterson, American Board of Family Medicine, 1648 McGrathiana Parkway, Suite 550, Lexington, KY 40511-1247; e-mail: [email protected]. © 2014 The Alliance for Continuing Education in the Health Professions, the Society for Academic Continuing Medical Education, and the Council on Continuing Medical Education, Association for Hospital Medical Education. • Published online in Wiley Online Library (wileyonlinelibrary.com). DOI: 10.1002/chp.21247

fessional development that includes continuing medical education (CME) activities. The CME model has been questioned as ineffective, with physicians often choosing activities in topics in which they are already competent and/or comfortable.1,2 Physicians who have identified their knowledge gaps or have a perception of their limits in certain clinical areas are more likely to show improvement from participating in CME.3 However, a systematic review found that physicians are poor at self-assessing their own competence compared to formal external assessment.4 Physicians’ poor ability to self-assess their clinical competency raises concerns, as physician learning theories suggest that physicians generally select self-directed activities either in response to immediate specific clinical problems or more general knowledge areas where they feel uneasy.5 If physicians are not able to adequately self-assess they may be spending valuable time in learning activities that are not relevant to their needs. Physicians are not alone as most health professionals are poor at identifying knowledge gaps and critiquing their own performance.6

JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS, 34(3):164–170, 2014

Family Physicians’ Choice of Self-Assessment

Maintenance of Certification (MOC) has emerged as another paradigm, built on the foundation of continuing professional development, through which physician competence may be assessed and through which physicians may continuously increase their clinical knowledge.7 All American Board of Medical Specialties boards are now using MOC to certify their physicians. MOC consists of four parts: Part 1, “Licensure and Professional Standing”; Part 2, “Lifelong Learning and Self-Assessment”; Part 3, “Cognitive Expertise”; and Part 4, “Practice Performance Assessment.” Part 2 of the American Board of Family Medicine (ABFM) Maintenance of Certification for Family Physicians (MCFP) includes an array of online, clinically based activities called Self-Assessment Modules (SAMs). SAMs consist of 60 multiple-choice questions on a clinical topic followed by a patient care simulation of a patient with the disease of interest. If the physician does not achieve 80% correct within each subsection of the SAM, he or she is provided with critiques and supporting references to foster learning about the topic and determine the current state of the literature. Physicians may then answer the questions again. Others have suggested that board certification examinations provide “excellent opportunities for physicians to assess their knowledge against facts and principles that inform essential clinical decisions,”8 and this easily extends to other parts of MOC. Additionally, state licensing boards are investigating creating a Maintenance of Licensure process that is, like MOC, based on the assumption that physicians will select self-learning and quality improvement activities appropriate for their given practice and learning needs. Despite the ABFM’s requiring its diplomates to be engaged in MC-FP for nearly 10 years,9 it remains unknown why diplomates choose the SAMs they do. Two competing hypotheses would be that (1) physicians use SAMs to fill in knowledge gaps in areas of perceived weakness and (2) physicians select SAMs in areas of aptitude or perceived strength, possibly to expedite the process or ensure ease of completion. Therefore, the objective of this study was to determine whether physician knowledge of a specific disease, as measured on the certification exam, is associated with subsequent choice of SAM.

Method Study Sample We used data from diplomates who passed the certification or recertification exam in July 2009 and reside in the United States. ABFM diplomates are required to complete an MCFP stage every 3 years.9 We included only those who passed the examination in order to allow all participants the same motivation and time to complete a full MC-FP stage. For each MC-FP stage, our cohort was required to complete 3 activities, which could include 2 or 3 SAMs, depending on

activities completed in subsequent stages. Physicians may complete additional SAMs beyond the number required if they choose to do so. Additionally, physicians may pass their exam and subsequently not engage in MC-FP, if they were planning to retire, for example, and would not complete any SAMs.10 For the cohort in this study, their first MC-FP stage after their examination was from January 2010 to January 2013, but we included any SAM completed after the examination, as some physicians start completing their requirements immediately after the examination. Study Variables Twelve SAMs were available throughout the entire study period; asthma, care of the vulnerable elderly, cerebrovascular disease, childhood illness, coronary artery disease, depression, diabetes, health behavior, heart failure, hypertension, maternity care, and well-child care. Three additional SAMs were made available during the study period: preventive care in 2011 and hospital medicine and mental health in 2012. The 3 most commonly completed SAMs from 2010 to 2012 were diabetes (n = 22 221), hypertension (n = 21 269) and asthma (n = 15,705). These diseases are commonly seen in primary care and, hence, have multiple questions on their diagnosis, management, and treatment on the examination.11 Using these particular SAMs better enabled us to evaluate our competing hypotheses—was each SAM completed so frequently because physicians recognized knowledge gaps in conditions commonly seen in primary care or because believed they were competent in these areas and could “get through” the activity quickly? Completion of any of these 3 SAMs during the MC-FP stage was the dependent variable. For the SAMs studied, we identified questions relating to each disease entity from the July 2009 certification exam. We then used these questions to compute subtest scores specific to those disease entities which were then converted to a scaled score. This was done using the dichotomous Rasch model, a measurement model within the family of item response theory models.12,13 Because the same item difficulty calibrations that were used on the examination were used to compute the disease entity subtests, these scores are on the same scale as the examination. The minimum passing score for the examination was 390, and the scores reported to the diplomates are limited to a range of 200 to 800. For our analyses, we used the entire range of scores to better differentiate levels of disease-specific knowledge; however, under these conditions, scores can theoretically range from negative to positive infinity. Because there were 2 forms of the 2009 examination and each form could have one of 28 different combinations of content-specific test modules, there are effectively 56 different forms of the examination. For this reason, there was some variation across physicians in the number of questions answered on each disease entity subtest.

JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS—34(3), 2014 DOI: 10.1002/chp

165

Peterson et al.

The number of asthma questions possible ranged from 4 to 10, diabetes questions ranged from 10 to 16, and hypertension ranged from 4 to 11 questions possible, depending on the specific form of the exam. Physician demographic information was obtained from the ABFM administrative database and included gender, age, degree type (MD or DO), and international medical school graduate status. Previous work found that MC-FP participation by family physicians varies by the rate of poverty among the population where a physician resides.14 We controlled for this phenomenon by including the social deprivation index of the primary care service area (PCSA) in which the physician practices.15 Physicians are not permitted to repeat a SAM within 5 years. To account for differences in time a physician is eligible to take a SAM, we calculated a variable representing the time a physician was eligible to take a specific SAM in the MC-FP stage we studied. If a physician was not eligible to take the SAM at all during this MC-FP stage, he or she was removed from the analysis for that SAM. Statistical Analysis We conducted an observational analysis of a cohort of family physicians who passed the ABFM examination in 2009 through their first MC-FP stage ending December 2012. We used descriptive statistics to characterize our sample of diplomates. Then, we used t-tests to examine the difference in scores between those who took the respective SAM in their MC-FP stage and those who did not. For each SAM, we assigned physicians into score quintiles to assess differences in behavior between different levels of disease-specific medical knowledge. Then, we determined the percentage of physicians in each quintile who completed the corresponding SAM. Cochran-Armitage tests for trend were used to investigate increasing likelihood of completing the SAM with increasing score quintile. For each disease entity, we then ran unadjusted and adjusted regressions to determine associations between higher score quintile and completing the corresponding SAM. The unadjusted regressions were logistic regressions that assessed associations between scores and whether or not diplomates completed a specific SAM. For the regression analyses, we used the first quintile as the reference group. We chose this strategy because we were interested in whether physicians with higher scores were more or less likely to complete a SAM, which we would not be able to directly test using score as a continuous variable. Logistic regression also allows adjustment for physician characteristics known to be associated with examination performance.16 As previously mentioned, for the 3-year period studied, a physician may complete 0 to 15 SAMs. We controlled for differential SAM participation by using conditional logistic regression. This adjusted regression model also controlled for physician demographics and social deprivation index at 166

TABLE 1. Characteristics of Family Physicians Who Passed the American Board of Family Medicine Certification Examination in July 2009

Variable (n = 9 610)

Mean (SD) or percent

Age in years

49.8 (1.0)

Gender (% Male)

61.1

Degree (% MD)

91.6

Number of SAMs completed None

18.1

One

3.1

Two

16.1

Three

61.5

Four or more

1.3

International medical graduate

18.3

Social deprivation index score

42.7 (27.0)

Certification exam score

522.6 (91.8)

Asthma scaled score

591.4 (308.5)

Diabetes scaled score

558.6 (216.1)

Hypertension scaled score

533.3 (226.7)

SAM = Self-Assessment Module.

the PCSA level. We also performed a sensitivity analysis by running a logistic regression that excluded diplomates who did not complete any SAMs, controlling for the same variables in the conditional regression as well as the number of SAMs completed (as a continuous measure). All analyses were conducted in SAS Version 9.2 (Cary, NC). This study was deemed to have exempt status by the American Academy of Family Physicians Institutional Review Board. Results The passing rate for diplomates who took the July 2009 examination (n = 10 794) was 85.9%.17 After excluding diplomates not residing in the United States, our sample included 9610 diplomates with a mean age of 49.8 years; 61% were male, over 90% were MDs and 18% were international medical graduates (TABLE 1). As expected, SAM participation varied: 1736 (18.0%) diplomates did not complete any SAM, 77.6% completed 2 or 3, and fewer than 2% completed 4 or more. Of the 9 610 diplomates, 2211 (23.0%) completed the asthma SAM, 3668 (38.2%) completed the diabetes SAM, and 3438 (35.8%) completed the hypertension SAM. The mean overall examination score was 522.6 (standard deviation [SD] = 91.8), and disease-specific scores were highest for asthma, 591.4 (± 308.5), and lowest for hypertension, 533.3 (± 226.7). Mean scores were significantly higher

JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS—34(3), 2014 DOI: 10.1002/chp

Family Physicians’ Choice of Self-Assessment TABLE 2. Mean Certification Examination Scores Between Those Who Did or Did Not Complete a Self-Assessment Module (SAM) by Disease Topic

Completed SAM

TABLE 3. Adjusted and Unadjusted Odds of Completing a Self-Assessment Module (SAM) by Disease Topic

Unadjusted

Adjusted*

Did not complete SAM

Percentage

completed

Odds ratio with 95% confidence

the SAM

interval

who Asthma

588.6 (303.2)

591.8 (310.1)

Diabetes

576.1 (210.0)

547.7 (219.0)*

Hypertension

542.1 (222.0)

528.5 (229.4)*

Scaled score

ASTHMA (n = 9 594) *p-value < .05.

among physicians who completed the hypertension (542.1 versus 528.5) and diabetes (576.1 versus 547.7) SAMs (TABLE 2) than among those who did not. Correlations between disease-specific scores ranged from 0.12 to 0.21 and disease-specific score correlation with the total exam score ranged from 0.31 to 0.47 (data not shown). Neither the unadjusted nor adjusted analysis indicated a consistent relationship between scores and odds of completing the respective SAM (TABLE 3 and FIGURE 1). For asthma, the proportion of physicians who completed the SAM was lowest in the top quintile with no significant trend observed. In regression analyses, both adjusted and unadjusted odds ratios were nonsignificant. For hypertension, there was a small but statistically significant increase in the proportion of physicians who completed the SAM from the bottom to top quintile (33.0 to 36.9, p for trend < .05). In adjusted models only, physicians in the third quintile of scores were more likely to complete the hypertension SAM (odds ratio [OR] = 1.20 [95% confidence interval, 1.03, 1.39]). The clearest relationship between score and odds of completing a SAM were seen in diabetes, where the proportion of physicians who completed the SAM increased from 32.3 to 40.9 (p < .05 for trend) from bottom to top score quintile. The adjusted ORs increased from the second (OR = 1.22 [1.05, 1.41]) to fifth quintile (OR = 1.37 [1.19, 1.59]). A sensitivity analysis, excluding diplomates who did not participate in any SAM, gave nearly identical results to the conditional regression (results not shown). Discussion Among nearly 10 000 family physicians, we found inconsistent relationships between a physician’s medical knowledge and selection of self-assessment modules completed as part of MOC. Lack of consistency in the relationship between medical knowledge and SAM selection does not support a single hypothesis. It is possible that physicians are poor at self-identifying areas of clinical weakness4 and lack the ability to use SAMs in an effective manner to increase their medical knowledge. Alternately, as suggested by our findings for

Quintile 1 [–691, 376]

23.1

Quintile 2 [377, 487]

23.1

1.00 (0.86, 1.16) 0.99 (0.84, 1.16)

Reference

Reference

Quintile 3 [488, 639]

22.3

0.96 (0.84, 1.09) 0.89 (0.77, 1.02)

Quintile 4 [640, 728]

26.2

1.18 (1.00, 1.39) 1.13 (0.95, 1.34)

Quintile5 [729, 1466]

22.1

0.94 (0.82, 1.09) 0.86 (0.73, 1.00)

DIABETES (n = 9 601) Quintile 1 [–248, 384]

32.3

Reference

Reference

Quintile 2 [385, 482]

38.1

1.29 (1.13, 1.47) 1.22 (1.05, 1.41)

Quintile 3 [483, 597]

39.3

1.36 (1.19, 1.55) 1.33 (1.15, 1.54)

Quintile 4 [598, 770]

41.0

1.46 (1.28, 1.66) 1.37 (1.19, 1.58)

Quintile5 [771, 1208]

40.9**

1.45 (1.27, 1.65) 1.37 (1.19, 1.59)

HYPERTENSION (n = 9 595) Quintile 1 [–396, 349]

33.0

Reference

Reference

Quintile 2 [350, 470]

35.3

1.11 (0.98, 1.27) 1.09 (0.94, 1.26)

Quintile 3 [471, 573]

37.0

1.19 (1.04, 1.37) 1.20 (1.03, 1.39)

Quintile 4 [574, 720]

37.1

1.20 (1.06, 1.36) 1.11 (0.96, 1.27)

Quintile5 [721, 1133]

36.9**

1.19 (1.03, 1.36) 1.13 (0.97, 1.31)

*Conditional on number of SAMs completed and adjusted for age, gender, MD versus DO, international medicine graduation, and social deprivation index; Exam Score Quintile 1 is the reference group for each regression. **Cochran-Armitage Trend Test significant at p < .05.

diabetes, expediency may motivate physicians to select the SAMs in areas in which they are already knowledgeable. Others have found evidence for this behavior in physician selection of CME activities.1,2 Physicians may not be able to self-assess their clinical ability to treat, diagnosis, and manage certain diseases for a number of reasons. First, as mentioned previously, most health care professionals are poor at self-assessment6 and identifying knowledge gaps. While others have suggested that board certification examinations are excellent ways for physicians to assess their knowledge,8 any knowledge gaps would need to be clearly reported back to the physician. In 2009, the ABFM examination score report provided the percentage of questions correctly answered in multiple organ systems categories but not a scaled score that could be compared to overall performance. Second, if physicians are not

JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS—34(3), 2014 DOI: 10.1002/chp

167

Peterson et al.

FIGURE 1. Adjusted and Unadjusted Odds of Completing a Self-Assessment Module (SAM) by Disease Topic

measuring the quality of care they provide, or lack the tools to do so, they may be unaware of any deficits in their care. This may be the case in asthma, where there are fewer immediate indicators of quality at the point of care than there are for hypertension (blood pressure reading at almost every clinical encounter) and diabetes (blood sugar and Hemoglobin A1c ). This lack of immediate clinical feedback might limit self-assessment of knowledge of asthma care and curtail the relationships seen between increasing likelihood of completing a SAM and higher scores seen for diabetes and hypertension. Routine involvement with quality improvement or ease of assessing care through electronic medical records may increase physician awareness of the quality of care they provide. Such feedback is critical in identifying knowledge gaps, which serve as the foundation for improvement.18 An underlying assumption of our study is that physicians should view SAMs as primarily learning and prac168

tice improvement activities rather than something to simply “get through” for certification purposes. We believe MCFP activities, including SAMs, fit into Slotnick’s Theory of Physician Self-Directed Learning Episodes.5 Under this theory, SAMs would be used by physicians undertaking independent learning in response to general clinical problems by reading comprehensively and taking available and appropriate courses.5,19 If this theory prevailed, then we would expect physicians with lower disease-specific scores to complete the corresponding SAM. Alternately, other work by Slotnick places learning needs into a hierarchy,20 and if physicians view SAMs as a requirement for certification, and by extension to maintain professional identity, rather than a solution to clinical problems, we would expect the opposite to occur. As previously discussed, we found support for the latter hypotheses in regards to diabetes.

JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS—34(3), 2014 DOI: 10.1002/chp

Family Physicians’ Choice of Self-Assessment

Two unexplored hypotheses may explain our findings. First, physicians may be basing their selection of CME or SAM activities on the composition of their patient panel. The 3 disease entities we studied are common in primary care, but each provider’s panel will vary. Thus, providers who treat more patients with diabetes may be more likely to take the diabetes SAM to further enrich their knowledge in order to better care for this population. Further, the frequency of diseases seen in practice is likely partly due to the prevalence of that disease in the surrounding area. Future studies might address these possibilities. Second, quality measures and payment incentives have rapidly developed for diabetes care but not for asthma or hypertension. It is possible that during our study, physicians were more likely to take the diabetes SAM due to this focus on diabetes. We set out to explore relationships between physician knowledge and choice of SAM. Only physicians who passed the examination were included in our sample due to the structure of MC-FP. While this approach allows a clean sample, as each physician has the same chance to complete a SAM, the exclusion of physicians with low knowledge overall may hinder our ability to detect differences in SAM selection by disease knowledge. We found little evidence of this bias, as the total score had low correlations to disease-specific scores (0.31 to 0.47) and the correlations between diseasespecific scores were even lower (0.12 to 0.21). Thus, physicians with low diabetes knowledge may have a high overall score and high scores on asthma. Additionally, physicians in the first quintile in each disease category all had diseasespecific scores below the passing score on the examination. Other limitations may affect our study. First, our sample represented a single cohort of ABFM diplomates, and examinees from other years may exhibit different behavior. This is less likely due to the regular cohorts of diplomates returning to take the exam. Second, we studied only 3 disease entities and, as evidenced by the differential results seen, if more diseases were studied there may be a clearer relationships between SAM choice and clinical knowledge. Third, SAM selection may be influenced by other professional development/CME activities being completed outside the MC-FP process, and we are unable to account for this with the data available. Finally, the scores by disease category were possibly imprecise due to the low number of items on which they were based; however, the number of physicians was quite large, so the mean of the imprecise subtest score was unlikely to be systematically biased. In conclusion, we found an inconsistent relationship between physician knowledge, as measured by disease-specific scores derived from their certification examination, and choice of self-assessment activity done through MOC among family physicians. This study suggests that physicians may not effectively be leveraging MOC requirements to increase their clinical knowledge. Medical specialty boards could as-

sist their diplomates by providing more detailed score reports or feedback on actual clinical performance to help guide MOC activity selection. Such help may elevate MOC activities from those that are undertaken to meet certification requirements into those that are instrumental for providing better patient care.

Lessons for Practice ●





Medical specialty boards have moved to a new certification paradigm called Maintenance of Certification (MOC). How physicians choose MOC activities is unknown. In this study of family physicians, we found no consistent relationship between clinical knowledge and choice of MOC activities. Medical specialty boards need to increase feedback to physicians about their clinical knowledge and performance in practice to better identify gaps in knowledge and clinical care to effectively leverage MOC to improve not only physician knowledge but also the health of the public.

References 1. Sibley JC, Sackett DL, Neufeld V, Gerrard B, Rudnick KV, Fraser W. A randomized trial of continuing medical education. N Engl J Med. 1982;306:511–515. 2. Cantillon P, Jones R. Does continuing medical education in general practice make a difference? BMJ. 1999;318:1276–1279. 3. Davis DA, Thomson MA, Oxman AD, Haynes RB. Changing physician performance. A systematic review of the effect of continuing medical education strategies. JAMA. 1995;274:700–705. 4. Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of physician self-assessment compared with observed measures of competence: a systematic review. JAMA. 2006;296:1094– 102. 5. Slotnick HB. How doctors learn: physicians’ self-directed learning episodes. Acad Med. 1999;74:1106–1117. 6. Eva KW, Regehr G. Self-assessment in the health professions: a reformulation and research agenda. Acad Med. 2005;80:S46–S54. 7. Brennan TA, Horwitz RI, Duffy FD, Cassel CK, Goode LD, Lipner RS. The role of physician specialty board certification status in the quality movement. JAMA. 2004;292:1038–1043. 8. Mazmanian PE, Davis DA. Continuing medical education and the physician as a learner: guide to the evidence. JAMA. 2002;288:1057– 1060. 9. Puffer JC. ABFM announces further enhancements to MC-FP. Ann Fam Med. 2010;8:467–468. 10. Puffer JC, Bazemore AW, Jaen CR, Xierali IM, Phillips RL, Jones SM. Engagement of family physicians in maintenance of certification remains high. J Am Board Fam Med. 2012;25:761–762.

JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS—34(3), 2014 DOI: 10.1002/chp

169

Peterson et al. 11. Norris TE, Rovinelli RJ, Puffer JC, Rinaldo J, Price DW. From specialty-based to practice-based: a new blueprint for the American Board of Family Medicine cognitive examination. J Am Board Fam Pract. 2005;18:546–554. 12. Rasch G. Probabilistic Models for Some Intelligence and Attainment Tests. Expanded edition (1980) with foreword and afterword by Wright BD, ed. Chicago, IL: University of Chicago Press; 1960/1980. 13. Wright B, Stone M. Best Test Design: Rasch Measurement. Chicago, IL: MESA Press; 1979. 14. Bazemore AW, Xierali IM, Petterson SM, et al. American Board of Family Medicine (ABFM) maintenance of certification: variations in self-assessment modules uptake within the 2006 cohort. J Am Board Fam Med. 2010;23:49–58. 15. Butler DC, Petterson S, Phillips RL, Bazemore AW. Measures of social deprivation that predict health care access and need within a rational

170

16.

17.

18. 19. 20.

area of primary care service delivery. Health Serv Res. 2013 Apr;48(2 Pt 1):539–559. O’Neill TR, Royal KD, Schulte B, Leigh TM. Comparing the Performance of Allopathically and Osteopathically Trained Physicians on the American Board of Family Medicine’s Certification Examination. ERIC; 2009. 2009 Examination Results. American Board of Family Physicians (ABFM), 2009. https://www.theabfm.org/moc/passrate.aspx. Accessed February 13, 2014. Sapyta J, Riemer M, Bickman L. Feedback to clinicians: theory, research, and practice. J Clin Psychol. 2005;61:145–153. Slotnick HB. How doctors learn: education and learning across the medical-school-to-practice trajectory. Acad Med. 2001;76:1013–1026. Slotnick HB. How doctors learn: the role of clinical problems across the medical school-to-practice continuum. Acad. Med. 1996;71:28–34.

JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS—34(3), 2014 DOI: 10.1002/chp

Do family physicians choose self-assessment activities based on what they know or don't know?

Maintenance of Certification (MOC) for Family Physicians (MC-FP) includes clinical Self-Assessment Modules (SAMs). Whether family physicians choose SA...
245KB Sizes 0 Downloads 4 Views