Research Report

The Relationship Between Physician Participation in Continuing Professional Development Programs and Physician ­ In-Practice Peer Assessments Elizabeth F. Wenghofer, PhD, Bernard Marlow, MD, Craig Campbell, MD, Lorraine Carter, PhD, Sophia Kam, MA, William McCauley, MD, and Lori Hill, MEd

Abstract Purpose To investigate the relationship between physicians’ performance, as evaluated through in-practice peer assessments, and their participation in continuing professional development (CPD). Method The authors examined the predictive effects of participating in the CPD programs of the Royal College of Physicians and Surgeons of Canada and the College of Family Physicians of Canada one year before in-practice peer assessments conducted by the medical regulatory authority in Ontario, Canada, in 2008–2009. Two multivariate

C

ontinuing professional development (CPD) encompasses a broad range of lifelong learning, skill development, and/or educational activities aimed at maintaining and improving physicians’ practice, professional behaviors, clinical skills and knowledge, and ultimately, care provided to patients. Physician participation in CPD has long been considered a professional obligation in Canada and the United States to sustain competence, enhance performance, and demonstrate accountability to the public for the privilege of self-regulation.1–10 The value placed on CPD in relation to physician performance is evident in policies of the Canadian specialty certification bodies. The College of Please see the end of this article for information about the authors. Correspondence should be addressed to Dr. Wenghofer, School of Rural and Northern Health, Laurentian University, Ramsey Lake Road, Sudbury, Ontario, Canada, P3E 2C6; telephone: (705) 675-1151 ext. 3925; e-mail: ewenghofer@ laurentian.ca. Acad Med. 2014;89:920–927. First published online April 18, 2014 doi: 10.1097/ACM.0000000000000243

920

logistic regression models were used to determine whether physicians who reported participating in any CPD and group-based, assessment-based, and/ or self-directed CPD activities were more or less likely to receive satisfactory assessments than physicians who had not. All models were adjusted for the effects of sex, age, specialty certification, practice location, number of patient visits per week, hours worked per week, and international medical graduate status. Results A total of 617 physicians were included in the study. Analysis revealed that physicians who reported participating in

Family Physicians of Canada (CFPC) and the Royal College of Physicians and Surgeons of Canada (Royal College) have mandated participation in CPD as a requirement for maintenance of membership and fellowship. In addition, the medical regulatory authorities of many provinces, including the College of Physicians and Surgeons of Ontario (CPSO), have recently made participation in recognized programs of CPD mandatory for all licensed physicians. Such participation is regarded to be a mechanism to ensure an ongoing high level of physician performance for the public.11 On the national level in Canada, the official position of the Federation of Medical Regulatory Authorities of Canada is that participation in CPD is critical to ensuring that the medical profession continues to act in the public’s best interest.12 Specialty certification bodies in the United States rely on CPD for maintenance of physicians’ competence and stress the importance of lifelong learning. The recertification programs of the American Board of Medical Specialties, in part, evaluate maintenance of competence through

any CPD activities were significantly more likely (odds ratio [OR] = 2.5; P = .021) to have satisfactory assessments than those who had not. In addition, physicians participating in group-based CPD activities were more likely to have satisfactory assessments than those who did not (OR = 2.4; P = .016). Conclusions There is encouraging evidence supporting a positive predictive association between participating in CPD and performance on in-practice peer assessments. The findings have potential implications for policies which require physicians to participate in programs of lifelong learning.

CPD.10 Similarly, the Federation of State Medical Boards promotes the importance of CPD as part of the assessment of competence for maintenance of licensure.13 The positions of these numerous American and Canadian bodies make clear the importance placed on CPD for sustaining effective performance throughout a physician’s career. At the same time, some scholars question the effectiveness of the CPD approach for ensuring continuing competence of physicians in practice.14 Studies have demonstrated that certain methods of CPD are more effective (e.g., ­practice-based small-group learning) than others in generating a positive impact on physician performance and care.15–21 Researchers have found, though, that the CPD approaches shown to have a lesser effect on practice behavior (e.g., didactic large-group sessions) tend to dominate educational offerings and, as such, are selected more frequently by physicians than other forms of learning.22–25 Through­out the literature, there is a call for further research on the influence of various types of CPD on practice. This

Academic Medicine, Vol. 89, No. 6 / June 2014

Research Report

need is increasingly urgent given the current emphasis on lifelong learning as a requirement for maintenance of competence by certification and licensing bodies.1,6,9,26–32 Our study is a contribution to address this gap in the research investigating the overall effect of participation in CPD on physician performance. Our study uses CPD participation data provided by the CFPC and the Royal College and performance data provided by in-practice peer assessments conducted by the CPSO. In analyzing these data, we explored the relationship between physicians’ performances, as evaluated through in-practice peer assessments, and the physicians’ participation in CPD overall, and different types of CPD. The conceptual framework we used in this study defines physician performance as contextual, insofar as performance reflects not only the physician’s education, credentials, and participation in CPD but also his or her ability to meet the unique requirements of the broader practice environment.33 Within this framework, the performance requirements for one practice setting may vary from those in another, because the scope of practice may differ between locations.

do minor in-office surgical procedures or any surgical assisting. Physician 2 has a full office practice in a small isolated community with very few professional supports (i.e., no other physicians, nurse practitioners, or dietitians). Her practice includes minor in-office surgical procedures and obstetrical work, and she has full hospital privileges and works in the hospital’s outpatient clinic. Although both physician 1 and physician 2 are certified family physicians, it is obvious that their scope of practice is not the same because both their patient needs and practice environments are distinctly different, resulting in different competency and performance requirements. We considered this conceptual model when structuring our analyses and in the selection of our covariates described in the statistical models below. Background

Previous research has indicated that physician clinical performance is multidimensional and can best be conceptualized and measured in terms of different types of physician–patient encounters.34,35 In addition, previous studies have demonstrated that physician demographics, organizational features, and system factors affect physician performance.33,36–38 In our study, we recognize that physician performance is influenced by the types of physician– patient encounters within a practice (i.e., who they see), physicians’ personal characteristics (i.e., their demographics and education), and their practice environments (i.e., where they practice and the resources available to them). Taken together, these components define a physician’s scope of practice.

In Canada, the CFPC and Royal College are the certifying bodies for, respectively, family physicians and all other specialties. Once certified, physicians, in order to maintain their membership and fellowship, must comply with the CPD requirements of these organizations. The Royal College’s maintenance of certification (MOC) program is a mandatory CPD system for all specialists through which they maintain membership and use of the designation Fellow of the Royal College of Physicians or Fellow of the Royal College of Surgeons. The purpose of the Royal College MOC program is to encourage, support, and enable a learning culture based on reflection, inquiry, peer review, and assessment of knowledge, competence, and performance across the entire spectrum of professional practice roles and competencies outlined in the CanMEDs framework. Each specialist is required to develop a needs-based, practice-specific CPD plan; to document learning activities; and to identify outcomes that have affected their practice. All specialists are required to complete at least 40 credits per year and 400 credits over each five-year cycle.39

The practices of two family physicians are described to illustrate this point. Physician 1 is engaged in an urban urgent care/ walk-in office practice and also does some long-term care facility duties. He does not have hospital privileges and does not

Although the CFPC also has a CPD requirement through its MAINPRO (maintenance of proficiency) program and is founded on concepts similar to those in the Royal College MOC program, there are differences in how the

Academic Medicine, Vol. 89, No. 6 / June 2014

programs are structured. The MAINPRO program began in 1995 and is undergoing major revisions. There are currently three categories of credit: M1 activities, which are primarily group and self-learning activities that have been peer reviewed and accredited; M2 activities, which are primarily composed of unaccredited group and self-learning activities; and MAINPRO-C activities, which are based on individual needs and include reflective activities or exercises. As of January 1, 2013, members must complete at least 25 credits per year and a minimum of 250 credits every five years to maintain membership and/or certification (earning the CFPC designation). Fellows of the CFPC (designated FCFP) must also complete a minimum of 25 MAINPRO-C credits in every five-year cycle to maintain this special designation.40 In Ontario, the CPSO’s peer assessment program started in 1980. Any physician less than 70 years of age who has been practicing for 5 years or more is included in the random selection pool. Physicians in their 70th year of age are automatically assessed and then assessed again every 5 years thereafter, for as long as they stay in active practice. Method

Peer assessment process and physician performance data All physicians licensed for independent practice with primary practice addresses in Ontario, Canada, and who were randomly selected to participate in a peer assessment by the CPSO in 2008 and 2009, constituted the study sample. Each CPSO peer assessment is performed by a single, trained, practicing physician who has been previously assessed and achieved an excellent rating. The assess­ ment consists of an in-practice review of 20 to 30 medical records, guided by a standardized series of protocol forms, which enables the assessor to develop a picture of the physician’s practice and understand the physician’s approach to patient care. The assessor collects information regarding various dimensions of care, including management of patients with acute conditions, patients with chronic conditions, preventive care, and psychosocial care. The assessor also reviews care activities such as taking adequate histories, conducting

921

Research Report

appropriate examinations, ordering necessary diagnostic tests, identifying the appropriate course of action, conducting necessary interventions, and monitoring patients. The record review is followed by a discussion with the physician. The assessor then completes the protocol forms, using the information from the records and the discussion. In addition to making recommendations for improvement, the assessor makes note of areas in which the physician demonstrates exemplary care (Figure 1). The adjudication of the report is conducted by the Quality Assurance Committee of the CPSO, which reviews the information provided by the assessor and then assigns one of three outcomes: satisfactory, requiring no further action; reassessment required, to follow up on minor concerns; and serious concerns, which require the physician to attend an interview with the committee to establish a remediation plan. For the purposes of this study, we dichotomized assessment outcomes into satisfactory (no further action) and unsatisfactory (requiring reassessment or interview). Additional details regarding the CPSO peer assessment can be found at the CPSO Web site41 and throughout the literature.33,34,42–46 CPD data CPD data included all CPD activities reported to the CFPC and Royal College by the assessment physician cohort in the year before their assessment (i.e., 2007 or 2008). Table 1 presents an overview of the various CPD activities for which

physicians receive credit from the CFPC and Royal College toward fulfillment of their CPD requirements. We categorized the CPD activities into three broad CPD types: group-based activities, ­selfdirected learning, and assessment-based activities. The categorization activity was completed collaboratively with CFPC and Royal College program experts so that similar and comparable activities could be identified and grouped appropriately. These categories enabled us to compare CPD groupings across the Royal College and CFPC program data. We focused our analysis of CPD on reported participation, noted as a dichotomous variable (yes/no). Individuals who reported participating in any CPD activity in the year before assessment were coded as “yes” for having participated in CPD overall. Additionally, physicians who reported participating in any group-based, self-directed learning, or assessment-based CPD activities were coded as “yes” for each of the CPD categories that applied. Covariates To be consistent with our conceptual, and in order to remove potential confounders that may influence the relationship between CPD and assessment performance, we targeted covariates that have been identified in the literature as affecting physician performance. More specifically, we included age,33,42–44,47–52 sex,33,43,45,53–55 international medical graduate (IMG) status,37,44,54,55 certification,33,43,51,52,54,55 hours worked and number of patients

seen in a typical week,33,56–58 and practice location for those physicians working in Northern Ontario.33,59,60 These data were derived from the CPSO register and annual membership renewal survey, which provides personal information and practice environment details. Data linkages and ethical approval Third-party data linkage procedures were used to link each physician’s overall performance data from the CPSO peer assessment program with the CPD data submitted to the CFPC or Royal College (i.e., data sets were linked by researchers at Laurentian University). All personal identifiers were removed from the data, and encrypted study identification numbers were assigned to study participants. Ethical approval for the study was provided by the Laurentian University research ethics board in Sudbury, Ontario. Analysis We calculated frequency distributions of all variables to provide an overview of the study sample and to examine the data for quality, completeness, and potential outliers. Chi-square analysis was used to determine the univariate relationship between reported participation in CPD and types of CPD and peer assessment outcome. We used multivariate logistic regression to determine whether reported participation in CPD was predictive of those physicians assigned a “satisfactory” assessment in contrast with those assigned an “unsatisfactory” outcome. Two logistic regression models were

Figure 1 Overview of the College of Physicians and Surgeons of Ontario peer assessment process, from a study investigating the relationship between physicians’ performance, as evaluated through in-practice peer assessments, and their participation in continuing professional development, 2008–2009.

922

Academic Medicine, Vol. 89, No. 6 / June 2014

Research Report

Table 1

Table 2

College of Family Physicians of Canada and Royal College of Physicians and Surgeons of Canada Continuing Professional Development Program Activities and Categorizations

Distribution of Physician Personal and Practice Covariates of 617 Peer-Assessed Physicians, Ontario, Canada, From a Study of Continuous Professional Development and Peer Assessment, 2008–2009

Activity category Group based

Self-directed learning

Royal College of Physicians and Surgeons of Canada: MOC program

College of Family Physicians of Canada: MAINPRO program

•  •  •  •  • 

•  •  •  • 

• 

•  •  •  •  • 

Assessment based

Rounds and journal clubs Conferences (in Canada) Conferences (outside Canada) Small-group learning Web-based continuing medical education Personal learning projects (i.e., self-directed, evidence-based practice reflection exercises that facilitate the integration of new knowledge and skills into practice) Traineeships Courses, fellowships, etc. Point-of-care learning Critically appraised topics Educational development (e.g., teaching, research, standards development)

•  Self-assessment programs •  Practice review

Group learning Practice-based small-group learning Conferences Advanced life support programs

We used a backward stepwise conditional entry for both models where P ≤ .05 was required for variable entry into the model, and P ≥ .10 was the threshold for

Academic Medicine, Vol. 89, No. 6 / June 2014

No. (%)

Specialty certification status  CFPC certification 243 (39.4)  Royal College certification

374 (60.6)

Sex •  L inking learning to practice activities (e.g., conducting a research project relevant to family medicine, developing educational materials and/or clinical practice guidelines, committee work that involves the review of family medicine content, academic activities) •  Pearls (i.e., an evidence-based practice reflection exercise that facilitates the integration of new knowledge and skills into practice) •  Organized or self-directed fellowship •  University program •  Contributions to medical or academic community •  Family emergency room exam •  Provincial review •­  Quality assurance audit

Abbreviations: MOC indicates maintenance of certification; MAINPRO, maintenance of proficiency.

completed. Model 1 examined the effects of overall reported participation in CPD (yes/no) on satisfactory performance. Model 2 examined the effects on satisfactory performance of reported participation in specific categories of CPD, including group-based activities (yes/no), self-directed learning (yes/no), and assessment-based activities (yes/no). Both models considered for inclusion all covariates outlined above. In addition to the listed covariates, we included an interaction term between reported CPD participation and certification (Royal College versus CFPC) in the variable selection process. Although the CFPC’s and Royal College’s mandatory CPD programs have many comparable features, there may be nuanced differences, which an interaction term could reveal.

Variable names and categories

removal. Odds ratios (ORs) and 95% confidence intervals were calculated for all significant predictors. Results

The final sample size included 617 physicians. Of this group, 580 (94.0%) received a satisfactory peer assessment. Of the 617 physicians, 243 (39.4%) were family physicians, 219 (35.5%) were female, 439 (71.2%) were under the age of 55, and 101 (16.4%) were IMGs. Regarding hours worked and patient visits per week, 280 (47.0%) reported working more than 40 hours a week, and 360 (60.4%) saw fewer than 100 patients per week (Table 2). Participation in CPD in the year before their assessment was reported by 465 (77.5%) physicians; 441 (73.5%) reported group-based CPD, and 398 (66.3%) reported participation in more than one type of CPD activity. Although s­ elfdirected and assessment-based activities were reported, the numbers of physicians

 Male

398 (64.5)

 Female

219 (35.5)

Age at assessment  Under 55

439 (71.2)

 55 and over

178 (28.8)

IMG status  Canadian or U.S. graduate

513 (83.6)

  IMG

101 (16.4)

 Missing

3a

Northern Ontario practice location  Primary practice in Northern Ontario  Primary practice not in Northern Ontario  Missing

36 (5.9) 578 (94.1) 3a

Hours worked per week  1–20

80 (13.4)

 21–30

104 (17.4)

 31–40

132 (22.1)

 41+

280 (47.0)

 Missing

21a

Patient visits per week  1–50

157 (26.3)

 51–100

203 (34.1

 101–150

142 (23.8)

 151+  Missing

94 (15.8) 21a

Abbreviations: CFPC indicates College of Family Physicians of Canada; IMG, international medical graduate. a Percentages were not calculated for missing data because the missing values were removed from the denominator for the other percentages calculated for this variable.

who participated in these categories were considerably lower at 292 (48.7%) and 108 (18.0%), respectively (Table 3). ­Chi-square analyses indicated significantly lower proportions of unsatisfactory assessments in physicians who participated in CPD overall (χ2 = 5.32; P = .021), participated in a variety of CPD activities as opposed to no reported CPD or only one type

923

Research Report

found that group-based activities had a positive predictive effect on performance, whereas self-directed learning and assessment-based activities were not predictive of performance as determined by in-practice peer assessments.

Table 3 Continuing Professional Development Activities in the One Year Prior to Peer Assessment, for 617 Physicians, Ontario, Canada, From a Study of Continuous Professional Development and Peer Assessment, 2008–2009

No. (%)

No. (%) within unsatisfactory peer assessment category

χ2 (P value)

Participation, overall  No credits reported

135 (22.5)

14 (10.4)

5.32 (.021)

 Credits reported

465 (77.5)

23 (4.9)



17a





135 (22.5)

14 10.4)

9.414 (.009)

67 (11.2)

7 (10.4)



398 (66.3)

16 (4.0)



17a





Variable names and categories

 Missing Participated in a variety of CPD activities  No CPD reported  One type of CPD reported  More than one type of CPD reported  Missing Group-based CPD  Yes

441 (73.5)

20 (4.5)

7.655 (.006)

 No

159 (26.5)

17 (10.7)



17a





 Missing Self-directed CPD  Yes

292 (48.7)

13 (4.5)

2.890 (.089)

 No

308 (51.3)

24 (7.8)



17a





 Missing Assessment-based CPD  Yes

108 (18)

5 (4.6)

0.538 (.463)

 No

492 (82)

32 (6.5)



17a





 Missing

Abbreviations: CPD indicates continuing professional development. Percentages were not calculated for missing data because the missing values were removed from the denominator for the other percentages calculated for this variable.

a

of CPD reported (χ2 = 9.414; P = .009), and participated in group-based CPD (χ2 = 7.655; P = .006).

terms were not significant predictors of satisfactory performance in either model. In both models, males were significantly less likely to do well on assessment than females, and Royal College–certified specialists were more likely to do well on assessment than CFPC-certified family practitioners. Both findings are consistent with previous studies on the CPSO peer assessment program.33,42–44,46

We used logistic regression to analyze the data according to the two models previously described. The regression for model 1 revealed that physicians who reported participating in CPD activities in the year before assessment were more likely (OR = 2.5; P = .021) to have a satisfactory assessment than those physicians who did not report participation in CPD (Table 4).

Discussion

The regression for model 2 indicated that physicians who reported participation in group-based CPD were more likely to have satisfactory assessments than those who did not report participation in group-based activities (OR = 2.4, P = .016) (Table 4). Reported participation in selfdirected learning activities, assessmentbased activities, and the interaction

In this study, we examined the predictive relationship between reported participation in CPD, different types of CPD, and practice performance as determined through in-practice peer assessment. We found a positive predictive effect of reported participation in CPD on physician performance for assessments completed in the year following the reported CPD activities. In addition, we

924

Our findings suggest a positive relationship between CPD participation and physician performance in practice, providing further evidence for the policies of medical regulators, certifying bodies, and CPD provider organizations which require or promote participation in CPD activities. However, there are some caveats to consider when interpreting our findings. First, the CPD data are based on s­ elfreported activities. As mentioned earlier, physicians report their CPD activities to the CFPC and Royal College over a five-year cycle. Some physicians will retroactively report participation in learning activities throughout the five years and not necessarily during the year the activity was completed. Therefore, the data likely represent only those CPD activities that physicians have completed and reported; they may not capture all completed learning activities. For this reason, we chose to examine the participation in any CPD rather than to focus on amounts of CPD. Future studies should focus on total CPD activities (e.g., total credits) reported across a complete cycle and examine whether or not the timing of reporting in the cycle (e.g., reporting of all activities in the weeks before the end of the cycle versus regular reporting of activities throughout the cycle) is associated with satisfactory performance. The relationship between CPD and performance also needs to be examined for a threshold effect, to determine whether there is a minimum level of CPD required to support satisfactory performance. Second, our research team worked cooperatively to construct the categories of CPD types used in this study. Although we concluded that these broad categories were appropriate categorizations of the range of accredited learning activities, the question may be raised as to whether our allocation of individual CPD activities to each category could have influenced the results. However, our choice of allocation for a specific CPD activity would not likely have affected our overall findings

Academic Medicine, Vol. 89, No. 6 / June 2014

Research Report

Table 4 Significant Predictors in Multivariate Logistic Regression for Satisfactory Peer Assessment of 617 Physicians, Ontario, Canada, From a Study of Continuous Professional Development and Peer Assessment, 2008–2009a Model and variables included in model at initial step of backward stepwise regression

Variable remaining significant in model at last iteration of backward stepwise regression

Model 1: Independent variable  Reported overall participation in CPD (yes/no)

Yes

Odds ratio

95% CI

P value

2.5

1.14–5.26

.021

Model 1: Covariates  Male (versus female)

Male

.4

0.15–0.92

.033

 Age less than 55 years (versus 55+ years)









  IMG (versus U.S./Canadian graduates)









 Royal College specialty certification (versus CFPC)

Royal College certification

2.9

1.38–5.99

.005

 Northern Ontario practice location (versus other practice location in Ontario)









 Hours worked per week









 Patient visits per week









 Reported participation in group-based CPD (yes/no)

Yes

2.4

1.18–5.05

.016

 Reported participation in assessment-based CPD (yes/no)









 Reported participation in self-directed learning (yes/no)









Model 2: Independent variables

Model 2: Covariates  Male (versus female)

Male

0.4

0.16–0.98

.046

 Age less than 55 years (versus 55+ years)









  IMG (versus U.S./Canadian graduates)









 Royal College specialty certification (versus CFPC)

Royal College certification

2.6

1.26–5.36

.010

 Northern Ontario practice location (versus other practice location in Ontario)









 Hours worked per week









 Patient visits per week









Abbreviations: CPD indicates continuous professional development; CFPC, College of Family Physicians of Canada; IMG, international medical graduate. a Both models significant at P < .001(3 df).

because our focus was on whether reported participation in any CPD activity affected performance. In future work, we hope to recruit a sufficient sample size to examine in greater detail the impact of individual types of CPD activities (e.g., practice-based ­smallgroup learning) on practice performance. Our third caveat, and related to the point above, is that our findings indicate a significant positive predictive relationship with group-based activities but not with self-directed or assessment-based activities. This finding was somewhat unexpected and in contradiction to the literature. The lack of a significant predictive effect of self-directed and assessment-based CPD may be due to these activities’ representing a far smaller proportion of the total CPD activities compared with group-based activities. It would be well worth revisiting

Academic Medicine, Vol. 89, No. 6 / June 2014

our analysis with CPD program data reported from a full five-year cycle, which could provide a more robust sample of s­ elfdirected and assessment-based activities. Further, although we focused on covariates that have been shown to influence physician practice performance, our list is not exhaustive. In future studies, other important performance factors (e.g., remuneration) should be explored because these may significantly influence the results presented here. Finally, some readers might note the very high percentage (94%) of physicians who did well on assessment and question the representativeness of this sample. Previous publications based on the CPSO peer assessment program have shown that randomly selected family physicians typically do exceptionally

well on assessment, with approximately 85% achieving satisfactory assessments.61 Given those findings, a rate of 94% for satisfactory assessments might appear high. However, this finding might be partially explained by the fact that Royal College–certified physicians tend to perform better on assessment than CFPCcertified physicians,61 and 60% of the study’s sample was composed of specialists certified by the Royal College. Nonetheless, even in this ­high-performing population, we still find a considerable effect, an increase in the probability of a positive outcome with CPD participation, lending further confidence to our findings. Notwithstanding these caveats, our findings contribute to the recently growing body of evidence62 supporting the premise that physicians who participate in CPD demonstrate satisfactory competence and

925

Research Report

performance in their practice. However, our data do not allow us to draw causal links between CPD participation and performance. As well, this study cannot allow us to conclude that the actual CPD activities provided target the skills, knowledge, and attitudes that physicians require to maintain ongoing levels of competence and performance in practice or whether there are other factors contributing to our results. For example, perhaps physicians who regularly participate in CPD activities hold particular commitments to lifelong learning or attitudes toward medical practice that are the actual causal factors driving good performance. Similarly, it may be that physicians who participate in CPD activities structure their practices in particular ways that support competent performance in practice. Future studies are needed to investigate these and other possibilities. In the next phases of our research, our team will examine the predictive effects of engaging in specific types of CPD activities and performance within specific dimensions of care (e.g., chronic condition management, preventive care and health maintenance, acute condition management). We will also explore the relationships between participation in CPD activities and concerns from the public, including overall and specific complaints (e.g., communication, professionalism, quality of care). We hope this work will further add to understanding the impact of CPD on important dimensions of clinical care. In the meantime, individual physicians who participate in lifelong learning should be encouraged to know that their participation in CPD is a positive predictor of performance in practice reviews. CPD planners and providers can likewise be assured that their work holds potential for ensuring the satisfactory performance of physicians. Lastly, our study provides important evidence to support the current policy directions of various Canadian and American certification and regulatory bodies that mandate CPD and lifelong learning. In an age where the public demands increasing transparency and accountability and for extending the responsibility of ­selfregulation to the medical profession, all evidence, in support or in opposition of mandatory CPD, is important.

926

Funding/Support: National Association of Faculties of Medicine of Canada (AFMC) Standing Committee on Continuing Medical Education Continuing Medical Education/ Continuing Professional Development Research Fund. Other disclosures: None reported. Ethical approval: Ethical approval for the study was provided by the Laurentian University research ethics board. Disclaimer: The AFMC had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; and preparation, review, or approval of the manuscript. The College of Physicians and Surgeons of Ontario, the College of Family Physicians of Canada, and the Royal College of Physicians and Surgeons of Canada, as organizations, were not involved in the approval of this article. Coauthors from the College of Physicians and Surgeons of Ontario (Dr. McCauley), the College of Family Physicians of Canada (Dr. Marlow, formerly, and Ms. Hill), and the Royal College of Physicians and Surgeons of Canada (Dr. Campbell) provided oversight for the retrieval, linkage, and anonymization of data from their respective institutions, as well as input on the design and conduct of the study; collection, management, and interpretation of the data; and preparation, review, and approval of the manuscript. Previous presentations: A summary of the preliminary findings of this study was presented at the Annual Spring Meeting of the Coalition for Physician Enhancement, June 7–8, 2012, Denver Colorado; and CME Congress, May 31 to June 2, 2012, Toronto, Ontario. Dr. Wenghofer is associate professor, School of Rural and Northern Health, and Northern Ontario School of Medicine, Laurentian University, Sudbury, Ontario, Canada. Dr. Marlow is assistant professor, Department of Family and Community Medicine, University of Toronto, Toronto, Ontario, Canada. Dr. Campbell is director of professional affairs, Royal College of Physicians and Surgeons of Canada, Ottawa, Ontario, Canada, and associate professor, Faculty of Medicine, University of Ottawa, Ottawa, Ontario, Canada. Dr. Carter is director, Centre for Flexible Learning, Nipissing University, North Bay, Ontario, Canada, and professor, Northern Ontario School of Medicine, Sudbury, Ontario, Canada. Ms. Kam is a PhD student, School of Rural and Northern Health, Laurentian University, Sudbury, Ontario, Canada. Dr. McCauley is medical advisor, Quality Management Division, College of Physicians and Surgeons of Ontario, Toronto, Ontario, Canada, and associate professor, Schulich School of Medicine & Dentistry, University of Western Ontario, London, Ontario, Canada. Ms. Hill is former manager, Continuing Professional Development, College of Family Physicians of Canada, Mississauga, Ontario, Canada.

References 1 Lowe MM, Aparicio A, Galbraith R, Dorman T, Dellert E; American College of Chest Physicians Health and Science Policy Committee. The future of continuing medical education: Effectiveness of continuing medical education: American College of Chest Physicians Evidence-Based Educational Guidelines. Chest. 2009;135(3 suppl):69S–75S. 2 Frank JR, ed. The CanMEDS 2005 Physician Competency Framework. Ottawa, Ontario, Canada: Royal College of Physicians and Surgeons of Canada; 2005. 3 Canadian Medical Association. CMA Policy Statement: CMA Code of Ethics. Ottawa, Ontario, Canada: Canadian Medical Association; 2004. ­http://policybase.cma.ca/ dbtw-wpd/PolicyPDF/PD04-06.pdf. Accessed February 15, 2014. 4 du Boulay C. From CME to CPD: Getting better at getting better? BMJ. 2000;320:393–394. 5 Shaw K, Cassel CK, Black C, Levinson W. Shared medical regulation in a time of increasing calls for accountability and transparency: Comparison of recertification in the United States, Canada, and the United Kingdom. JAMA. 2009;302:2008–2014. 6 Chaudhry HJ, Talmage LA, Alguire PC, Cain FE, Waters S, Rhyne JA. Maintenance of licensure: Supporting a physician’s commitment to lifelong learning. Ann Intern Med. 2012;157:287–289. 7 Chaudhry HJ, Rhyne J, Cain FE, Young A, Crane M, Bush F. Maintenance of licensure: Protecting the public, promoting quality health care. J Med Regul. 2010;96:13–20. http://www.fsmb.org/pdf/mol-bg.pdf. Accessed February 15, 2014. 8 Levinson W, Holmboe E. Maintenance of certification: 20 years later. Am J Med. 2011;124:180–185. 9 Miller SH, Thompson JN, Mazmanian PE, et al. Continuing medical education, professional development, and requirements for medical licensure: A white paper of the Conjoint Committee on Continuing Medical Education. J Contin Educ Health Prof. 2008;28:95–98. 10 American Board of Medical Specialties. ABMS maintenance of certification. http://www.abms.org/Maintenance_of_ Certification/ABMS_MOC.aspx. Accessed February 15, 2014. 11 College of Physicians and Surgeons of Ontario. Consultation on proposed amendments to the QA regulations. Dialogue. September 2009. http://www. cpso.on.ca/uploadedfiles/policies/ consultations/QAamendments_Sept09. pdf?terms=CPD+require. Accessed February 15, 2014. 12 Federation of Medical Regulatory Authorities of Canada Revalidation Working Group. Physician Revalidation: Maintaining Competence and Performance [position paper]. Federation of Medical Regulatory Authorities of Canada; July 4, 2007. http://www. fmrac.ca/committees/documents/final_reval_ position_eng.pdf. Accessed February 15, 2014. 13 Federation of State Medical Boards. Report of the Board of Directors: Maintenance of Licensure (BRD RPT 10-3). ­­http://www. fsmb.org/pdf/BD_RPT_10-3_MOL.pdf. Accessed February 15, 2014.

Academic Medicine, Vol. 89, No. 6 / June 2014

Research Report 14 Levinson W. Revalidation of physicians in Canada: Are we passing the test? CMAJ. 2008;179:979–980. 15 Norman GR, Shannon SI, Marrin ML. The need for needs assessment in continuing medical education. BMJ. 2004;328:999–1001. 16 Borgiel AE, Williams JI, Davis DA, et al. Evaluating the effectiveness of 2 educational interventions in family practice. CMAJ. 1999;161:965–970. 17 Grol R. Improving the quality of medical care: Building bridges among professional pride, payer profit, and patient satisfaction. JAMA. 2001;286:2578–2585. 18 Davis DA, Thomson MA, Oxman AD, Haynes RB. Changing physician performance. A systematic review of the effect of continuing medical education strategies. JAMA. 1995;274:700–705. 19 Bloom BS. Effects of continuing medical education on improving physician clinical care and patient health: A review of systematic reviews. Int J Technol Assess Health Care. 2005;21:380–385. 20 Davis D, Galbraith R; American College of Chest Physicians Health and Science Policy Committee. Continuing medical education effect on practice performance: Effectiveness of continuing medical education: American College of Chest Physicians Evidence-Based Educational Guidelines. Chest. 2009;135(3 suppl):42S–48S. 21 Scott IA, Phelps G, Brand C. Assessing individual clinical performance: A primer for physicians. Intern Med J. 2011;41:144–155. 22 Kühne-Eversmann L, Eversmann T, Fischer MR. Team- and case-based learning to activate participants and enhance knowledge: An evaluation of seminars in Germany. J Contin Educ Health Prof. 2008;28:165–171. 23 Gray J. Changing physician prescribing behaviour. Can J Clin Pharmacol. 2006;13:e81–e84. 24 Duffy FD, Holmboe ES. Self-assessment in lifelong learning and improving performance in practice: Physician know thyself. JAMA. 2006;296:1137–1139. 25 Byrick RJ, Naik VN, Wynands JE. Simulation-based education in Canada: Will anesthesia lead in the future? Can J Anaesth. 2009;56:273–275. 26 American Board of Medical Specialties. The value of ABMS MOC. http://www.abms.org/ Maintenance_of_Certification/value_of_ MOC.aspx. Accessed February 15, 2014. 27 Jamtvedt G, Young JM, Kristoffersen DT, O’Brien MA, Oxman AD. Audit and feedback: Effects on professional practice and health care outcomes. Cochrane Database Syst Rev. 2006;(2):000259. 28 O’Brien MA, Rogers S, Jamtvedt G, et al. Educational outreach visits: Effects on professional practice and health care outcomes. Cochrane Database Syst Rev. 2007;(4):CD000409. 29 Mansouri M, Lockyer J. A meta-analysis of continuing medical education effectiveness. J Contin Educ Health Prof. 2007;27:6–15. 30 Bennett NL, Davis DA, Easterling WE Jr, et al. Continuing medical education: A new vision of the professional development of physicians. Acad Med. 2000;75:1167–1172.

Academic Medicine, Vol. 89, No. 6 / June 2014

31 Epstein RM, Hundert EM. Defining and assessing professional competence. JAMA. 2002;287:226–235. 32 Naik VN, Wong AK, Hamstra SJ. Review article: Leading the future: Guiding two predominant paradigm shifts in medical education through scholarship. Can J Anaesth. 2012;59:213–223. 33 Wenghofer EF, Williams AP, Klass DJ. Factors affecting physician performance: Implications for performance improvement and governance. Healthc Policy. 2009;5:e141–e160. 34 Wenghofer EF, Williams AP, Klass DJ, Faulkner D. Physician–patient encounters: The structure of performance in family and general office practice. J Contin Educ Health Prof. 2006;26:285–293. 35 Klass D. A performance-based conception of competence is changing the regulation of physicians’ professional behavior. Acad Med. 2007;82:529–535. 36 Blasier RB. The problem of the aging surgeon: When surgeon age becomes a surgical risk factor. Clin Orthop Relat Res. 2009;467:402–411. 37 Lipner R, Song H, Biester T, Rhodes R. Factors that influence general internists’ and surgeons’ performance on maintenance of certification exams. Acad Med. 2011;86:53–58. 38 Dahrouge S, Hogg WE, Russell G, et al. Impact of remuneration and organizational factors on completing preventive manoeuvres in primary care practices. CMAJ. 2012;184:E135–E143. 39 Royal College of Physicians and Surgeons of Canada. Maintenance of certification. www.royalcollege.ca/portal/page/portal/rc/ members/moc. Accessed February 15, 2014. 40 College of Family Physicians of Canada. Introduction to MAINPRO (maintenance of proficiency). http://www.cfpc.ca/MAINPRO/. Accessed February 15, 2014. 41 College of Physicians and Surgeons of Ontario. www.cpso.on.ca. Accessed February 22, 2014. 42 Norton PG, Faulkner D. A longitudinal study of performance of physicians’ office practices: Data from the peer assessment program in Ontario, Canada. Jt Comm J Qual Improv. 1999;25:252–258. 43 Norton PG, Dunn EV, Soberman L. What factors affect quality of care? Using the peer assessment program in Ontario family practices. Can Fam Physician. 1997;43:1739–1744. 44 Wenghofer E, Klass D, Abrahamowicz M, et al. Doctor scores on national qualifying examinations predict quality of care in future practice. Med Educ. 2009;43:1166–1173. 45 Norton PG, Dunn EV, Soberman L. Family practice in Ontario. How physician demographics affect practice patterns. Can Fam Physician. 1994;40:249–256. 46 McAuley RG, Paul WM, Morrison GH, Beckett RF, Goldsmith CH. Five-year results of the peer assessment program of the College of Physicians and Surgeons of Ontario. CMAJ. 1990;143:1193–1199. 47 Grace ES, Korinek EJ, Weitzel LB, Wentz DK. Physicians reentering clinical practice: Characteristics and clinical abilities. J Contin Educ Health Prof. 2011;31:49–55.

48 Grace ES, Korinek EJ, Tran ZV. Characteristics of physicians referred for a competence assessment: A comparison of state medical board and hospital referred physicians. J Med Regul. 2011;96:8–15. 49 Turnbull J, Carbotte R, Hanna E, et al. Cognitive difficulty in physicians. Acad Med. 2000;75:177–181. 50 Choudhry NK, Fletcher RH, Soumerai SB. Systematic review: The relationship between clinical experience and quality of health care. Ann Intern Med. 2005;142:260–273. 51 Caulford PG, Lamb SB, Kaigas TB, Hanna E, Norman GR, Davis DA. Physician incompetence: Specific problems and predictors. Acad Med. 1994;69(10 suppl): S16–S18. 52 Norman GR, Davis DA, Lamb S, Hanna E, Caulford P, Kaigas T. Competency assessment of primary care physicians as part of a peer review program. JAMA. 1993;270:1046–1051. 53 Slade S, Busing N. Weekly work hours and clinical activities of Canadian family physicians: Results of the 1997/98 National Family Physician Survey of the College of Family Physicians of Canada. CMAJ. 2002;166:1407–1411. 54 Reid RO, Friedberg MW, Adams JL, McGlynn EA, Mehrotra A. Associations between physician characteristics and quality of care. Arch Intern Med. 2010;170:1442–1449. 55 Tamblyn R, Abrahamowicz M, Dauphinee D, et al. Physician scores on a national clinical skills examination as predictors of complaints to medical regulatory authorities. JAMA. 2007;298:993–1001. 56 Goedhuys J, Rethans JJ. On the relationship between the efficiency and the quality of the consultation. A validity study. Fam Pract. 2001;18:592–596. 57 Freeman GK, Horder JP, Howie JG, et al. Evolving general practice consultation in Britain: Issues of length and context. BMJ. 2002;324:880–882. 58 Howie JG, Heaney DJ, Maxwell M. Measuring quality in general practice: Pilot study of a needs, process and outcome measure. Occas Pap R Coll Gen Pract. 1997;75:1–32, i–xii. 59 Chan BTB, Schultz SE. Supply and Utilization of General Practitioner and Family Physician Services in Ontario: ICES Investigative Report. Toronto, Ontario, Canada: Institute for Clinical Evaluative Sciences; 2005. 60 Tepper J, Schultz SE, Rothwell DM, Chan BTB. Physician Services in Rural and Northern Ontario: ICES Investigative Report. Toronto, Ontario, Canada: Institute for Clinical Evaluative Sciences; 2006. 61 College of Physicians and Surgeons of Ontario. Annual Report 2011. Toronto, Ontario, Canada: College of Physicians and Surgeons of Ontario. http://www.cpso.on.ca/ uploadedFiles/policies/publications/AR11. pdf. Accessed February 15, 2014. 62 Goulet F, Hudon E, Gagnon R, Gauvin E, Lemire F, Arsenault I. Effects of continuing professional development on clinical performance: Results of a study involving family practitioners in Quebec. Can Fam Physician. 2013;59:518–525.

927

The relationship between physician participation in continuing professional development programs and physician in-practice peer assessments.

To investigate the relationship between physicians' performance, as evaluated through in-practice peer assessments, and their participation in continu...
511KB Sizes 0 Downloads 4 Views