Editorial

Integrating Research and Practice: Proceedings of the Research Track of the 2015 Alliance for Continuing Education in the Health Professions Annual Meeting

CURTIS A. OLSON, PHD

This supplemental issue of JCEHP carries forward an initiative begun in 2014 by the Alliance for Continuing Education in the Health Professions. The goals of this initiative, as articulated by Kues and Doyle-Scharff in an article that introduces this special issue . . . have been to provide tiered educational opportunities for attendees to learn skills and tools to help them understand, and apply, research, and to provide a forum for researchers to present completed studies that impact CEHP practice.1(pS4)

In its ongoing pursuit of these goals, the Alliance again included a research track in its 40th annual meeting held in Dallas, Texas, January 14–19, 2015. JCEHP’s editors were invited to lead the abstract review process and secured a commitment from Pfizer Inc for funds to underwrite publication of this supplement. As required by JCEHP’s supplement policy, the journal was actively involved in reviewing and selecting abstracts for presentation during the conference. These activities included drafting the call for research abstracts, overseeing the screening and peer review of research abstracts, recruiting peer reviewers and managing the peer review process, recommending abstracts for acceptance to the conference, and supporting the researchers and discussants at the meeting in Dallas. JCEHP’s editors solely determined which abstracts would appear in the supplement, extended invitations to se-

lected authors to submit full manuscripts, and implemented a second round of peer review for manuscripts. Changes made in the process based on last year’s experience included increasing the abstract word limit from 250 to 500 and specifying that only completed research projects were eligible. A workshop was also conducted during the annual meeting on how to avoid common errors when submitting abstracts for the ACEhp Research Track, facilitated by the editor-in-chief. As TABLE 1 shows, 34 abstracts were submitted for the meeting and were peer reviewed, a doubling over the previous year. Of these, 12 were recommended for acceptance in the research track. For the supplement, authors were invited TABLE 1. Abstracts and Manuscripts Submitted to the 2015 Annual Meeting of the Alliance for Continuing Education in the Health Professions (ACEhp) and Volume 35, Issue S1 of the Journal of Continuing Education in the Health Professions (JCEHP)

Abstracts

Manuscripts

34



12



6

6

Submissions received

10

2*

Accepted for publication in JCEHP

10

2

Abstracts considered for the ACEhp research track Accepted for inclusion in conference program Invited to submit an extended abstract or full manuscript for the

Disclosure: The author reports none. Dr. Olson: Editor-in-Chief, Journal of Continuing Education in the Health Professions. Correspondence: Curtis A. Olson, 455 Presidential Lane, Madison, WI 53711; e-mail: [email protected]. © 2015 The Alliance for Continuing Education in the Health Professions, the Society for Academic Continuing Medical Education, and the Council on Continuing Medical Education, Association for Hospital Medical Education • Published online in Wiley Online Library (wileyonlinelibrary.com). DOI: 10.1002/chp.21274

JCEHP supplement following presentation at the meeting

35(S1)

*All manuscripts were peer reviewed a second time following JCEHP’s standard protocol for submissions.

JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS, 35(S1):S1–S3, 2015

Olson

to submit either abstracts (n = 6) or full manuscripts (n = 6). Two manuscripts were received and accepted following a second round of peer review. Ten abstracts were ultimately accepted for publication. As was the case last year, this supplement does not simply re-present the 500-word abstracts that were accepted for the meeting. Instead, the reader will find longer versions (approximately 750 words for abstracts, 3000 words for articles), and these have benefitted from the feedback the authors received from the audience following their presentations at the meeting, the additional round of peer review for articles, and close editing by JCEHP’s staff. The articles and abstracts in this issue show a broad range of research topics and include works that encompass theory building, program evaluation, comparative effectiveness, developing a shared terminology in the field, managing conflicts of interest, and evaluation methodology. The article by Tara Herrmann and colleagues2 discusses the impact of a personalized online CME curriculum for oncologists. Participants first took a test to assess their needs and then, based on their answers, were provided with a customized learning plan. Although an uncontrolled study, the results suggest that this approach can yield improvements at the level of knowledge and competence. The article by Williams, Kessler, and Williams3 describes a study examining the relationship among self-efficacy, motivation to change, and acquisition of knowledge in the context of a CME activity. Their results suggest that CME planners might consider assessing and explicitly targeting self-efficacy as a means of enhancing the impact of educational interventions aimed at changing clinical practice. Next is an abstract by Menzies, Duz, and Kinch,4 which reports on a preliminary study to assess the impact of a pointof-care tool designed to provide family physicians and cardiologists immediate feedback on how accurately they scored patients’ risk of stroke when assessing the need to initiate anticoagulation. The results suggest that point-of-care feedback may be an effective component of interventions aimed at changing practice. In their study of the comparative effectiveness of two educational strategies, Mehta and colleagues5 sought to determine if educational activities that guided participants to personalized learning paths were more or less effective than those in which the learning path was determined by the learner. Their findings suggest that educational activities that help physicians select topics that are tailored to their individual needs may be more effective at enhancing learning and, ultimately, clinical practice. Van Hoof and colleagues6 provide an overview of a major study aimed at developing a common terminology around four types of educational activities (performance measurement and feedback, practice facilitation, educational meetings, and interprofessional education) to address persistent S2

problems in how educational interventions are described and to enhance sensitivity to contextual factors that influence the impact of those interventions. Another educational case study is described by Leong and colleagues.7 The intervention combined strategies from continuing education and quality improvement in an effort to reduce the incidence of venous thromboembolism (VTE) among patients in a comprehensive clinical cancer center. The project resulted in measurable improvements at multiple levels but not in the incidence of VTE. The authors provide an analysis of possible reasons for these results. In their study on evaluation methodology, Heintz and Fagerlie8 address a practical question of concern to educational program evaluators: whether unpaired comparisons of pre- and post-outcomes data can be used in lieu of paired comparisons. Their results suggest that for many purposes, unpaired pre-post comparisons can yield an acceptable estimate of changes in outcome measures used in the continuing education of health professionals. Schwarz et al.9 report on a case study of an intervention aimed at reducing the participants’ perceptions of bias in continuing education activities. They found that a combination of systematic disclosure of potential conflicts, prescreening of presentations for bias, and enhanced communication of expectations to presenters and reviewers did result in a significant decrease in reports of bias. Meadows and Weiss10 report on an innovative approach to needs assessment on the clinical topic of obesity. In their study, a large number of patients were surveyed and, based on the data, were segmented into four groups, each requiring a somewhat different approach from the clinician. Taking advantage of a unique dataset, Salinas11 reports on a study of CME effectiveness. Using 9 years of data from a commercial CME provider, he looked the impact of various categories of CME activities on Level 5 (clinical practice) outcomes as measured using a case vignette survey. He found that average effect size for different therapeutic areas ranged from 0.50 (infectious disease) to 0.83 (psychiatry) and found some important differences in effectiveness based on the type of educational activity. The case study by Greene et al.12 reports on the impact of an educational intervention focused on rheumatoid arthritis. Chart audits before and after the intervention showed that with a combination of print-based and video educational material, individual audit and feedback sessions with a pharmacist, and small group webinars led by an expert rheumatologist, there were significant increases in the rate of adherence on 4 of 6 PQRS quality measures. The experience of 2015 suggests that a growing number of CEHP professionals are looking to the ACEhp Annual Meeting as a means for disseminating the results of their investigations and reaching a larger audience. Although the quality of the outcome measures used in evaluation studies has shown

JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS—35(S1), 2015 DOI: 10.1002/chp

Integrating Research and Practice

improvement, there is an overreliance on single-case designs for assessing the impact of educational interventions. Naturalistic studies are heavily represented in this supplemental issue, and it is widely recognized that such studies often preclude the use of rigorous controls. Nevertheless, investigators should make greater use of designs that strengthen the validity of their work. These might include, for example, realist evaluation13 or more sophisticated case study designs such as comparative case study14 and process tracing.15,16

References 1. Kues JR, Doyle-Scharff M. Overview of the Alliance 2015 Research Track. J Contin Educ Health Prof. 2015;35(S1):S4. 2. Herrmann T, Peters P, Williamson C, Rhodes E. Educational outcomes in the era of the Affordable Care Act: impact of personalized education about non-small cell lung cancer. J Contin Educ Health Prof. 2015;35(S1):S5–S12. 3. Williams B, Kessler H, Williams M. Relationship among knowledge acquisition, motivation to change, and self-efficacy in CME participants. J Contin Educ Health Prof. 2015;35(S1):S13–S21. 4. Menzies S, Duz J, Kinch R. Knowledge transfer at point-of-care: investigating new strategies for implementing guideline recommendations. J Contin Educ Health Prof. 2015;35(S1):S22–S23. 5. Mehta N, Geissel K, Rhodes E, Salinas G. Comparative effectiveness in CME: evaluation of personalized and self-directed learning models. J Contin Educ Health Prof. 2015;35(S1):S24–S26.

6. Van Hoof T, Sajdlowska J, Grant R, Kitto S. Context and terminology in continuing education: improving the use of interventions in quality improvement and research. J Contin Educ Health Prof. 2015;35(S1):S27– S28. 7. Leong L, Mendelsohn M, Saavedra C, Morgan R. Quality improvement education for venous thromboembolism (VTE) prevention in cancer. J Contin Educ Health Prof. 2015;35(S1):S29–S30. 8. Heintz A, Fagerlie S. Competence assessments: to pair or not to pair, that is the question. J Contin Educ Health Prof. 2015;35(S1):S31–S32. 9. Schwarz E, Stain S, Shadduck P, et al. A comprehensive process for identifying and managing conflicts of interest reduced perceived bias at a Specialty Society Annual Meeting. J Contin Educ Health Prof. 2015;35(S1):S33–S35. 10. Meadows S, Weiss K. Pre-assessment of weight management practices by providers and patients from 2 community primary care clinic networks. J Contin Educ Health Prof. 2015;35(S1):S36–S37. 11. Salinas G. CME effectiveness: utilizing outcomes assessments of 600+ CME programs to evaluate the association between format and effectiveness. J Contin Educ Health Prof. 2015;35(S1):S38–S39. 12. Greene L, Sapir T, Rusie E, Carter J, Moreo K. Impact of quality improvement education on adherence to quality measures for rheumatoid arthritis. J Contin Educ Health Prof. 2015;35(S1):S40–S41. 13. Pawson R, Tilley N. Realistic Evaluation. London, England: Sage; 1997. 14. Stake RE. Multiple Case Study Analysis. New York, NY: Guilford Press; 2006. 15. George AL, Bennett A. Case Studies and Theory Development in the Social Sciences. Cambridge, MA: MIT Press; 2005. 16. Bennett A, Checkel JT. Process Tracing: From Metaphor to Analytic Tool. New York, NY: Cambridge University Press; 2015.

JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS—35(S1), 2015 DOI: 10.1002/chp

S3

Integrating research and practice: proceedings of the research track of the 2015 Alliance for Continuing Education in the Health Professions annual meeting.

Integrating research and practice: proceedings of the research track of the 2015 Alliance for Continuing Education in the Health Professions annual meeting. - PDF Download Free
56KB Sizes 0 Downloads 11 Views