Opinion For reprint orders, please contact: [email protected]

Implementation science and comparative effectiveness research: a partnership capable of improving population health “...implementation science teaches us that overly complicated, intensive, lengthy

theories, measurement approaches and research methods are much less likely to be adopted in real-world settings and, if adopted, are less likely to be successfully implemented or sustained.



Keywords:  comparative effectiveness • complexity • external validity • implementation

Although the fields of comparative effectiveness research (CER) [1] and implementation science (IS) [2,3] both have historical precursors, they are relatively new and emerging as health research fields of inquiry [4] . CER and IS share similar goals such as studying interventions under more typical or ‘real-world’ conditions and improving health outcomes through translating research and evidence based findings into practice. The intersection of IS and CER with areas such as pragmatic trials, quality improvement and evaluation research further strengthen this connection. Furthermore, functional connections created between IS and CER within clinical translational science awards, publications and training courses suggest an increasing acknowledgement of the role IS can play to inform CER [4–6] . Despite this overlap and acknowledged connection, many strategies, methods and findings from IS are not routinely used in the design and conduct of CER studies. More consistent incorporation of key IS principles and lessons could enhance the public health impact of CER. The purpose of this article is to summarize key findings from IS in the form of recommendations for CER knowledge generators and users that could enhance the consistency or speed of translation of CER findings into practice. Elsewhere [4] , we have termed such strategies CER-T for CER that will translate. We enumerate and briefly describe key IS approaches in the form of ten inter-related recommendations, beginning with more commonly discussed issues.

10.2217/CER.14.9 © 2014 Future Medicine Ltd

Incorporate multiple stakeholder perspectives Others, and especially the Patient-Centered Outcomes Research Institute [7] , have emphasized patient or community engagement, but IS emphasizes that there are almost always multiple stakeholders, whose perspectives may not align, and that it is important to understand multiple perspectives [8] . This involvement should continue throughout a program, not just occur during the initial formative stage, although it need not be the same community members at different stages [8,9] . In CER, this might mean that stakeholders (practitioners, patients and community members, among others) will identify the research questions, provide input on settings and target populations and collaborate on study design, identification of outcomes and creation of products for dissemination.

Russell E Glasgow Author for correspondence: Department of Family Medicine & Colorado Health Outcomes Program, School of Medicine, University of Colorado Denver, 13199 E Montview Boulevard, Suite 300, MS F443, Room 323, Aurora, CO 80045, USA russell.glasgow@ ucdenver.edu

Borsika A Rabin Department of Family Medicine & Colorado Health Outcomes Program, School of Medicine, University of Colorado Denver, 13199 E Montview Blvd, Suite 300, MS F443, Room 323, Aurora, CO 80045, USA

Design for dissemination & sustainability Planning for widespread use and continuation of an intervention beyond a funded evaluation period needs to begin during the earliest planning stages, and inform choices about program content, modalities, who delivers the intervention, program intensity, resources needed for implementation and other features. IS emphasizes that dissemination and sustainability will not just occur naturally, but need to be designed into programs and policies in an early stage rather than as an afterthought [2] .

J. Compar. Effect. Res. (2014) 3(3), 237–240

part of

ISSN 2042-6305

237

Opinion  Glasgow & Rabin Consider multilevel context IS generally adopts a multilevel socioecological systems perspective [10] . This perspective emphasizes that even a seemingly straightforward patient-focused intervention usually has related staff and setting level factors that influence its successful implementation. More complex interventions and programs, especially those designed to address complex, multidetermined problems such as substance abuse, cancer, heart disease or multiple chronic conditions almost always require multilevel solutions [11,12] . This often means constructing and evaluating an intervention strategy that can act at more than one level, or designing multicomponent programs with strategies and evaluation components addressing various levels (e.g., patient, provider and system). Address external as well as internal validity Most research evaluation criteria and design recommendations focus predominantly on internal validity [13] . IS emphasizes that external validity is an equally important consideration for any program intended for eventual dissemination [14] . In addition to traditional outcomes, CER should include and integrate IS process and outcome measures related to external validity as additional CER outcomes. Important questions central to adoption and public health impact include reach and generalization of results across patient, staff, setting and policy contexts [15,16] . Exclusive use of traditional designs that emphasize internal validity over external validity by controlling out, instead of taking into account contextual variation can limit our ability to understand the real world applicability of an intervention [14] . Resource & cost issues are central Although economic analyses are considered controversial in some quarters and not allowed by some CER funding groups, cost and resources required by different interventions are always central to decision-making. CER is about added or comparative value and knowing the personnel, equipment, overhead, recruitment, implementation and replication costs are essential to determining adoption, successful implementation and sustainability [17,18] . Such costs should be routinely reported and in consistent ways. Variations & adaptations of evidence-based practices & guidelines happen & are informative Few programs or policies are implemented in real-world settings in the way they were in research protocols, or even the same way in different locations in multisite trials. Instead of assuming that all adaptations or

238

J. Compar. Effect. Res. (2014) 3(3)

variations are bad, however (and some are), research should transparently report and seek to understand adaptations and their effects [19] . Don’t just ‘look under the lamp post’: focus on the denominator Since the advent of the CONSORT criteria, researchers do a good job of reporting on the persons who take part in research and who are present at assessments (e.g., the numerator). Such reporting is not required and is done much less often at the setting or staff level than at the patient level, but the principle is generalizable to the multiple levels discussed above. IS pushes us to understand the broader target populations (e.g., the denominator), and who declines to participate and why at each of these levels. This cannot only inform the generalizability of our findings, but can help answer the broader realist or contextual question of with whom and under what circumstances does a given intervention work [3,20,21] . CER research is complex & should be approached as such Important, challenging implementation problems are seldom simple or have simple, straightforward solutions [11] . Often interventions have multiple effects, some of them unintended or delayed, and these complex systems effects should be anticipated and reported [8] . At present, understanding such complex and contextual effects is best approached via qualitative or mixed methods [21] . When investigating complex programs, consider the ‘cascade’ of steps that needs to occur for successful outcomes Often programs involve a complex sequence of intervention components, many times contingent on earlier steps – for example, recruiting key practitioners who influence or train others, recruitment and retention of a high percentage of the target audience, consistent delivery by diverse staff, sustainability of program components. Using IS models and approaches [15,22] to identify and address ‘weak links’ in this chain can facilitate planning, quality control and program success. CER needs more practical models, measures & methods CER needs more practical models, measures, and methods [23] . IS teaches us that overly complicated, intensive, lengthy theories, measurement approaches and research methods are much less likely to be adopted in real-world settings and, if adopted, are less likely to be successfully implemented or sustained [18,24] .

future science group

Implementation science & comparative effectiveness research: a partnership capable of improving population health 

The above IS lessons learned and recommendations are interrelated and need to be employed in integrated fashion to substantially enhance the value of CER. No one activity alone will dramatically increase information value. We acknowledge that it is not possible to employ all of these recommendations in every CER study. Taken together however, these recommendations can help to substantially enhance the relevance and value of CER for decision-making. A recent NIH/VA conference of IS experts concluded that these and similar recommendations are needed to increase the value of research and to focus attention and reporting across the various planning, implementation, reporting and long-term outcomes stages of research [Neta G, Brown-

in intensive interventions and very comprehensive assessments – all focused on a single or very small set of outcomes. While such approaches do simplify experimental design and reporting, they do so at the expense of contextual knowledge, generalization and some of the most essential learning if one is concerned with identifying the range and limits of comparative approaches [3] . Applying the IS lessons above could help CER become of greater relevance and value to stakeholders including patients and family, practitioners and policy makers who ultimately have to make decisions about research [4,23] .

son RC, Carpenter C et al. A framework for enhancing the

Acknowledgements

value of research for dissmination and implementation (2013),

The authors thank David West for his review and constructive feedback on an earlier version of this paper.

Submitted] .



...these recommendations can help to substantially enhance the relevance and value of omparative effectiveness research for decision-making.



To achieve high levels of internal validity and achieve large effect sizes, there is a tendency to narrow the focus of research to very small segments of settings, staff and patients who are highly motivated and have substantial resources to participate

Financial & competing interests disclosure The authors have no relevant affiliations or financial involvement with any organization or entity with a financial interest in or financial conflict with the subject matter or materials discussed in the manuscript. This includes employment, consultancies, honoraria, stock ownership or options, expert testimony, grants or patents received or pending, or royalties. No writing assistance was utilized in the production of this manuscript.

References 1

Sox HC, Goodman SN. The methods of comparative effectiveness research. Annu. Rev. Public Health 33, 425–445 (2012).

for research and initial research agenda. JAMA 307, 1583–1584 (2012). 8

Hovmand PS. Community-Based System Dynamics. Springer, NY, USA (2014).

2

Brownson RC, Colditz GA, Proctor EK. Dissemination and Implementation Research in Health: Translating Science to Practice. Oxford University Press, NY, USA (2012).

9

Mullins CD, Abdulhalim AM, Lavallee DC. Continuous patient engagement in comparative effectiveness research. JAMA 307, 1587–1588 (2012).

3

Glasgow RE, Chambers D. Developing robust, sustainable, implementation systems using rigorous, rapid and relevant science. Clin. Transl. Sci. 5, 48–55 (2012).

10

4

Glasgow RE, Steiner JF. Comparative effectiveness research to accelerate translation: recommendations for an emerging field of science. In: Dissemination and Implementation Research in Health: Translating Science and Practice. Brownson RC, Colditz G, Proctor E (Eds). Oxford University Press, NY, USA, 72–93 (2012).

Stange KC, Breslau ES, Dietrich AJ, Glasgow RE. State-of-the-Art and future directions in multilevel interventions across the cancer control continuum. J. Natl Cancer Inst. Monogr. 2012, 20–31 (2012).

11

Craig P, Dieppe P, Macintyre S Michie S, Nazareth I, Petticrew M; Medical Research Council Guidance. Developing and evaluating complex interventions: the new Medical Research Council guidance. BMJ 337, a1655 (2008).

12

Stange KC, Glasgow RE. Considering and reporting important contextual factors in research on the patientcentered medical home. In: Agency for Healthcare Research and Quality. AHRQ Publication No. 13-0045-EF, MD, USA (2013).

13

Cochrane reviews. www.cochrane.org/cochrane-reviews

14

Green LW, Glasgow RE. Evaluating the relevance, generalization, and applicability of research: Issues in external validity and translation methodology. Eval. Health Prof. 29, 126–153 (2006).

5

Bonham AC, Solomon MZ. Moving comparative effectiveness research into practice: implementation science and the role of academic medicine. Health Aff. (Millwood) 29, 1901–1905 (2010).

6

Morrato EH, Concannon TW, Meissner P, Shah ND, Turner BJ. Dissemination and implementation of comparative effectiveness evidence: key informant interviews with Clinical and Translational Science Award institutions. J. Comp. Eff. Res. 2, 185–194 (2013).

7

Selby JV, Beal AC, Frank L. The Patient-Centered Outcomes Research Institute (PCORI) national priorities

future science group

Opinion

www.futuremedicine.com

239

Opinion  Glasgow & Rabin

240

15

Gaglio B, Shoup JA, Glasgow RE. The RE-AIM framework: a systematic review of use over time. Am. J. Public Health 103, e38–e46 (2013).

16

Klesges LM, Williams NA, Davis KS, Buscemi J, Kitzmann KM. External validity reporting in behavioral treatment of childhood obesity: a systematic review. Am. J. Prev. Med. 42, 185–192 (2012).

17

Ritzwoller DP, Sukhanova A, Gaglio B, Glasgow RE. Costing behavioral interventions: a practical guide to enhance translation. Ann. Behav. Med. 37, 218–227 (2009).

18

Rogers EM. Diffusion of Innovations. Free Press, NY, USA (2003).

19

Stirman SW, Miller CJ, Toder K, Calloway A. Development of a framework and coding system for modifications and adaptations of evidence-based interventions. Implement. Sci. 8, 65 (2013).

J. Compar. Effect. Res. (2014) 3(3)

20

Pawson R, Tilley N. Realistic Evaluation. Sage Publications, CA, USA (1997).

21

McDonald KM. Considering context in quality improvement interventions and implementation: concepts, frameworks, and application. Acad. Pediatr. 13, S45–S53 (2013).

22

Stringer EM, Ekouevi DK, Coetzee D et al. Coverage of nevirapine-based services to prevent mother-to-child HIV transmission in 4 African countries. JAMA 304, 293–302 (2010).

23

Glasgow RE. What does it mean to be pragmatic? Pragmatic methods, measures, and models to facilitate research translation. Health Educ. Behav. 40, 257–265 (2013).

24

Rabin BA, Glasgow RE, Kerner JF, Klump MP, Brownson RC. Dissemination and implementation research on community-based cancer prevention: a systematic review. Am J. Prev. Med. 38, 443–456 (2010).

future science group

Implementation science and comparative effectiveness research: a partnership capable of improving population health.

Implementation science and comparative effectiveness research: a partnership capable of improving population health. - PDF Download Free
2MB Sizes 2 Downloads 3 Views