Adm Policy Ment Health DOI 10.1007/s10488-015-0714-1

POINT OF VIEW

Democratizing Implementation and Innovation in Mental Health Care Glenn Saxe1 • Mary Acri1,2

Ó Springer Science+Business Media New York 2016

Abstract Improvements in the quality of mental health care in the United States depend on the successful implementation of evidence-based treatments (EBT’s) in typical settings of care. Unfortunately, there is little evidence that EBT’s are used in ways that would approximate their established fidelity standards in such settings. This article describes an approach to more successful implementation of EBT’s via a collaborative process between intervention developers and intervention users (e.g. providers, administrators, consumers) called Lead-user Innovation. Leaduser Innovation democratizes the implementation process by integrating the expertise of lead-users in the delivery, adaptation, innovation and evaluation of EBT’s. Keywords Lead-user innovation  Democraticizing innovation  Treatment innovation

Introduction Most strategies offered to improve the quality of mental health care in the United States involve the implementation of evidence-based treatments (EBT’s) in typical settings of care (Drake et al. 2001; Garland et al. 2008; Hoagwood

& Mary Acri [email protected] 1

Department of Child and Adolescent Psychiatry, New York University, New York, USA

2

The McSilver Institute for Poverty, Policy, and Research, The Silver School of Social Work, New York, USA

et al. 2001; Kazdin 2008). Yet, despite two decades of effort to accomplish this task, there is little evidence that EBT’s are implemented—in ways that would approximate their respective fidelity standards—at any meaningful scale (Beidas et al. 2015; Hoagwood et al. 2001; Hoagwood and Olin 2002; Garland et al. 2010; Kendall and Beidas 2007; Torrey et al. 2014). There is great caution expressed within our field about implementing an EBT in such a way that deviates from the fidelity standard used in the clinical trial from which it gained its evidence-base (Moore et al. 2013). In some cases, it has been shown that deviations from fidelity have lead to smaller effect sizes than those observed in the clinical trials (Breitenstein et al. 2010). Although we are sympathetic to this concern, we also believe it represents the primary barrier to our field’s capacity to improve the quality of mental health care in the United States (Garland et al. 2008). Our field needs to extract itself from a terrible paradox: In the pursuit of effect sizes commensurate with those observed in the clinical trial that established the evidence-base by which an intervention became an EBT, we insist on a fidelity-standard that makes the achievement of that effect size, near impossible. And this pursuit undermines the capacity of that EBT to achieve impact on the mental health problem it was designed to address, at a scale that can have more than marginal impact on the population that suffers from that mental health problem.1In this article, we describe the nature of this paradox and offer an approach that may help find a way out of it.

1

EBT’s delivered in usual care settings, with the intervention supported by grant funds, are not considered in this article. We do not believe such an approach is scalable.

123

Adm Policy Ment Health

Implementation with Incomplete Information What sort of information does an intervention’s fidelity standard contain? A fidelity standard contains information about procedures that should be delivered by a mental health provider to a person with a mental health problem with the intention of leading to some defined improvement in that mental health problem. Such procedures are usually written in a standardized format so that they can be reliably delivered. What sort of information does an intervention’s fidelity standard not contain? An intervention’s fidelity standard will usually contain little information about how the procedures it specifies can be applied to the specific setting in which it would be implemented. This missing information includes the complexity and diversity of its clinical population and its providers, supervisors, and administrators. The missing information also includes the complexity of the setting’s organizational processes (including finances), and that of the specific service system in which that organization must operate. Successful implementation would involve accounting for the missing information via the development of new or adjusted procedures that would ‘accompany’ the delivery of the given intervention: but many of these would now be considered fidelity violations (e.g. deliver the EBT with less frequent sessions, deliver the EBT to broader clinical populations). In essence, there are innumerable conditional variables that are unaccounted for (via an intervention’s fidelity standard), in any implementation of that intervention. These conditional variables may include patient characteristics (e.g. clinical risk, comorbidity), provider characteristics (e.g. level of training, openness EBT’s), organizational characteristics (e.g. support of leadership, openness to change, morale), and service system characteristics (e.g. degree of collaboration, reimbursement, resources). There is an emerging implementation science literature documenting the importance of these conditional variables (Beidas et al. 2015). Clinical trials typically do not account for these conditional variables, as each variable would require an additional interaction term in data analysis. Accordingly, most EBT’s are silent about almost all conditional variables that must be understood for successful implementation. How, then, to implement with more complete information?

Completing the Information by Democratizing the Implementation Process If the intervention’s fidelity standard does not contain the information that would guide successful implementation in a given setting, where is this information located, and who has access to it? This information is readily available and is

123

contained in the expertise, experience, and ideas of the individuals at that specific setting, particularly the providers, administrators, and consumers of the intervention. How to leverage that information for successful implementation? By democratizing the implementation process. But the specific EBT must be open to that democratic process. In 2005, Eric von Hippel, the Director of the Center for Innovation at MIT’s Sloan School of Management published a book called Democratizing Innovation that detailed a model of innovation that leveraged the knowledge and expertise of what he called ‘lead-users’. Leadusers are expert practitioners of specific technologies (e.g. surgeon’s use of operative equipment, mountain bike users, users of open source software). Such lead-users have a great deal of knowledge of what they need for their own purposes and are motivated to collaborate on the innovation of the technology so that it may better serve their needs. Von Hippel’s group created processes by which lead-users could create such innovations with the technologies they cared about. Most powerfully, lead-user innovation involves processes for communities of users to share their innovations and improve them together. It is beyond the scope of this article to detail the process of lead-user innovation or review the diversity of its applications. Details can be found in these articles (Harhoff et al. 2003; Morrison et al. 2000; Von Hippel 1986; Von Hippel and Katz 2002). Our group has adapted lead user innovation to implement trauma-informed care in usual service settings in order to address the informational problems detailed in this article (Saxe et al. 2016; Saxe and Acri 2016; Saxe et al. 2012). Based on our experience, we offer an approach to the implementation of EBT’s that we believe may offer opportunities for a much greater scale of impact on the quality of care for individuals with mental illness than has previously been achieved.

Democratizing Implementation and Innovation in Mental Health Care An EBT is usually implemented via the training, consultation, and technical assistance provided by an intervention developer—or someone certified by them- so that an organization can deliver the intervention with sufficient fidelity. Adjustments to the fidelity criterion are rarely considered formally and- in fact—fidelity is rarely measured in the process of implementation. To this point, Schoenwald et al. (2011) notes that the main type of fidelity measurement being conducted in psychotherapy research is treatment adherence, and even this type of fidelity measurement is not done regularly.

Adm Policy Ment Health

Accordingly, there is likely large variation in the fidelity that is actually delivered within most implementations of EBT’s. If variation in fidelity were to be thoughtfully considered within a formalized process in the planning for an implementation of an EBT, it would enable tremendous opportunities for innovation and impact. Such variation would be considered—with knowledge of the many factors at play for successful implementation within a given setting. Adjusting the processes and procedures within an intervention to improve fit—if standardized—can serve as an innovation to the intervention that can benefit other settings with similar needs. These innovations can then, themselves, be evaluated. This process of deciding how to adjust the fidelity standard for an intervention would require collaboration between intervention developers (or their proxies) and intervention users (e.g. providers, administrators). Such a collaborative process would leverage the knowledge and expertise ‘on the ground’ so that the intervention would have a greater likelihood of meeting the needs of those who would benefit from it. To anchor this collaborative, decision-making process, we have created a tool called the Intervention Implementation and Innovation Grid, shown in Table 1. This two dimensional grid considers both the flexibility of the chosen intervention and of the organization in which it will be implemented, and guides decision making for innovation and implementation based on the limits of this defined flexibility/inflexibility. This tool guides developers, providers, and organizations through four sequential steps (Steps 1–4) to guide implementation and innovation decisionmaking. As can be seen, this process begins with basic decisions on the bottom-line limits of flexibility on intervention fidelity (Step 1), and continues through decisions on the bottom-line limits in flexibility of the organization (Step 2), through decisions to innovate above the minimal fidelity standard that would enable the intervention to be more successfully implemented (Step 3), to plans for how the intervention—with its minimal fidelity standard and innovation above that standard—will be successfully delivered within the organization (Step 4). Importantly, this process facilitates a critical early appraisal that may conclude the

intervention is not a good fit for the organization (i.e. if the organization’s bottom line requirements cannot accommodate the interventions minimal fidelity standards). We describe each of these four steps next: Step 1: Minimal intervention fidelity decisions These are the bottom-line decisions about the flexibility of the intervention. What is absolutely not changeable or there is little confidence the intervention will yield the intended results? This domain of decision-making is primarily the responsibility of the intervention developer (or proxy). There are very tough tradeoffs that need to be considered. The more that is required to reach fidelity, the less likely the intervention will be able to be adapted for the needs of the organization. Ultimately this will undermine the capacity of the intervention to achieve results. The target to achieve here is minimal fidelity. What is the minimum that should be required of an organization and its providers to achieve sufficient fidelity? A given organization may be able to deliver more than this minimal fidelity standard, but it is very important to establish this bottom line limit of the flexibility of the intervention. It is important to acknowledge that arriving at a decision about minimal fidelity is a not a trivial process and is, optimally, guided by empirical information. Future research should help clarify the optimal way for decisions to be made about minimal fidelity. Step 2: Organizational requirement decisions These are the bottom-line decisions about the flexibility of the organization to adapt and implement the intervention. What is absolutely not changeable within the organization based on a variety of factors such as mission, policy, regulations, or financial constraints? This domain of decision-making is primarily left to administrators and practitioners within the organization. There are also tough tradeoffs here. The more that cannot be changed within organizational practice, the less likely the intervention can be delivered with even minimal fidelity. On the other hand, organizations that initially express a willingness to be more flexible then they are able to sustain will ultimately build a program that will not last. Examples of decisions on

Table 1 Intervention implementation and innovation grid

Not changeable Changeable

Intervention factors

Organizational factors

Step 1: Minimal intervention fidelity decisions Factors within the intervention that cannot be changed

Step 2: Organizational requirement decisions Factors within the organization that cannot be changed

Step 3: Intervention innovation decisions Factors within the intervention that can be changed, and still generate confidence the intervention can yield desired results

Step 4: Organizational flexibility decisions Factors within the organization that can be changed without resulting in practices that prohibitively deviate from the policy, regulatory, or financial constraints of the organization

123

Adm Policy Ment Health

bottom line organizational requirements, and its possible impact on intervention implementation, include a policy of once weekly therapy sessions at 45 min each and an intervention that requires twice weekly sessions of at least 60 min, or a policy that offers once monthly supervision sessions and an intervention that requires once weekly supervision. Step 3: Intervention innovation decisions With knowledge of the bottom line limits to both intervention and organizational flexibility: intervention developers, in collaboration with their lead user partners, adjust the standards of the intervention to best fit the needs of the organization. Any adjustment to the intervention above the platform of the established minimal fidelity standard is considered an innovation to the intervention and should, itself, be standardized with a view to its potential replication in other, similar, organizations. This standardization of these innovations is necessary for their evaluation. Ultimately, an intervention that can be developed with a standardized minimal fidelity and a growing set of standardized and evaluated innovations meant to facilitate the intervention’s successful adaptation to a wide variety of settings can strongly determine the impact and utility of the intervention. Step 4: Organizational flexibility decisions The intervention has been adapted for the needs of the organization. Consideration must be given to how- exactlythe intervention will be implemented to improve quality within the organization. Flexibility of organizational practice should follow the decisions already made about organizational requirements. If in the process of considering changes to organizational practice, the lead user team encounters inflexibility, this is reason to recheck whether all organizational requirements have been properly considered. A plan to evaluate the intervention and its innovations for meeting defined organizational needs is extremely important. This evaluation will be critical for enabling the organization to improve the quality of care and to share the information about the intervention, its standardized innovations, the implementation process, and the results, with other organizations with similar needs. It is important to acknowledge that organizations may need to make changes in flexibility decisions over time based on a host of factors (e.g. leadership change, new financing models). Any change in these decisions following implementation must be carefully considered as it may put the implementation in jeopardy. We recommend that a process of repeating this Intervention Implementation and Innovation Grid be instituted as early as possible when such changes are considered. The repetition of this process is important to thoughtfully and proactively adjust decisions

123

to support the implementation, based on the organizational changes that will occur.

Conclusions Strategies that require EBT’s to be delivered at close to their established fidelity standard will continue to result in the limited impact of EBT’s on quality of care in typical settings of care. Such strategies are largely closed to the most important information that will enable interventions to be successfully implemented, sustained, and scaled within these settings. Important opportunities for innovation are missed by such strategies. We have proposed a collaborative process to enable EBT’s to achieve impact on the quality of mental health care at a scale that is desperately needed. By democratizing the intervention implementation and innovation process, we envision a future where our field offers increasingly flexible and useful interventions, shared between users/organizations with similar needs, and continually improved based on the meeting of these needs.

Funding This article was funded by The Substance Abuse and Mental Health Administration, grant number U79SM061280. Dr. Saxe has received funds from authorship of the book entitled Trauma Systems Therapy for Children and Teens, 2nd edition. Compliance with Ethical Standards Conflict of interest

Dr. Acri declares she has no conflict of interest.

Ethical approval This article does not contain any studies with human participants performed by any of the authors.

References Beidas, R. S., Marcus, S., Aarons, G. A., Hoagwood, K. E., Schoenwald, S., Evans, A. C., et al. (2015). Predictors of community therapists’ use of therapy techniques in a large public mental health system. Journal of the American Medical Association, 169, 374–382. Breitenstein, S. M., Gross, D., Garvey, C. A., Hill, C., Fogg, L., & Resnick, B. (2010). Implementation fidelity in community-based interventions. Research in Nursing & Health, 33(2), 164–173. Drake, R. E., Goldman, H. H., Leff, H. S., Lehman, A. F., Dixon, L., Mueser, K. T., & Torrey, W. C. (2001). Implementing evidencebased practices in routine mental health service settings. Psychiatric Services, 52, 179–182. Garland, A. F., Hawley, K. M., Brookman-Frazee, M., & Hurlburt, M. S. (2008). Identifying common elements of evidence-based psychosocial treatments for children’s disruptive behavior problems. The American Academy of Child and Adolescent Psychiatry, 47, 505–514. Garland, A. F., Brookman-Frazee, L., Hurlburt, M. S., Accurso, E. C., Zoffness, R., Haine, R. A., & Ganger, W. (2010). Mental health

Adm Policy Ment Health care for children with disruptive behavior problems: a view inside therapists’ offices. Psychiatric Services, 61, 788–795. Harhoff, D., Henkel, J., & Von Hippel, E. (2003). Profiting from voluntary information spillovers: How users benefit by freely revealing their innovations. Research Policy, 32(10), 1753–1769. Hoagwood, K., & Olin, S. S. (2002). The NIMH blueprint for change report: Research priorities in child and adolescent mental health. Journal of the American Academy of Child and Adolescent Psychiatry, 41(7), 760–767. Hoagwood, K., Burns, B. J., Kiser, L., Ringeisen, H., & Schoenwald, S. K. (2001). Evidence-based practice in child and adolescent mental health services. Psychiatric Services, 52, 1179–1189. Kazdin, A. E. (2008). Evidence-based treatment and practice: New opportunities to bridge clinical research and practice, enhance the knowledge base, and improve patient care. American Psychologist, 63(3), 146–159. Kendall, P. C., & Beidas, R. S. (2007). Smoothing the trail for dissemination of evidence- based practices for youth: Flexibility within fidelity. Professional Psychology: Research and Practice, 38(1), 13–20. Moore, J. E., Bumbarger, B. K., & Cooper, B. R. (2013). Examining adaptations of evidence-based programs in natural contexts. The Journal of Primary Prevention, 34(3), 147–161. Morrison, P. D., Roberts, J. H., & Von Hippel, E. (2000). Determinants of user innovation and innovation sharing in a local market. Management Science, 46(12), 1513–1527.

Saxe, G. & Acri, M. (2016). Democratizing mental health interventions. In K. Strandburg, B. Frischmann, M. Madison (Eds.), Medical Knowledge Commons and User Innovation (Vol. 1). Oxford University Press. Saxe, G. N., Ellis, B. H., Fogler, J., & Navalta, C. P. (2012). Innovations in Practice: Preliminary evidence for effective family engagement in treatment for child traumatic stress– trauma systems therapy approach to preventing dropout. Child and Adolescent Mental Health, 17(1), 58–61. Saxe, G. N., Ellis, B. H., & Brown, A. B. (2016). Trauma Systems Therapy for Children and Teens (2nd ed.). New York: Guilford Press. Schoenwald, S. K., Garland, A. F., Chapman, J. E., Frazier, S. L., Sheidow, A. J., & Southam-Gerow, M. A. (2011). Toward the effective and efficient measurement of implementation fidelity. Administration and Policy in Mental Health and Mental Health Services Research, 38, 32–38. Torrey, W. C., Drake, R. E., Dixon, L., Burns, B. J., Flynn, L., Rush, A. J., & Klatzker, D. (2014). Implementing evidence-based practices for persons with severe mental illnesses. Psychiatric Services, 52, 45–50. Von Hippel, E. (1986). Lead users: a source of novel product concepts. Management Science, 32(7), 791–805. Von Hippel, E., & Katz, R. (2002). Shifting innovation to users via toolkits. Management Science, 48(7), 821–833.

123

Democratizing Implementation and Innovation in Mental Health Care.

Improvements in the quality of mental health care in the United States depend on the successful implementation of evidence-based treatments (EBT's) in...
566B Sizes 1 Downloads 10 Views