ORIGINAL ARTICLE

Urban and Rural Utilization of Evidence-Based Practices for Substance Use and Mental Health Disorders Jo Ann Walsh Dotson, RN, PhD;1,2,3 John M. Roll, PhD;1,2,3,4 Robert R. Packer, PhD;1,2,3 Jennifer M. Lewis, PhD;1,2,3,4 Sterling McPherson, PhD;1,2,3,4 & Donelle Howell, PhD1,2,3,4 1 College of Nursing, Washington State University, Spokane, Washington 2 Program of Excellence in the Addictions, Washington State University, Spokane, Washington 3 Program of Excellence in Rural Mental Health and Substance Abuse Treatment, Washington State University, Spokane, Washington 4 Translational Addiction Research Center, Washington State University, Pullman, Washington

Abstract Purpose: The purpose of the investigation was to examine variations in Funding: This research was supported by a grant from the Life Science Discovery Fund to the Program of Excellence in Rural Mental Health and Substance Abuse Treatment (John Roll, PI). For further information, contact: Jo Ann Walsh Dotson, RN, PhD, Assistant Professor, College of Nursing, PO Box 1495, Spokane, WA 99210-1495; e-mail: [email protected]. doi: 10.1111/jrh.12068

evidence-based practice (EBP) utilization between rural and urban mental health and substance abuse prevention provider agencies in Washington State. Methods: We conducted a secondary analysis of the 2007 EBP Survey, which was administered to 250 of Washington State Department of Social and Health Services’ contracted mental health and substance abuse treatment agencies. The survey solicited input from solo and group practices across the state on EBP implementation, successes, and challenges. Findings: Most mental health and substance abuse treatment agencies used more than 1 EBP, although rural substance abuse agencies were less likely to do so than urban agencies. Rural substance abuse agencies were more likely to be solo than group practices. Urban agencies reporting significantly more collaboration with universities for EBP training, although training by internal staff was the most commonly reported training mechanism regardless of agency focus or location. Over half of agencies reported conducting no systematic assessment of EBPs, and of those who did report systematic assessment, most used outcome monitoring more than program evaluation or benchmarking. Urban and rural mental health and substance abuse prevention providers reported shortages of appropriately trained workforce and financing issues available to pay for EBPs as the greatest barriers to utilization. Conclusions: Challenges to EBP utilization and fidelity should be monitored as EBPs contribute to the delivery of high-quality care. Collaborations between universities and rural agencies may support an agency’s abilities to adopt EBPs, train staff, and systematically assess impact.

Key words access to care, evidence-based practices, mental health, substance abuse, utilization of health services.

Evidence-Based Practice Evidence-based practice (EBP) is variably defined, but Sackett and associates’ definition as the “ . . . conscientious, explicit, and judicious use of current best evidence in making decisions about the care of individual patients,”1 captures the intent. EBP incorporates

292

the best available research evidence, clinical expertise and patient values and preferences into decisions regarding patient care, and it is intended to “ . . . increase the quality and effectiveness of health care.”2 EBPs are research-based, and they are often developed in urban areas that house research universities and facilities. The adaptability of EBPs to rural settings has been

c 2014 National Rural Health Association The Journal of Rural Health 30 (2014) 292–299 

Variations in Evidence-Based Practice Utilization

Dotson et al.

identified as problematic, attributable to limited access to information and training on EBPs by rural practitioners,3 financial and technological barriers to accessing databases and research staff able to assist and support practitioners’ efforts to search the literature4 and the lack of EBPs that are applicable to general practice.2 In the interest of decreasing disparities in health outcomes, it is important that EBPs be both accessible and acceptable for rural practitioners and their patients. Approximately 59 million, or 20% of the US population lives in rural areas.5 Bennet and associates report that rural dwellers are generally older, more likely to have chronic diseases and be obese, have higher rates of poverty, and are more likely to be uninsured.6 Rural dwellers also have less access to medical and behavioral health care providers and more “natural barriers” (eg, travel distance to services, extreme weather) which further limit health care access.7,8 The purpose of this investigation was to examine variation in EBP utilization between rural and urban mental health and substance abuse prevention provider agencies in Washington State. This investigation was a secondary analysis of data collected for the June 2007 EBP Survey, a component of Washington State’s Mental Health Transformation effort.9 The 19-item survey was developed and administered as part of the statewide effort to inform transformation partners about mental health and substance abuse prevention practices in Washington. Questions solicited information on the types of services providers provided (scope and ability to provide chemical dependency and mental health services), type of practice (solo or not), population service area, and utilization, training opportunities, implementation success, challenges, and future plans for EBPs. The survey is available as Appendix C in the McBride and colleagues’ report.9

Table 1 Evidence-Based Practices Included in EBP Survey 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34

Aggression Replacement Training Assertive Community Treatment(ACT/PACT) Behavioral Treatment for Substance Abuse in Schizophrenia Brief Strategic Family Therapy Cognitive Behavior Therapies Contingency Management (co-occurring) Dare to Be You Dialectical Behavioral Therapy Eye Movement Desensitization & Reprocessing (EMDR) Family Integrated Transitions Family Psychoeducation Functional Family Therapy Gatekeeper Program Illness Self-Management/Illness Management & Recovery Incredible Years Integrated Dual Disorders Treatment Interpersonal Therapy Medications for Specific Conditions Motivational Enhancement Therapy (MET) Motivational Interviewing (co-occurring) Multifamily Group Treatment (MFG) Multidimensional Family Therapy Multidimensional Treatment Foster Care Multisystemic Therapy Nurse-Family Partnership Parent-Child Interaction Therapy Peer Support Promoting Alternative Thinking Strategies (PATH) Second Step Seeking Safety: A Psychotherapy for Trauma/PTSD & Substance Abuse Strengthening Families Program (SFP) Supported Employment Supported Housing Therapeutic Foster Care

Source: Inventory of Evidence Based Practices (I-EBP) Survey, Appendix C of the McBride and Associates9 report.

of survey completion and the limited rural agency data collected.

Method Participants

Procedure

The 250 provider agencies evaluated in this study were contract providers for the Washington State Department of Social and Health Services’ (DSHS) Mental Health Division (MHD; N = 96) and Division of Alcohol and Substance Abuse (DASA; N = 154). Those 2 divisions merged into the Division of Behavioral Health and Recovery in 2009.10 Two additional DSHS provider agencies (Children’s Administration and Juvenile Rehabilitation Administration) were included in the 2007 EBP Survey; however, data from those agencies were not included in this analysis due to the centralized mechanism

The EBP Survey solicited data regarding agency characteristics and target population, and utilization, training, successes, and barriers to implementation of 34 EBPs. The list of EBPs was developed by transformation stakeholders using an iterative process based on endorsements from national and state expert agencies, research evidence, and utilization of the EBPs in Washington State at the time of data collection. A detailed description of the EBP selection process may be found in the 2007 EBP Survey report9 ; a list of the 34 EBPs is included as Table 1.

c 2014 National Rural Health Association The Journal of Rural Health 30 (2014) 292–299 

293

Variations in Evidence-Based Practice Utilization

Data Analysis Provider agencies were classified as urban or rural using the Rural-Urban Commuting Area (RUCA) codes developed by the Rural Health Research Center at the University of Washington.11 RUCA codes are useful for classifying rural and urban areas because they incorporate not only population density, but also work commuting patterns taking into account proximity to metropolitan centers and health services. The center described 3 categorizations: Categorization A with 4 distinction levels, Categorization B with 3, and Categorization C with 2.12 This study used the simplest, 2-level Categorization C of rural or urban. After categorizing our responding agencies, data were analyzed to assess for disparities in EBP utilization between rural and urban provider agencies. This included 96 MHD agencies (82 urban, 14 rural) and 154 DASA agencies (116 urban, 38 rural). Chi-square analyses and t tests were used to compare categorical and continuous variables (respectively) for urban and rural settings for both MHD and DASA agencies.

Results Treatment Services Provided The treatment services provided by DASA and MHD provider agencies included: (1) intake, assessment, or referral; (2) chemical dependency treatment, (3) mental health treatment, (4) co-occurring disorders treatment, and (5) other services. These services were provided in equivalent frequencies across both rural and urban agencies with 1 exception: rural DASA facilities provided intake, assessment, or referral services at a significantly higher number of locations than did urban DASA facilities (χ 2 = 4.26, P < .05). None of the MHD agencies operated as a solo practice. Of the DASA agencies, significantly more rural agencies operated as a solo practice (1 out of 116 or 1% of urban agencies vs 6 out of 38 or 16% of rural agencies; χ 2 = 14.70, P < .01). Table 2 describes characteristics of responding agencies including services provided, solo practice agencies, service area population, and use of EBPs by provider type.

EBP Training Agencies were asked about the mechanisms used to provide training about EBPs. All agencies reported training by internal staff training as the most commonly used mechanism to disseminate knowledge about EBP implementation, with 75.6% of urban and 60.6% rural MHD agencies, and 80% of urban and 66% of rural DASA agencies, reporting depending upon internal staff. The

294

Dotson et al.

Table 2 Characteristics of Responding Agencies DASA Urban Number of survey responders Services provided Intake, assessment, or referral Chemical dependency treatment Mental health treatment Co-occurring disorders treatment Other Solo vs multiprovider practice Population of service area Less than 5,000 Between 5,001 and 50,000 Between 50,001 and 500,000 Greater than 500,000 Using more than 1 EBP

MHD

Rural

Urban

Rural

116

38

82

14

90% 98% 41% 55% 31% 1%

100%a 97% 42% 44% 21% 16%b

90% 34% 93% 49% 44% 0%

86% 43% 86% 64% 50% 0%

9% 16% 46% 29% 90%

37% 45% 18% 0% 74%a

6% 7% 54% 33% 95%

7% 79% 14% 0% 93%

Note. Bold values represent significant differences. a P < .05. b P < .01.

difference between urban and rural agency use of internal training was significant for DASA agencies (χ 2 = 9.47, P < .01). Significantly more urban than rural agencies had collaborations with universities for training purposes, with 19.8% of urban MHD compared to 5.3% of rural MHD (χ 2 = 11.50, P < .01) and 22% of urban DASA compared to 10.4% of rural DASA (χ 2 = 7.35, P < .01) reporting collaborations. Urban MHD agencies used provider-to-provider training more frequently than rural MHD agencies, and rural MHD and DASA were more likely to use outside accreditation than urban MHD and DASA agencies; however, there were no significant differences between these training mechanisms.

EBP Systematic Assessment Agencies were asked if they were conducting systematic assessment of the effects of EBPs, and if so, if they were using benchmarking, outcome monitoring, program evaluation, or some other method. Over half of all agencies reported they were conducting no systematic assessment. Outcome monitoring was the method most used by those agencies reporting conducting systematic assessment, with 78.4% of urban and 100% of rural MHD agencies, and 91.7% of urban and 100% of rural DASA agencies reporting using outcome monitoring. Program evaluation was the second most reported assessment mechanism used by both urban and rural practices, with 73% of urban and 50% of rural MHD agencies and 83.3% of urban and 72.7% of rural DASA agencies using this method. Benchmarking was the least commonly

c 2014 National Rural Health Association The Journal of Rural Health 30 (2014) 292–299 

Dotson et al.

used mechanism, but of those using it, significantly more urban MHD and DASA agencies used this mechanism compared to rural, with 32.4% of urban and no (0%) rural MHD reporting the method and 31.3% of urban and 18.2% of rural DASA agencies reporting the method.

EBP Implementation and Effectiveness Urban and rural MHD and DASA agencies reported on the number of EBPs they currently implement, the mean implementation success, and the mean EBP effectiveness. Across urban and rural providers, there were no statistical differences for the number of EBPs implemented for either MHD (M = 3.91, M = 5.32, respectively; t = −1.22, P > .05) or DASA (M = 2.71, M = 2.03, respectively; t = 1.40, P > .05) agencies. Urban DASA agencies were significantly more likely to report using more than 1 EBP, compared to rural DASA agencies (90% of urban agencies vs 74% of rural agencies; χ 2 = 6.89, P < .05). Urban and rural MHD agencies reported almost equivalent usage of more than 1 EBP (95% of urban agencies vs 93% of rural agencies). On a scale from 1 (not at all) to 5 (extremely), there was also almost no difference found on the mean implementation success value for urban and rural MHD (M = 3.76, M = 3.59, respectively; t = 0.99, P > .05) or DASA agencies (M = 3.71, M = 3.70, respectively; t = −0.29, P > .05). While there was no statistical difference in the mean effectiveness between urban and rural DASA agencies (M = 3.71, M = 3.47, respectively; t = 1.61, P > .05), MHD agencies reported a significantly higher mean effectiveness score for urban-based agencies (M = 3.83, M = 3.59, respectively; t = 2.14, P < .05). However, this difference appears relatively small.

Barriers to Provision of EBP Practices reported on barriers to providing EBPs; the findings are included in Figure 1. Shortage of appropriately trained workforce was reported by 45.8% of urban and 37.1% of rural MHD agencies and 44.8% and 57.6% of urban and rural DASA agencies, respectively. Financing issues in paying for EBPs were reported more by urban MHD and DASA agencies at 50.7% and 43.0%, respectively, compared to rural MHD and DASA agencies at 38.6% and 28.3%, respectively. Attaining or maintaining fidelity to EBP model standards was identified by a similar number of urban and rural agencies, at 23.1% and 21.2% for urban and rural MHD agencies, respectively, and 20.2% and 22.3% of urban and rural DASA agencies, respectively. Modification of EBPs to fit local needs was cited by 16.9% of urban MHDs compared to 28.0% of rural, and by 12.8% of urban DASA agencies

c 2014 National Rural Health Association The Journal of Rural Health 30 (2014) 292–299 

Variations in Evidence-Based Practice Utilization

compared to 20.1% of rural. Resistance by practitioners or others to implementing EBPs was not significantly different between urban and rural MHDs (7.2% and 3.0%, respectively), but significantly more rural DASA agencies reported that implementation resistance was a barrier compared to urban DASA agencies (18.5% compared to 7.7%). Challenges in complying with rules and regulations was also reported significantly more by rural than urban DASA agencies (10.3% compared to 3.5%).

EBP Service Disparities Urban and rural agencies indicated similar levels of agreement that their agencies serve populations for which there are no known or available EBPs (44.3% of urban MHD agencies vs 50.0% of rural MHD agencies; 24.1% of urban DASA agencies vs 18.4% of rural DASA agencies). Interest in EBP implementation was consistent across urban and rural MHD and DASA agencies, with the majority reporting that they were at least somewhat interested in EBP implementation (20.7% of urban MHD agencies vs 21.4% of rural MHD agencies; 19.0% of urban DASA agencies vs 34.2% of rural DASA agencies) if not very interested (41.5% of urban MHD agencies vs 50.0% of rural MHD agencies; 40.5% of urban DASA agencies vs 28.9% of rural DASA agencies) or extremely interested (35.4% of urban MHD agencies vs 21.4% of rural MHD agencies; 33.6% of urban DASA agencies vs 21.1% of rural DASA agencies).

Initiatives Used to Promote EBP Adoption Agencies were asked about the mechanisms used to promote the adoption of EBPs. The most frequently reported mechanisms reported by all agencies, regardless of substance abuse or mental health focus or location, were increasing awareness about available EBPs and training about EBPs. Between 60% and 80% of agencies reported using these mechanisms. Other mechanisms used included incorporation of EBPs in contracts, monitoring of fidelity, modification of information systems/data reports, modification of paperwork/documentation to fit EBPs, and financial incentives; findings are presented in Figure 2. Among MHD agencies, urban treatment settings were more likely to offer modification of paperwork/documentation (43.9% of urban vs 28.6% of rural agencies), financial incentives (9.8% of urban agencies vs 0% of rural agencies) or no initiative, while rural settings appear to report a higher rate of modification of information systems/data reports (42.9% of rural vs 30.5% of urban agencies); only the differences between modification of paperwork and financial incentives were significant. Among DASA agencies, significant differences

295

Variations in Evidence-Based Practice Utilization

Dotson et al.

Figure 1 Barriers to Utilizing EBPs.

MHD Shortage of appropriately trained workforce

37.1%

Financing issues in paying for EBPs

38.6% 16.9%

EBP needs modificaƟon to fit local needs AƩaining or maintaining fidelity to EBP model standards Resistance to implemenƟng EBPs from pracƟƟoners or others Rules & RegulaƟons

50.7%

28.0%

23.1% 21.2%

Urban (753) Rural (132)

7.2% 3.0% 10.5% 5.3% 20.5% 25.0%

None Other

45.8%

6.5% 9.1%

DASA Shortage of appropriately trained workforce

44.8%

Financing issues in paying for EBPs

28.3%

AƩaining or maintaining fidelity to EBP model standards

Rules & RegulaƟons

20.2% 22.3% 7.7%

Rural (184) 18.5%

28.5% 22.8% 0.4% 1.1%

were noted between urban and rural settings with more urban than rural practices monitoring program fidelity and modifying information systems/data reports.

Discussion Dissemination of EBPs to clinical settings is impacted by site readiness and resources, organizational capacity, agency knowledge about EBPs, and availability of training for site clinicians and support staff. Our data revealed that urban agencies were more likely to provide training regarding EBPs than rural agencies. Training opportunities by coworkers, area trainers, and university partners are more limited for rural agencies that typically have fewer staff, as suggested by the greater percentage

296

Urban (688)

3.5% 10.3%

None Other

43.0%

12.8% 20.1%

EBP needs modificaƟon to fit local needs

Resistance to implemenƟng EBPs from pracƟƟoners or others

57.6%

of solo agencies in rural settings. These smaller staffs are less likely to be able to take time to attend training and to have the opportunity to, in turn, present to coworkers what they learned. Rural communities, by nature of their sparse populations, are also less likely to have access to trainers and universities that can provide or facilitate EBP training. Ouimette and associates13 identified co-occurring diagnoses of mental and substance abuse conditions as a common issue, with providers in rural settings more frequently assuming responsibility for management of both conditions. MHD in rural settings were more likely than agencies in urban settings to offer treatment for co-occurring disorders than chemical dependency treatment, likely due to the limited access to targeted substance abuse treatment programs

c 2014 National Rural Health Association The Journal of Rural Health 30 (2014) 292–299 

Variations in Evidence-Based Practice Utilization

Dotson et al.

Figure 2 Initiatives Used to Promote EBP Adoption.

MHD

80.5% 78.6%

Increase awareness about EBPs

78.0% 71.4%

Training 32.9% 28.6%

Incorporaon of EBPs in contracts

45.1% 42.9%

Monitoring of fidelity Modificaon of informaon systems/data reports Modificaon of paperwork/documentaon Financial incenves

30.5% 42.9% 28.6% 0%

7.3% 7.1%

None

7.3% 0%

a

Rural (14)

a

9.8%

Other

Urban (82)

43.9%

DASA 69.8% 63.2%

Increase awareness about EBPs

76.7% 71.1%

Training Incorporaon of EBPs in contracts

18.1% 21.1%

Monitoring of fidelity

21.1%

Modificaon of informaon systems/data reports Modificaon of paperwork/documentaon

25.9% 15.8%

None a

a a

Urban (116) Rural (38)

36.2% 31.6% 12.1% 15.8%

Financial incenves Other

43.1%

6.9% 2.6% 7.8% 15.8%

= P < .05

in rural settings. The likelihood of rural DASA agencies being solo practices also likely affected their ability to serve clients with co-occurring disorders. Ouimette and associates recommend cross-training, which enables practitioners to implement EBPs including pharmaceutical and behavior management of these co-occurring conditions.13 This investigation revealed challenges to adoption and implementation of EBPs in urban and rural settings, with a shortage of appropriately trained workforce identified more frequently by substance abuse prevention agencies in rural settings. Curry and associates propose a strategy

c 2014 National Rural Health Association The Journal of Rural Health 30 (2014) 292–299 

titled “academic detailing” (AD) to address the shortage of trained providers.3 AD uses trained educators to transfer knowledge and methods to local clinicians, and it is an approach which has been used successfully in Washington State and other settings with rural communities. Provider resistance to EBPs has been identified as a concern for rural DASA agencies; continued efforts to assure that providers and support staff in rural communities have access to training on how to search and critically appraise the research literature is an important strategy, and one that would benefit from active partnerships between practice and education.14

297

Variations in Evidence-Based Practice Utilization

Resistance to implementing EBPs by providers or others was also identified as an impediment; it is encouraging to note that AD was documented as helping to address this type of provider reticence. Lack of support from leadership or clinic management has been recognized as a hindrance to EBP implementation.2 Staff acceptance, organizational capacity, and community interest and acceptance of the proposed EBPs are keys to effective implementation.15 Hillburn and associates suggest that shared governance between health professionals in rural practices can positively impact staff acceptance and promote organizational capacity.16 In addition to clinician training, consumer knowledge of the proposed EBPs and congruence with existing practice can affect implementation and success.17 This study has some notable limitations. First, this investigation is based on data collected in 1 state in the Pacific Northwest. While these data may well be generalizable to other states in the region, the analyses reported here should be replicated in other regions throughout the United States to investigate this phenomenon more thoroughly. Second, the original survey provided limited direction or definition to respondents regarding survey terms, allowing broad interpretation within the context of statewide standards. Third, this investigation was a descriptive analysis, but an important future investigation would be to begin to identify predictors of some of the outcomes, such as EBP effectiveness and whether or not predictors vary across rural and urban treatment settings. While such analyses are beyond the scope of this study, future research should consider such analyses. The need to modify EBPs was also identified as much more important in rural agencies; this finding is consistent with the literature, including a paper by Goodkind and associates18 which recommends that EBP should actually be “practice-based evidence,” incorporating practices that work in various communities and settings with EBP. Modifications to EBPs should, however, be monitored to assure that practice fidelity is retained and health outcomes effectively addressed; this is notable considering over half of the survey respondents reported no systematic assessment of EBPs. Lundgreen and associations reported that site-based modifications of EBPs were common, and that the distinction between minor and major changes, which could impact model fidelity, was blurred.19 Attention to model fidelity is important to client outcomes. Future research regarding the impact of site and community preparation and knowledge of EBPs, the impact of targeted training on EBP effectiveness in rural communities, and the assessment of model “shift” in rural versus urban settings is warranted. Rural and urban dwellers deserve high-quality mental and substance

298

Dotson et al.

abuse prevention services; EBPs provide the structure for these critical programs.

References 1. Sackett D, Rosenberg W. Evidence based medicine: What it is and what it isn’t. Brit Med J (Intl Ed.). 1996;312(7023): 71-72. 2. Taylor J, Wilkinson D, Blue I. Towards evidence-based general practice in rural and remote Australia: an overview of key issues and a model for practice. Rural and Remote Health [Serial Online]. 2001; no.106 Available at: http://www.rrh.org.au/publishedarticles/article print 106.pdf/. 3. Curry W, Lengerich E, Kluhsman B, et al. Academic detailing to increase colorectal cancer screening by primary care practices in Appalachian Pennsylvania. BMC Health Serv Res. 2011;11(1):112-120. 4. O’Lynn C, Luparell S, Winters C, Shreffler-Grant J, Lee H, Hendrickx L. Rural nurses’ research use. Online J Rural Nurs Health Care. 2009;9(1):34-45. 5. Bennet K, Olatosi B, Probst J. Health Disparities: A Rural-Urban Chartbook. Columbia, SC: South Carolina Rural Health Research Center; 2008. 6. 2010 Census Urban and Rural Classification and Urban Area Criteria. Available at: http://www.census.gov/geo/ reference/ua/urban-rural-2010.html. Accessed April 24, 2012. 7. Stamm B. Rural Behavioral Health Care: An Interdisciplinary Guide. Washington, DC: American Psychological Association; 2003:3-9. 8. Strode AD, Roll JM. Disparities of Mental Health Services Between Urban and Rural Communities in Washington State. Spokane, WA: Washington Institute for Mental Illness Research and Training, Washington State University; 2007. 9. McBride D, Voss W, Mertz H, Villanueva T, Smith G. Mental Health Evidence Based Practices (EBPs) in Washington State. Seattle, WA: The Washington Institute for Mental Illness Research and Training – Western Branch, University of Washington School of Medicine Department of Psychiatry and Behavioral Sciences; 2007. 10. DSHS. About Division of Behavioral Health and Recovery (DBHR). Available at: http://www.dshs.wa.gov/dbhr/ aboutdbhr.shtml. Accessed October 16, 2012. 11. RHRC. Rural-Urban Commuting Area Codes (RUCAs). Available at: http://depts.washington.edu/uwruca/ index.php. Accessed July 13, 2011. 12. RHRC. RUCA Data. Availale at: http://depts.washington. edu/uwruca/ruca-uses.php. Accessed July 13, 2011. 13. Ouimette P, Jemelka R, Hall J, Brimner K, Krupski A, Stark K. Services to patients with dual diagnoses: findings from Washington’s Mental Health Service System. Subst Use Misuse. 2007;42(1):113-127.

c 2014 National Rural Health Association The Journal of Rural Health 30 (2014) 292–299 

Dotson et al.

14. Salbach N, Jaglal S, Korner-Bitensky N, Rappolt S, Davis D, Duncan P. Practitioner and organizational barriers to evidence-based practice of physical therapists for people with stroke . . . includes commentary by Duncan PW, and author response by Salbach NM and Korner-Bitensky N. Phys Ther. 2007;87(10):1284-1305. 15. Amodeo M, Lundgren L, Cohen A, et al. Barriers to implementing evidence-based practices in addiction treatment programs: comparing staff reports on motivational interviewing, adolescent community reinforcement approach, assertive community treatment, and cognitive-behavioral therapy. Eval Program Plann. 2011;34(4):382-389. 16. Hillburn K, McNulty J, Jewett L, Wainwright K. Build upon strengths and leadership practices using EBP. Nurs Manage. 2006;37(11):15-16.

c 2014 National Rural Health Association The Journal of Rural Health 30 (2014) 292–299 

Variations in Evidence-Based Practice Utilization

17. McGovern M, Fox T, Xie H, Drake R. A survey of clinical practices and readiness to adopt evidence-based practices: dissemination research in an addiction treatment system. J Subst Abuse Treat. 2004;26(4):305312. 18. Goodkind J, Ross-Toledo K, John S, et al. Promoting healing and restoring trust: policy recommendations for improving behavioral health care for American Indian/Alaska native adolescents. Am J Commun Psychol. 2010;46(3/4):386-394. 19. Lundgren L, Amodeo M, Cohen A, Chassler D, Horowitz A. Modifications of evidence-based practices in community-based addiction treatment organizations: a qualitative research study. Addict Behav. 2011;36(6):630-635.

299

Urban and rural utilization of evidence-based practices for substance use and mental health disorders.

The purpose of the investigation was to examine variations in evidence-based practice (EBP) utilization between rural and urban mental health and subs...
319KB Sizes 0 Downloads 3 Views