This article was downloaded by: [University of Otago] On: 22 December 2014, At: 07:48 Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

Journal of Evidence-Based Social Work Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/webs20

Using Principles of Practice-Based Research to Teach Evidence-Based Practice in Social Work Shane Jaynes

a

a

Social Work, Bloomsburg University , Bloomsburg , Pennsylvania , USA Published online: 09 Jan 2014.

Click for updates To cite this article: Shane Jaynes (2014) Using Principles of Practice-Based Research to Teach Evidence-Based Practice in Social Work, Journal of Evidence-Based Social Work, 11:1-2, 222-235, DOI: 10.1080/15433714.2013.850327 To link to this article: http://dx.doi.org/10.1080/15433714.2013.850327

PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content. This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. Terms &

Downloaded by [University of Otago] at 07:48 22 December 2014

Conditions of access and use can be found at http://www.tandfonline.com/page/termsand-conditions

Journal of Evidence-Based Social Work, 11:222–235, 2014 Copyright © Taylor & Francis Group, LLC ISSN: 1543-3714 print/1543-3722 online DOI: 10.1080/15433714.2013.850327

Using Principles of Practice-Based Research to Teach Evidence-Based Practice in Social Work

Downloaded by [University of Otago] at 07:48 22 December 2014

Shane Jaynes Social Work, Bloomsburg University, Bloomsburg, Pennsylvania, USA

Social work educators are in a good position to encourage the uptake of evidence-based practice more widely throughout the profession. Despite increasing attention being paid to it within professional literature, it seems to be making inroads to practice only very slowly. This article interprets that slow uptake as a function of confusion about the definition and scope of evidence-based practice, and also as an expression of the distance between the practice and research communities within the profession. Practice-based research is introduced as a framework that responds to both of these concerns. Finally the importance of social work education as a catalyst of evidence-based practice is articulated and the five-step evidence-based practice process is explicated with considerations from practice-based research incorporated along with pedagogical implications. Keywords: Practice-based research, education, EBP implementation process

Social work, along with many allied professions, is currently in the midst of working out the meaning of the term evidence-based practice (EBP). Some call it a culture (Baker, Stephens, & Hitchcock, 2010); some call it a decision improvement process (Schoech, Basham, & Fluke, 2006); and some worry that it is becoming a catchphrase “for anything that is done with clients that can somehow be linked to an empirical study, regardless of the study’s quality, competing evidence, or consideration of the client’s needs” (Shlonsky & Gibbs, 2004, p. 137). Sackett, Rosenberg, Gray, Haynes, and Richardson (1996), considered by many to be the originators of the idea, defined EBP as “the conscientious, explicit and judicious use of current best evidence in making decisions about the care of the individual patient. It means integrating individual clinical expertise with the best available external clinical evidence from systematic research” (p. 71). This article is an attempt to explore the enterprise of social work education as a venue for advancing the uptake of EBP throughout the profession. This is certainly not a new issue: Calls to reform education have been part of the EBP agenda since social work began to borrow the model from medicine in the late 1990s (Bilsker & Goldner, 2000). What is not available in the literature, and what this article focuses on particularly, is consideration of how professional education emphasizing EBP can become more impactful by incorporating the principles of practice-based research into its pedagogy. Some context needs to be developed so that new ideas can be brought into relief. Therefore, the first section of the paper focuses on the contested definition of EBP. The second section elaborates principles of practice-based research and contrasts them with the traditional path of knowledge creation and Address correspondence to Shane Jaynes, Social Work, Bloomsburg University, Bloomsburg, PA 17815, USA. E-mail: [email protected]

222

PRACTICE-BASED RESEARCH TO TEACH EBP

223

diffusion. In the third section of the paper, specific adaptations to a particular stage-wise approach to teaching EBP will be offered in keeping with a focus on practice-based research. The Contested Definition of EBP

Downloaded by [University of Otago] at 07:48 22 December 2014

One way to frame the history of EBP in social work is to suggest that it is a recapitulation of an older and deeper issue, that being the divide between practice and research. The author assumes that most readers are familiar with that history, so it will not be elaborated here. However, its underlying themes of differential power, insularity, and ways of knowing are reflected often in explorations of both the diffusion and adoption of EBP. Schoech et al. (2006) refered to a divided profession, in terms of who produces—and who is expected to consume—scholarship about EBP: Few publications that develop and articulate the conceptual underpinnings of EBP are emerging from the experiences or immediate observations of front-line workers and supervisors. Front-line workers are frequently the consumers of academic literature on EBP, but have limited resources to develop and articulate developmental advances based on their observations and research. (p. 58)

The same authors went on to suggest that if this academic discourse continues to ignore the reality of life in agency practice, then EBP might “join the ranks of other nice sounding but difficult to implement social work service delivery constructs such as the strengths perspective and strategic planning” (p. 59). Kirk and Reid (2002) likewise noted that the conversation about utilization of research by practitioners is something that not a single practitioner has contributed to in professional literature; it is solely an academic discourse. They summarized the themes from extant investigations of research utilization by practitioners into three themes: science, and the researchers who follow the scientific process, is the engine of progress; science struggles against organizational banality within practice settings; and individual practitioners are defensive saboteurs of science. Expanding on the third theme they wrote (summarizing the prevailing point of view that they find in the literature), “Practitioners are self-serving and self-protective and lack the intellectual power to be brave about unsupportive research findings. So they subvert research projects : : : they are covert enemies of science” (p. 189). There is not only a gap between research and practice but a polarization as well, which must be recognized and addressed in the project of EBP advance. For instance, one catalyst of resistance from the practice community toward EBP is the open question about how it should be defined, because there are markedly different implications for professional practice that follow from different definitions. One of the formative questions at a 2006 conference on improving the delivery of EBP in social work education was “Can we and should we come to an agreement on what we mean by EBP?” (Zlotnik, 2007, p. 625). The axis on which this question turns is whether EBP is a single intervention or a more thoroughgoing process. If it is a single intervention, then allusions should be made to EBPs. On the other hand, if it’s a broader way of practicing, then it’s an endeavor of many component parts. For reasons elaborated below, the author believes that the process definition is the more pragmatic of the two and offers the best potential for increasing the quality of services to clients. Regarding an EBP as a single intervention, Patterson and McKiernan (2010) described knowledge diffusion thusly, “Having adhered to all the rigors of behavioral research methodologies the logical flow would be to implement findings within community healthcare agencies” (p. 334). What they mean by rigors of behavioral research is the establishment of internal validity— confidence that a single variable is responsible for fluctuations in outcome because all confounding variables have been accounted for. They continue, “Unfortunately, clinical practices and service system innovations that are validated by all the rigors of research are not being fully adopted in community treatment settings” (p. 334). In other words, research done in exacting detail is not

Downloaded by [University of Otago] at 07:48 22 December 2014

224

S. JAYNES

being transferred to community-based organizations with careful replication of the initial research conditions. Finally, they warn, “staff must adhere to the documented protocols. While there is a tendency to adjust the protocols in order to fit them to the agency, doing so often undermines the calculated outcome” (p. 338). This approach might be called the fidelity to protocol description of (an) EBP. The operative task is for the social worker to evince this “ready-made” protocol in his/her practice. Several sources refer to EBPs in similar fashion (Cohen, 2011; Luongo, 2007; Rieckmann, Bergmann, & Rasplica, 2011). Before proceeding to the alternative description of EBP as a process, it is important to consider the obstacles that have been encountered when attempts to implement EBPs have been made in practice situations. Dissimilarity between the original research situation and the practice context is one noteworthy obstacle to the implementation of EBPs. Typically research takes place in the circumscribed environment of a university or a university-affiliated program where careful delimitations and controls can be maintained. The priority of this knowledge creation is on the highest internal validity possible. To that end, researchers often exclude from participation people who have co-morbidities (McMillen, Lenze, Hawley, & Osborne 2009; Zlotnik, 2007). One study investigated the extent of similarity between the conditions of community-based treatment and the conditions under which two popular interventions for the treatment of schizophrenia and bipolar disorder respectively were developed. Findings indicate that 38% of patients receiving care from community-based organizations for schizophrenia, and 55% being treated for bipolar I disorder would have been ineligible to participate in the research that culminated in forming protocols for their treatment (Zarin, Young, & West, 2005). This is because they had co-morbid disorders and other confounding circumstances that did not meet the narrow preconditions of the researchers. Of course real-world agencies can’t operate under similarly circumscribed conditions, so practitioners often perceive very little real-world relevance in such protocols (Horsfall, Cleary, & Hunt, 2011; Stanhope, Tuchman, & Sinclair, 2011). Another problem closely related to the dissimilarity between research and practice settings concerns the unit of analysis. Knowledge gained from randomized controlled trials (RCTs) is group-level information, about average tendencies among a group of people. Practitioners engage with individuals who, while they may share a diagnosis or some other common attribute with study participants, are individuated in practice. With the understanding that there is often more heterogeneity between randomly selected members of the same group than between members of different groups randomly selected for comparison (Hutchison, 1999) the task of applying an EBP based on group tendencies to a unique individual in the name of optimal treatment can be a source of frustration. A third obstacle to the implementation of EBPs as a priori protocols is the lack of practicality of implementation. It is often necessary to tailor procedures to meet the needs of people of diverse cultural experience and, therefore, impractical and unwise to implement them “as is” in a rigid sense (Domenech Rodríguez, Baumann, & Schwartz, 2011). EBPs are often impractical in a temporal way as well. Assigning practitioners the task of carrying out these interventions means that they must often engage EBPs in addition to the body of work they are already carrying out. For instance Baker et al. (2010) described EBPs as “add-on services” beyond the scope of normal practice and Amodeo, Ellis, and Samet (2006) suggested that “clinicians would not need to abandon their current treatment approaches but that EBPs could be helpful in addressing chronic or nagging problems in client care; EBPs could make the treatment easier and more interesting for the clinician” (p. 558). On the contrary, many clinicians resist EBPs specifically because they view them as impractical (McBeath, Briggs, & Aisenberg, 2010). An alternative definition of EBP, and one that has gained increasing support, is that EBP is a process as opposed to a discrete protocol. In addition to describing EBP as the integration of clinical expertise, the best available evidence, and patient values, Sackett (2000) identified the EBP process as inclusive of the following steps: (a) converting information needs relative to practice

Downloaded by [University of Otago] at 07:48 22 December 2014

PRACTICE-BASED RESEARCH TO TEACH EBP

225

decisions into well-structured answerable questions; (b) seeking out the best evidence with which to answer them; (c) critically appraising that evidence for its validity, impact, and applicability; (d) integrating the critical appraisal with clinical judgment and client values and circumstances; and (e) evaluating the effectiveness and efficiency in carrying out steps one through four and seeking ways to improve future practice. The process definition of EBP has a great deal of support in the literature and is considered by many to be a better fit with practice situations (Gambrill, 2007; Gibbs, 2007; Grady, 2010; Nevo & Slonim-Nevo, 2011; Shlonsky & Gibbs, 2004). The process model seems to fit with provider preferences because it takes seriously the idea that “in addition to research, many other factors must be considered when determining the most effective course of treatment, including the clinicians’ expertise and the client’s wishes” (Grady, 2010, p. 401). One study of the fidelity with which 53 sites in 8 states implemented 5 EBPs (protocols as described above) concluded that practitioner resistance was a formidable obstacle to the process (McHugo et al., 2007). In the process definition objective research-derived guidelines are still essential, but they are weighed in the context of the practitioner’s clinical judgment, the client’s values, and the agency setting as delimiters. These concerns qualify the simple linearity of manualized protocols being carried out in practice. In fact a new term—empirically supported intervention (ESI)—is increasingly used in place of EBP as a ready-made protocol (Woody, D’Souza, & Dartman, 2006). That which counts as evidence is also at issue. Can data sources beyond RCT results be useful in informing practice decisions? Rubin (2011) is among social work scholars who support the idea of a hierarchy of evidence to be used in EBP. He suggested that the information needs of practice can be met by qualitative methodologies and surveys, among other means, and that while RCTs have an important place in the range of sources of evidence, they are inappropriate to certain practice situations. To summarize, the two modifications that the process description makes to the EBP-as-protocol description are the confining of research-derived practice indicators to a single factor in a multi-factorial decision process and the expansion of the range of information sources that can give rise to such research-generated indicators. In the text that follows EBP will refer to the stepwise process of making and evaluating practice decisions. The goal of increasing the uptake of EBP among practicing social workers requires change within the profession that is “profound” (Stanhope et al., 2011) and “radical” (Gambrill, 2007). It will require a shift from business as usual within many target systems: Funders and regulators, agency operations, individual orientation to practice, and professional education. It is educational innovation as part of the overall advance of EBP that is the focus of this article. The contribution that schools of social work make to the profession, and by extension to society, is to equip the community with individuals who are prepared to work in the midst of the sources of displacement that accompany the current social and technical situation. That situation and the sources of displacement incumbent with it, is subject to change across time. In that vein, Shlonsky and Stern (2007) wrote that social work programs must encourage curiosity, a skeptical approach to information, and a passion for seeking out the best knowledge possible in the service of helping. It has been argued that teaching the process of EBP as a way to mitigate different problems as opposed to teaching content indexed to a range of specific problems will have a more durable value to students (Springer, 2007). It has further been argued that the nature of the social and technical reality in which social work and social problems are both embedded is changing and therefore the way social workers are prepared for practice must also change. For example, Parton (2008) stipulated that narrative ways of understanding people and events have been eclipsed by “database ways of thinking and operating” and that social work education has not kept pace with these changes (p. 253). There is the suggestion that there is too much teaching focused on theory and not enough on technical proficiencies (Thyer, 2007). Advocating pedagogical ideas relative to the teaching of EBP in social work programs addresses both of these criticisms: it focuses on the architecture of decision making rather than on singular emphasis on theory to guide practice; it

Downloaded by [University of Otago] at 07:48 22 December 2014

226

S. JAYNES

also emphasizes the role of uncertainty at the beginning of any decision process (Gambrill, 2007) and the need to ascertain objective data that follows from that uncertainty. Despite these advantages and despite the growing advocacy for adoption of EBP throughout the profession, social work educational programs have not adopted the model or even the principles of EBP into their programs to any great extent. In fact, Woody et al. (2006) found an inverse relationship between the increase of EBP articles in the literature and social work faculty members’ interest in teaching EBP material. Social work programs are heterogeneous in many ways and these authors found that programs specializing in a clinical as opposed to a generalist focus had more EBP content in their curriculums. Similarly, others have noted that the difference between programs housed in large research-oriented universities and those within smaller liberal arts campuses may be an important variable consider when measuring the uptake of EBP in social work education (Springer, 2007; Stanhope et al., 2011). The point has also been raised that whatever their hesitations, social work programs universally will be under growing pressure to adopt EBP principles into their teaching because of the demand from the wider environment (Soydan, 2007). As stated above, the literature on practice-based research has much to offer to the project of EBP uptake and in the third section of the paper pedagogical ideas associated with practice-based research will be developed. Prior to that development, the following section of the paper identifies several principles of practice-based research. Practice-Based Research From its beginning social work has attempted to use science to improve practice. This is in accordance with the evolution of the profession beyond the religious and volunteer-based provision of charity that existed prior to its establishment. The specific utility of science has fallen into two categories. Sometimes practitioners are encouraged to use scientific methodology in the conduct of their practice; that is, to operate as empiricists themselves vis-à-vis their clients. Otherwise science is framed as useful insofar as the results of scientific inquiry conducted elsewhere have bearing on the current practice situation. This means that the practitioner has responsibility to read and interpret research, but not to engage in knowledge production himself (Kirk & Reid, 2002). As frustration with the process of traditional research—where the individual professional is the downstream recipient of scientific knowledge—grew, many practitioners sought a more local and attenuated source of information to help make important decisions. This frustration coalesced into the development of an approach to practice called practice-based research, which is defined as “the conduct of research and the generation of knowledge within natural practice settings” (McMillen et al., 2009, p. 308). Developing the definition further, Dirkx (2006) notes that practitioner research “can involve a focus on technical issues and use traditional quantitative methodologies : : : but such [procedures] are understood within a broader narrative that seeks to give voice to practice as perceived, understood and struggled with from the inside” (p. 276). The knowledge development function of practitioner-based research can take the shape of large-scale, multi-agency joint ventures or it can be as simple as pilot testing an ESI in a particular organization and documenting the lessons learned in the “living” process of implementation (Martinez-Brawley, 1995). McMillen et al. (2009) identified four characteristics of practice-based research. First, it produces data that reflect the conditions of providing services in community-based programs, not academic settings. Second, colleagues are connected and linked together in an interdependent research network, as are multiple organizations in some iterations; the longevity of the network transcends any individual time-limited investigation. Third, practitioners often partner with academic research institutions and negotiate division of labor. Fourth, within their partnerships, practitioners realign traditional power imbalances such that they keep discretion over generating research ideas.

Downloaded by [University of Otago] at 07:48 22 December 2014

PRACTICE-BASED RESEARCH TO TEACH EBP

227

It is useful to compare these characteristics of practice-based research to the standard model of knowledge development and diffusion. In the standard model, experts based in labs or academic research centers use RCTs to identify optimal treatments for isolated problems; then they develop techniques and protocols that are incrementally taken up by the practice community through training and technical support (Schoech et al., 2006). Aside from the issues of relevance, differing units of analysis, and practicality, which have already been addressed as problematic features of traditional knowledge development, the model can also be critiqued for espousing an industrial and managerial assumption about product control, wherein articulation of credible knowledge is the purview of experts and where vetted knowledge and policies associated with it flow down to practitioners to be absorbed (Lunt & Fouche, 2009). Practice-based research takes a different approach. The inertia of knowledge development starts with the information needs of practitioners, not the interests of the researcher (Horsfall et al., 2011). Because it is impractical and potentially unethical to use random assignment and inflexible protocols in practice, practice-based research utilizes non-experimental or quasi-experimental techniques. It relies on instrumentation that is tailored to practice needs, not external research protocols; and pragmatic needs of the program always outweigh the principles of basic or “pure” research (Epstein, 2001). Academic researchers and professionals in practice encounter one another across a wide gulf within the standard model. Researchers are sometimes unfamiliar with the reality of practice situations, and practitioners are likewise reluctant to be involved in research because of suspicion and/or distrust of researchers. The theme of lack of familiarity is echoed in the following recounting of a training episode focused on teaching the basics of EBP facilitated by the Columbia University School of Social Work to three agencies in New York City: We were struck by how difficult it is to provide such training in the real world of social agency practice because of barriers such as limited time, agency culture, and infrastructure, including access to Internet and research databases, high staff turnover, and limited resources to support the implementation of empirically supported practices once they are identified. (Mullen, Bellamy, Bledsoe, & Francois, 2007, p. 575)

Likewise, Amodeo, Storti, and Larson (2010) express the difficulty they had and the surprise they encountered in implementing cognitive behavioral therapy training among addiction providers as follows: We had expected to recruit clinical supervisors, but found that they were a rare commodity in the programs or rarely available to participate in this type of training venture; instead, clinical directors or program managers were often enrolled to fulfill this function due to the absence of clinical staff with supervisory experience and/or funds earmarked for supervisory staff. (p. 974).

These statements reflect the surprise of researchers as they encountered the real world of practice and attempted to train agency personnel according to standards they had developed. For their part, practitioners often distrust the research community. They report “hit and run” or “smash and grab” exchanges with researchers who appear to want to extract data for a short time or impose a new program from the outside. As Begun, Berger, Otto-Salaj, and Rose (2010) expressed: These experiences leave agency partners feeling violated, used, and robbed. These sentiments most likely arise when the community partner’s primary role in research is unidirectional; for example, if community-based partners are only giving access to study participants or providing data to a university researcher, they believe that the university researcher leaves the agency and staff no wiser or better as a result of everyone’s efforts. (p. 57)

Downloaded by [University of Otago] at 07:48 22 December 2014

228

S. JAYNES

All of these statements contain insinuations about power and who exercises it in relation to whom. Power differential is manifested in who develops questions of interest, who selects methods of research including defining appropriate sources of data, who gathers and who analyzes data, who develops findings and conclusions, and who decides on the manner through which results will be disseminated (Hopper & Lincoln, 2009). Practice-based research realigns the traditional research hierarchy by relocating the power over the aforementioned decisions to the practice community rather than deferring power to experts outside of practice. The role of academic research partners is to be “on tap rather than on top” (Lunt & Fouche, 2009). Practitioners often feel disempowered by heavy bureaucratic requirements placed on them to follow protocols, gather data, and produce required documentation (Burton & van den Broeck, 2009; Parton, 2008). Both EBP and practice-based research are born out of the same set of practitioner frustrations and, as such, they should be more closely aligned. Stanhope et al. (2011) see a link between advancing EBP uptake among practitioners and the need to develop research in the context of practice settings. With their potential synergy noted, it is timely to consider how the process model of EBP might be qualified and updated by principles arising from practice-based research and, furthermore, what implications for teaching EBP in social work education programs accompany these modifications. Teaching the EBP Process in Social Work Education In the first sentence of this article the idea of EBP as a nascent and still-forming idea was introduced. The debate about what it is and how it should be operationalized continues. One corollary that follows, in terms of education, is that it is difficult to teach about something that is still very new and unresolved. The indication for schools of social work to incorporate EBP into their teaching comes from several sources. The changing nature of information, especially how accessible it is, has already been referred to, as has the preference among some educators to focus more on the process of decision making than on imbuing students with a complement of theories and facts. Some say that social work education is not even principally about facts at all, but is an “opinion-based pedagogy” that needs to become an “evidence-based pedagogy” (Soydan, 2007, p. 616). Several educators have suggested that social work programs employ old teaching methods that are outdated. For instance, Thyer (2007) saw a focus on theory as less practical than a focus on how to assess effectiveness of practice in terms of client outcomes. Many believe that the way research methods classes are structured is arcane. Most research courses in Bachelor of Social Work and Master of Social Work programs teach students how to conduct research, while the vast majority of social work graduates will never conduct formal research in their careers. Instead, they have a manifest need to be proficient in the consumption of research and this requires a different approach to teaching (Gibbs, 2007; Howard, Allen-Meares, & Ruffolo, 2007; Jenson, 2007). Similarly, content on statistical analysis often gets bogged down explaining how tests of significance are calculated when computer software performs these functions instantaneously (Rubin, 2011). The five-step process identified earlier is used as a framework by several faculty members to teach the process of EBP to social work students. Each of those steps is explicated below, with principles from the literature on practitioner-based research incorporated as well as pedagogical implications also included. Step 1: Convert Information Needs Relative to Practice Decisions into Well-Structured Answerable Questions The process begins with the identification of a problem that needs to be addressed relative to the client’s situation and the development of an answerable question related to that problem. This

Downloaded by [University of Otago] at 07:48 22 December 2014

PRACTICE-BASED RESEARCH TO TEACH EBP

229

is a more detailed process than it might appear to be. The practitioner has to specify the precise problem with which she and the client are concerned, identify contributing factors that influence the problem, and which aspects of the problem will be prioritized (Monette, Sullivan, & De Jong, 2010). This means moving from nominal definitions of particular circumstances like “well-being” or “difficulty” to operational definitions, which stipulate the procedures for how a concept is to be measured. Furthermore, not every question is an answerable question. Monette et al. suggested that questions pertaining to values, and existential questions (i.e. “What’s my purpose on Earth?”) are not answerable in an objective sense that is open to broad consensus; answerable questions in EBP are empirical questions. They are shaped so that they can be explored through searching research databases. Gibbs (2007) listed five possible types of questions that can be formulated in this phase: questions about effectiveness of interventions on specified outcomes, questions about prevention, questions about description, assessment questions, and risk or prognosis questions. The literature on practice-based research informs educators that there are important constraints on the boundaries within which practitioners are at liberty to develop these initial questions. McMillen et al. (2009) suggested that practice-based research be used to investigate how organizational issues like funding, regulation, and accreditation affect clinical decision making. They maintain that a bottom-up process of inquiry is better suited to complexity of that topic than a traditional research design would be. Descriptions of EBP refer to the intersection of best available evidence, clinical judgment, and client values and preferences (Gambrill, 2007; Howard, McMillen, & Pollio, 2003; & Sackett et al., 1996). The agency or organizational setting employing the professional is left out of the description. To that end, Morago (2010) noted that while EBP is typically referred to as an endeavor of a competent individual practitioner, organizational dynamics typically weigh heavily in the process of professional decision making. He studied attitudes about EBP and the extent of its implementation across 155 social work agencies. His findings indicate that among those agencies, organizational policies and procedures were rated as more instrumental in practice decisions than professional judgment was, and also more than client values or preferences were. Adding the organizational influence to the recognized importance of best available evidence, professional judgment, and client values is an important step toward framing EBP overall and also asking a question that is feasible to answer, given the way organizations operate. Social work educators should modify EBP content and principles accordingly. Organizational risk aversion may be a delimiter to the way client problems are framed; meaning that an open field of problem selection and question development can be unsettling in many practice settings. Students should be taught to match their question formulation within existing narrative structure, aligned with prevailing culture, and operationalized very similarly to what is in effect at the particular practice setting at hand. While this may seem to limit the creative and professional license of the individual, pragmatism must also be considered: Agencies are more likely to move ahead with EBP if the operationalization of variables and the nature of the questions it addresses fit with the agencies’ modus operandi. Step 2: Seek Out the Best Evidence with Which to Answer the Question In the second step of the process, the carefully operationalized question leads to a search of many data sources for information that has bearing on the particular variables at hand. As above, while the wording of this step might appear simplistic—done well, it is an elegant process. It involves locating and accessing useful databases or sources of information, using search terms effectively in searchable databases, and understanding what components of the question will be most useful in the search process (Grady, 2010). Developing these information-gathering skills is part of becoming what Howard et al. (2007) called an “information scientist.” They and others (Gibbs, 2007) suggested that social work education is doing students a disservice by offering little

Downloaded by [University of Otago] at 07:48 22 December 2014

230

S. JAYNES

to no guidance in using information technology well in order to gather information to inform practice. This shortcoming has obvious implications for innovation in teaching; Educators need to outfit students with the database skills they will need in the information age. To reiterate the first principle of practice-based research listed above, it is concerned with producing data that reflect the conditions of providing services in community-based programs, not academic settings. Given this emphasis on local knowledge, there is a support in the practicebased research literature for the use of practitioner/agency-generated clinical data to be used as a source of data under certain circumstances and after it has been subjected to recognized data analysis techniques (Epstein, 2001; Fitch & Grogan-Kaylor, 2012; McMillen et al., 2009). Information technology permeates virtually everything social workers do. In one survey of 2,200 social workers over half of the sample reported spending more than 60% of their time on administrative work as opposed to client contact; and 95% of respondents believed that social work had become more bureaucratic over the past five years (Parton, 2008). In most agencies computers facilitate that administrative work. Given these types of compliance burdens, it is not difficult to understand how practitioners feel overtaxed, alienated from the motives that attracted them to the field, and unavailable to the prospect of incorporating EBP into their work routines (Berger, Otto-Salaj, Stoffel, Hernandez-Meier, & Gromoske, 2009; Cawood, 2010; Morago, 2010). In point of fact copious amounts of data are being gathered according to standardized objective criteria as bits (or bytes) of information about practitioners’ caseloads. It may well be that practitioners aggregate more standardized information than researchers do about the vulnerable populations who are their service users, but since agency-based information has not been publically shared the way that research information has, that claim would be difficult to substantiate. Agencies’ clinical and administrative databases are a source of potentially useful information that those who advocate EBP should recognize and those who teach the EBP process should put forward as a source to be pursued. Step 3: Critically Appraise Evidence for Its Validity, Impact, and Applicability After several sources of information have been gathered, the next step in the EBP process is to assess that information for its relative quality and for its fit with the circumstances of the case at hand. This is particularly difficult for practitioners because published research findings are not typically written for them, but for a target audience of other academic researchers (Grady, 2010). In one study of the impact of EBP training and technical assistance on practitioner attitudes across three agencies, the authors found that even with substantial training and support from university partners, there was a striking “persistence from pre- to post-intervention of concerns about the lack of necessary skills to understand and judge the quality of research and the need for ongoing supervision and monitoring” (Manuel, Mullen, Fang, Bellamy, & Bledsoe, 2009). The ability to discern the earmarks of validity in published research and to appraise its overall quality relative to other research is the second skill (along with proficiency in information retrieval) exhibited by the type of person Howard et al. (2007) labeled an “information scientist.” There are a variety of tools that have been developed to help students appraise the value of research. Gibbs (2003) has created a series of worksheets to guide this process. For example, the Quality of Study Rating Form (QSRF) examines the following indicators in each discrete study collected: Statistical significance, absolute risk reduction, number needed to treat, and number needed to harm. These tools require a modest amount of proficiency in understanding research concepts and terms. Many practitioners struggle with this phase of the process because they lack the capacity to make sense of research results (note the Manuel et al. 2009 study above). Not only does this reflect poorly on the way social work programs have (or have not) been equipping their graduates for

Downloaded by [University of Otago] at 07:48 22 December 2014

PRACTICE-BASED RESEARCH TO TEACH EBP

231

practice, it represents an occasion for practitioners to develop resistance to EBP as too technical and/or overwhelming. In large part practice-based research is an initiative to break the hegemony of RCTs as the single source of valid information. With considerable attention paid to parsing the quality of experimental and quasi-experimental research designs, less attention has been paid to the subject of feasibility of RCT-based knowledge in complex community organizations and with considering alternative forms of knowledge and knowledge creating processes. For instance practice-based researchers might want to ascertain information about what factors (predictor variables) among their caseload of clients are associated with higher rates of arrest or hospitalization (criterion variables), so—upon finding results of RCTs inhospitable to extrapolation—they might choose to perform a regression analysis with their own data. The external validity of this analysis becomes greater if there is a network of practice-based researchers standardizing their agency data in the same fashion and inputting it into a common database; note here that standards for de-identifying personal health information exist and the presumption of such a shared database is that appropriate ethical guidelines have been followed in its establishment. As Rubin (2011) noted, many practice situations involve forming questions for which experimental design studies will not provide usable information due to feasibility issues in practice. The implications for teaching the EBP process that emerges from these concerns is that feasibility to practice should be highlighted more often than it has been. There seems to be a singular focus on the rigor of the methodological designs of studies gathered from database searches to the exclusion of questions about relevance to practice. Educators should introduce this rigor-relevance dialectic and help students navigate its imperatives. Step 4: Apply the Results of This Appraisal to Practice and Policy Decisions The process above culminates in apprehending the best available evidence to be used in developing a particular treatment approach for the case at hand. The next step requires consultation of multiple concerns that have an interest in what the specific plan of treatment becomes. As stated above, those concerns have been limited to three categories by most EBP advocates: Best available evidence, clinical judgment, and client values. The work involved for the practitioner in this step includes considering his own skillfulness at implementing the intervention recommended by the best available evidence, considering client characteristics that may affect how the intervention will be perceived by him/her, and what interventions may be needed to the intervention in order to make it maximally effective to the client (Grady, 2010). For all the specificity and use of algorithm inherent or implied within the earlier steps of the EBP process, there is little help for social workers in practice trying to figure out how to go about actually forging a plan in the midst of these multiple concerns; this is often where allusions to art come into play. Shlonsky and Stern (2007) reflected on this weakness in the following passage: It takes a great deal of clinical skill to successfully integrate current best evidence with client preferences/actions, clinical state/circumstances, and the practice context. Indeed, this coming together is the hardest part of the endeavor and is also the one we know the least about. We must be honest about our current limitations. EBP is an emerging approach, and it will take considerable time and effort to make it work. (pp. 607–608)

It is heartening that these authors are unconventional, in that they do include the practice context (i.e. organizational constraints) as a compelling interest in practice decisions. Other authors depict this balance as a binary contest between clinical judgment and evidence: Practice can be tyrannized by external evidence when it is applied without practice expertise, because even rigorous and compelling evidence may be inapplicable to an individual client or inconsistent

232

S. JAYNES

Downloaded by [University of Otago] at 07:48 22 December 2014

with their goals. However, without current best evidence, practice rapidly becomes outdated, much to the detriment of our clients. (Howard et al., 2003, p. 255)

Defining conflicting interests, or sources of knowledge in this case, is not the same thing as helping practitioners steer a course of action through them. Recognizing the value of practice-based research, some have called for faculty members to engage themselves in the role of practitioner, taking up the role of providing services to clients. Grady (2010), for instance wrote that in large research-oriented universities social work faculty often lose touch with the “messiness” of practice situations. She advocated strengthening the practice-based research model, but not in the way it is commonly understood. Instead of encouraging practitioners to expand their research activity, she advocated that faculty should engage consistently in practice and service delivery. Along with this she stated that tenure and promotion decisions should recognize the importance of faculty maintaining practice within the community. Thyer (2007), similarly, has called for a requirement that clinical social work faculty be licensed at the highest level possible within the states where they teach. If this were to happen more often in social work education, a better way forward in terms of advising students might be found by a new iteration of academics-as-practitioners-as-researchers. Step 5: Evaluate the Effectiveness and Efficiency in Carrying Out Steps 1 to 4 and Seek Ways to Improve Future Practice Practice evaluation has long been an important and under-utilized part of the practice process. Practice-based research encourages social workers to become much more involved with analysis and interpretation of data than they normally are. At the same time, it recognizes barriers that must be overcome. Front-line workers typically have little ongoing exposure to the information they input into information systems; it is aggregated and analyzed by supervisors, but does not inform practice (Sapey, 1997). They are responsible for documentation but don’t benefit from its potential. Practitioner-generated research addresses the alienation of practitioners by uncovering the power they have as observers and recorders of clinical data; it encourages a posture of critical reflection based on demonstrated outcomes. But in order to facilitate a new attitude toward data, they will need more access to it. It must become less sequestered by agency leaders. Bradt, Roose, Bouverne-De Bie, and De Schryver (2011) noted that social work happens in an increasingly informational context and that while the repercussions of the rising premium on data gathering are often experienced as alienation and frustration by social workers, there is a point of view available to practitioners that is strategically valuable: capitalize on the power and efficiency possible through databases and computer technology. Gambrill (2007) advocated for more transparency, including the agency leaders making objective information about the relative success working with clients publically available through the internet. In terms of the role of social work education, the issue is whether it will do what is easier (placating students in their “math aversion,” and the distaste for research, operational variables that co-arise from that aversion) or what is initially more difficult: demanding, for the sake of the clients that await them, that students are better equipped with skills, not opinions. Rubin (2011) confronted everyone involved in social work education when he asks, “What is the role of social work education with regard to inadequacies in the practice community? Should we just teach those things that fit the status quo? Or should we strive to make it better?” (p. 69). CONCLUSION The steps of the EBP process represent a very useful architecture in which social work professionals can do what has always been important to them: Empowering clients and challenging

PRACTICE-BASED RESEARCH TO TEACH EBP

233

Downloaded by [University of Otago] at 07:48 22 December 2014

injustice. It’s useful because it fits with the information age, with the demand for accountability and with the concomitant awareness that to work with clients is to introduce the possibility that one is harming them, not helping. How does the concerned professional know whether her work with clients leaves them better in any demonstrable way than they would have been without her? EBP is a way of feeling deeply about that question and acting accordingly, Concretizing the steps of EBP is a way to keep taking it seriously when the alternative is to, upon seeing the nuanced definitional debates, walk away unchallenged thinking “I guess everyone has their own opinion.” Advancing EBP in the social work profession is potentiated by the way students are educated about it and the skills that they are equipped with when they graduate. This article is written in strong support of teaching the stepwise approach to EBP with important modifications introduced from the related and co-evolving field of practice-based research.

REFERENCES Amodeo, M., Ellis, M. A., & Samet, J. H. (2006). Introducing evidence-based practices into substance abuse treatment using organization development methods. American Journal of Drug & Alcohol Abuse, 32(4), 555–560. doi:10.1080/009529 90600920250 Amodeo, M., Storti, S. A., & Larson, M. J. (2010). Moving empirically supported practices to addiction treatment programs: Recruiting supervisors to help in technology transfer. Substance Use & Misuse, 45(6), 968–982. doi:10.3109/1082608 0903534467 Baker, L. R., Stephens, F., & Hitchcock, L. (2010). Social work practitioners and practice evaluation: How are we doing? Journal of Human Behavior in the Social Environment, 20(8), 963–973. doi:10.1080/15433714.2010.498669 Begun, A. L., Berger, L. K., Otto-Salaj, L., & Rose, S. J. (2010). Developing effective social work university-community research collaborations. Social Work, 55(1), 54–62. Berger, L. K., Otto-Salaj, L., Stoffel, V. C., Hernandez-Meier, J., & Gromoske, A. N. (2009). Barriers and facilitators of transferring research to practice: An exploratory case study of motivational interviewing. Journal of Social Work Practice in the Addictions, 9, 145–162. doi:10.1080/15332560902806199 Bilsker, D., & Goldner, E. M. (2000). Teaching evidence-based practice in mental health. Research on Social Work Practice, 10(5), 664–669. Bradt, L., Roose, R., Bouverne-DeBie, M., & De Schryver, M. (2011). Data recording and social work: From the relational to the social. The British Journal of Social Work, 41, 1372–1382. Burton, J., & van den Broek, D. (2009). Accountable and countable: Information management systems and the bureaucratization of social work. British Journal of Social Work, 39(7), 1326–1342. doi:10.1093/bjsw/bcn027 Cawood, N. D. (2010). Barriers to the use of evidence-supported programs to address school violence. Children & Schools, 32(3), 143–149. Cohen, B. J. (2011). Design-based practice: A new perspective for social work. Social Work, 56(4), 337–346. Dirkx, J. M. (2006). Studying the complicated matter of what works: Evidence-based research and the problem of practice. Adult Education Quarterly, 56(4), 273–290. doi:10.1177/0741713606289358 Domenech Rodríguez, M. M., Baumann, A. A., & Schwartz, A. L. (2011). Cultural adaptation of an evidence based intervention: From theory to practice in a Latino/a community context. American Journal of Community Psychology, 47(1), 170–186. doi:10.1007/s10464-010-9371-4 Epstein, I. (2001). Using available clinical information in practice-based research: Mining for silver while dreaming of gold. Social Work in Health Care, 33(3), 15–32. Fitch, D., & Grogan-Kaylor, A. (2012). Using agency data for evidence-based programming: A university-agency collaboration. Evaluation and Program Planning, 35(1), 105–112. Gambrill, E. (2007). Transparency as the route to evidence-informed professional education. Research on Social Work Practice, 17(5), 553–560. doi:10.1177/104973 1507300 149 Gibbs, L. E. (2003). Evidence-based practice for the helping professions: A practical guide with integrated multimedia. Pacific Grove, Calif.; Toronto, Ont: Thomson Brooks/Cole. Gibbs, L. (2007). Applying research to making life-affecting judgments and decisions. Research on Social Work Practice, 17(1), 143–150. doi:10.1177/1049731506294802 Grady, M. D. (2010). The missing link: The role of social work schools and evidence-based practice. Journal of EvidenceBased Social Work, 7(5), 400–411. doi:10.1080/15433711003591101

Downloaded by [University of Otago] at 07:48 22 December 2014

234

S. JAYNES

Hopper, K., & Lincoln, A. (2009). Capacity-building. In Handbook of service user involvement in mental health research (pp. 73–86). Chichester, West Sussex, UK; Hoboken, NJ: John Wiley & Sons, Ltd. doi:10.1002/9780470743157.ch6 Horsfall, J., Cleary, M., & Hunt, G. E. (2011). Developing partnerships in mental health to bridge the research-practitioner gap. Perspectives in Psychiatric Care, 47(1), 6–12. doi:10.1111/j.1744-6163.2010.00265.x Howard, M. O., Allen-Meares, P., & Ruffolo, M. C. (2007). Teaching evidence-based practice: Strategic and pedagogical recommendations for schools of social work. Research on Social Work Practice, 17(5), 561–568. doi:10.1177/1049731 50730019! Howard, M. O., McMillen, C. J., & Pollio, D. E. (2003). Teaching evidence-based practice: Toward a new paradigm for social work education. Research on Social Work Practice, 13(2), 234–259. doi:10.117711049731502250404 Hutchison, E. (1999). Dimensions of human behavior. Person and environment (Vol. 1). Thousand Oaks, CA; London: Pine Forge. Jenson, J. M. (2007). Evidence-based practice and the reform of social work education: A response to Gambrill and Howard and Allen-Meares. Research on Social Work Practice, 17(5), 569–573. doi:10.1177/1049731507300236 Kirk, S. A., & Reid, W. J. (2002). Science and social work: A critical appraisal. New York: Columbia University Press. Lunt, N., & Fouche, C. (2009). Action research for developing social workers’ research capacity. Educational Action Research, 17(2), 225–237. doi:10.1080/09650790902914209 Luongo, G. (2007). Re-thinking child welfare training models to achieve evidence-based practices. Administration in Social Work, 31(2), 87–96. doi:10.1300/J147v31n0206 Manuel, J. I., Mullen, E. J., Fang, L., Bellamy, J. L., & Bledsoe, S. E. (2009). Preparing social work practitioners to use evidence-based practice: A comparison of experiences from an implementation project. Research on Social Work Practice, 19(5), 613–627. Martinez-Brawley, E. (1995). Knowledge diffusion and transfer of technology: Conceptual premises and concrete steps for human services innovators. Social Work, 40(5), 670–682. McBeath, B., Briggs, H. E., & Aisenberg, E. (2010). Examining the premises supporting the empirically supported intervention approach to social work practice. Social Work, 55(4), 347–357. McHugo, G. J., Drake, R. E., Whitley, R., Bond, G. R., Campbell, K., Rapp, C. A., : : : Finnerty, M. T. (2007). Fidelity outcomes in the national implementing evidence-based practices project. Psychiatric Services, 58(10), 1279–1284. McMillen, J. C., Lenze, S. L., Hawley, K. M., & Osborne, V. A. (2009). Revisiting practice-based research networks as a platform for mental health services research. Administration & Policy in Mental Health & Mental Health Services Research, 36(5), 308–321. doi:10.1007/s10488-009-0222-2 Monette, D., Sullivan, T., & DeJong, C. (2010). Applied social research: A tool for the human services (8th ed.). Belmont, CA: Brooks/Cole. Morago, P. (2010). Dissemination and implementation of evidence-based practice in the social services: A UK survey. Journal of Evidence-Based Social Work, 7(5), 452–465. doi:10.1080/15433714.2010.494973 Mullen, E. J., Bellamy, J. L., Bledsoe, S. E., & Francois, J. J. (2007). Teaching evidence-based practice. Research on Social Work Practice, 17(5), 574–582. doi:10.1177/1049731507303234 Nevo, I., & Slonim-Nevo, V. (2011). The myth of evidence-based practice: Towards evidence-informed practice. British Journal of Social Work, 41(6), 1176–1197. Parton, N. (2008). Changes in the form of knowledge in social work: From the ‘social’ to the ‘informational’? British Journal of Social Work, 38(2), 253–269. doi:10.1093/bjsw/bcl337 Patterson, D. A., & McKiernan, P. M. (2010). Organizational and clinical implications of integrating an alcohol screening and brief intervention within non-substance abuse serving agencies. Journal of Evidence-Based Social Work, 7(4), 332–347. doi:10.1080/15433710903256880 Rieckmann, T., Bergmann, L., & Rasplica, C. (2011). Legislating clinical practice: Counselor responses to an evidencebased practice mandate. Journal of Psychoactive Drugs, 27–39. doi:10.1080/02791072.2011.601988 Rubin, A. (2011). Teaching EBP in social work: Retrospective and prospective. Journal of Social Work, 11(1), 64–79. doi:10.1177/1468017310381311 Sackett, D. L. (2000). Evidence-based medicine: How to practice and teach EBM (2nd ed.). Edinburgh; New York: Churchill Livingstone. Sackett, D. L., Rosenberg, W. M. C., Gray, J. A. M., Haynes, R. B., & Richardson, W. S. (1996). Evidence based medicine: What it is and what it isn’t. British Medical Journal, 312(7023), p. 71. Sapey, B. (1997). Social work tomorrow: Towards a critical understanding of technology in social work. British Journal of Social Work, 27(6), 803–814. Schoech, D., Basham, R., & Fluke, J. (2006). A technology enhanced EBP model. Journal of Evidence-Based Social Work, 3(3), 55. doi:10.1300/J394v03n0305 Shlonsky, A., & Gibbs, L. (2004). Will the real evidence-based practice please stand up? Teaching the process of evidencebased practice to the helping professions. Brief Treatment and Crisis Intervention, 4(2), 137–153.

Downloaded by [University of Otago] at 07:48 22 December 2014

PRACTICE-BASED RESEARCH TO TEACH EBP

235

Shlonsky, A., & Stern, S. B. (2007). Reflections on the teaching of evidence-based practice. Research on Social Work Practice, 17(5), 603–611. doi:10.1177/1049731507301527 Soydan, H. (2007). Improving the teaching of evidence-based practice: Challenges and priorities. Research on Social Work Practice, 17(5), 612–618. doi:10.1177/1049731507300144 Springer, D. W. (2007). The teaching of evidence-based practice in social work higher Education—Living by the Charlie Parker dictum: A response to papers by Shlonsky and Stern, and Soydan. Research on Social Work Practice, 17(5), 619–624. doi:10.1177/1049731506297762 Stanhope, V., Tuchman, E., & Sinclair, W. (2011). The implementation of mental health evidence based practices from the educator, clinician and researcher perspective. The Netherlands: Springer. doi:10.1007/s10615-010-0309-y Thyer, B. (2007). Social work education and clinical learning: Towards evidence-based practice? Clinical Social Work Journal, 35(1), 25–32. doi:10.1007/s10615-006-0064-2 Woody, J. D., D’Souza, H. J., & Dartman, R. (2006). Do Master’s in social work programs teach empirically supported interventions? A survey of deans and directors. Research on Social Work Practice, 16(5), 469–479. doi:10.1177/104973 1505285453 Zarin, D. A., Young, J. L., & West, J. C. (2005). Challenges to evidence-based medicine: A comparison of patients and treatments in randomized controlled trials with patients and treatments in a practice research network. Social Psychiatry and Psychiatric Epidemiology, 40(1), 27–35. doi:10.1007/s00127-005-0838-9 Zlotnik, J. L. (2007). Evidence-based practice and social work education: A view from Washington. Research on Social Work Practice, 17(5), 625–629. doi:10.1177/1049731507300168

Using principles of practice-based research to teach evidence-based practice in social work.

Social work educators are in a good position to encourage the uptake of evidence-based practice more widely throughout the profession. Despite increas...
199KB Sizes 0 Downloads 0 Views