Evaluation and Program Planning 43 (2014) 16–26

Contents lists available at ScienceDirect

Evaluation and Program Planning journal homepage: www.elsevier.com/locate/evalprogplan

Early implementation evaluation of a multi-site housing first intervention for homeless people with mental illness: A mixed methods approach Geoffrey Nelson a,*, Ana Stefancic b, Jennifer Rae c, Greg Townley d, Sam Tsemberis b, Eric Macnaughton e, Tim Aubry c, Jino Distasio f, Roch Hurtubise g, Michelle Patterson h, Vicky Stergiopoulos i, Myra Piat j, Paula Goering k a

Wilfrid Laurier University, Canada Pathways to Housing, United States c University of Ottawa, Canada d Portland State University, United States e Mental Health Commission of Canada, Canada f University of Winnipeg, Canada g Universite´ Sherbrooke, Canada h Simon Fraser University, Canada i Centre for Inner City Health, University of Toronto, Canada j Douglas Hospital, McGill University, Canada k Centre for Addiction and Mental Health, University of Toronto, Canada b

A R T I C L E I N F O

A B S T R A C T

Article history: Received 8 January 2013 Received in revised form 8 October 2013 Accepted 10 October 2013

This research sought to determine whether the implementation of Housing First in a large-scale, multisite Canadian project for homeless participants with mental illness shows high fidelity to the Pathways Housing First model, and what factors help or hinder implementation. Fidelity ratings for 10 Housing First programs in five cities were made by an external quality assurance team along five key dimensions of Housing First based on 84 key informant interviews, 10 consumer focus groups, and 100 chart reviews. An additional 72 key informant interviews and 35 focus groups yielded qualitative data on factors that helped or hindered implementation. Overall, the findings show a high degree of fidelity to the model with more than 71% of the fidelity items being scored higher than 3 on a 4-point scale. The qualitative research found that both delivery system factors, including community and organizational capacity, and support system factors, training and technical assistance, facilitated implementation. Fidelity challenges include the availability of housing, consumer representation in program operations, and limitations to the array of services offered. Factors that accounted for these challenges include low vacancy rates, challenges of involving recently homeless people in program operations, and a lack of services in some of the communities. The study demonstrates how the combined use of fidelity assessment and qualitative methods can be used in implementation evaluation to develop and improve a program. ß 2013 Elsevier Ltd. All rights reserved.

Keywords: Implementation Fidelity Mixed methods Mental health Homelessness Housing First

1. Introduction Homelessness and mental illness have emerged as a pressing and costly social problem in Canada (Frankish, Hwang, & Quantz, 2009) and other western industrial countries (e.g., Minnery & Greenhalgh, 2007). For example, in a study of 300 shelter users in Toronto, 71% had either a mental illness or addiction or both (Goering, Tolomiczenko, Sheldon, Boydell, & Wasylenki, 2002). The

* Corresponding author at: Department of Psychology, Wilfrid Laurier University, Waterloo, ON, Canada N2L 3C5. Tel.: +1 519 884 0710x3314; fax: +1 519 746 7605. E-mail address: [email protected] (G. Nelson). 0149-7189/$ – see front matter ß 2013 Elsevier Ltd. All rights reserved. http://dx.doi.org/10.1016/j.evalprogplan.2013.10.004

Pathways to Housing program model of Housing First (HF) in New York City (Tsemberis, Gulcur, & Nakae, 2004) is a revolutionary approach to addressing homelessness among people with severe and persistent mental illness. In contrast to ‘‘treatment first’’ approaches, the Pathways HF approach provides housing to homeless people with mental illness shortly after intake without any requirements, rather than offering housing as a reward for having made progress in treatment. The Pathways HF model, also known as ‘‘supported housing’’ (Carling, 1995), is a consumerdriven approach that includes choice over housing, separation of housing and clinical treatment, and delivery of recovery-oriented services that focus on facilitating community integration. Pathways HF tenants receive rent supplements that enable them to

G. Nelson et al. / Evaluation and Program Planning 43 (2014) 16–26

secure typical housing in the community (e.g., scattered site apartments), with the tenant paying no more than 30% of her/his income toward rent. Findings from nine randomized controlled trials (RCTs) in the U.S. demonstrate that HF supported housing approach is more effective at reducing homelessness, hospitalization, and incarceration and increasing housing stability and housing choice, as compared to treatment as usual (TAU), the residential continuum of care approach, or clinical treatment alone (Aubry et al., in press). Because of its progressive philosophy and its success in promoting positive outcomes demonstrated through rigorous research, the Pathways HF approach has been widely endorsed and disseminated as an evidence-based practice. In the U.S., for example, the Substance Abuse and Mental Health Services Administration (SAMHSA), the Department of Housing and Urban Development (HUD), Veterans Affairs (VA), and the Interagency Council on Homelessness developed the Collaborative Initiative to Help End Chronic Homelessness (CICH), which implemented many HF projects across the U.S. (Mares & Rosenheck, 2011; McGraw et al., 2010). Also, many U.S. states and cities have developed 10year plans to end chronic homelessness using the Pathways HF approach (Tsemberis, 2010). Currently, HF supported housing is moving beyond the U.S. to Australia (Johnson, Parkinson, & Parsell, 2013), Canada (Calgary Homeless Foundation, 2012; Goering et al., 2011), and Europe (Pleace & Bretherton, 2012). There are, however, challenges to scaling up evidence-based practices such as HF. Pleace and Bretherton (2012) note that HF can be defined in many different ways, and Johnson et al. (2013) question whether scaling up HF might lead to a paradigm shift or program drift away from the original Pathways HF model. Core principles, such as consumer choice over housing and separation of housing and treatment, can be ignored or only partially implemented, resulting in status quo housing programs rather than real innovation. Research on the implementation of HF has in fact shown considerable variability in the implementation of core principles (McHugo et al., 2004; Rog & Randolph, 2002; Wong, Filoromo, & Tennille, 2007). Based on their review of implementation studies in the area of prevention programs for children, Durlak and DuPre (2008) stated that ‘‘results from over 500 quantitative studies offered strong empirical support to the conclusion that the level of implementation affects the outcomes obtained’’ (p. 327). The extant research on HF has yet to incorporate fidelity assessments as part of program evaluations. As the HF supported housing model is scaled up, questions arise about whether interventions under study have been adequately specified and what model components account for effectiveness. As Mowbray, Holter, Teague, and Bybee (2003) noted, ‘‘the development and use of valid fidelity criteria is now an expected component of quality evaluation practice’’ (p. 316). Until recently, however, the absence of a fidelity scale contributed to a lack of clarity in describing HF in standardized ways. Tabol, Drebi, and Rosenheck (2010) conducted a comprehensive review of the literature on HF supported housing programs. The review examined in a post hoc manner the degree of fidelity to the Pathways HF model in the descriptions of programs in published articles. A total of 15 key elements, identified as central to supported housing, clustered into five broader, overarching categories: (a) normal housing, (b) flexible supports, (c) separation of housing and services, (d) choice, and (e) immediate placement. Tabol et al.’s analysis found that less than half of the supported housing programs adhered to most of the 15 elements. Based on these findings, they concluded that the lack of fidelity to the critical ingredients of the supported housing model in many programs has hindered the broad dissemination, implementation, and evaluation of this approach.

17

One important implementation issue is the tension between fidelity to the original Pathways model and adaptation to the local context (Blakely et al., 1987). There is a balance to be achieved in adhering to the core program ingredients and adapting programs so that they are relevant to particular community and cultural contexts, as both dimensions are related to positive outcomes (Durlak & DuPre, 2008). Although there is some debate in this area (Blakely et al., 1987), adaptations to local context are possible and desirable and can occur without compromising the essential principles or functions of the intervention (Hawe, Shiell, & Riley, 2004). For example, to better serve consumers from different ethno-racial backgrounds in a large culturally diverse city, Stergiopoulos et al. (2012) combined HF principles with an antioppression/anti-racism framework. The purpose of this article is to present findings on the evaluation of the early implementation of HF in a large-scale, multi-site Canadian project, known as At Home/Chez Soi. The early implementation evaluation entailed a fidelity assessment and a qualitative evaluation of factors facilitating or hindering implementation of the HF supported housing model. 2. The Canadian At Home/Chez Soi project The At Home/Chez Soi project, funded for $110 million for four years (2009–2013) by Health Canada through the Mental Health Commission of Canada (MHCC), has implemented HF for homeless people with mental illness in five Canadian cities: Moncton, Montre´al, Toronto, Winnipeg, and Vancouver (Keller et al., 2012). More than 2200 participants have been enrolled in this project. The project hired and trained staff in the Pathways HF model before participants were recruited into the study. This research demonstration project is a randomized controlled trial (RCT) that compares the effectiveness of HF to treatment as usual (TAU) at baseline, 6, 12, 18, and 24 months on a variety of outcome measures. Nested within each of these two conditions are two groups: those with high needs, who are served with Assertive Community Treatment (ACT) in the HF condition, and those with moderate needs, who are served with Intensive Case Management (ICM) in the HF Condition (Goering et al., 2011). In addition to quantitative analyses of outcomes and costs (Goering et al., 2012), the project has employed qualitative methods to understand the conception (Macnaughton, Nelson, & Goering, 2013), planning (Nelson et al., 2013), implementation, and narratives of participants. The ACT programs have a recovery orientation with services provided by a team rather than one staff; ACT has a staff to participant ratio of 1:10, including a psychiatrist, a nurse, and a peer specialist; staff members are closely involved with hospital admissions and discharges; the ACT team meets daily; and staff are available seven days per week with crisis coverage around the clock (Goering et al., 2011). In ICM programs, services are provided by a single case manager; the staff to participant ratio was initially 1:20 but was later changed to 1:16 because the needs of the moderate needs group were greater than expected; case managers work closely with other services and accompany participants to appointments; there are monthly case conferences; and services are provided seven days a week, 12 h per day (Goering et al., 2011). 3. Mixed methods approach to implementation evaluation A mixed methods approach to fidelity and implementation evaluation of the At Home/Chez Soi project was used (Macnaughton, Goering, & Nelson, 2012; Palinkas et al., 2011). We used a design that addresses both triangulation, different methods for assessing implementation, and complementarity, different methods for providing a fuller understanding of implementation (Cresswell & Clark Piano, 2011). Moreover, both the quantitative

18

G. Nelson et al. / Evaluation and Program Planning 43 (2014) 16–26

fidelity data and qualitative data on factors that help or hinder implementation were collected at roughly the same time. The value of this design is that a more complete understanding of the phenomenon of interest can be obtained through multiple methods and from multiple sources. Program fidelity refers to a quantitative assessment of the degree to which implementation adheres to the core principles and ingredients of the HF model (Tsemberis & Asmussen, 1999). Fidelity assessment has been increasingly commonplace in community mental health, with studies incorporating scales to measure other evidence-based practices such as ACT (DACTS, TMACT, Monroe-DeVita, Teague, & Moser, 2011; Teague, Bond, & Drake, 1998), Supported Employment (Bond, McHugo, Becker, Rapp, & Whitley, 2008), and Integrated Dual Diagnosis Treatment (IDDT, Chandler, 2011). More recently, the SAMHSA (2010) Permanent Supportive Housing (PSH) Toolkit was developed to guide implementation and evaluation of supportive housing programs and heavily informed the development of the HF Fidelity Scale. Proctor et al. (2011) have conceptualized fidelity as one important implementation outcome. We also used qualitative methods to understand the dynamics of program implementation in different contexts and ‘‘how’’ and ‘‘why’’ numerical fidelity ratings show more or less adherence to the HF model (Patton, 2002, 2011). To understand implementation, researchers have argued for the use of an ecological framework that draws attention to two broad sets of influences on implementation: service delivery system factors and support system factors (Durlak & DuPre, 2008; Raghavan, Bright, & Shadoin, 2008). Service delivery factors include community and organizational capacity factors, service-provider qualities, and characteristics of the innovation, while support system factors include training and technical assistance. While their review focused on prevention programs for children, many of the factors that they uncovered have been found to be barriers or facilitators to the implementation of evidence-based mental health programs for adults in several recent qualitative studies (Mancini et al., 2009; McGraw et al., 2010; Rapp et al., 2010; Seffrin, Panzano, & Roth, 2008; Torrey, Bond, McHugo, & Swain, 2012). 4. Research questions There are no systematic studies of the early implementation of HF in the literature. We sought to fill this gap in knowledge by addressing the following questions: 1. What were the fidelity strengths and challenges in the implementation of HF? 2. What factors helped and what factors hindered the implementation of HF?

5. Methodology 5.1. Fidelity evaluation The Pathways HF Fidelity Scale was used to assess program fidelity (Tsemberis, 2010). Some items for the HF Fidelity Scale were adopted from the DACTS, TMACT, and PSH Fidelity scales (Monroe-DeVita et al., 2011; SAMHSA, 2010; Teague et al., 1998; Williams, Banks, Robbins, Oakley, & Dean, 2001), while others were created. Two versions of the Pathways HF Fidelity Scale were developed – one for ACT and one for ICM (see Appendix 1 for a list of all items for the two versions of the scale). The main difference between the two scales was that ACT teams were assessed on the degree to which they directly provided an array of services, whereas ICM teams were assessed on the degree to which they

were able to broker these same services by establishing an effective network of existing community-based providers. Aside from this difference, the two versions were identical. A total of 38 items were grouped within five overarching domains. The first three domains were conceptualized as homogeneous constructs reflecting the HF philosophy: Housing Choice and Structure, Separation of Housing and Services, and Service Philosophy. The fourth domain, Service Array, refers to the different types of services that are available in the community to people with serious mental illness (e.g., substance abuse treatment, employment and educational services), while the fifth domain of Program Structure was not conceptualized as a homogeneous construct that reflects HF, but rather a number of diverse items that constitute good programming (e.g., low participant/staff ratio, frequent meetings, participant representation in the program). In the original development of the scale, 20 housing programs serving people with people with mental illness in California were rated (Stefancic, Tsemberis, Messeri, & Drake, 2013) and Cronbach’s alpha coefficients were computed and found to show to good internal consistency for the first four domains: Housing Choice and Structure (.80), Separation of Housing and Services (.83), Service Philosophy (.92), and Service Array (.71). In terms of validity, the 10 At Home/Chez Soi programs, which are based on the Pathways HF model, scored significantly higher than the 20 California programs, that did not explicitly follow the HF model, on the three defining domains of Housing First: Housing Choice and Structure, t(29) = 7.88, p < .01, Separation of Housing and Services, t(29) = 5.75, p < .01, and Service Philosophy, t(29) = 2.21, p < .05. They did not differ significantly on the domains of Service Array or Program Structure, that are more generic to different types of community mental health programs. Fidelity assessments were conducted with five ACT teams and five ICM teams at the five sites by an eight-member Quality Assurance (QA) team consisting of clinicians, researchers, housing experts, and a consumer representative, who were external to the sites and experts in the Pathways HF model. Members of the QA team underwent a one-day training on the use of the scale and they observed or participated in two practice ratings of programs in the U.S. Each of the 38 items was rated by the QA team on a 4-point scale with a high score indicating a high level of fidelity. For example, to obtain a maximum fidelity score of 4 on the item of housing availability, 85% of program participants must have moved in to housing of their choosing within six weeks. Ratings could include half-point increments (e.g., 3.5). Our benchmark for a high level of fidelity on the items was a score of 3.5 or 4, as these scores indicate either maximum or near maximum fidelity. Data used to score the dimensions of the scale were obtained from multiple sources. The fidelity assessment consisted of a fullday site visit to each program by four to six QA team members and included observing program meetings, conducting interviews with program staff, chart reviews, and focus groups with consumers. Two QA team members reviewed each data source. Site visits were conducted between August 2010 and November 2010 when teams had been operating for 9–13 months. So the focus was on early implementation and fidelity. Interviews were done with specialized, frontline staff (e.g., psychiatrists), clinicians (e.g., case managers), management (e.g., program directors), and members of the local housing team. Interviews were semi-structured and lasted approximately 45 min, with interviewers taking notes. The consumer focus groups lasted approximately one and a half hours with 8–12 participants. For the chart review, the QA team reviewed a random sample of 10 charts, including progress notes for the past month, as well as the most recent treatment plan and assessments. The total sample included 84 staff interviews, 10 consumer focus groups, and 100 chart reviews. In advance of the visit, the fidelity

G. Nelson et al. / Evaluation and Program Planning 43 (2014) 16–26

team also collected and reviewed information from each program related to housing, clinical service policies and procedures, populations served, and program operations. The final scores for each item on the Pathways HF Fidelity Scale were reached through QA team discussion and consensus, which is very similar to the approach used by Torrey et al. (2012) in their study of fidelity of evidence-based practices in community mental health. The QA team met to review all the data obtained in the site visit. For each item, the team members shared their independent ratings. A discussion followed until consensus was achieved for the rating for each item. Equal weighting was given to all the data sources. After this process, the QA team conducted a debriefing session with each program to discuss preliminary findings. Debriefing sessions began with brief presentations by the QA team’s findings and were followed by a discussion with program staff. In this format, the QA team could report initial observations and recommendations and the program had the opportunity to comment, clarify, or offer additional information, and provide feedback. After each visit, the QA team prepared a report that included a summary of program operation and implementation as well as detailed feedback on each item that described program strengths and challenges, and made recommendations for improvement. Reports were first sent out as drafts to staff, soliciting their input and feedback, and were then edited based on program input and sent back to the programs as final versions. 5.2. Qualitative evaluation of implementation Following the QA team assessment of fidelity, local site researchers gathered qualitative data on barriers and facilitators of implementation. Sampling of the key informants was purposeful: individuals who played a key role in program implementation were selected and interviewed individually (Patton, 2002). These informants, many of whom were also interviewed for the fidelity visits, included the Site Coordinators, Principal Investigators, Team Leads for clinical and housing programs, psychiatrists, housing staff representatives, and landlords. Front-line project staff and consumers, most of whom participated directly in the programs, were interviewed in focus groups. The total sample across all five sites consisted of 64 key informant interviews and 35 focus groups with 211 participants. Additionally, the eight members of the QA team were interviewed by members of the National Qualitative Research Team. All interviews were conducted between May 2010 and April 2011 to examine early implementation, and all participants provided informed consent to participate in the research. Common key informant and focus group interview guides were used across the sites that addressed the general issues of what helped and what hindered implementation (e.g., relationships among project stakeholders, organizational structures). Key informant interviews and focus groups were conducted, in either English or in French, at the participants’ workplaces or at the site offices. A small number of key informant interviews and landlord

19

interviews were conducted over the phone. All interviews were audio recorded and transcribed verbatim. The approach to data analysis at each of the sites involved thematic analysis. Site researchers sought and identified ‘‘common threads’’ throughout the data, drawing out significant concepts that emerged from individual interviews along with concepts that linked interviews together. They also used the constant comparative method during each stage of the analysis to further develop codes and themes (Charmaz, 2007). Each site went through a process of memberchecking with people who were interviewed for the site reports to establish the trustworthiness of the data (Lincoln & Guba, 1986). Qualitative researchers produced site reports on the implementation process and asked stakeholders to review and provide input on the reports to ensure that the reports reflected the constructions of the participants and not those of the researchers (Lincoln & Guba, 1986). For the cross-site analysis, members of the National Qualitative Research Team and the QA team read the five qualitative implementation site reports and the five ACT and five ICM fidelity reports. A teleconference was held in which the members of these two teams shared their impressions of the reports. From this discussion emerged two main topics: (a) what worked well in implementation and (b) challenges to implementation. For the qualitative data, matrix displays were constructed with sites as one dimension and, consistent with the ecological framework, helping and hindering factors at multiple levels of analysis as the other dimension (Durlak & DuPre, 2008). Relevant codes from each of the site reports were included in the cells of the matrix (Miles & Huberman, 1994). A draft cross-site report was written using a multiple case study approach (Stake, 2005). Member-checking to ensure that the analyses reflected the realities of site participants was undertaken by having the site researchers (including the Principal Investigators and qualitative researchers) and other key site stakeholders (housing, ACT, and ICM team members, peer specialists, and members of the site operations teams) review and provide input on the report’s findings. Feedback from the sites was incorporated into the final cross-site report.

6. Findings 6.1. What worked well 6.1.1. Strengths: fidelity evaluation Overall, 71% of the fidelity scale items were rated as higher than 3 on a 4-point scale, indicating a high level of fidelity to the Pathways HF model (see Appendix 2). For all of the domains, scores were skewed with most scores falling at the positive end of the scale (see Table 1). The strongest fidelity findings were found for the following domains: Separation of Housing and Services (3.90), Service Philosophy (3.60), Housing Choice and Structure (3.59), and Program Structure (3.11). Only the Service Array domain had an average score below 3 (2.84).

Table 1 Scores on the fidelity scale domains by site averaged across ACT and ICM programs. Fidelity Domains – Number of Items

Site 1

Site 2

Site 3

Site 4

Site 5

Average across Sites

Housing Choice and Structure – 6 Items Separation of Housing and Services – 7 Items Service Philosophy – 10 Items Service Array – 8 Items Program Structure – 6 Itemsa

3.75 3.86 3.50 2.50 3.50

3.33 3.93 3.62 2.75 3.38

3.62 3.89 3.56 3.34 3.08

3.73 3.83 3.53 2.85 3.58

3.52 3.97 3.78 2.75 3.67

3.59 3.90 3.60 2.84 3.44

a Two items that were not applicable to ICM programs, Team Approach and Peer Specialist, were not included. Scores are averaged across ACT and ICM programs. Each site has one ACT team and one ICM team, except for Site 1, which only has an ACT team, and Site 2, which has one ACT team and two ICM teams. Scores are computed out of 4, since ratings for each item are rated from 1 to 4.

20

G. Nelson et al. / Evaluation and Program Planning 43 (2014) 16–26

6.2. Factors that facilitated implementation: qualitative evaluation 6.2.1. Community factors The strength of existing services in the community facilitated implementation. For example, Winnipeg was described by participants as having many useful resources, including vocational training, food and drop-in programs, all of which benefit the project’s participants. A Winnipeg stakeholder commented that ‘‘we’ve just scratched the surface’’ of the services available within the community. Participants stated that local partners have been critical in contributing expertise and experience and have increased the project’s capacity to integrate within service networks. A long list of different partners has been involved at the various sites. In Toronto, for example, the City of Toronto is the service lead for housing, which facilitated partnerships with a host of city-funded programs. Moreover, participants conceptualized the At Home/Chez Soi project as a bridge, bringing together the host of community services that had previously been operating in a more fragmented manner. Partnerships with government agencies and departments have enhanced the project’s ability to secure access to housing units, mental health and homelessness services, and government income supports. In Vancouver, participants described their collaboration with the Ministry of Social Development, which has led to increased access to services and reduced wait-times. ‘‘I know people who’ve been on housing lists for over two years, waiting for a unit. . . Then we can get a referral and have them housed within three weeks or a month. It’s a dream. I mean, it’s the way it should be for people.’’ (Vancouver site report) At the Toronto site, participants emphasized the importance of forming a partnership with representatives from Ontario Works (OW) and the Ontario Disability Support Program (ODSP) to facilitate timely access to income support. ‘‘I figure they have a really good relationship with ODSP, seems you get things done quite efficiently with the ODSP workers’’ (Toronto site report). Moncton site members spoke of the importance partnerships with government. ‘‘The first day that this project started it brought in all the . . . partners and so for me that’s probably what is the crucial part. . . Right from the beginning of this project, the public servants, the Deputy Ministers, the Assistant Deputy Ministers, were the first ones that partnered. Once you had their approval and a partnership with them, the other doors are opened and still opening.’’ (Moncton site report) Partnerships with landlords and landlord associations facilitated implementation. If At Home/Chez Soi had not been able to successfully engage a large number of landlords for the more than 1000 apartments needed across the country, implementation would not have been possible. Participants described the importance of fostering positive relationships between the project staff and landlords. In Montre´al, the housing team developed a strong network that included clinicians, consumers, and superintendents. It was noted that among the more than 40 landlords, personal knowledge of mental health and homelessness sometimes played a part in the landlord’s decision to participate in the project. In some of the sites, landlord appreciation and education events have been organized. Participants across sites expressed the value of collaboration with landlords, which can make landlords more likely to consult with service team members to resolve issues, rather than notifying the police or taking steps toward eviction. ‘‘. . . it [an eviction] was handled extremely well and everything went very smooth after he moved out, the clean up and everything. Everything was looked after 100%. . . the taste I got

in my mouth when it was all said and done was ‘hey, this was just great.’ Put it this way, thank God he was in the program. . . everything we deal with and the rules and regulations, sometimes it can be a real pain in the butt or whatever and so this was a big advantage.’’ (Moncton site report)

6.2.2. Organizational/team/service factors Leadership was seen as helping implementation. Site Coordinators, team leaders, and others were cited by participants as possessing skills that facilitated implementation. Effective leaders were those considered to have strong decision-making skills, to provide clear direction, to foster an environment of shared learning and respect amongst staff, to understand the HF model, and to have extensive experience working with various consumer populations. Participants reported the importance of having skilled Coordinators who understand the ‘‘big picture.’’ Having a strong staff team was also described as benefiting implementation. Participants identified the importance of staff with the right combination of technical and interpersonal skills, and noted the value of staff knowledge of mental health and addiction issues, as well as assessment skills, commitment to the project and participants, and openness, respect, and adaptability. Quality Assurance (QA) team key informants confirmed that having staff with a philosophy, values and skills congruent with the HF model is critical to project implementation. Team diversity was described as fostering cross-team learning and sharing, breaking down hierarchical relationships within teams, and providing participants with expertise and information in a wide variety of areas. Participants also noted team cohesion as a facilitator of project implementation. Activities like structured meetings, formal training, all-team events, and sharing project office space were all considered beneficial to team work. A positive team environment was described as supportive, open, flexible, cooperative, and characterized by trust, mutual understanding, and a shared commitment to HF values. Another organizational factor that stakeholders identified as facilitating implementation is the projects’ organizational structure and governance. Participants at the Toronto site described this factor in most detail, stating that project governance structures are critical in defining roles and responsibilities and allowing for collaboration, partnership building, effective communication and conflict resolution. Toronto stakeholders specifically mentioned the important effect on implementation that the Site Operations Team, Local Advisory Committee, and various work groups had. Montre´al stakeholders similarly highlighted governance successes. QA team key informants affirmed how valuable it is to have a clear organizational structure so that staff can understand their roles. The Pathways HF program philosophy and practice encourages consumer input both individually and collectively. Indeed, participants described the value of partnerships with consumers for program implementation. Participants at several sites described the emergence of peer-driven initiatives that have made important contributions to the implementation of the project. Social get-togethers, peer-support programs for individuals with substance abuse issues, participant-produced newsletters, and participant-led focus groups on eviction prevention are all examples of consumer involvement in project implementation. Participants reported that consumers help their fellow consumers by providing them with information about the services available in the community and, similarly, can act as a resource for staff by offering them insider information about community resources. A Winnipeg key informant made the following comment about consumer involvement. ‘‘The experts are the constituents. We learn from them’’ (Winnipeg site report).

G. Nelson et al. / Evaluation and Program Planning 43 (2014) 16–26

The Toronto site formed a Consumer Caucus, comprised of 22 people with lived experience, which is represented on the Site Operations Team, the Local Advisory Committee, and other governance structures (van Draanen et al., 2013). The value of the Caucus is described by participants as significant, having provided the project with ‘‘in-house experts’’ and grounding the project in the consumer perspective. A final observation made by stakeholders at a number of sites was that implementation has been facilitated by having a program team composed of some individuals with lived experience. Service-providers in Montre´al, for example, reported that the peer support worker is wellintegrated into the ACT team. The involvement of these staff members is seen to benefit the implementation of the project by facilitating the engagement of consumers and the establishment of positive consumer-staff relationships. 6.2.3. Characteristics of the innovation Site-specific programs that address the needs of racialized groups were believed to help implementation. Participants at sites with programs specifically designed to account for the needs of racialized participant groups noted how important this type of flexible adaptation of the HF model is to the success of the project. In Toronto, there is an Ethnoracial ICM model in place, which makes use of an Anti-Racist/Anti-Oppression framework, and in Winnipeg, the Aboriginal Lens Committee provides an Aboriginal perspective to the services offered. The ICM program in Winnipeg is We Chi Win, which means ‘‘walk with me.’’ ‘‘Our philosophy is, when they come through our doors, we have to start walking with them, wherever they’re going, and start where they are. And walk with them in their new neighborhoods, their new homes. Even if they go to jail, we walk with them.’’ (Winnipeg site report) 6.2.4. Training and technical assistance provided by the MHCC Participants considered the MHCC to be responsive, fair, generous and supportive. Instances of communication between sites and the MHCC were characterized as positive and productive. Resources provided by the MHCC that were thought to facilitate implementation included funding, practical guidance, and training opportunities, which were described as being ongoing, in-depth, and relevant to the treatment of a wide variety of mental health issues. QA team key informants commented on the usefulness of having international experts who were able to contribute their expertise to training program staff both at national meetings and by spending time at each site: ‘‘. . .We got lots of training, lots of money, nobody has that kind of [money and resources], the opportunities that they’ve had around ability to go to conferences, ability to learn together, ability to attend courses, they had a whole month of orientation nobody has had.’’ (Toronto site report) 6.3. Challenges to implementation Challenges: Fidelity evaluation. Several challenges to fidelity were noted across the five sites (see Appendix 2). Housing availability was a problem in all five sites (average score of 2.2 out of 4), and housing choice was a challenge in two of the sites (scores of 3 for these two sites). In the domain of Service Philosophy, person-centered planning (2.7) and motivational interviewing (2.9) were challenges for several sites, and assertive engagement was a challenge for two of the sites (scores of 3 for these two sites). For the Program Structure domain, consumer representation in program operations and policy was a challenge at all of the sites (1.9).

21

The majority of challenges to fidelity were observed in the Service Array domain (average score of 2.8 out of 4). However, this was not uniform across sites. Four of the sites had an average score below 3 for the Service Array domain, but one site, which is well known for having a relatively long-standing and rich service environment for people with mental illness, had an average score of 3.3. The items of substance abuse treatment (2.8), employment and educational services (2.5), nursing/medical care (2.9), social integration (2.9), 24-h coverage (3.0), and staff involvement regarding discharge from inpatient treatment (2.5) were challenges for the majority of sites (see Appendix 2). Moreover, as is shown in Table 2, these Service Array items were more of a problem for the ICM teams (2.9) than for the ACT teams (3.2). This is likely because ICM teams have higher caseloads and have to rely on accessing an array of services in the community that they can broker for consumers.

Table 2 Scores on the fidelity scale domains and items by ACT and ICM Programs. Fidelity domains and items

ACT (n = 5)

ICM (n = 5)

Housing Choice and Structure Housing Choice Housing Availability Permanent Housing Tenure Affordable Housing Integrated Housing Privacy Separation of Housing and Services No Housing Readiness No Program Contingencies of Tenancy Standard Tenant Agreement Commitment to Re-house Services Continue Through Housing Loss Off-site Services Mobile Services Service Philosophy Service Choice No Requirements for Participation in Psychiatric Treatment No Requirements for Participation in Substance Use Treatment Harm Reduction Approach Motivational Interviewing Assertive Engagement Absence of Coercion Person-centered planning Interventions Target a Broad Range of Life Goals Participant Self-Determination and Independence Service Array Housing Support Psychiatric Services Substance Abuse Treatment Employment and Educational Services Nursing/medical Services Social integration 24-h coverage Involved in in-patient treatment Program Structure Priority Enrollment for Individuals with Obstacles with Housing Stability Contact with participants Low participant/staff ratio Team approacha Frequent meetings Weekly Meeting/Case Review Peer Specialist on Staffa Participant Representation in Program

3.6 3.4 2.0 4.0 4.0 4.0 3.9 3.9 3.9 4.0 3.9 3.9 4.0 4.0 3.6 3.7 4.0 4.0

3.5 3.1 1.8 4.0 4.0 4.0 4.0 3.9 4.0 4.0 3.8 3.8 4.0 4.0 3.7 3.5 3.8 4.0

(1–4) (3–4) (1–3) (4–4) (4–4) (4–4) (3.5–4) (3–4) (3.5–4) (4–4) (3.5–4) (3.5–4) (4–4) (4–4) (3–4) (2–4) (4–4) (4–4)

(1–4) (3–3.5) (1–3) (4–4) (4–4) (4–4) (4–4) (3–4) (4–4) (4–4) (3–4) (3–4) (4–4) (4–4) (3–4) (1–4) (3–4) (4–4)

4.0 (4–4)

4.0 (4–4)

3.9 3.3 3.4 3.9 3.0 3.6 3.6 3.2 4.0 4.0 2.8 2.2 3.4 3.0 3.0 3.3 3.5 4.0

(3.5–4) (2–4) (2–4) (3.5–4) (2–4) (3–4) (3–4) (1–4) (4–4) (4–4) (2–4) (1–3) (3–4) (2–4) (2–4) (2–4) (1–4) (4–4)

3.5 2.8 3.2 3.8 2.6 3.5 3.6 2.9 4.0 2.2 3.0 2.6 2.4 2.8 3.0 3.3 3.3 4.0

3.6 4.0 3.8 4.0 3.2 3.8 1.8

(2–4) (4–4) (3–4) (4–4) (2–4) (3–4) (1–2)

2.4 (1–4) 4.0 (4–4) – 4.0 (4–4) 3.6 (2–4) – 2 (2–2)

(3–4) (2–4) (2.5–4) (3–4) (1–4) (2.5–4) (3–4) (1–4) (4–4) (1–3) (2–4) (2–4) (1–3) (2–3) (3–3) (3–4) (1–4) (4–4)

a ICM programs do not use a team approach, nor do they have peer specialists on staff. These two items were not included in the total Program Structure score. Scores are averaged across sites, so that five ACT and five ICM programs are included. Ranges for each item are included in parentheses.

22

G. Nelson et al. / Evaluation and Program Planning 43 (2014) 16–26

6.4. Factors that hindered implementation: qualitative evaluation 6.4.1. Community factors A universal barrier to program implementation across the sites was the lack of affordable and available housing. For example, Winnipeg reported difficulties housing participants, with some individuals waiting up to five months for housing. Toronto program staff cited the scarcity of affordable housing units, particularly in downtown Toronto, as hindering program implementation. At the same time, this barrier could sometimes be overcome by various strategies developed by site housing procurement teams. A lack of public transportation was also viewed as a significant barrier for Moncton consumers to get to appointments to receive health and social services, or to visit the food bank. Transportation challenges made it difficult for consumers to maintain relationships with family and friends, leading to social isolation. In Montre´al, many housed participants reported not having telephones. This resulted in communication difficulties, missed visits from service-providers, and increased isolation. Challenges related to partnerships with landlords were also noted. While landlord relationships were reported as being generally positive across sites, there were issues that certainly hindered program implementation. First, stigma and racism from some landlords was reported as being an issue in the sites. For example, the Winnipeg report noted that with low vacancy rates, landlords can select renters, avoiding Aboriginal people with mental illness and addictions. Similar instances of racism from landlords were noted in Toronto. In Moncton, some landlords perceived tenants from the program as being different because of their histories of homelessness. ‘‘What I see coming in, it’s people with needs, like the housing, people coming up from the street, they’re homeless, so they are brought into the apartment and you can how they dress, their appearance. . .’’ (Moncton site report) Second, program staff has had to work hard to sustain and repair relationships with landlords, particularly after they have had to evict some tenants. While Winnipeg staff has tried to keep landlords satisfied and engaged in the program, it has not been possible to keep all of the ‘‘guarantees’’ which initially attracted landlords to the program, such as consistent visits with participants or prompt repairs to damaged units. This, coupled with the problems they have had with some participants, has caused several landlords to leave the program in recent months, ‘‘taking their units with them.’’ 6.4.2. Organizational/team/service factors Stakeholders at the Winnipeg, Toronto and Montre´al sites noted a lack of contact and cohesion between teams (i.e., ACT, ICM, housing teams) as a factor that hindered implementation. For example, Toronto staff expressed frustration with the fact that housing and support services teams occupy different sites: ‘‘. . . I think the real challenge is that the housing folks are not embedded with us, and I think that that’s a real concern. We missed an important opportunity that we didn’t realize we were missing’’ (Toronto site report). Similarly, Winnipeg staff noted significant challenges as a result of landlords, service teams, and research partners being separated and compartmentalized. Toronto service-providers also expressed a lack of collaboration between ACT and ICM teams, with some informants reporting early tensions between clinical teams due to competition for housing and the perception that teams were being compared to one another on project outcomes. Staff workload issues and caseload size were expressed as a major concern at each program site. For example, the required travel time between staff offices and participants’ homes was deemed a

significant burden on service teams in all of the sites. With housing spread out across the cities, providing follow-up service visits involved a great deal of travel time (upwards of two hours per hour-long house call). ‘‘You can spend hours finding one participant’’ (Winnipeg site report). Participants in Moncton and Toronto emphasized the importance of self-care and well-being of staff, suggesting that measures need to be taken to ensure that service-providers work regular hours and have opportunities to discuss their concerns. A general consensus among Winnipeg informants was that all teams found the workload light at the beginning of implementation, but then it became heavy, and at times excessive, as the number of participants increased. Montre´al ACT and ICM teams struggled with how to reconcile a recovery approach with heavy workload demands. Confronted with numerous tasks, service-providers had to build trust with participants and offer them intensive accompaniment at the same time as they were dealing with team reorganization and considerable staff turnover. Vancouver and Toronto staff spoke of the immense pressures placed on them to handle the rate of intake of new consumers while simultaneously finding housing for participants, supporting existing consumers, and helping them to maintain their housing. ACT and ICM team members in Vancouver described the difficulty in not having established program protocols prior to implementation. In Winnipeg, the large number of participants who abused solvents required adaptation on the parts of the service team because they were initially not equipped to deal with this specific challenge. In general, the first year of operations is the most difficult phase of HF program implementation. This was further compounded in this initiative by the fact that implementation was embedded within a research demonstration project which placed enormous pressure on teams to enroll many participants rapidly so that they could participate for the full duration of the study. In this sense, some of the implementation issues may not be representative of regular clinical admission and some of the issues may be attributable to the design of the largescale research project. Participants across sites noted that the diversity of consumer needs and functioning has made program implementation difficult. For example, service-providers in Vancouver noted that there has not been enough discussion on how to assist people who are not doing well in the program. ‘‘What is it like to come inside after being outside for so long? Because we’re recognizing that that’s a huge issue for people, the change from homelessness to housed. It’s an identity change. It’s a lifestyle change. It’s not just. . . somebody wants housing and you get them housing and their problems are solved. It’s half the battle, but there’s also the head stuff that has to happen too.’’ (Vancouver site report) Participants also expressed concerns that some important service programs are currently lacking (e.g., addictions treatment, vocational and educational support). In Winnipeg, consumers were concerned about the lack of after-hours staff support available to assist them with crisis situations and landlord issues. While partnerships with consumers were viewed as something that helped implementation, consumer involvement remains an issue at some of the sites. In Montre´al, participants were unaware of the activities of the Conseil ex-pairs, a consumer group involved in the project, and regret the general lack of communication about the role of peers in the project. Organizational structure and governance was also noted as a challenge. In some cases, it was not clear who does what and ‘‘who work for whom.’’ The lack of a clear governance model sometimes led to delays in obtaining housing for consumers and a ‘‘lack of cohesion and contact’’ among clinical and housing teams.

G. Nelson et al. / Evaluation and Program Planning 43 (2014) 16–26

6.4.3. Characteristics of the innovation Although programs designed to account for the needs of racialized participants were cited as a major resource for participants, there were also some unique implementation challenges related to addressing the needs of racialized groups. In Winnipeg, social isolation was cited as a major challenge for consumers of Aboriginal backgrounds where ‘‘home’’ usually meant they lived with or very close to extended family. Some Winnipeg staff suggested that the project should offer more congregate-style housing units to address this cultural issue, so that the choices of some consumers could be respected and met. Toronto cited unique challenges of hiring and training culturally competent staff to accommodate the needs of Aboriginal participants in the program. Participants in Toronto also expressed difficulties meeting the cultural and linguistic requirements of their diverse population. 6.4.4. Training and technical assistance provided by the MHCC There were mixed views on MHCC training. Service-providers in Winnipeg, for example, believed that more sensitivity training was needed (e.g., Aboriginal communication styles and body language), as well as training on how to deal with persons at risk for suicidal behavior. Housing staff pointed out that the site was not prepared for the high volume of consumers moving out and needing new housing arrangements. Staff suggested that there could have been more preparation or training regarding this possibility. QA team key informants addressed this issue, stating that a large challenge to implementation was the unevenness in expertise of the teams at different sites, with some sites requiring more technical assistance than others. These key informants also expressed challenges designing training that is relevant to all sites when the groups are in very different places in terms of ‘‘local/social/systemic capacity realities.’’ 7. Discussion We wanted to know if the Pathways HF approach can be implemented with fidelity in a large scale RCT in five different cities across Canada. After one year of operation, we found a high level of fidelity to the HF model, with more than 71% of the items across programs and sites demonstrating high fidelity. Fidelity was particularly strong for the domains of Housing Choice and Structure, Separation of Housing and Services, and Service Philosophy. Thus, in spite of some significant adaptations of the model to local context (e.g., incorporation of Aboriginal perspectives and traditions), these data confirm the At Home/Chez Soi sites were successful in implementing the core principles and functions of the Pathways HF model as defined in the fidelity scale. What accounts for this high level of fidelity? As posited by Durlak and DuPre (2008), both delivery system factors and support system factors facilitated implementation. Delivery system factors included community capacities (i.e., the strength of existing community services, partnerships with agencies, departments, and landlords), organizational and team capacities (i.e., leadership, staff strengths, staff cohesion, organizational structures, partnerships with consumers), characteristics of the Pathways HF model (its adaptability to the needs racialized groups), while support system factors include the training and technical assistance provided by MHCC. Bond (2009), Durlak and DuPre (2008), and Raghavan et al. (2008) have all noted the importance of systems level support for the implementation of evidence-based practices. Given that HF is a complex community intervention (Nelson et al., 2013), successful implementation depends heavily on support from a variety of formal and informal community stakeholder organizations (i.e., government, local services, landlords) (Seffrin et al., 2008).

23

Organizational capacities are also important for implementation. The findings from this study confirm the observations of Bond (2009) that project leadership is particularly critical for successful implementation. Participants noted the importance of Site Coordinators having the ‘‘big picture’’ and being able to bridge stakeholder groups to develop a shared vision. As has been found in research on implementation of other evidence-based programs (McGraw et al., 2010; Rapp et al., 2010; Seffrin et al., 2008), staff competence and cohesion were found to help implementation. The various staff qualities mentioned as important have been referred to by Foster-Fishman, Berkowitz, Lounsbury, Jacobson, and Allen (2001) as member capacity (e.g., knowledge, practice skills) and relational capacity (e.g., the ability to work with others to develop a positive work climate). The nature of the Pathways HF intervention was also thought be important for implementation, particularly the flexibility of being able to adapt the model to the local cultural context (Stergiopoulos et al., 2012). The training and technical assistance provided by the MHCC was an important factor in promoting implementation of the HF model from the outset, as has been found in previous research (Seffrin et al., 2008). Participants noted the value of having experienced staff from the New York Pathways program and the Toronto Streets to Home program to teach them HF practice skills and assist them with program implementation, as well as the establishment of internal communities of practice. We also wanted to know about the challenges that were experienced in implementation. One challenge that was experienced across the sites was housing availability, which was benchmarked by the percentage of participants who moved into housing within six weeks. The most frequently cited reason for this problem was low vacancy rates and insufficient supply of affordable housing. Gaetz (2010) noted this problem and posed the question ‘‘How effective can Housing First be if there is no affordable housing to move people into?’’ (p. 25). This problem was compounded by challenges working with landlords who may prefer to rent to people who do not have mental health or addictions issues or who are mainstream Canadians. Stigma and racism were noted as problems that were experienced with some landlords, particularly in sites with high percentages of Aboriginal and ethno-racial populations. Although these challenges created delays in housing, housing outcomes at one year are generally very good across the sites (Goering et al., 2012), and many landlords became strong supporters of the At Home/Chez Soi project. Another challenge across sites was the extent to which participants are represented in program operations and have input into policy. While ratings on this item were low, this does not mean that consumers did not participate in the HF program. At each of the sites and at the national level, consumers were involved in serviceprovider roles, research roles, specific consumer-led initiatives (e.g., a photovoice project), and other groups (e.g., the Toronto Consumer Caucus). However, there were fewer opportunities for involvement or governance within each of the programs themselves. Ratings of items in the Service Array domain also indicated challenges. Furthermore, we found that these challenges were more apparent for ICM teams than ACT teams, where staff must broker services from other agencies. While ACT was the initial service modality in HF (Tsemberis et al., 2004), clinical supports have since been adapted to also include ICM services in order to meet the needs of individuals with more moderate needs. Using ICM as the clinical model may pose challenges in achieving fidelity to the original model or it may indicate that ICM scale ratings are too stringent. Future results examining the degree to which ICM fidelity predicts outcomes may provide some insights on this issue. Related to this is the fact that ICM staff found that many of the ‘‘moderate needs’’ participants in the ICM program actually had a high level of needs that demanded both considerable staff time and

24

G. Nelson et al. / Evaluation and Program Planning 43 (2014) 16–26

a diverse array of staff skills and services. The staff to consumer ratio was modified by MHCC in response to such feedback from 1:20 to 1:16 for ICM, while this ratio remained at 1:10 for ACT throughout the project. Another factor that may account for challenges in the Service Array domain is that the cities varied widely in terms of other community mental health services upon which they could draw. Some cities had a service-rich environment, while others were new to these types of community mental health programs. Staff workload issues, which were exacerbated by a continuously expanding number of participants during the recruitment phase, as well as the need to rehouse participants who were experiencing housing challenges, hampered the programs from providing a full array of programs to participants. A lack of communication between service teams at some sites and the need for training regarding interventions for addictions, vocational planning, solvent use, suicide, and culturally different populations were other challenges that were experienced during early implementation. While Rapp et al. (2010) found considerable staff resistance to innovation in their study of implementation, this study found more problems with staff being overwhelmed by the many demands of working with high needs participants in a complex system with varying resources to assist them. 8. Limitations We did not obtain independent ratings of each of the fidelity items from QA team members for each of the site visits. In future research, it would be valuable to gather independent ratings and compute inter-rater reliability agreement for the items. Another limitation is that the fidelity data and qualitative data reported here were gathered at just one point in time. We are in the process of gathering and analyzing follow-up fidelity and qualitative data on the implementation of At Home/Chez Soi during the second year of program operation. It will be useful to compare these data to understand fidelity and implementation strengths and challenges earlier and later in program evaluation, as such strengths and challenges may change over time. One final limitation is that we did not focus the qualitative evaluation specifically on the fidelity strengths and challenges. Future research would benefit from using a sequential mixed methods approach in which the qualitative data are meshed more intentionally with the fidelity data (Cresswell & Clark Piano, 2011).

among people with mental illness. Tabol et al. (2010) noted significant variability in the implementation of the critical elements of supported housing programs. They recommended the need for a fidelity tool to address this notable program development problem. We consider this study as being an important step toward resolving this issue. Another lesson learned was how a parallel qualitative evaluation can be of value in uncovering the reasons underlying both implementation strengths and weaknesses. In our case, it did so by helping us identify key implementation drivers related to the innovation itself, the surrounding delivery system, and the implementation support system. By conceptualizing HF as a complex community intervention (Hawe et al., 2004), we were able to emphasize the value of adapting HF to local circumstances, which was acknowledged as a key factor affecting its successful uptake. While the extensive array of supports called for could be a challenge to implement, especially for the ICM teams who rely on accessing referrals for meeting the oftentimes complex needs of its participants, the emphasis on adaptation has helped the project conceptualize ways of ‘‘capturing’’ additional key resources (e.g., psychiatric or addictions consultations). In relation to the delivery system, the qualitative evaluation emphasized the importance of hiring staff whose technical and interpersonal skills and personal values are congruent with the HF model. The evaluation also underscored the importance of leadership for implementing a coherent approach to the model. It showed that where lack of clarity existed, implementation could suffer. For example, ambiguity in how the housing and clinical teams were governed meant that the teams did not take a coherent approach to procuring housing for participants in a timely manner. With respect to the technical support system, the evaluation indicated that the training and technical assistance strategy, as well as the Site Coordinators put in place at the outset of the project, were crucial aspects of successful implementation. While the fidelity visits themselves could be perceived as anxietyproducing ‘‘tests,’’ their value was acknowledged as being as a tool for quality improvement as implementation proceeds. With clear quantitative information where strengths and weaknesses lie, and rich qualitative findings indicating how these could be improved, our mixed methods implementation evaluation has helped move toward a rigorous and contextually relevant implementation of the HF model. Acknowledgments

9. Lessons learned Through the mixed methods implementation evaluation, we gleaned a number of lessons, both relating to the findings and to the process itself. First, the quantitative fidelity assessment demonstrated that it is feasible to achieve a high level of implementation of the Pathways HF model within a Canadian healthcare delivery context. The initiative successfully implemented most critical ingredients of the model in all sites and teams. The assessment was also valuable for pinpointing areas needing improvement, such as the timeliness of housing procurement, adjusting caseloads for ICM, and expanding the array of services. Given the demonstrated link between implementation and outcome (Durlak & DuPre, 2008), the lesson learned is that doing a fidelity assessment is clearly important for assuring confidence that the model is being implemented as planned and for identifying trouble spots where implementation can be improved. Moreover, we believe that the creation and utilization of a fidelity assessment tool in this study is an important contribution to the conceptualization and dissemination of the supported housing HF model, which is an increasingly popular approach throughout North America and Europe to address homelessness

We thank Jayne Barker (2008–2011), Ph.D., and Cameron Keller (2011 – present), Mental Health Commission of Canada At Home/ Chez Soi national project leads, the national qualitative research team, the five qualitative site research teams, Site Coordinators, and the numerous service and housing providers, as well as persons with lived experience, who have contributed to this project and the research. This research has been made possible through a financial contribution from Health Canada. The views expressed herein solely represent the authors. Appendix A. Supplementary data Supplementary data associated with this article can be found, in the online version, at http://dx.doi.org/10.1016/j.evalprogplan.2013. 10.004. References Aubry, T., Ecker, J., & Jette´, J. (2013). Supported housing as a promising Housing First approach for people with severe and persistent mental illness. In M. Guirguis, R. McNeil, & S. Hwang (Eds.), Homelessness and health. in press.

G. Nelson et al. / Evaluation and Program Planning 43 (2014) 16–26 Blakely, C. H., Mayer, J. P., Gottschalk, R. G., Schmitt, N., Davidson, W. S., Roitman, D. B., et al. (1987). The fidelity-adaptation debate: Implications for the implementation of public sector social programs. American Journal of Community Psychology, 15, 253–268. Bond, G. R. (2009). Deciding versus implementing: A comment on ‘‘What gets noticed: How barrier and facilitator perceptions relate to the adoption and implementation of innovative mental health practices.’’. Community Mental Health Journal, 45, 270– 271 http://dx.doi.org/10.1007/s10597-009-9190-y. Bond, G. R., McHugo, G. J., Becker, D. R., Rapp, C. A., & Whitley, R. (2008). Fidelity of supported employment: Lessons learned from the National Evidence-based Practice Project. Psychiatric Rehabilitation Journal, 31, 300–305 http://dx.doi.org/ 10.2975/31.4.2008.300.305. Calgary Homeless Foundation. (2012). The state of homelessness in Calgary in 2012: Preliminary report. Calgary: Author. Available from the Homeless Hub http:// www.homelesshub.ca/. Carling, P. J. (1995). Return to community: Building support systems for people with psychiatric disabilities. New York: The Guilford Press. Chandler, D. W. (2011). Fidelity and outcomes in six integrated dual diagnosis treatment programs. Community Mental Health Journal, 47, 82–89 http:// dx.doi.org/10.1007/s10597-009-9245-0. Charmaz, K. (2007). Constructing grounded theory. Thousand Oaks, CA: Sage. Cresswell, J. W., & Clark Piano, V. L. (2011). Designing and conducting mixed methods research (2nd ed.). Thousand Oaks, CA: Sage. Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors that influence implementation. American Journal of Community Psychology, 41, 327–350 http:// dx.doi.org/10.1007/s10464-008-9165-0. Foster-Fishman, P. G., Berkowitz, S. L., Lounsbury, D. W., Jacobson, S., & Allen, N. A. (2001). Building collaborative capacity in community coalitions: A review and integrative framework. American Journal of Community Psychology, 29, 241–261. Frankish, C. J., Hwang, S. W., & Quantz, D. (2009). The relationship between homelessness and health: An overview of research in Canada. In J. D. Hulchanski, P. Campsie, S. Chau, S. W. Hwang, & E. Paradis (Eds.), Finding home: Policy options for addressing homelessness in Canada (e-book), Chapter 2.1. Toronto: Cities Centre, University of Toronto.. Available at www.homelesshub.ca/FindingHome. Gaetz, S. (2010). The struggle to end homelessness in Canada: How we created the crisis, and how we can end it. Open Health Services and Policy Journal, 3, 21–26. Goering, P. N., Streiner, D. L., Adair, C., Aubry, T., Barker, J., Distasio, J., et al. (2011). The at Home/Chez Soi trial protocol: A pragmatic, multi-site, randomized controlled trial of Housing First in five Canadian cities. BMJ Open, 1–18,. Retrieved from http:// bmjopen.bmj.com/content/1/2/e000323.full. Goering, P., Tolomiczenko, G., Sheldon, T., Boydell, K., & Wasylenki, D. (2002). Characteristics of persons who are homeless for the first time. Psychiatric Services, 53, 1472–1474. Goering, P., Veldhuizen, S., Watson, A., Adair, C., Kopp, B., Latimer, E., et al. (2012 September). At Home/Chez Soi interim report. Calgary: Mental Commission of Canada. Available from www.mentalhealthcomission.ca. Hawe, P., Shiell, A., & Riley, T. (2004). Complex interventions: How ‘‘out of control’’ can a randomised controlled trial be? British Medical Journal, 328, 1561–1563. Johnson, G., Parkinson, S., & Parsell, C. (2013). Policy shift or program drift? Implementing Housing First in Australia, AHURI Final Report No. 184 Melbourne: Australian Housing and Urban Research Institute. Keller, C., Goering, P., Hume, C., Macnaughton, E., O’Campo, P., Sarang, A., et al. (2013). Initial implementation of Housing First in five Canadian cities: How do you make the shoe fit, when one size does not fit all? American Journal of Psychiatric Rehabilitation. (in press). Lincoln, Y. S., & Guba, E. (1986). But is it rigorous? Trustworthiness and authenticity in naturalistic evaluation New Directions for Program Evaluation, vol. 30 (pp. 73–84). San Francisco: Jossey-Bass. Macnaughton, E., Goering, P., & Nelson, G. (2012). Exploring the value of mixed methods within the At Home/Chez Soi Housing First Project: A strategy to evaluate the implementation of a complex population health intervention for people with mental illness who have been homeless. Canadian Journal of Public Health, 103(Suppl. 1), S57–S62. Macnaughton, E., Nelson, G., & Goering, P. (2013). Bringing politics and evidence together: Policy entrepreneurship and the conception of the At Home/Chez Soi Housing First initiative for addressing homelessness and mental illness in Canada. Social Science and Medicine, 82, 100–107. Mancini, A. D., Moser, L. L., Whitley, R., McHugo, G. J., Bond, G. R., Finnerty, M. T., et al. (2009). Assertive community treatment: Facilitators and barriers to implementation in routine mental health settings. Psychiatric Services, 60, 189–195 http:// dx.doi.org/10.1176/appi.ps.60.2.189. Mares, A. S., & Rosenheck, R. A. (2011). A comparison of treatment outcomes among chronically homeless adults receiving comprehensive housing and health care services versus usual local care. Administration and Policy in Mental Health, 38, 459– 478 http://dx.doi.org/10.1007/s10488-011-0333-4. McHugo, G. J., Bebout, R. R., Harris, M., Cleghorn, S., Herring, G., Xie, H., et al. (2004). A randomized controlled trial of integrated versus parallel housing services for homeless adults with severe mental illness. Schizophrenia Bulletin, 30, 969–982. McGraw, S. A., Larson, M. J., Foster, S. E., Kresky-Wolff, M., Botelho, E. M., Elstad, E. A., et al. (2010). Adopting best practices: Lessons learned from the Collaborative Initiative to Help End Chronic Homelessness (CICH). Journal of Behavioral Health Services and Research, 37, 197–212. Miles, M. B., & Huberman, M. A. (1994). Qualitative data analysis: A sourcebook for new methods (2nd ed.). CA: Sage: Thousand Oaks.

25

Minnery, J., & Greenhalgh, E. (2007). Approaches to homelessness policy in Europe, the United States, and Australia. Journal of Social Issues, 63, 641–655. Monroe-DeVita, M., Teague, G. B., & Moser, L. L. (2011). The TMACT: A new tool for measuring fidelity to Assertive Community Treatment. Journal of the American Psychiatric Nurses Association, 17, 17–29. Mowbray, C. T., Holter, M. C., Teague, G. B., & Bybee, D. (2003). Fidelity criteria: Development, measurement, and validation. American Journal of Evaluation, 24, 315–340. Nelson, G., Macnaughton, E., Goering, P., Piat, M., Dudley, M., O’Campo, P., et al. (2013). Planning a multi-site complex intervention for people with lived experience of mental illness and homelessness: The relationships between the national team and local sites in Canada’s At Home/Chez Soi project. American Journal of Community Psychology, 51, 347–358. Palinkas, L. A., Aarons, G. A., Horwitz, S., Chamberlain, P., Hurlburt, M., & Landsverk, J. (2011). Mixed method designs in implementation research. Administration and Policy in Mental Health, 38, 44–53 http://dx.doi.org/10.1007/s10488-010-0314-z. Patton, M. Q. (2002). Qualitative research and evaluation methods (3rd ed.). Thousand Oaks, CA: Sage. Patton, M. Q. (2011). Developmental evaluation: Applying complexity concepts to enhance innovation and use. New York: Guilford Press. Pleace, N., & Bretherton, J. (2012). What do we mean by Housing First?. Categorising and critically assessing the Housing First movement from a European perspective. Paper presented at the European Network for Housing Research. Proctor, E., Silmere, H., Raghavan, R., Hovmand, P., Aarons, G., Bunger, A., et al. (2011). Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Administration and Policy in Mental Health and Mental Health Services Research, 38, 65–76 http://dx.doi.org/10.1007/s10488-0100319-7. Raghavan, R., Bright, C. L., & Shadoin, A. L. (2008). Towards a policy ecology of implementation of evidence-based practices in mental health settings. Implementation Science, 3, 26 http://dx.doi.org/10.1186/1748-5908-3-26. Rapp, C. A., Etzel-Wise, D., Marty, D., Coffman, M., Carlson, L., Asher, D., et al. (2010). Barriers to evidence-based practice implementation: Results of a qualitative study. Community Mental Health Journal, 46, 112–118 http://dx.doi.org/10.1007/s10597009-9238-z. Rog, D. J., & Randolph, F. L. (2002). A multisite evaluation of supported housing: Lessons learned from cross-site collaboration. New Directions for Evaluation, 94, 61–72. Seffrin, B., Panzano, P. C., & Roth, D. (2008). What gets noticed: How barrier and facilitator perceptions relate to the adoption and implementation of innovative mental health practices. Community Mental Health Journal, 44, 475–484 http:// dx.doi.org/10.1007/s10597-008-9151-0. Stake, R. E. (2005). Qualitative case studies. In N. K. Denzin & Y. S. Lincoln (Eds.), The Sage handbook of qualitative research (3rd ed., pp. 443–466). Thousand Oaks, CA: Sage. Stefancic, A., Tsemberis, S., Messeri, P., & Drake, R. E. (2013). The Pathways Housing First Fidelity Scale for individuals with psychiatric disabilities. American Journal of Psychiatric Rehabilitation. (in press). Stergiopoulos, P., O’Campo, F. V., Gozdzik, A., Jeyaratnam, J., Corneau, S., Sarang, A., et al. (2012). Moving from rhetoric to reality: Housing First for homeless individuals with mental illness from ethno-racial groups. BMC Health Services Research, 12, 345,. available at http://www.biomedcentral.com/1472-6963/12/345. Substance Abuse and Mental Health Services Administration (SAMHSA). (2010). Permanent supportive housing: Evaluating your program. DHHS Publication No. SMA10-4509, Rockville, MD Center for Mental Health Services, SAMHSA, U.S. Department of Health and Human Services. Tabol, C., Drebing, C., & Rosenheck, R. A. (2010). Studies of ‘‘supported’’ and ‘‘supportive’’ housing: A comprehensive review of model descriptions and measurement. Evaluation and Program Planning, 33, 446–456. Teague, G. B., Bond, G. R., & Drake, R. E. (1998). Program fidelity in assertive community treatment: Development and use of a measure. American Journal of Orthopsychiatry, 68, 216–232. Torrey, W. C., Bond, G. R., McHugo, G. J., & Swain, K. (2012). Evidence-based practice implementation in community mental health settings: The relative importance of key domains of implementation activity. Administration and Policy in Mental Health and Mental Health Services Research, 39, 353–364 http://dx.doi.org/10.1007/ s10488-011-0357-9. Tsemberis, S. (2010). Housing First: The Pathways model to end homelessness for people with mental illness and addiction. Center City, MN: Hazelden. Tsemberis, S., & Asmussen, S. (1999). From streets to homes: The Pathways to Housing consumer preference supported housing model. Alcoholism Treatment Quarterly, 17, 113–131. Tsemberis, S., Gulcur, L., & Nakae, M. (2004). Housing first, consumer choice, and harm reduction for homeless individuals with a dual diagnosis. American Journal of Public Health, 94(4), 651–656. van Draanen, J., Jeyaratnam, J., O’Campo, P., Hwang, S., Harriott, D., Koo, M., & Stergiopoulos, V. (2013). Meaningful inclusion of consumers in research and service delivery. Psychiatric Rehabilitation Journal, 36, 180–186 http://dx.doi.org/ 10.1037/prj0000014. Williams, V. F., Banks, S. M., Robbins, P. C., Oakley, D., & Dean, J. (2001). Final report on the cross-site evaluation of the collaborative program to prevent homelessness. Delmar, NY: PRA. Wong, Y.-L.I., Filoromo, M., & Tennille, J. (2007). From principles to practice: A study of implementation of supported housing for psychiatric consumers. Administration and Policy in Mental Health and Mental Health Services Research, 34, 13–28 http:// dx.doi.org/10.1007/s10488-006-0058-y.

26

G. Nelson et al. / Evaluation and Program Planning 43 (2014) 16–26

Geoffrey Nelson is Professor of Psychology at Wilfrid Laurier University and Co-lead of the Qualitative Research for the At Home/Chez Soi project. Ana Stefancic is Director of Research at Pathways to Housing and a PhD candidate at the Mailman School of Public Health at Columbia University. Jennifer Rae is a PhD Candidate at University of Ottawa who served as a research assistant of the Qualitative Research Team for the At Home/Chez Soi project.

Jino Distasio is Director of the Institute of Urban Studies, University of Winnipeg, and is Co-PI for the Winnipeg At Home/Chez Soi site.

Roch Hurtubise is Director of the School of Social Work, Universite´ de Sherbrooke and is a member of the Qualitative Research Team for the At Home/Chez Soi project.

Greg Townley is an Assistant Professor of Community Psychology at Portland State University and a member of the Qualitative Research Team for the At Home/Chez Soi project.

Michelle Patterson is a Scientist and Adjunct Professor in the Faculty of Health Sciences, Simon Fraser University and a Co-Investigator at the Vancouver site.

Sam Tsemberis is the Founder and CEO of Pathways to Housing, Inc. and is on the faculty of the Department of Psychiatry at Columbia University Medical Center.

Vicky Stergiopoulos is a Scientist at St. Michael’s Hospital, an Associate Professor at the University of Toronto, and a Co-PI for the Toronto At Home/Chez Soi site.

Eric Macnaughton is a post-doctoral fellow with the At Home/Chez Soi project and adjunct faculty with the Adler School of Psychology, in Vancouver, BC. Tim Aubry is Co-Director and Senior Researcher at the Centre for Research on Educational and Community Services at the University of Ottawa and Co-PI for the Moncton At Home/Chez Soi site.

Myra Piat is an Assistant Professor, McGill University, Department of Psychiatry, and Researcher at Douglas Mental Health University Institute.

Paula Goering is Professor, University of Toronto, Affiliate Scientist, Centre for Addiction and Mental Health, and Research Lead for At Home/Chez Soi.

Early implementation evaluation of a multi-site housing first intervention for homeless people with mental illness: a mixed methods approach.

This research sought to determine whether the implementation of Housing First in a large-scale, multi-site Canadian project for homeless participants ...
456KB Sizes 0 Downloads 0 Views