LSHSS

Tutorial

Data-Based Decision Making in Professional Learning for School Speech-Language Pathologists Kimberly A. Murzaa and Barbara J. Ehrenb

Purpose: School-based speech-language pathologists (SLPs) are often asked to adopt new policies or practices at the school, district, or state level. Professional learning (PL) opportunities usually accompany these changes but are often delivered in a one-size-fits-all workshop format. The authors challenge school-based SLPs and district leadership to consider PL in a new light, guided by data-based decision making to ultimately improve student outcomes. Method: This article discusses the research supporting the assessment and delivery of high-quality PL for school

professionals, including SLPs, and a specific model for measuring change: the concerns-based adoption model (CBAM; Hall & Hord, 2015). An example of how CBAM was used to examine the adoption process with school-based SLPs in a large school district is provided. Conclusion: Based on the review of the literature, the current approach to PL experienced by most SLPs is problematic. High-quality PL should target improvement in student outcomes and should be focused, ongoing, supported, and evaluated.

S

Practices & Council of Chief State School Officers, 2010) that require a new way of doing business for SLPs in schools. A major component in facilitating change is professional development, now more commonly called professional learning (PL; Mizell, Hord, Killion, & Hirsh, 2011). SLPs, along with their colleagues, are asked to engage in a variety of activities to learn new practices and procedures. A significant issue is the cost benefit of these activities—that is, whether the SLPs’ personal investment of time and effort and the school system’s investment of financial resources yield sufficient results. Historically, the effectiveness of PL activities in schools has not been measured in terms of implementation by participants and the subsequent impact on students (Guskey, 2000). The definition of success for these activities has most often related to participant satisfaction: Did the participants like the activity? Did they perceive that they learned useful information? Only recently, consistent with new standards for PL advanced by the national professional association dedicated to this work (Learning Forward, 2011), have some school districts begun to look at the outcomes of PL. They are now asking questions such as the following: Do participants actually implement what they learned? Does the implementation of new learning result in a sustained change in practice? Does the change in practice positively affect student outcomes? Given the amount of time SLPs are invited or required to participate in PL and the amount of money that states, districts, service centers, and professional

chool reform and accountability initiatives in American education prompt all educators in K–12 education, including school-based speech-language pathologists (SLPs), to examine their assessment and instructional practices to optimize student success. SLPs, like other educators in schools, are called upon to implement scientifically based practices to support learning for students who struggle, including those with language impairments (e.g., Individuals with Disabilities Education Improvement Act [IDEA], 2004). Further, SLPs who practice in schools have been challenged by their professional association to examine and perhaps redefine their roles and responsibilities in order to make significant contributions to student achievement (American Speech-Language-Hearing Association, 2010). Therefore, change is a constant in the school setting: examples are initiatives such as response to intervention (RTI; also known as multi-tiered systems of support in many places) and the Common Core State Standards (CCSS; National Governors Association Center for Best

a

University of Northern Colorado, Greeley University of Central Florida, Orlando Correspondence to Kimberly A. Murza: [email protected]

b

Editor: Marilyn Nippold Associate Editor: LaVae Hoffman Received November 3, 2014 Revision received February 11, 2015 Accepted April 21, 2015 DOI: 10.1044/2015_LSHSS-14-0101

Disclosure: The authors have declared that no competing interests existed at the time of publication.

Language, Speech, and Hearing Services in Schools • Vol. 46 • 181–193 • July 2015 • Copyright © 2015 American Speech-Language-Hearing Association

Downloaded From: http://lshss.pubs.asha.org/pdfaccess.ashx?url=/data/journals/lshss/934277/ by a Univ Of Newcastle User on 01/22/2017 Terms of Use: http://pubs.asha.org/ss/rights_and_permissions.aspx

181

associations spend in offering PL activities, it is important to examine carefully the changes in practices that occur. Because desired change in practice is at the heart of PL, stakeholders need to understand the change process and the role of effective PL within it. Stakeholders include SLPs in schools; school-, district-, and state-level leaders who support SLPs in schools; and the university faculty who provide preservice preparation. Therefore, the purpose of this article is to discuss an evidence-based method for evaluating PL in schools to inform both designers and consumers in an effort to use human and financial resources wisely. Specifically, we will (a) define important terms used to discuss PL and the change process, (b) explain the nature of high-quality PL in education, (c) discuss the concerns-based adoption model (CBAM; Hall & Hord, 2015) as a databased system for managing the change process and tailoring PL experiences, and (d) provide an example of CBAM used to evaluate and refine PL in a large school district and the implications of that trial. We hope that this article will promote dialogue about using data to design and deliver student outcome–driven PL for SLPs in schools.

Definitions Several terms used throughout this article require definition. The following terms are related to the CBAM (Hall & Hord, 2015) that will be described in another section. •

Innovation is the change in practice that educators are asked to adopt (e.g., collaboration with classroom teachers).



Adoption is the process of implementing a new practice in a consistent manner over time.



Adopter is an individual who is in the process of implementing a new practice.



Change facilitator is an individual who leads the adoption process by supporting individuals in the various stages of adopting an innovation.



Professional learning facilitator refers to an individual who promotes and supports comprehensive, sustained, and intensive in-service education.



Intervention in this context refers to experiences that facilitate learning of an innovation by professionals. It does not refer to activities with students.

High-Quality PL Given that PL opportunities are a central means for implementing school reform and changing practices, the strikingly large gap between what educators, including SLPs, know about effective PL and current practice is astounding (Elmore, 2004). So, what do we know? We know that standards for PL based on empirical evidence exist. As mentioned previously, Learning Forward, formerly named the National Staff Development Council (NSDC), has developed the standards listed in Table 1 for highquality PL that are evidence based (Bergquist, 2006; Jaquith,

182

Mindich, Wei, & Darling-Hammond, 2010; Slabine, 2011; Wei, Darling-Hammond, & Adamson, 2010). The bottom line of these standards is that high-quality PL should be focused, ongoing, supported, and evaluated. PL should not be the oft-occurring single session or “training” without follow-up or evaluation (Graner, Ault, Mellard, & Gingerich, 2012), although this is the typical configuration seen in school districts. In fact, the word training is no longer used by PL specialists to describe the transformational learning activities at the heart of PL. With the current focus on accountability in education, it can be argued that it is now more important than ever that school leaders demonstrate evidence of the impact of PL on student learning. Just as educators are using student outcome data to make decisions about the effectiveness of teaching methods, these data should also be used to determine the effectiveness of PL initiatives. With specific regard to SLPs, a focus on outcomes is certainly consistent with the practice policy on SLPs’ implementation of evidence-based practice (ASHA, 2005). Learning Forward states its purpose as “ensuring that every educator engages in effective PL every day so every student achieves” (Learning Forward, 2015, para 2). Ultimately, the aim of any PL experience should be to improve student outcomes. However, the only way to ensure that PL is affecting student outcomes is to evaluate first whether SLPs change their thinking and behavior because change in practice precedes improvement of student outcomes.

Evaluation of PL Although the intent of PL in schools is ultimately to improve student outcomes, research suggests that participant satisfaction is largely the only form of evaluation being conducted. For example, Muijs and Lindsay (2008) conducted a survey of 223 PL facilitators and 416 teachers from a randomly selected sample of 1,000 schools. More than 75% of PL coordinators reported that participant satisfaction was evaluated “usually” or “always,” whereas participants’ use of the innovation and student outcomes was consistently evaluated (“usually” or “always”) less than 40% of the time. A leading contributor in the arena of evaluating PL, Guskey (2005), argued that there are five critical stages of PL that are arranged hierarchically and build on one another. His levels, based on Kirkpatrick’s (1959a, 1959b, 1960a, 1960b) model and listed by complexity beginning with the least complex level, are as follows: 1.

Participants’ reactions

2.

Participants’ learning

3.

Organizational support and change

4.

Participants’ use of new knowledge and skills

5.

Student learning outcomes

Guskey’s (2005) model was constructed to evaluate each level in sequence, wherein a level must be evaluated and

Language, Speech, and Hearing Services in Schools • Vol. 46 • 181–193 • July 2015

Downloaded From: http://lshss.pubs.asha.org/pdfaccess.ashx?url=/data/journals/lshss/934277/ by a Univ Of Newcastle User on 01/22/2017 Terms of Use: http://pubs.asha.org/ss/rights_and_permissions.aspx

Table 1. Learning Forward standards for professional learning (PL; Learning Forward, 2011). Standard Area Learning communities Resources Learning designs Outcomes Leadership Data Implementation

PL that increases educator effectiveness and results for all students . . . . . . occurs within learning communities committed to continuous improvement, collective responsibility, and goal alignment. . . . requires prioritizing, monitoring, and coordinating resources for educator learning. . . . integrates theories, research, and models of human learning to achieve its intended outcomes. . . . aligns its outcomes with educator performance and student curriculum. . . . requires skillful leaders who develop capacity, advocate, and create support systems for PL. . . . uses a variety of sources and types of student, educator, and system data to plan, assess, and evaluate PL. . . . applies research on change and sustains support for implementation of PL for long-term change.

Note. Reprinted with permission from Learning Forward, © 2015.

achieved before moving on to the next level. The rationale for this approach is that each level builds on the prior level; therefore, student outcomes cannot be expected to change until each of the four previous levels have succeeded. Abell et al. (2007) developed a multisite PL evaluation model based on Guskey’s (2005) model. In Abell’s model, data are gathered across sites so that common attributes can be analyzed together for a potentially greater outcome effect. In addition to evaluating the five levels of Guskey’s model through the use of observational data, multiple surveys, interviews, and student data, the authors argued that the content and context of the PL should also be evaluated. To do this, they used individual project profiles to thoroughly describe and analyze the PL activities at each site while also considering overall outcomes of the PL initiatives across sites.

Evaluation Tools and Methods Desimone, Porter, Garet, Suk Yoon, and Birman (2002) completed a 3-year longitudinal survey to evaluate Guskey’s (2005) fourth level of evaluation: participants’ use of new knowledge and skills. They surveyed mathematics and science teachers’ changes in teaching practices before, during, and after PL. They chose states, districts, and schools that used diverse methods of PL approaches. Only those teachers who responded to all three surveys were included in the analysis. A total of 207 teachers met those conditions. The results of the study suggested that there is a link between (a) providing PL focused on content and specific teaching practices and (b) teachers’ adoption and use of those practices. The analyses indicated a number of qualities of PL that appeared to have a positive effect on teachers’ use in the classroom, including (a) teachers from the same school, department, or grade all collectively participating; (b) active learning opportunities; (c) linking to other activities; and (d) reform-type PL. Because this study looked at only one aspect of PL—that is, the teachers’ use of new knowledge and skills—it is impossible to know how the participants thought and felt about the innovation or how their use affected student learning. Scherz, Bialer, and Eylon (2008) developed an assessment protocol to evaluate teachers’ use of PL in their practice through portfolios. Fourteen teachers, who had at least

16 hours of PL instruction in a particular science program, participated in the study. The teachers developed portfolios to demonstrate evidence of instructional change as a result of implementing the science program. The authors’ assessment protocol consisted of a rubric that rated the teachers’ portfolios in the areas similar to those identified by Guskey (2005). The authors of the study concluded that portfolio learning by teachers and evaluation of the portfolios could be an effective way of evaluating PL, although insufficient for completely understanding the process of adopting new practices. As Guskey (2005) argued, PL is a complex process that requires professionals to change their thoughts, beliefs, and practices. Hall and Hord (2015) suggest that these thoughts, beliefs, and practices of adopters change in a predicable way when they participate in high-quality PL. Thus, change facilitators are wise to assess adoption of an innovation in a way that is aligned with this pattern of adopter transformation. Hanley, Maringe, and Ratcliffe (2008) and Hall and Hord (2015) have developed methods to evaluate PL in a way that considers patterns of adopter transformation. Hanley et al. (2008) divided adoption patterns into four levels: trigger, vision, conversion, and maintenance. All four levels deal with the participants’ reactions and feelings toward the adoption of an innovation at increasing levels of adoption, but the process does not look at participants’ use of the PL in an objective way. The CBAM Hall and Hord (2015) also address the idea of PL as a complex process of adopter change but have developed a system with specific tools to evaluate various aspects of the adoption process, called the concerns-based adoption model (CBAM). CBAM has been used widely by educators and change facilitators since the model was first published in 1987 by Hall and Hord. CBAM includes three key components: Innovation Configurations (IC), Stages of Concern (SoC), and Levels of Use (LoU). The fundamental component of CBAM is the IC, which allows PL facilitators to define clearly the innovation they want educators to adopt. The process of constructing the configuration is called mapping, and the resulting tool is called an IC Map. The IC Map defines critical components

Murza & Ehren: Data-Based Decision Making in PL for SLPs

Downloaded From: http://lshss.pubs.asha.org/pdfaccess.ashx?url=/data/journals/lshss/934277/ by a Univ Of Newcastle User on 01/22/2017 Terms of Use: http://pubs.asha.org/ss/rights_and_permissions.aspx

183

of the innovation (see the Appendix for an example of an IC Map). This tool helps developers of innovations, PL facilitators, and implementers to be on the same page about the exact nature of the innovation. Without being clear about the “it” they are being asked to implement, SLPs cannot be sure that their implementation is on target, and evaluators will not know what they are evaluating. For example, suppose a school district wants SLPs to collaborate more with teachers, and they conduct PL activities on “collaboration with teachers.” Unless the district defines exactly what they mean by this innovation, not only can confusion result, but also the district will not have an accurate foundation on which to build an evaluation of the PL. Some SLPs may interpret collaboration as talking with teachers about students struggling with literacy, some may think they are collaborating if they help teachers prepare materials, and others may define “collaboration with teachers” as providing services in the classroom. Each group of SLPs may think they have adopted collaboration with teachers because all of the activities described above were discussed in PL activities. However, without an IC Map, the district will be at a loss in evaluating the effectiveness of the PL because the SLPs are not defining collaboration in the same way. Another component of CBAM is the SoC. This tool allows thoughts, feelings, and perceptions to be measured throughout the process of change. The SoC described by Hall and Hord (2015) could be related to Guskey’s (2005) Levels 1 and 2 of the evaluation process. However, Guskey’s Level 1 relates only to participants’ reactions to the PL activity, whereas Hall and Hord look at the type of concerns that potential adopters have at various stages of the adoption process. It should be noted that the meaning of concerns in this context should not be interpreted as elements that cause potential adopters to fret. Hall and Hord’s (2015) definition of concern is “the composite representation of the feelings, preoccupation, thought, and consideration given

to a particular issue or task” (p. 85). SoC are expressed in seven stages and deal with a variety of concerns: unrelated concerns; self-concerns, including concerns about the information participants learned about the innovation (Guskey’s Level 2); logistical concerns about implementation; concerns about the impact of the innovation on students (Guskey’s Level 5, although not the actual impact); and concerns about collaboration. Table 2 provides the definitions of each SoC as presented in Hall and Hord (2015). To measure adopters’ SoC, Hall and Loucks (1979) developed a rigorous tool called the SoC Questionnaire, a research-validated instrument (Bailey & Palsha, 1992) with 35 items as well as open-ended statements and a onelegged interview (brief interview that can occur anywhere) to analyze adopter concerns. Their premise was that there are predictable stages a person goes through when asked to change. By determining the specific SoC reflected in an adopters’ descriptions, appropriate supports and interventions can be provided to help them progress throughout the adoption process. For example, returning to the collaboration example used in describing ICs, there might be an SLP who has been overwhelmed with family matters and unable to embrace any professional change initiative. That individual’s response after a year of attending PL on collaboration may be categorized as SoC Stage 2 (Personal). Another SLP in the same district may be trying to collaborate as defined by the district IC Map and is focused on attending to the logistics of scheduling collaboration opportunities, placing that individual at an SoC Stage 3 (Management). The PL facilitators in the district would do well to plan activities to differentiate PL for those at Stage 2 and Stage 3 because their learning needs differ. The last component of CBAM evaluates participants’ LoU of the innovation. This component looks at behavior and measures actual implementation, corresponding

Table 2. Stages of concern (SoC) about the innovation: Paragraph definitions. Levels of concern Impact

Task Self

Unrelated

SoC 6 Refocusing: The focus is on the exploration of more universal benefits from the innovation, including the possibility of major changes or replacement with a more powerful alternative. Individual has definite ideas about alternatives to the proposed or existing form of the innovation. 5 Collaboration: The focus is on coordination and cooperation with others regarding use of the innovation. 4 Consequence: Attention focuses on impact of the innovation on “clients” in the immediate sphere of influence. 3 Management: Attention is focused on the processes and tasks of using the innovation and the best use of information and resources. Issues related to efficiency, organizing, managing, scheduling, and time demands are utmost. 2 Personal: Individual is uncertain about the demands of the innovation, his or her inadequacy to meet those demands and his or her role with the innovation. This includes analysis of his or her role in relation to the reward structure of the organization, decision making, and consideration of potential conflicts with existing structures or personal commitment. Financial or status implications of the program for self and colleagues may also be reflected. 1 Information: A general awareness of the innovation and interest in learning more detail about it is indicated. The person seems to be unworried about himself or herself in relation to the innovation. She or he is interested in substantive aspects of the innovation in a selfless manner, such as general characteristics, effects, and requirements for use. 0 Unconcerned: Little concern about or involvement with the innovation is indicated. Concern about other thing(s) is more intense.

Note. From Implementing Change: Patterns, Principles, and Potholes (4th ed., pp. 86, 108), by G. E. Hall and S. M. Hord. Copyright © 2015 by Pearson Education, Inc., Upper Saddle River, NJ. Reprinted with permission.

184

Language, Speech, and Hearing Services in Schools • Vol. 46 • 181–193 • July 2015

Downloaded From: http://lshss.pubs.asha.org/pdfaccess.ashx?url=/data/journals/lshss/934277/ by a Univ Of Newcastle User on 01/22/2017 Terms of Use: http://pubs.asha.org/ss/rights_and_permissions.aspx

to Guskey’s (2005) Level 4, evaluation of participants’ use of new knowledge and skills. Based on their research, Loucks, Newlove, and Hall (1975) set eight LoU. These eight LoU can be subdivided into two categories: nonusers and users. As previously discussed, adopters typically move along a predictable process in their adoption of an innovation. As the LoU increase, so do the independence and sophistication of use of an adopter. Table 3 provides an explanation of the LoU. Loucks et al. (1975) developed frameworks for interviews to determine adopters’ use called the LoU Branching Interview and the LoU Focused Interview. Both of these interview procedures allow PL facilitators to gather information about what implementation looks like. It is important to note, however, that ascertaining LoU should be done within the context of the IC. An educator may seem to be implementing an innovation based on the LoU Branching Interview, but it is important to determine if the innovation is being used as learned or if the individual is using a significant alteration from the original. In the collaboration-with-teachers example we have been using, an SLP who has been finding more efficient ways to collaborate with teachers, while retaining the essential nature of collaboration as defined by the district, would be rated at LoU IVB (Refinement). That SLP would need very different PL support from that needed by a colleague who is still struggling with finding sufficient time to collaborate consistently and who is rated at LoU III (Mechanical Use).

Using CBAM in Data-Based Decision Making for PL This section provides an example of how CBAM was used to evaluate a PL initiative in a large school district. CBAM was used to examine the adoption process with

school-based SLPs in the Southeast. Significant resources (e.g., personnel and material costs, time) had been allocated by the district to provide materials and PL for a specific narrative text structure intervention over the course of several years. However, district SLPs’ thoughts and feelings toward the intervention and their implementation of it were unknown to administrators. To assist district administrators in planning future PL, CBAM tools were used to evaluate the adoption of the innovation (i.e., a narrative text structure intervention). The following questions were posited: (a) What patterns of concerns regarding the innovation are common throughout the district? (b) What levels of implementation of the innovation are evident throughout the district? (c) How should data regarding concerns and use be utilized to make decisions about further PL on the innovation?

Process The authors worked closely with SLP district administrators who had expressed a desire to understand the needs of school-based SLPs in order to move their implementation of the innovation to the next phase. Upon Institutional Review Board and school district approval, the first author attended meetings of geographical clusters of the district to recruit participants. Participants had the option of completing the SoC Questionnaire only or completing the questionnaire and LoU Branching Interview. Participants Participants were recruited from approximately 200 school-based SLPs in this district. Of those SLPs, 63 completed the SoC Questionnaire and 28 completed the LoU Branching Interview over the phone with the first author.

Table 3. Levels of use of the innovation. Category

Level

User

VI V IVB IVA III

Nonusers

II I 0

Definition Renewal: State in which the user reevaluates the quality of use of the innovation, seeks major modifications of or alternatives to present innovation to achieve increased impact on clients, examines new developments in the field, and explores new goals for self and the system. Integration: State in which the user is combining his or her own efforts to use the innovation with related activities of colleagues to achieve a collective impact on clients within their common sphere of influence. Refinement: State in which the user varies the use of the innovation to increase impact on clients within immediate sphere of influence. Variations are based on knowledge of both short- and long-term consequences for clients. Routine: Use of the innovation is stabilized. Few if any changes are being made in ongoing use. Little preparation or thought is being given to improving innovation use or its consequences. Mechanical use: State in which the user focuses most effort on the short-term, day-to-day use of the innovation, with little time for reflection. Changes in use are made more to meet user needs than client needs. The user is primarily engaged in a stepwise attempt to master the tasks required for the innovation, often resulting in disjointed and superficial use. Preparation: State in which the user is preparing for first use of the innovation. Orientation: State in which the user has recently acquired or is acquiring information about the innovation and/or has recently explored or is exploring its value orientation and its demands upon user and user system. Nonuse: State in which the user has little or no knowledge of the innovation, has no involvement with the innovation, and is doing nothing toward becoming involved.

Note. From Implementing Change: Patterns, Principles, and Potholes (4th ed., pp. 86, 108), by G. E. Hall and S. M. Hord. Copyright © 2015 by Pearson Education, Inc., Upper Saddle River, NJ. Reprinted with permission.

Murza & Ehren: Data-Based Decision Making in PL for SLPs

Downloaded From: http://lshss.pubs.asha.org/pdfaccess.ashx?url=/data/journals/lshss/934277/ by a Univ Of Newcastle User on 01/22/2017 Terms of Use: http://pubs.asha.org/ss/rights_and_permissions.aspx

185

It was important to district administrators and the schoolbased SLPs to keep the data completely anonymous; thus, very limited data were collected outside of the SoC Questionnaire and LoU Branching Interview. Of the 63 participants who completed the SoC Questionnaire, most SLPs had been involved for 1 (n = 12), 2 (n = 27), or 3 (n = 12) years. The remaining SLPs reported either no previous involvement (n = 9) or involvement for 4 years (n = 3). Assessment Tools The SoC Questionnaire (Hall & Loucks, 1979), as described previously, is designed to measure the thoughts and attitudes of PL participants. This tool consists of 35 items designed to identify concerns about the innovation. Respondents rate their response to each question on a scale ranging from 0 (irrelevant) to 7 (very true of me now), which allows evaluators to classify their current SoC within four different categories (see Table 2). This tool has strong internal reliability (a = .64 to .83) and test–retest correlations (r = .65 to .86; Hall, George, & Rutherford, 1979). High construct intercorrelation (72%) suggests that the SoC Questionnaire measures the constructs it was intended to measure (Hall et al., 1979). Researchers have since found the SoC Questionnaire to be a valid tool (Cicchelli & Baecher, 1989; Dennison, 1993; Long, 1994; Kirby & Smith, 1998). The LoU Branching Interview (Loucks et al., 1975) allows evaluators to quickly gather information about a participant’s implementation of an innovation. The interview protocol uses a decision-tree type of format so that the interviewer asks all participants the same initial question, with follow-up questions determined by the participant’s responses. Participants are assigned an LoU rating (see Table 3) based on their responses. The LoU Branching Interview demonstrates strong interrater reliability (r = .87 to .96; Hall & Loucks, 1977). Further, it was shown to be a valid measure through a comparison of ethnographers’ LoU ratings based on teacher observation and consensus LoU Branching Interview ratings (r = .98; Hall & Loucks, 1977).

Outcomes Stages of Concern in School-Based SLP PL Participants Table 4 presents the results of the SoC Questionnaire. The concerns of most participants were primarily in the

areas of awareness, management, and collaboration. In fact, many SLPs indicated high concerns in at least two of these three areas. Participants expressed the least concerns in the areas of consequence and refocusing, the two highest stages. The SoC results suggest that SLPs in the district were at very different points in the adoption of the innovation. Although PL for the innovation had been provided for 4 years at the time of the data gathering, a majority of the SLPs sampled (76%) had been involved with the innovation for 2 years or less. However, all district SLPs received the same PL experiences. LoU in School-Based SLP PL Participants Each participant who agreed to participate in the LoU Branching Interview was interviewed over the phone by the first author. Each interview was recorded and independently scored by trained and blinded assistants. Any discrepancies in the scoring were resolved through consultation with the first author. As described in Table 5, the SLP participants spanned the gamut from nonuse to the highest level of use: Level VI, Renewal. However, the majority of participating SLPs (79%) were either at the Routine (Level IVA) or Refinement (Level IVB) LoU. SLPs in the Routine level had stabilized the use of the innovation and were not making any changes to it. SLPs in the Refinement level had moved to varying the innovation to increase its impact. However, it was unclear in the interviews whether the changes the SLPs made were acceptable changes to the innovation. In other words, there was a question as to whether the SLPs were still using the innovation as intended by the district or whether they had changed it to the point of creating a new innovation.

Discussion Although this example is limited in number and scope of participants, the sample provided important information to district leaders. Both the SoC and LoU data supported the notion that district SLPs needed individualized PL experiences and interventions because they were at different phases of adoption. However, a standard approach to PL was in place. This PL practice is problematic considering the SoC data, which suggested that SLPs had different concerns about adopting the innovation. For example, SLPs

Table 4. Participant stages of concern (SoC). Most concerned about

186

Least concerned about

SoC

n

% of sample

n

% of sample

6—Refocusing 5—Collaboration 4—Consequence 3—Management 2—Personal 1—Informational awareness 0—Unconcerned

2 20 0 15 8 2 16

3 32 0 24 13 3 25

15 5 26 1 3 9 4

24 8 41 2 5 14 6

Language, Speech, and Hearing Services in Schools • Vol. 46 • 181–193 • July 2015

Downloaded From: http://lshss.pubs.asha.org/pdfaccess.ashx?url=/data/journals/lshss/934277/ by a Univ Of Newcastle User on 01/22/2017 Terms of Use: http://pubs.asha.org/ss/rights_and_permissions.aspx

Table 5. Participant levels of use (LoU). LoU

n

% of sample

VI—Renewal V—Integration IVB—Refinement IVA—Routine III—Mechanical use II—Preparation I—Orientation 0—Nonuse

1 1 11 11 2 0 0 2

4 4 39 39 7 0 0 7

with awareness concerns first needed more information about the exact nature of the innovation. SLPs with management concerns were focused on issues related to efficiency, organizing, managing, and scheduling around the innovation, whereas SLPs with collaboration concerns were focused on the coordination and cooperation of others around the use of the innovation. Clearly, these groups of SLPs would benefit from different follow-up activities, requiring a more individualized approach to PL by the district. The LoU Branching Interviews indicated eight levels of use within the interview sample. It is important to note that only 28 of the 63 participants completing the SoC Questionnaire participated in the LoU Branching Interview. It appears that those SLPs who felt more comfortable with the innovation were the individuals who participated in the LoU Branching Interview. The fact that a majority fell within the higher levels of Routine or Refinement supports that interpretation. Those SLPs at the Routine level of use would benefit from interventions focused on making acceptable adaptations to the intervention to better serve their students (Hall & Hord, 2015). For SLPs at the Refinement level, interventions targeting assessment of the innovation related to student outcomes might be targeted. The following section discusses recommendations made to the school district about future PL activities, based on the data.

Specific Recommendations to the School District The most significant issue that surfaced from gathering these data was that the school district had not created an IC Map prior to asking the SLPs to adopt the narrative text structure innovation. Results from both the SoC Questionnaire and the LoU Branching Interview suggested that many SLPs in the district needed a better understanding of the exact nature of the innovation. Hall and Hord (2015) suggested that the reason widespread change in practice occurs only modestly across a school district is that many adopters, facilitators, and leaders do not truly understand what a targeted innovation is and what it should look like when it is implemented. In this case, there seemed to be some confusion in the district about the innovation, and many SLPs appeared to be creating their own versions of the innovation when they did not completely understand it. For example, some SLPs explained that they focused on only one piece of the innovation while ignoring other key elements. Others added supplemental visual supports or

simplified the innovation significantly for their students with complex communication needs. Thus, the first suggestion the authors made to the SLP district leadership was to create an IC Map to clearly define the nonnegotiable features of the narrative text structure intervention that SLPs were supposed to be implementing. Not everyone will be at ideal implementation as they first encounter an innovation, but it is easier to move in that direction when the adopter knows what ideal implementation looks like. The Appendix presents the IC Map that was created by the first author in collaboration with the district leaders. Several other recommendations for future PL were made to district leadership based on the results of the data and include video modeling, aligning to standards, appropriate adaptations, and collaboration among the SLPs. For SLPs with a variety of concerns and implementation levels, the opportunity to watch videos of successful SLPs in the district using the innovation with a variety of grade levels in the therapy setting could be beneficial. Also, with the increasing emphasis on curriculum-relevant therapy (Ehren, 2006) aligned with the academic standards of the state, SLPs could benefit from receiving support in tying the innovation to the curriculum. Further, SLPs should learn how to modify the innovation in a way that addresses the unique needs of students while retaining the nonnegotiable features of the innovation as depicted in the IC Map. Several SLPs who were interviewed also described challenges they encountered when they tried to use the innovation with their students who had complex communication needs. While maintaining the integrity of the innovation, district leaders could help SLPs develop adaptations to the innovation to address the unique needs of those students— for example, working with the narrative text structure intervention with augmentative and alternative communication devices. Finally, for those SLPs who are interested in and ready to try collaborating with teachers around the use of the innovation, a focused PL program would be needed. SLPs might discuss the opportunities and challenges of moving toward an in-classroom service delivery model. Experienced district SLPs and their teacher collaborators could provide a model to less experienced but interested SLPs.

Implications for PL for SLPs It is clear from the body of literature on PL that PL practices in the schools have to change if SLPs want to affect student outcomes when they seek to implement new approaches, techniques, or program models. The use of CBAM with the district described in this article is an example of an approach that employs data-based decision making to inform PL goals and onging activities. Further, the results of the CBAM-generated data affirm the need for individualized and ongoing PL. Implications exist for a variety of participants in PL. These include professionals who design and deliver the experiences, such as school and district leaders, as well as local, state, and national professional organizations. Implications also exist for school

Murza & Ehren: Data-Based Decision Making in PL for SLPs

Downloaded From: http://lshss.pubs.asha.org/pdfaccess.ashx?url=/data/journals/lshss/934277/ by a Univ Of Newcastle User on 01/22/2017 Terms of Use: http://pubs.asha.org/ss/rights_and_permissions.aspx

187

SLPs themselves. Considerations for SLPs and leaders in smaller school districts are also important.

Implications for Leaders It behooves PL facilitators and change facilitators working with school-based SLPs to reflect on their current practices prior to targeting an innovation in PL. There are several key components that should be considered, including definition of the innovation, creative use of resources, and the assessment of PL. As the district data reported here demonstrated, it is difficult to implement change when the change means different things to different people. Developing an IC Map will not only help clarify the innovation but also provide a resource for future assessment of implementation. When SLPs understand exactly what they are being asked to do in their practice, resources can be devoted to PL activities aimed at affecting student learning. Ample resources (e.g., time, money, personnel) are necessary to provide high-quality PL experiences that go beyond the one-size-fits-all workshop to which SLPs are accustomed. However, when budget constraints exist, as they often do, leadership will have to be creative in offering individualized, ongoing, and focused opportunities for SLPs to learn about and enhance their use of an innovation. If they do not, it is unlikely that individual SLPs and the students they serve will reap substantial benefits from PL. One idea is for sponsoring entities (e.g., local education agencies, service education agencies, university programs, community partners) to arrange professional learning communities (PLCs; DuFour, 2004) based on SLP SoC or LoU with a targeted innovation. The term professional learning community is often used by school professionals to describe a model of collaboration with a focus on student learning and measurable results. Currently, many technology options exist for virtual PLCs or other interactions among professionals that can be implemented within limited budgets. SLPs with a management SoC could benefit from similar types of focused experiences. For example, some SLPs might benefit from PL experiences around issues such as scheduling, paperwork management, or creative service delivery. On the other hand, SLPs who are interested in collaborating with teachers around the innovation need something very different from SLPs who are still working on mastering the daily use of the innovation. For instance, SLPs interested in collaborating with teachers might engage in PL experiences around topics such as how to approach a potential collaborative partner, models of collaborations, and how to plan for and execute efficient planning meetings. Other ideas include providing virtual meetings, webinars, and periodic focused workshops involving successful SLP adopters as a follow-up to initial workshops to launch an innovation. A key to individualized and focused PL experiences for SLPs is understanding where they are in the adoption process. This can be accomplished only through assessment. The SoC Questionnaire and the LoU Branching Interview

188

are two of the key tools Hall and Hord (2015) suggest that change facilitators use to measure the adoption process. Further, as Guskey (2005) would argue, the bottom line in assessing PL is measuring the impact on students.

Implications for School SLPs As consumers of PL, school SLPs need to advocate for and be willing to engage in the high-quality PL that can affect student outcomes. First, they need to reflect on the needs of the students they serve to select PL activities that target innovations to address those needs. They also have to be willing to think beyond attending workshops and webinars and seriously consider how they will implement what they have learned. They need to be informed consumers of PL and support those PL providers who acknowledge and provide high-quality PL opportunities that include follow-up. In districts that require individual PL plans, rather than participating in only entry-level workshops, SLPs should take the opportunity to seek out PL where they will be guided through implementation.

Considerations for Smaller School Districts The school district illustrated in the example is very large and has the advantage of more resources than might be available to smaller districts—for example, SLP leadership personnel, PL planning time and money, and PL materials. However, SLPs in smaller districts usually still engage in PL and would benefit from taking a critical look at the adoption process. One issue that SLPs in smaller districts often face is a feeling of isolation. The development of PLCs (Dufour, 2004) across the district or even the state can help SLPs feel connected and is something that does not require significant resources, especially if implemented in a virtual format. Developing PLCs around district initiatives that SLPs must already participate in could be an efficient way to begin. For example, many SLPs across the country are engaging in RTI /multi-tiered systems of support and the CCSS that require new ways of thinking about intervention for students. SLPs in a smaller school district may want to develop a PLC around one of these initiatives. The first step in the process would be to collaborate around the development of an IC Map. What does the SLP’s role in RTI /multi-tiered systems of support or the CCSS look like? This map could have the added benefit of assisting SLPs in explaining their role to principals and other evaluators who may not understand the SLP’s role. Next, SLP leaders or district leaders should consider using the SoC Questionnaire to gauge the level of concern of SLPs in the district around the adoption of the given innovation. This tool can be easily disseminated and scored and provides extremely valuable information about the feelings related to the adoption of an innovation. Based on results of this questionnaire, SLP leaders could target PL opportunities (whether they involve official workshops or PL gatherings in person or via the web) to best suit the

Language, Speech, and Hearing Services in Schools • Vol. 46 • 181–193 • July 2015

Downloaded From: http://lshss.pubs.asha.org/pdfaccess.ashx?url=/data/journals/lshss/934277/ by a Univ Of Newcastle User on 01/22/2017 Terms of Use: http://pubs.asha.org/ss/rights_and_permissions.aspx

needs of the SLPs in the district. SLP leaders may also want to consider the LoU Branching Interview if they have the means to support SLPs in adopting an innovation. The LoU Branching Interview quickly provides information about where the SLP is at in the process of adoption of the innovation. Providing focused and individualized PL opportunities even in a smaller school district may also help with increased retention of SLPs who feel supported in their efforts to be effective with students.

Conclusion The work of researchers such as Guskey (2005) and Hall and Hord (1987, 2015) suggests that the current approach to PL experienced by most school SLPs is problematic. With student achievement as the ultimate goal of PL experiences, it behooves SLP leadership to consider what is known about adult learning and the adoption process in designing and delivering PL. A change in mindset surrounding PL in the schools is first necessary to build capacity for a different and improved approach to PL. This mindset shift is the first step to more judiciously allocating the resources already available to districts in order to promote improved student outcomes. To move to a more differentiated approach to PL, SLP PL facilitators first must understand what is happening in the adoption of an innovation in the same way that all SLPs must understand the strengths and weaknesses of their students to plan appropriate speechlanguage therapy. SLP leadership should consider evaluating the effects of the PL opportunities that they offer to their SLPs using the levels of evaluation advocated by Guskey. The use of the CBAM (Hall & Hord, 2015) can assist in this effort. SLP practitioners should advocate for PL opportunities that are focused, ongoing, supported, and evaluated. Future research should expand efforts to evaluate PL with SLPs across settings and across the adoption of a variety of innovations.

References Abell, S. K., Lannin, J. K., Marra, R. M., Ehlert, M. W., Cole, J. S., Lee, M. H., . . . Wang, C.-Y. (2007). Multi-site evaluation of science and mathematics teacher professional development programs: the project profile approach. Studies in Educational Evaluation, 33, 135–158. American Speech-Language-Hearing Association. (2005). Evidencebased practice in communication disorders [Position Statement]. Retrieved from www.asha.org/policy American Speech-Language-Hearing Association. (2010). Roles and responsibilities of speech-language pathologists in schools [Professional Issues Statement]. Retrieved from www.asha.org/ policy Bailey, D. B., & Palsha, S. A. (1992). Qualities of the Stages of Concern Questionnaire and implications for educational innovations. Journal of Educational Research, 85, 226–232. Bergquist, C. C. (2006). Encouraging systemic changes in professional development: A short summary of Florida’s evaluation protocol system. Retrieved from www.fldoe.org/profdev/pdf/ firstcycle-short.pdf

Cicchelli, T., & Baecher, R. (1989). Microcomputers in the classroom: Focusing on teacher concerns. Educational Research Quarterly, 13, 37–46. Dennison, B. C. (1993). The Stages of Concern of technical preparation education among secondary and postsecondary vocational and academic educators, guidance counselors, and administrators (Unpublished doctoral dissertation). University of Missouri– Columbia. Desimone, L. M., Porter, A. C., Garet, M. S., Suk Yoon, K., & Birman, B. F. (2002). Effects of professional development on teachers’ instruction: Results from a three-year longitudinal study. Education Evaluation and Policy Analysis, 24, 81–112. DuFour, R. (2004). What is a professional learning community? Educational Leadership, 61(8), 6–11. Ehren, B. J. (2006). Partnerships to support reading comprehension for students with language impairment. Topics in Language Disorders, 26, 41–53. Elmore, R. F. (2004). School reform from the inside out: Policy, practice, and performance. Cambridge, MA: Harvard Education Press. Graner, P. S., Ault, M. M., Mellard, D. F., & Gingerich, R. A. (2012). Effective professional development for adult learners. Lawrence: University of Kansas Center for Research on Learning. Guskey, T. R. (2000). Evaluating professional development. Thousand Oaks, CA: Corwin Press. Guskey, T. R. (2005). Taking a second look: Strong evidence reflecting the benefits of professional development is more important than ever before. Journal of Staff Development, 26, 10–18. Hall, G. E., George, A. A., & Rutherford, W. L. (1979). Measuring Stages of Concern about the innovation: A manual for the use of the SoC Questionnaire [Report 3032]. Austin: Research and Development Center for Teacher Education, The University of Texas. Hall, G. E., & Hord, S. M. (1987). Change in schools: Facilitating the process. Albany: State University of New York Press. Hall, G. E., & Hord, S. M. (2015). Implementing change: Patterns, principles, and potholes (4th ed.). Upper Saddle River, NJ: Pearson Education. Hall, G. E., & Loucks, S. F. (1977). A developmental model for determining whether the treatment is actually implemented. American Educational Research Journal, 14, 263–276. Hall, G. E., & Loucks, S. F. (1979). Implementing innovations in schools: A concerns-based approach. Austin: Research and Development Center for Teacher Education, The University of Texas. Hanley, P., Maringe, F., & Ratcliffe, M. (2008). Evaluation of professional development: Deploying a process-focused model. International Journal of Science Education, 30, 711–125. Individuals with Disabilities Education Improvement Act of 2004. Pub. L. No. 108–446, §§ 1400 et seq. Jaquith, A., Mindich, D., Wei, R. C., & Darling-Hammond, L. (2010). Teacher professional learning in the United States: Case studies of state policies and strategies. Oxford, OH: National Staff Development Council. Retrieved from www. learningforward.org/ news/2010Phase3TechnicalReport.pdf Kirby, B. M., & Smith, W. (1998). Stages of Concern of administrators and teachers in the implementation of the school-towork transition initiative in North Carolina. Proceedings of the North Carolina Council of Vocational Teacher Educators Research Conference, 13, 35–40.

Murza & Ehren: Data-Based Decision Making in PL for SLPs

Downloaded From: http://lshss.pubs.asha.org/pdfaccess.ashx?url=/data/journals/lshss/934277/ by a Univ Of Newcastle User on 01/22/2017 Terms of Use: http://pubs.asha.org/ss/rights_and_permissions.aspx

189

Kirkpatrick, D. L. (1959a). Techniques for evaluating training programs. Journal for the American Society of Training Directors, 13(11), 3–9. Kirkpatrick, D. L. (1959b). Techniques for evaluating training programs: Part 2—Learning. Journal for the American Society of Training Directors, 13(12), 21–26. Kirkpatrick, D. L. (1960a). Techniques for evaluating training programs: Part 3—Behavior. Journal for the American Society of Training Directors, 14(1), 13–18. Kirkpatrick, D. L. (1960b). Techniques for evaluating training programs: Part 4—Results. Journal for the American Society of Training Directors, 14(2), 28–32. Learning Forward. (2011). Standards for professional learning. Oxford, OH: Author. Learning Forward. (2015). NSDC introduces bold new purpose. Retrieved from http://learningforward.org/who-we-are/what-westand-for/bold-new-purpose#.VSQw2fmjOM4 Long, B. D. (1994). Stages of Concern in the implementation of tech prep programs in Virginia (Doctoral dissertation). Retrieved from ProQuest Dissertations and Theses (Order No. 9520746). Loucks, S. F., Newlove, B. W., & Hall, G. E. (1975). Measuring Levels of Use of the innovation: A manual for trainers, interviewers,

190

and raters. Austin: Research and Development Center for Teacher Education, The University of Texas. Mizell, H., Hord, S., Killion, J., & Hirsh, S. (2011). New standards put the spotlight on PL. Journal of Staff Development, 32(4), 10–14. Muijs, D., & Lindsay, G. (2008). Where are we at? An empirical study of levels and methods of evaluating continuing professional development. British Educational Research Journal, 34, 195–211. National Governors Association Center for Best Practices & Council of Chief State School Officers. (2010). Common Core State Standards. Washington, DC: Author. Scherz, Z., Bialer, L., & Eylon, B.-S. (2008). Learning about teachers’ accomplishment in ‘learning skills for science’ practice: The use of portfolios in an evidence-based continuous professional development programme. International Journal of Science Education, 30, 643–667. Slabine, N. A. (2011). Evidence of effectiveness. Oxford, OH: Learning Forward. Available at http://learningforward.org/ docs/pdf/evidenceofeffectiveness.pdf Wei, R. C., Darling-Hammond, L., & Adamson, F. (2010). Professional development in the United States: Trends and challenges. Oxford, OH: National Staff Development Council. Retrieved from www.learningforward.org/news/ NSDCstudytechnicalreport2010.pdf

Language, Speech, and Hearing Services in Schools • Vol. 46 • 181–193 • July 2015

Downloaded From: http://lshss.pubs.asha.org/pdfaccess.ashx?url=/data/journals/lshss/934277/ by a Univ Of Newcastle User on 01/22/2017 Terms of Use: http://pubs.asha.org/ss/rights_and_permissions.aspx

Appendix ( p. 1 of 3) Narrative Text Structure Intervention Innovation Configuration Map Key element Initial assessment

Student goal development Murza & Ehren: Data-Based Decision Making in PL for SLPs

Introduction of narrative text structure intervention

Narrative text structure– specific vocabulary

Ideal implementation (4)

In process (3)

In process (2)

The speech-language pathologist (SLP) audiorecords and transcribes a student retell of a novel story that depicts a complete episode. The SLP probes the student’s story comprehension using specific questions to check all eight areas of story grammar (i.e., character, initiating event, etc.). The SLP analyzes the student retell using the Developmental Checklist and then categorizes the student according to the appropriate developmental stage. The SLP uses the student’s narrative developmental stage to set a measurable goal at the next level of narrative development. The SLP describes the critical features of the narrative text structure intervention. The SLP tells the students why they will be learning to use it and what it can help them do. The SLP explains and shows when it should be used. The SLP ensures that the students comprehend and can use the specific narrative text structure vocabulary necessary for their narrative developmental stage with at least 80% accuracy.

The SLP audio-records and transcribes a student retell of a novel story that depicts a complete episode. The SLP probes the student’s story comprehension using specific questions to check six to seven areas of story grammar (i.e., character, initiating event, etc.). The SLP analyzes the student retell using the Developmental Checklist and then categorizes the student according to the appropriate developmental stage.

The SLP audio-records and transcribes a student retell of a novel story not depicting a complete episode. After the student retells the story without cues, the SLP probes their story comprehension using narrative text structure questions to check at least six areas of story grammar (i.e., character, initiating event, etc.). The SLP analyzes the student retell and arbitrarily assigns the student to a developmental stage without using the Developmental Checklist.

The SLP uses the student’s narrative developmental stage to set a measurable goal that is not at the next level of narrative development. The SLP describes the critical features of the narrative text structure intervention and tells the students why they will be learning how to use it and its advantages but does not explain or show when it should be used.

191

The SLP ensures that the students can comprehend the specific narrative text structure vocabulary necessary for their narrative developmental stage with at least 80% accuracy but does not ensure the students can use the vocabulary accurately.

In process (1)

No implementation (0)

The SLP does not audiorecord or transcribe a student retell. After the student retells the story without cues, the SLP probes the student’s story comprehension by asking questions unrelated to the specific areas of story grammar (i.e., character, initiating event, etc.). The SLP analyzes the student retell without assigning the student to a developmental stage.

The SLP does not complete a baseline measure.

The SLP sets goals that are unrelated to the student’s developmental stage.

The SLP does not develop student goals.

The SLP describes only the critical features of the narrative text structure intervention.

The SLP describes only some of the critical features of the narrative text structure intervention.

The narrative text structure intervention is not introduced.

The SLP teaches and evaluates the student’s comprehension and use of narrative text structure vocabulary not at their narrative developmental stage.

The SLP teaches narrative text structure vocabulary that is not used at the student’s level without ensuring the student’s comprehension.

The SLP does not teach narrative text structure vocabulary.

Downloaded From: http://lshss.pubs.asha.org/pdfaccess.ashx?url=/data/journals/lshss/934277/ by a Univ Of Newcastle User on 01/22/2017 Terms of Use: http://pubs.asha.org/ss/rights_and_permissions.aspx

192 Language, Speech, and Hearing Services in Schools • Vol. 46 • 181–193 • July 2015

Appendix ( p. 2 of 3) Narrative Text Structure Intervention Innovation Configuration Map Key element Modeling of graphic organizer

Scaffolded use of graphic organizer

Story retell with graphic organizer

Progress monitoring

Ideal implementation (4)

In process (3)

In process (2)

In process (1)

The narrative text structure components are reviewed, and a story appropriate to the narrative level of the student(s) is read first for enjoyment. The SLP then completes the graphic organizer and uses it to retell the story in summary. Students use the graphic organizer with the appropriate scaffolding from the SLP for the students’ ability level. The SLP fills in the missing parts so student(s) can order their scripts and asks questions to probe for more information. The SLP monitors progress at 3- to 6-month intervals using the same baseline story and analysis process. Progress is also continually assessed by data kept on the student’s ability to work toward goal by demonstrating knowledge through answering questions and graphic organizer completion. Students are given feedback continuously on their progress.

The narrative text structure components are reviewed, and a story appropriate to the narrative level of the student(s) is read first for enjoyment. The SLP then completes the graphic organizer but does not use it explicitly to retell the story. Students use the graphic organizer with intermittent scaffolding from the SLP for the students’ ability level. The SLP fills in the missing parts so student(s) can order their scripts or asks questions to probe for more information. The SLP monitors progress at 3- to 6-month intervals using the same baseline story and analysis process. Progress is also continually assessed by data kept on the student’s ability to work toward goal by demonstrating knowledge through answering questions and graphic organizer completion. Students are given feedback infrequently (less than 50% of the time) on their progress.

The narrative text structure components are reviewed, and a story appropriate to the narrative level of the student(s) is read for a first time while the SLP is completing the graphic organizer. A summary of the story might or might not be given. Students use the graphic organizer with intermittent scaffolding from the SLP without regard to the students’ ability level. The SLP uses incorrect or ineffective scaffolding during the student(s)’ retell.

The narrative text structure components are not reviewed before the SLP reads a story while completing the graphic organizer.

The SLP does not model the graphic organizer strategy.

The SLP does not use scaffolding to aid students in using the graphic organizer.

Students do not use graphic organizers.

The student orally retells stories without input from the SLP although scaffolding is needed.

SLP does not have students orally retell stories.

The SLP monitors progress intermittently though data keeping.

The SLP does not monitor progress.

The SLP monitors progress at 3- to 6-month intervals using the same baseline story and analysis process, or progress is assessed by data kept on the student’s ability to work toward goal by demonstrating knowledge through answering questions and graphic organizer completion. Students are rarely (less than 25% of the time) given feedback on their progress.

Downloaded From: http://lshss.pubs.asha.org/pdfaccess.ashx?url=/data/journals/lshss/934277/ by a Univ Of Newcastle User on 01/22/2017 Terms of Use: http://pubs.asha.org/ss/rights_and_permissions.aspx

No implementation (0)

Appendix ( p. 3 of 3) Narrative Text Structure Intervention Innovation Configuration Map

Murza & Ehren: Data-Based Decision Making in PL for SLPs

Key element

Ideal implementation (4)

In process (3)

In process (2)

In process (1)

Collaboration

The SLP collaborates with classroom teachers to work on generalization of the graphic organizer tool to the classroom by co-teaching a series of lessons to the whole class and developing lessons or adapting lessons for its use when the SLP is absent.

The SLP collaborates with classroom teachers to work on generalization of the graphic organizer tool to the classroom by co-teaching a series of lessons to the whole class.

The SLP discusses the graphic organizer tool with classroom teachers without making any plans for collaboration.

The SLP does not collaborate.

Story selection

The SLP analyzes stories to choose those that demonstrate specific learning objectives for individual student narrative developmental stages.

The SLP analyzes stories but is not able to align with all learning objectives.

The SLP collaborates with classroom teachers to work on generalization of the graphic organizer tool to the classroom by helping the teacher develop lessons or adapting lessons to incorporate narrative text structure without having first demonstrated a lesson for the teacher. The SLP attempts to align stories selected with all learning objectives although a story analysis is not conducted.

The SLP chooses stories without analysis to address one learning objective.

The SLP selects stories without consideration of learning objectives.

193 Downloaded From: http://lshss.pubs.asha.org/pdfaccess.ashx?url=/data/journals/lshss/934277/ by a Univ Of Newcastle User on 01/22/2017 Terms of Use: http://pubs.asha.org/ss/rights_and_permissions.aspx

No implementation (0)

Data-Based Decision Making in Professional Learning for School Speech-Language Pathologists.

School-based speech-language pathologists (SLPs) are often asked to adopt new policies or practices at the school, district, or state level. Professio...
148KB Sizes 0 Downloads 7 Views