RESEARCH AND PRACTICE

Declining Trends in Local Health Department Preparedness Capacities Mary V. Davis, DrPH, MSPH, Christine A. Bevc, PhD, MA, and Anna P. Schenck, PhD, MSPH

Federal, state, and local public health agencies have made substantial investments in improving state and local health department (LHD) preparedness capacities and capabilities to effectively prevent, detect, or respond to public health emergencies.1 A lack of valid and reliable data collection instruments as well as evolving preparedness standards has made it difficult to determine the impact of these investments.2,3 As recently as 2011, the Centers for Disease Control and Prevention released 15 public health preparedness capabilities designed to serve as national public health preparedness standards to assist state health departments and LHDs with strategic planning.4 In addition, few studies have examined the impact of LHD contextual factors and participation in improvement efforts on the performance of preparedness capacities.5 We examined LHD preparedness capacities in the context of participation in performance improvement efforts over a 3-year period using a validated survey instrument.6 LHDs are essential to emergency preparedness and response activities. They have statutory authority to perform key functions including community health assessments and epidemiologic investigations, enforcement of health laws and regulations, and coordination of the actions of the agencies in their jurisdictions that make up the local public health system.7 Preparedness also involves specialized functions such as incident command, countermeasures and mitigation, mass health care delivery, and management of essential health care supply chains.8 The Centers for Disease Control and Prevention organized these functions into capabilities or standards that are supported by foundational capacities or resources elements in the 15 public health preparedness capabilities.4 Despite the considerable investment in public health preparedness after the September 11, 2001, attacks on the United States and the anthrax attack, funding for public health

Objectives. We examined local health department (LHD) preparedness capacities in the context of participation in accreditation and other performance improvement efforts. Methods. We analyzed preparedness in 8 domains among LHDs responding to a preparedness capacity instrument from 2010 through 2012. Study groups included LHDs that (1) were exposed to a North Carolina state-based accreditation program, (2) participated in 1 or more performance improvement programs, and (3) had not participated in any performance improvement programs. We analyzed mean domain preparedness scores and applied a series of nonparametric Mann–Whitney Wilcoxon tests to determine whether preparedness domain scores differed significantly between study groups from 2010 to 2012. Results. Preparedness capacity scores fluctuated and decreased significantly for all study groups for 2 domains: surveillance and investigation and legal preparedness. Significant decreases also occurred among participants for plans and protocols, communication, and incident command. Declines in capacity scores were not as great and less likely to be significant among North Carolina LHDs. Conclusions. Decreases in preparedness capacities over the 3 survey years may reflect multiple years of funding cuts and job losses, specifically for preparedness. An accreditation program may have a protective effect against such contextual factors. (Am J Public Health. 2014;104:2233–2238. doi:10.2105/ AJPH.2014.302159)

preparedness declined 38% between federal fiscal years 2005 and 2012.9 Although LHDs received funding supplements in 2009 and 2010 to address the H1N1 virus and through the American Recovery and Reinvestment Act,10 median per capita revenues for LHD preparedness activities in the most recently completed fiscal year, 2013, declined to $1.15 from $2.07 in 2010.11,12 In 2012, approximately half of LHDs reported reducing or eliminating services, with preparedness being among the most common services to be affected.12 The specific impact of these and other funding reductions on preparedness capacities has yet to be formally studied. After more than a decade of focused effort, gaps and variation in the performance of preparedness activities remain.6,12 Heterogeneity in the composition and structure of public health systems continues to be an important source of variation in preparedness, as in other aspects of public health practice.14,15 Other factors affecting LHD general performance and preparedness

November 2014, Vol 104, No. 11 | American Journal of Public Health

include LHD governance structure, community, and organizational characteristics, such as funding, leadership characteristics, and partnerships.7,16,17 Over the past decade, efforts to improve public health infrastructure, and performance more generally, have gathered momentum. These efforts included development and use of the National Public Health Performance Standards Program instruments, the implementation of state-based accreditation programs and the Public Health Accreditation Board, and initiatives to encourage the use of performance management and quality improvement tools.18-22 The Public Health Accreditation Board is charged with developing and managing national voluntary public health accreditation for tribal, state, local, and territorial health departments. The national accreditation final standards, released in 2011, include a specific emergency preparedness standard as well as additional standards that are linked to preparedness measures.23

Davis et al. | Peer Reviewed | Research and Practice | 2233

RESEARCH AND PRACTICE

The National Public Health Performance Standards Program provides a framework to assess the capacity and performance of public health systems and public health governing bodies and identify areas for system improvement. LHDs and their partners use tailored instruments to assess the performance of their public health system against model standards, including preparedness standards, which are based on the 10 essential services (National Public Health Performance Standards Program version 2.0; NPHPS Partners, Atlanta, GA). More than 400 public health systems and governing entities used the version 2 assessment instruments (Centers for Disease Control and Prevention, http://www.cdc.gov/nphpsp/ archive.html). Preparedness performance improvement programs have also been implemented to address variation. Project Public Health Ready is a standards-based recognition program with 300 LHDs (27 states) recognized as meeting all the Project Public Health Ready requirements individually or working collaboratively as a region since 2004.24 To achieve recognition, LHDs must meet nationally recognized standards in all-hazards preparedness planning, workforce capacity development, and demonstration of readiness through exercises or real events. In addition, the Institute of Medicine has recommended that an accreditation program could be a performance monitoring and accountability system for agency preparedness.25,26 One previous study examined the effects of performance and accreditation programs on LHD performance of 8 preparedness domains on a validated instrument.5 Controlling for LHD characteristics, a significant positive effect on domain scores was found for LHDs that participated in the North Carolina state-based accreditation program and select performance improvement programs (National Public Health Performance Standards, the Public Health Accreditation Board beta test, Project Public Health Ready) when compared with a national matched comparison group that did not participate in any program. Findings, however, were limited to 1 year of survey data—2010. In this article, we explore trends in preparedness capacities in the present climate of declining resources for public health preparedness activities.

METHODS Using a natural experiment design, we analyzed differences in preparedness domain scores among LHDs who participated in the Local Health Department Preparedness Capacities Survey (PCAS) study over 3 years. More specifically, we examined 3 groups of LHDs: (1) North Carolina LHDs exposed to the North Carolina Local Health Department Accreditation Program (NCLHDA participation), (2) national comparison LHDs who participated in 1 or more of the performance improvement programs described in the preceding section (program participation LHDs), and (3) national comparison LHDs that had not participated in any program (no participation). The North Carolina Preparedness and Emergency Response Research Center invited 333 LHDs from 40 states to participate in the PCAS on an annual basis from 2010 to 2012. We sent survey invitations to both the LHD director or administrator and the preparedness coordinator. Each LHD was asked to respond once to the survey and encouraged to include multiple staff in the survey response, including the health director and preparedness coordinator. The survey sample included 85 North Carolina LHDs and 248 comparison LHDs, identified using a propensity score matching methodology.6 The matching sample selection criteria (public health agency staffing levels, scope of services delivered, annual agency expenditures per capita, population size served, socioeconomic characteristics of the community, and other health resources within the community) were based on a set of public health agency and system characteristics obtained from the National Association of County and City Health Officials 2010 Profile.27 Within the survey sample, a majority (61.6%) of LHDs were governed by a local board of health. The sample was evenly distributed between LHDs within metropolitan statistical areas (51.7%) and nonmetropolitan areas (48.3%). LHDs reported an average of 96 full-time equivalents (median = 54; range = 2---1025 full-time equivalents). The population sizes ranged from 4000 to 1 484 645 residents, with a median population of 54 261 (mean = 109 803). On average, responding LHDs spent $68.86 per capita (adjusted expenditures; range = $0.68---$358.97; median = $53.12). Using

2234 | Research and Practice | Peer Reviewed | Davis et al.

Welch’s 2-sample t test, we found no significant differences in characteristics between North Carolina LHDs and those in the comparison sample. Supported by previous validity and reliability testing,6 the self-administered PCAS includes 58 questions and 211 subquestions related to specific preparedness or response capacities organized across 8 domains: surveillance and investigation, plans and protocols, workforce and volunteers, communication and information dissemination, incident command, legal preparedness, emergency events and exercises, and corrective action activities. Each domain consists of a set of preparedness capacities ranging from 4 to 33 measures that capture a subset of local preparedness (Table 1). To provide a summary measure for each domain, we calculated a preparedness capacity score. Each PCAS domain represents an equally weighted proportion of aggregate reported capacities (accounting for parent---child relationships), whereby the proportion of capacities within each domain’s subquestions is averaged across the domain. As a result, LHDs are not unduly penalized for nonapplicable capacities resulting from nested and dependent higher level capacity measures. We defined the participation groups as follows. We categorized all North Carolina LHDs (NCLHDA participation; n = 85) as participating in the NCLHDA program for the purposes of these analyses because of the preparatory effects of the NCLHDA; also, at the time of data collection, all North Carolina LHDs had been exposed to the program. Preliminary assessments found minimal (nonsignificant) differences in domain scores between North Carolina accredited and nonaccredited LHDs in 2010. Among national comparison LHDs outside North Carolina, we categorized 48 LHDs as having participated in 1 or more of the following programs before PCAS completion (program participation LHDs): LHD participation in Project Public Health Ready was designated through program recognition between 2004 and 2010, Public Health Accreditation Board participation was determined from the list of 19 LHDs that participated as a beta test site, and National Public Health Performance Standards Program participation was determined through review of the cumulative report of all local public

American Journal of Public Health | November 2014, Vol 104, No. 11

RESEARCH AND PRACTICE

TABLE 1—Summary of Preparedness Domains and Capacity Measures: Local Health Department Preparedness Capacities Survey, 2010–2012 Domain Surveillance and investigation

No. of Items 20

Description of Capacities Measured Handling of urgent case reports Access to public health surveillance system Electronic storage of local case report data Specimen transportation system

Plans and protocols

25

Capability and components of surge capacity Formal case investigation components and protocol

Workforce and volunteers

17

Type and maintenance of volunteer registry

All-hazards emergency preparedness and response plan Identification and training of emergency preparedness staff Assessment of emergency preparedness workforce Workforce training in emergency preparedness Communication and

33

information dissemination

Emergency communication plans and procedures Capacity and assessment of communication technologies Use of health alert network

Incident command

5

Legal infrastructure and

8

Use of emergency operations center Local incident command structure

preparedness

Review of legal power and authority in emergency preparedness and response Access and use of legal counsel Extent of legal power and authority in emergency

Emergency events and exercises Corrective action activities

4 28

preparedness and response Determination of emergency events and exercises Debriefing activities Evaluation activities Reporting activities

health systems that completed the version 2 survey between October 21, 2007, and June 10, 2010. We grouped the remaining national comparison group LHDs (no participation; n = 200) that had not participated in any performance improvement program to provide a control for program exposure. For this analysis, we examined preparedness capacities across the 8 domains in these LHD groups over 3 years of survey data. The initial analysis offers a summary of the 8 domain values across the program groups to examine the varying levels of preparedness capacity for the 3 years. We analyzed the mean domain preparedness scores along with the 95% confidence intervals for these mean scores. To compare these scores over time, we applied a series of nonparametric Mann---Whitney---Wilcoxon tests to determine whether significant differences existed in the preparedness domain scores between the

pooled cross-sectional data for each of the 3 participation groups in the 3-year study period. This metric enabled us to explore these differences without the assumption of a normal distribution because of skew and kurtosis.

RESULTS We examined response rates across survey years by study groups. In 2010, the overall response was 79.3% (n = 264), with response rates of 69% (n = 138) among no-participation LHDs, 89.5% (n = 43) among program participant LHDs, and 97.6% (n = 83) among NCLHDA program participant LHDs. In 2011, the response rate decreased slightly to an overall rate of 70.6% (n = 235), with response rates of 54.5% (n = 106) among no-participation LHDs, 79.2% (n = 38) among program participant LHDs, and 95.3% (n = 81) among NCLHDA program

November 2014, Vol 104, No. 11 | American Journal of Public Health

participant LHDs. In 2012, the overall response to the PCAS increased to 73% (n = 243), with the response rates rising to 63.5% (n = 127) among no-participation LHDs and 97.6% (n = 83) among NCLHDA participant LHDs, whereas response within the program participation LHDs decreased to 68.7% (n = 33). Preparedness capacity fluctuated between 2010 and 2011, as well as between 2011 and 2012, for all domains across all groups (Figure 1). We observed significant decreases in scores for the surveillance and investigation and legal preparedness domains in all 3 study groups, and we observed significant declines in 3 other domains among all but the NCLHDA program participant LHDs. Table 2 presents the average domain scores and confidence limits for the 3 years of survey data across the 3 groups. From 2010 to 2012, preparedness capacity scores decreased significantly for all 3 groups in the surveillance and investigation and legal preparedness domains. We also observed significant decreases among program participation and no-participation LHDs within the domains of plans and protocols, communication, and incident command. Although there was a decline among NCLHDA program participant LHDs in these domains, the change was not statistically significant. In the remaining 3 domains (workforce and volunteers, exercises and emergency events, and corrective action), changes were not significant or consistent among the 3 groups. The decline in the surveillance and investigation domain scores is most notable. The decreases in capacity scores among all groups reflect potential changes in surveillance systems, urgent case management, or other means of investigation support. We observed equal and significant decreases (–0.15) in legal preparedness for those LHDs that had and had not participated in programs. This domain measures the extent of legal power and authority in emergency preparedness and response, as well as access and use of legal counsel. Although we observed a decrease in NCLHDA program participant scores in this domain, this decrease was less than in other groups (–0.08).

Davis et al. | Peer Reviewed | Research and Practice | 2235

RESEARCH AND PRACTICE

No program participation

Program participation LHDs

b Domain Score

Domain Score

a 1.0 0.8 0.6 0.4 0.2 0.0

2010

2010

2011

2010

f

1.0 0.8 0.6 0.4 0.2 0.0

2010

2011

2012

1.0 0.8 0.6 0.4 0.2 0.0

2012

2010

Year

2011

2012

Year

g

h 1.0 0.8 0.6 0.4 0.2 0.0

Domain Score

Domain Score

2011

Year Domain Score

e

2012

1.0 0.8 0.6 0.4 0.2 0.0

2012

Year

2011

Year

d

1.0 0.8 0.6 0.4 0.2 0.0

2010

1.0 0.8 0.6 0.4 0.2 0.0

2012

Domain Score

Domain Score

2011

Year

c

Domain Score

North carolina LHDs

2010

2011

2012

1.0 0.8 0.6 0.4 0.2 0.0

2010

Year

2011

2012

Year

Note. LHD = national comparison local health department participating in a performance improvement program; NCLHDA = North Carolina Local Health Department Accreditation Program.

FIGURE 1—Average domain scores by year for (a) surveillance and investigation, (b) incident command, (c) plans and protocols, (d) legal preparedness, (e) workforce and volunteers, (f) emergency events and exercises, (g) communication and information dissemination, and (h) corrective action activities: Local Health Department Preparedness Capacities Survey, 2010–2012.

DISCUSSION Using a validated instrument, we examined LHD performance of preparedness capacities and found declines in 5 of 8 domains representing preparedness capacities from 2010 to 2012. We observed significant decreases in LHD capacity scores in surveillance and investigation and legal preparedness among all study groups. Although capacity scores varied

between years and study groups, scores declined for an additional 3 domains and remained unchanged for the remaining 3 domains. In no domain, for any group, did preparedness capacity significantly improve. Given that the domains cover a wide range of preparedness capacities, the results would suggest that different domains of public health preparedness may be more (or less) responsive to contextual effects. Decreases in preparedness capacities over the 3 survey years may reflect multiple years of

2236 | Research and Practice | Peer Reviewed | Davis et al.

funding cuts and job losses, specifically for preparedness.14 We observed the greatest decline in capacities in the surveillance and investigation domain, which is critical not only to preparedness responsibilities but also to the basic functions of a public health department. This domain measures surveillance systems, urgent case management, and other means of investigation support. These findings support the call for reliable federal funding and decision making to modernize the public health system,

American Journal of Public Health | November 2014, Vol 104, No. 11

RESEARCH AND PRACTICE

TABLE 2—Summary of Domain Values: Local Health Department Preparedness Capacities Survey, 2010–2012 Preparedness Domain

2010, Average (95% CI)

2011, Average (95% CI)

2012, Average (95% CI)

2010–2012 Difference

Surveillance and investigation No participation

0.63 (0.60, 0.66)

0.24 (0.21, 0.28)

0.45 (0.42, 0.47)

–0.18***

Program participation LHDs NCLHDA participation

0.75 (0.71, 0.78) 0.60 (0.56, 0.64)

0.43 (0.36, 0.51) 0.45 (0.41, 0.48)

0.49 (0.44, 0.54) 0.48 (0.45, 0.50)

–0.26*** –0.12***

Plans and protocols No participation

0.71 (0.69, 0.74)

0.42 (0.37, 0.47)

0.59 (0.56, 0.62)

–0.12***

Program participation LHDs

0.76 (0.71, 0.82)

0.57 (0.48, 0.67)

0.63 (0.58, 0.68)

–0.14**

NCLHDA participation

0.61 (0.56, 0.65)

0.68 (0.64, 0.72)

0.58 (0.54, 0.61)

–0.03

Workforce and volunteers No participation

0.49 (0.47, 0.51)

0.24 (0.21, 0.27)

0.51 (0.49, 0.53)

0.01

Program participation LHDs NCLHDA participation

0.53 (0.49, 0.57) 0.47 (0.43, 0.50)

0.33 (0.27, 0.39) 0.41 (0.38, 0.44)

0.53 (0.51, 0.56) 0.50 (0.48, 0.52)

0.00 0.03

Communication No participation

0.63 (0.61, 0.65)

0.23 (0.19, 0.26)

0.55 (0.52, 0.57)

–0.09***

Program participation LHDs

0.67 (0.64, 0.70)

0.35 (0.29, 0.42)

0.58 (0.54, 0.62)

–0.09**

NCLHDA participation

0.64 (0.61, 0.67)

0.45 (0.41, 0.48)

0.60 (0.57, 0.63)

–0.04

Incident command No participation

0.80 (0.76, 0.83)

0.62 (0.58, 0.66)

0.65 (0.60, 0.69)

–0.15***

Program participation LHDs NCLHDA participation

0.85 (0.79, 0.91) 0.73 (0.67, 0.79)

0.63 (0.55, 0.70) 0.54 (0.48, 0.60)

0.70 (0.62, 0.78) 0.69 (0.63, 0.75)

–0.15* –0.04

Legal preparedness No participation

0.71 (0.67, 0.74)

0.31 (0.26, 0.35)

0.55 (0.52, 0.58)

–0.15***

Program participation LHDs

0.79 (0.74, 0.84)

0.47 (0.39, 0.56)

0.63 (0.58, 0.69)

–0.15**

NCLHDA participation

0.73 (0.68, 0.78)

0.55 (0.51, 0.59)

0.65 (0.61, 0.69)

–0.08**

Exercises and emergency events No participation

0.84 (0.81, 0.88)

0.41 (0.35, 0.47)

0.81 (0.77, 0.86)

–0.03

Program participation LHDs NCLHDA participation

0.85 (0.79, 0.91) 0.92 (0.89, 0.96)

0.58 (0.47, 0.70 0.73 (0.67, 0.79)

0.87 (0.79, 0.96) 0.94 (0.91, 0.97)

0.02 0.02

Corrective action No participation

0.58 (0.54, 0.62)

0.37 (0.31, 0.42)

0.61 (0.56, 0.66)

0.03

Program participation LHDs

0.70 (0.63, 0.76)

0.51 (0.39, 0.62)

0.66 (0.57, 0.75)

–0.04

NCLHDA participation

0.67 (0.62, 0.72)

0.63 (0.57, 0.70)

0.66 (0.60, 0.73)

–0.01

Note. LHD = national comparison local health department participating in a performance improvement program; NCLHDA = North Carolina Local Health Department Accreditation Program. *P < .05; **P < .01; ***P < .001.

including surveillance systems, to address ongoing and emerging infectious diseases.14 Although we observed decreases in preparedness capacities in all 3 study groups, the declines were not as great and were less likely to be significant among LHDs in North Carolina. These results reinforce previous findings that an accreditation program can have an effect on preparedness domain scores within the context of a single state.5 Participating

in some phase of an accreditation process (prepreparation, preparation for a site visit, accreditation, or preparation for reaccreditation) may allow LHDs to better retain capacities in spite of contextual effects. We hypothesize that the requirement that North Carolina LHDs undergo accreditation every 4 years facilitates maintenance of organizational capacity in a variety of program areas, including preparedness, because the LHDs must

November 2014, Vol 104, No. 11 | American Journal of Public Health

demonstrate conformity with consistent standards over time. We did not observe this protective effect of participation in other improvement programs, which was most likely a result of participation decay, meaning that sufficient time had elapsed between participating in the performance improvement program and completion of the PCAS. The performance improvement programs included in this analysis did not include ongoing monitoring, and standards for these programs were revised during the survey timeframe. In contrast, North Carolina LHDs had to maintain capacities to meet accreditation requirements for reaccreditation every 4 years. The LHDs included in our sample reflect the characteristics of North Carolina LHDs because we chose the original national comparison LHDs to reflect the characteristics of these LHDs. Thus, our findings may not be directly generalizable to all LHDs nationally. Attrition over the survey years reduced the ability and statistical power to examine PCAS domain scores among the same LHDs, primarily among the national comparison group, which resulted in a pooled cross-sectional analysis rather than matched panel analyses. The PCAS is a selfreport instrument completed entirely by the LHD. Capacities are not observed by trained observers as has been recommended by some researchers.28 Although observation measurement may be more objective, this measurement approach has considerable resource implications and could not be implemented with as many LHDs. The PCAS has been subjected to validity (factor analysis) and reliability (interclass correlation coefficients) testing with strong results, including Cronbach a coefficients of at least 0.6 for all domains.6 Results from this study demonstrate decreases in LHD preparedness in most preparedness domains. Although we cannot predict from these results that preparedness capacities will continue to decline, the consistency of the trend suggests that unless new investments are made in public health preparedness, we should expect to see continued decreases in preparedness capacities. Participation in the NCLHDA program appeared to have a potential protective effect against the impact of funding cuts for North Carolina LHDs. Because of the variability in domain scores over time, these findings provide support for measuring preparedness across the

Davis et al. | Peer Reviewed | Research and Practice | 2237

RESEARCH AND PRACTICE

variety of preparedness domains rather than relying on single-index measures. The recently released National Health Security Preparedness Index, developed in collaboration with more than 25 organizations, provides a single index measure as well as results across 5 domains. Initial results have demonstrated considerable score variation across these domains.29 Continued improvement in public health preparedness will require measurement and feedback across the domains of practice as well as renewed funding for these activities. j

About the Authors Mary V. Davis and Anna P. Schenk are with the North Carolina Institute for Public Health, Gillings School of Global Public Health, University of North Carolina at Chapel Hill (UNC-CH). Anna P. Schenck is also with the Public Health Leadership Program, Gillings School of Global Public Health, UNC-CH. Christine A. Bevc is with the Research and Evaluation Unit, North Carolina Institute for Public Health, Gillings School of Global Public Health, UNC-CH. Correspondence should be sent to Mary V. Davis, DrPH, MSPH, Campus Box 7469, Chapel Hill, NC 27599 (e-mail: [email protected]). Reprints can be ordered at http://www.ajph.org by clicking the “Reprints” link. This article was accepted June 16, 2014.

Contributors All authors contributed to the conceptualization and writing of the article. C. A. Bevc conducted the data analysis with input from the other authors.

Acknowledgments and Disclosures The research was carried out by the North Carolina Preparedness and Emergency Response Research Center, which is part of the UNC Center for Public Health Preparedness at the Gillings School of Global Public Health, UNC-CH and was supported by the Centers for Disease Control and Prevention (grant 1Po1TP000296). We thank Elizabeth Mahanna, MPH, Michael Zelek, MPH, and Nadya Belenky, MSPH, for their work on this project. In addition, we thank Glen Mays, PhD, James Bellamy, PhD, John Wayne, PhD, and Cammi Marti, PhD, who contributed to this study at the University of Arkansas for Medical Sciences. Note. The contents are solely the responsibility of the authors and do not necessarily represent the official views of the Centers for Disease Control and Prevention. Additional information can be found at http://cphp.sph.unc.edu/ncperrc.

Human Participant Protection The research study received approval from the UNC Public Health Nursing institutional review board.

3. Asch SM, Stoto M, Mendes M, et al. A review of instruments assessing public health preparedness. Public Health Rep. 2005;120(5):532---542. 4. Centers for Disease Control and Prevention. Public health preparedness capabilities: national standards for state and local planning. 2011. Available at: http://www. cdc.gov/phpr/capabilities. Accessed December 13, 2012. 5. Davis, MV, Bevc, CA, Schenck AP. Effects of performance improvement programs on preparedness capacities. Public Health Rep. In Press. 6. Davis MV, Mays GP, Bellamy J, Bevc C, Marti C. Improving public health preparedness capacity measurement: development of the local health department preparedness capacities assessment survey. Disaster Med Public Health Prep. 2013. 7(6);578---584. 7. Scutchfield FD, Knight EA, Kelly A, et al. Local public health agency capacity and its relationship to public health system performance. J Public Health Manag Pract. 2004;10(3):204---215. 8 Nelson C, Lurie N, Wasserman J, Zakowski S. Conceptualizing and defining public health emergency preparedness. Am J Public Health. 2007. 97(suppl 1): S9---S11. 9. Levi J, Segal L, Lieberman DA, May K, Lang A, St Laurent R. Ready or Not? Protecting the Public’s Health From Diseases, Disasters, and Bioterrorism. Washington, DC: Trust for America’s Health; 2012. 10. Patient Protection and Affordable Care Act. 42 U.S.C. §18001 2010. 11. National Association of County and City Health Officials. Local Health Department Job Losses and Program Cuts: Findings from the January/February 2010 Survey. Washington, DC: National Association of County and City Health Officials; 2010. Research Brief. 12. National Association of County and City Health Officials. 2013 National Profile of Local Health Departments. 2014. Available at: http://nacchoprofilestudy. org/wp-content/uploads/2014/02/2013_National_ Profile021014.pdf. Accessed March 3, 2014. 13. Levi J, Segal L, Lieberman DA, St Laurent R. Outbreaks: Protecting Americans From Infectious Diseases. Washington, DC: Trust for America’s Health; 2013.

20. Gillen SM, McKeever J, Edwards KF, Thielen L. Promoting quality improvement and achieving measurable change: the lead states initiative. J Public Health Manag Pract. 2010;16(1):55---60. 21. Riley WJ, Bender K, Lownik E. Public health department accreditation implementation: transforming public health department performance. Am J Public Health. 2012;102(2):237---242. 22. Davis MV, Cannon MM, Stone DO, Wood BW, Reed J, Baker EL. Informing the national public health accreditation movement: lessons from North Carolina’s accredited local health departments. Am J Public Health. 2011;101(9):1543---1548. 23. Public Health Accreditation Board. Public Health Accreditation Board Standards and Measures, Version 1.0. Alexandria, Virginia. 2011. Available at: http://www. phaboard.org. Accessed August 11, 2014. 24. National Association of County and City Health Officials. Project Public Health Ready overview. 2011. Available at: http://www.naccho.org/topics/emergency/ PPHR/pphr-overview.cfm Accessed May 31, 2013. 25. Institute of Medicine. Research Priorities in Emergency Preparedness and Response for Public Health Systems: A Letter Report. Washington, DC: National Academies Press. 2008. 26. Singleton CM, Corso L, Koester D, Carlson V, Bevc CA, Davis MV. Accreditation and emergency preparedness: linkages and opportunities for leveraging the connections. J Public Health Manag Pract. 2014;20 (1):119---124. 27. National Association of County and City Health Officials. 2013 National Profile of Local Health Departments. 2010. Available at: http://nacchoprofilestudy.org. Accessed March 3, 2014. 28. Stoto M. Measuring and assessing public health emergency preparedness. J Public Health Manag Pract. 2013;19(suppl 2):S16---S21. 29. National Health Security Preparedness Index. Available at: http://www.nhspi.org. Accessed August 11, 2014.

14. Bhandari MW, Scutchfield FD, Charnigo R, Riddell MC, Mays GP. New data, same story? Revisiting studies on the relationship of local public health systems characteristics to public health performance. J Public Health Manag Pract. 2010;16(2):110---117. 15. Mays GP, McHugh MC, Shim K, et al. Institutional and economic determinants of public health system performance. Am J Public Health. 2006;96(3):523---531. 16. Mays GP, Smith S, Ingram R, Racster LJ, Lamberth CD, Lovely ES. Public health delivery systems: evidence, uncertainty, and emerging research needs. Am J Prev Med. 2009;36(3):256---265.

References

17. Erwin PC. The performance of local health departments: a review of the literature. J Public Health Manag Pract. 2008;14(2):E9---E18.

1. Levi J, Segal L, Lieberman DA, St Laurent R. Ready or Not? Protecting the Public’s Health From Diseases, Disasters, and Bioterrorism. Washington, DC: Trust for America’s Health; 2011.

18. Nicola RM. Turning Point’s national excellence collaboratives: assessing a new model for policy and system capacity development. J Public Health Manag Pract. 2005;11(2):101---108.

2. Duncan WJ, Ginter PM, Rucks AC, Wingate MS, McCormick LC. Organizing emergency preparedness within United States public health departments. Public Health. 2007;121(4):241---250.

19. Corso LC, Lenaway D, Beitsch LM, Landrum LB, Deutsch H. The national public health performance standards: driving quality improvement in public health systems. J Public Health Manag Pract. 2010;16(1):19---23.

2238 | Research and Practice | Peer Reviewed | Davis et al.

American Journal of Public Health | November 2014, Vol 104, No. 11

Declining trends in local health department preparedness capacities.

We examined local health department (LHD) preparedness capacities in the context of participation in accreditation and other performance improvement e...
538KB Sizes 5 Downloads 5 Views