RESEARCH AND PRACTICE

Minimum Package of Public Health Services: The Adoption of Core Services in Local Public Health Agencies in Colorado Sarah Lampe, MPH, Adam Atherly, PhD, Lisa VanRaemdonck, MPH, MSW, Kathleen Matthews, MPH, and Julie Marshall, PhD

Between 1948 and 2007, Colorado law supported a legal distinction between 2 common types of local public health agencies (LPHAs): the “organized health department”1 and the “nursing service.”2 Over time, the different agency types led to different governance, structure, administration, financing, authority, services, and activities. Typically, nursing services served rural and frontier counties, and organized health departments served more populated, urban areas. Organized health departments served 85% of the state’s population, and were required by law to provide basic public health services and to have independent boards of health.1 Nursing services typically provided only public health services that fell into the scope of public health nursing and services deemed necessary by the local board of health, which was comprised of 3 elected county commissioners.2 In 2008, Colorado passed the Public Health Act of 2008 (heretofore “the Act”),3 which incorporated key portions of the Turning Point Model State Public Health Act,4 including the Core Services provision in the Model Act. The Act both standardized the legal structure for LPHAs in Colorado, removing the determination of organized health department1 and the nursing service,2 and also required the promulgation of a rule that set out a list of core services that should be provided to all Colorado residents and visitors. Under the new statutes, all LPHAs were required to provide or assure the provision of a set of comprehensive public health services (core services) that were promulgated into rule in October 2011.5 The Colorado core services were developed by a taskforce of state and LPHA leaders and included stakeholder input. The core services were mapped to the Ten Essential Public Health Services6 and were intended to be tangible, prioritized guides for

Objectives. We examined the effect of a state law in Colorado that required local public health agencies to deliver a minimum package of public health services. Methods. We used a longitudinal, pre–post study design, with baseline data collected in 2011 and follow-up data collected in 2013. We conducted means testing to analyze the change in service delivery and activities. We conducted linear regression to test for system structure effects on the implementation of core services. Results. We observed statistically significant increases in several service areas within communicable disease, prevention and population health promotion, and environmental health. In addition to service and program areas, specific activities had significant increases. The significant activity increases were all in population- and systems-based services. Conclusions. This project provided insight into the likely effect of national adoption of a minimum package as recommended by the Institute of Medicine. The implementation of a minimum package showed significant changes in service delivery, with specific service delivery measurement over a short period of time. Our research sets up a research framework to further explore core service delivery measure development. (Am J Public Health. 2015;105:S252–S259. doi:10. 2105/AJPH.2014.302173)

LPHAs to follow. The core services, listed broadly, included administration and governance, assessment and planning, vital records, communicable diseases, prevention and population health promotion, environmental health, and emergency preparedness and response. The core services rule contains more detail on the work to be done within each of the core services listed.5 In 2012, not long after the promulgation of core services in Colorado, the Institute of Medicine (IOM) released “For the Public’s Health: Investing in a Healthier Future,”7 the third report in a series of reports aimed at improvement in the public health system.8 This report called for the creation of a minimum package of public health services (heretofore “minimum package”) to parallel the essential benefits package established in the Affordable Care Act.7,8 The IOM report heightened the need both to define a minimum package and to create valid measurement of these services.

S252 | Research and Practice | Peer Reviewed | Lampe et al.

This report also highlighted the need for LPHAs to withdraw from the provision of direct health care services to focus on population health services.7 Soon after the IOM called for a minimum package, the National Association of County and City Health Officials (NACCHOs) echoed the call for a minimum package and released a statement of policy defining LPHA foundational capabilities and basic programs that should be mandatory services within a minimum package.9,10 The NACCHO statement of policy was the first response from a national public health organization stating what a minimum package should entail and how it should be defined. Several of the basic programs listed in the NACCHO statement are similar to the Colorado core services, including communicable disease control, environmental health, public health preparedness and response, vital statistics, community health assessment (similar to assessment and planning in Colorado), and

American Journal of Public Health | Supplement 2, 2015, Vol 105, No. S2

RESEARCH AND PRACTICE

chronic disease (listed specifically in prevention and population health promotion in Colorado).9 Many of the foundational capabilities listed in the NACCHO statement can be found in the Colorado Administration and Governance Core Service. The similarities between the Colorado core services and the NACCHO statement of policy for a minimum package put Colorado in position as a case study for how the implementation of core service requirements could create change in public health service delivery. Determination of how public health agency structure affects performance was the topranked research priority by a consensus process in the early mid-2000s11 and has been called for in the National Research Agenda for Public Health Services and Systems Research (PHSSR),12 but has not yet been fully answered and will be informed by this study. In addition, the call for a minimum package and measurable service delivery changes upon implementation of a minimum package have not been recorded nor understood in the context of the IOM recommendations. To date, PHSSR has primarily been descriptive or cross-sectional in nature. Current literature supports that LPHAs can adapt,13 are influenced by various structural capacity components,14---17 and have a wide level of variability in services, performance, and interaction between structural capacity and processes.18---21 Research on the impact of modernizing laws such as the Act in Colorado has been specifically called for in research needs and prioritization by experts22 and by the National Research Agenda for PHSSR.12 One method to guide system change toward the provision of a minimum package is codifying a state law or regulation that requires the provision or assurance of the minimum package. The long-term ability to measure and track service delivery change requires foundational work to determine appropriate, reliable, and valid measures. Because of the policy work to develop a new regulation on a minimum package, the resulting publicity, and potential system change momentum, we may see some initial service delivery changes, and can expect that incremental change will continue. Our research presented in this study established a research framework to further explore service delivery measure development, helped to

improve our understanding of how public health practice is evolving in the midst of a changing field, and prepared Colorado for future tracking of service delivery change. Our research had 2 specific research questions. First, what is the effect of a new statelevel regulation of core services for LPHAs on the adoption or assurance of said core services? Second, how do LPHA-level structural capacity factors affect the change in the level of adoption or assurance of core services by LPHAs? We outlined this longitudinal, pre---post study of core service adoption and the relationship of core service changes with structural capacity components and filled in some of the research gaps noted in the literature.

METHODS Using a longitudinal design, we took advantage of a natural experiment, the core services ruling, to measure change in the provision of core services and how system structures and population context were associated with change in the provision of core services. The Colorado Department of Public Health and Environment’s Office of Planning and Partnerships (CDPHE-OPP) led the process of a baseline capacity assessment in the spring and summer of 2011, before the conceptualization of our research project. This data collection, done in partnership with the Colorado Association of Local Public Health Officials (CALPHO), used a comprehensive data collection tool and in-depth interviews, and the results served as the state-required LPHA Annual Report for that year. The Annual Report is a requirement for receipt of state funds distributed by CDPHE-OPP. We used the baseline data collection tool as a guide during agency interviews to gather practical information about the breadth and scope of core service delivery before the Core Service rule. The content was based on similar service data gathering tools, such as the NACCHO Profile of Local Health Departments (NACCHO Profile),23 and tools in states such as Minnesota24 and Washington (Washington Public Health Improvement Partnership Activities and Services Work Group, 2009 Local Health Jurisdictions Survey, B. Bekemeier, PhD, MPH, personal communication, March 2010), as well as the list of core services, which were in draft

Supplement 2, 2015, Vol 105, No. S2 | American Journal of Public Health

form at the time of data collection. Data were gathered on service delivery as well as staffing, funding, perceived capacity, and areas for potential improvement. We distributed the data collection tool to LPHAs 3 to 4 weeks before a scheduled site visit. All LPHA directors were asked to complete the instrument and to engage knowledgeable staff members when it was appropriate. We reviewed and discussed the completed tool with the LPHA director and key staff members during a 3- to 5-hour site visit conducted by trained project team members. Before the site visit, project team members reviewed the submission for gaps, inconsistencies, and areas of targeted inquiry during the visit. All 54 LPHAs in Colorado participated in the site visits and completed the data collection tool. We collected follow-up data 21 to 24 months after the initial data collection, in the spring and summer of 2013, through a strategic partnership with the CDPHE-OPP using their Annual Reporting mechanism. The follow-up instrument was in the field for a total of 4 months. All 54 LPHAs submitted data; however, baseline and follow-up data were not complete from 2 LPHAs, and 2 additional LPHAs submitted follow-up data after the deadline for analysis, which resulted in a total of 50 LPHAs in the final analyses. Not all Colorado LPHAs perform environmental health services at the local level, which resulted in a sample of 46 LPHAs for that core service.

Follow-Up Measure Determination To simplify the data collection effort and ask fewer measures in follow-up compared with baseline, we used a modified Delphi process to determine the highest priority measures within select core services. We used the skills and expertise of Project Advisory Committee (PAC) members, along with the data from baseline, to determine which core services we would study and which measures within these core services were considered highest priority. The PAC was comprised of 11 practice and research professionals, including representatives from Colorado LPHAs, CDPHE, the Colorado School of Public Health, national experts who were familiar with service measurement, and Colorado practice professionals with expertise in the various core service areas. The PAC was

Lampe et al. | Peer Reviewed | Research and Practice | S253

RESEARCH AND PRACTICE

asked to look at data about service delivery and to use their expertise in specific public health areas to determine which measures represented the work that was being done within LPHAs across the state, as well as which measures were most relevant to the field and the changing demands of the field at the time. Among other measures, follow-up data collection included various service delivery or availability measures. The service delivery or availability measures were structured as multiple choice questions regarding what services were available or provided in each core service area and who delivered those services. We modeled these measures after the NACCHO Profile,23 and they were modified to gather more detailed data about service delivery. These details included the core service (e.g., prevention and population health promotion), the program area (e.g., chronic disease), the service area (e.g., nutrition promotion), and the activities of the service (e.g., ranging from educational materials to policy development and implementation). The program areas, service areas, and activities were specific to reflect the actual work that was done within each of the core services. Once we determined the follow-up measures, 15 LPHAs completed pilot testing on the remaining core service measures and the whole instrument. We adjusted the measures according the pilot testing results, and the instrument was released for annual reporting in late April 2013. We tested the measures for validity using convergent and divergent validity, and we tested reliability by using test---retest reliability. Several of the service delivery and availability measures were both reliable and valid, although this was dependent on the service area. The measures were not consistently reliable or valid across all service areas. The specific test results are not included in this study (future article, in preparation).

Analysis We categorized changes in the delivery of core services by the core service itself (e.g., prevention and population health promotion), by program areas (e.g., maternal and child health), by service area (e.g., tobacco), or by activities that cut across program areas within core services (e.g., inspections within environmental health could cover both food safety and

sanitation). We calculated the mean number of activities done by the core service, within service and program areas, and by activities both pre- and posttest. We performed means testing using the t-test for analysis of the change using Stata (version 11; StataCorp, College Station, TX). Within each core service, program area, and service area, we compared the means of activities done at baseline and follow-up. We used ordinary least-squares regression to examine LPHA-level structural capacity factors related to change in core service adoption using Stata version 11. Changes in the number of activities across 3 broad core services (communicable disease, population and health promotion, and environmental health) were the dependent variables; the independent variables included the following LPHA-level structural capacity variables: local board of health structure, structure of the agency before the Act, agency jurisdiction population size, and percent of population living at or below poverty. We summarized all covariates by the level of the dependent variables, using means and SDs for continuous variables and proportions for categorical variables.

RESULTS Overall, we found increases in the implementation of 3 of the core services (communicable disease, prevention and population health promotion, and environmental health) over the 2 years of the project. However, the only core service to reach statistical significance was communicable disease (Table 1). When examined in more detail within the general core services, we found several program areas, service areas, and activities that had a significant increase from baseline to follow-up (Table 2). Examples and definitions of program areas, services areas, and activities are provided in Table 3. The PAC and research team determined that the following core services did not lend themselves to pre---post measurement: administration and governance, vital statistics, and assessment and planning. The potential changes observed in these areas would likely not be caused by the core service rule, but rather by other external forces. For example, the Act required that each LPHA complete a community health assessment and community health

S254 | Research and Practice | Peer Reviewed | Lampe et al.

improvement plan every 5 years. This requirement preceded the core service rule and the baseline data collection. Several LPHAs had already begun the community health assessment process at baseline, which were captured in the results. In addition, emergency preparedness and response was determined to not lend itself to pre---post measurement because all LPHAs in Colorado had been recently required to become recognized as Project Public Health Ready25 as a contractual obligation to CDPHE.

Communicable Disease We found that the only overall core service that had a significant change was communicable disease, with a mean increase of 1.62 activities per LPHA (P = .03). Within communicable disease, the program area of population prevention and education had a significant change, with a mean increase of 1.24 activities per LPHA (P = .02). Within the program area of population prevention and education, the influenza service area also had a significant change, with a mean increase of 0.86 activities per LPHA (P = .005). Within the program area of prevention of disease transmission, the service area with a significant change in activity was tuberculosis, with an increase of 0.14 activities per LPHA (P = .03). Within the activities of communicable disease, we found that system-based services (Table 2) had a significant change across the service areas, with a mean increase of 0.82 service areas per LPHA (P = .01).

Prevention and Population Health Promotion Within prevention and population health promotion, the only program area with a significant change in activity was the chronic disease program area, which had a mean increase of 2.28 activities per LPHA (P = .04). Specifically, in the chronic disease program area, the service area with the only significant change in activity was nutrition promotion, which had a mean increase of 0.72 activities per LPHA (P = .01). We found that two of the activities had significant changes across all service areas in prevention and population health promotion. These activities were implementation of

American Journal of Public Health | Supplement 2, 2015, Vol 105, No. S2

RESEARCH AND PRACTICE

TABLE 1—Change in Core Service Delivery by Core Service, Program Area and Service Area: Colorado, 2011–2013 Outcome

Potential Range of Activitiesa

No.

Pretest Mean (SD)

Posttest Mean (SD)

Change

Test Statistic

P

Core service: communicable disease Total

0–34

50

26.44 (4.45)

28.06 (4.54)

1.62

2.27

.03

Program area: immunizations

0–10

50

8.58 (1.74)

9.02 (1.42)

0.44

1.95

.06

Service area: childhood immunizations

0–5

50

4.54 (0.79)

4.70 (0.61)

0.16

1.38

.17

Service area: adult immunizations

0–5

50

4.04 (1.16)

4.32 (0.96)

0.28

1.79

.08

0–10 0–5

50 50

9.14 (1.53) 4.86 (0.45)

9.08 (1.81) 5.00 (0.0)

–0.06 0.14

0.17 2.19

.87 .03

Program area: prevention of disease transmission Service area: tuberculosis Service area: HIV Program area: population prevention and education

0–5

50

4.28 (1.34)

4.08 (1.81)

–0.20

0.59

.56

0–14

50

8.72 (3.16)

9.96 (3.05)

1.24

2.45

.02

Service area: influenza

0–7

50

4.78 (1.59)

5.64 (1.48)

0.86

2.94

.005

Service area: sexually transmitted infection

0–7

50

3.94 (1.95)

4.32 (2.33)

0.38

1.09

.28

Core service: prevention and health promotion Total

0–56

50

34.72 (13.19)

37.14 (11.66)

2.42

1.33

.19

Program area: chronic disease Service area: tobacco

0–28 0–7

50 50

17.98 (7.16) 4.30 (2.37)

20.26 (6.24) 4.66 (2.17)

2.28 0.36

2.16 1.02

.04 .31

Service area: physical activity

0–7

50

4.20 (2.35)

4.70 (2.33)

0.50

1.23

.23

Service area: mental health

0–7

50

4.58 (2.28)

5.28 (2.29)

0.70

1.73

.09

Service area: nutrition

0–7

50

4.90 (2.01)

5.62 (1.65)

0.72

2.66

.01

0–21

50

12.76 (5.71)

12.90 (5.61)

0.14

0.14

.89

Program area: maternal and child health Service area: prenatal health

0–7

50

4.38 (2.23)

4.64 (2.24)

0.26

0.63

.53

Service area: unintended pregnancy prevention

0–7

50

3.82 (2.44)

4.06 (2.51)

0.24

0.57

.57

0–7 0–7

50 50

4.56 (2.19) 3.30 (2.57)

4.20 (2.39) 3.98 (2.68)

–0.36 0.68

0.88 1.32

.38 .19

Service area: oral health Program area: injury prevention, for motor vehicle service area

Core service: environmental health Total

0–15

46

11.26 (3.57)

12.09 (2.83)

0.83

1.95

.06

Program area: sanitation

0–9

46

5.98 (2.53)

6.59 (2.28)

0.61

1.76

.08

Service area: childcare facilities

0–3

46

2.72 (0.66)

2.72 (0.50)

0.00

0.00

‡.99

Service area: summer camps

0–3

46

1.65 (1.35)

2.13 (1.28)

0.48

2.57

.01

Service area: spa, swimming, natural bathing

0–3

46

1.61 (1.18)

1.74 (1.25)

0.13

0.68

.5

0–6 0–3

46 46

5.28 (1.39) 2.87 (0.34)

5.50 (1.01) 2.96 (0.21)

0.22 0.09

1.46 1.43

.15 .16

0–3

46

2.41 (1.20)

2.54 (0.89)

0.13

1.00

.32

Program area: food safety Service area: retail establishments Service area: special events a

Potential range of activities refers to the possible number of activities that a local public health agency could implement within each core service, program area, or service area.

culturally or linguistically tailored educational programs, which had an increase of 1.08 service areas per LPHA (P = .02), and policy development and implementation initiatives, which had an increase of 1.40 service areas per LPHA (P = .02).

Within the activities of environmental health, we found 1 activity that had a significant change across the service areas. This activity was outreach or education, which had a mean increase of 0.72 service areas per LPHA (P < .001).

Environmental Health

Structural Factors

Within the core service of environmental health, none of the program areas had a significant change. However, within the program area of sanitation, the service area of summer camps had a significant change, with a mean increase of 0.48 activities per LPHA (P = .01).

To see whether changes in core services were related to LPHA structural factors, we ran linear regressions with the change in the number of activities across the 3 core service areas (prevention and population health promotion, environmental health, and communicable disease) (Table 4).

Supplement 2, 2015, Vol 105, No. S2 | American Journal of Public Health

Whether the agency was formerly a nursing service was positively related to the number of activity changes in the prevention and population health promotion core service (Β = 13.9; P = .11), although the change marginally lacked statistical significance. In the environmental health model, the only statistically significant factor was population, which was positive (Β = 0.00148; P = .11). The coefficient on population was also positive in the other 2 models, but only statistically significant in the environmental health model. For the communicable disease model,

Lampe et al. | Peer Reviewed | Research and Practice | S255

RESEARCH AND PRACTICE

TABLE 2—Change in Core Service Delivery by Activity: Colorado, 2011–2013 Outcome/Activity

Potential Range of Service Areasa

No.

Pretest Mean (SD)

Posttest Mean (SD)

Change

Test Statistic

P

Core service: communicable disease

0–34

50

26.44 (4.45)

28.06 (4.54)

1.62

2.27

.03

System-based servicesb

0–10

50

6.96 (1.80)

7.78 (1.73)

0.82

2.66

.01

Population-based servicesc

0–8

50

7.10 (0.93)

7.36 (0.92)

0.26

1.57

.12

Targeted population servicesd Effected population servicese

0–12 0–4

50 50

8.74 (2.35) 3.64 (0.83)

9.34 (2.34) 3.58 (0.78)

0.60 –0.06

1.84 0.35

.07 .73

0–56

50

34.72 (13.19)

37.14 (11.66)

2.42

1.33

.19

0–8

50

7.22 (1.07)

7.10 (1.18)

–0.12

0.57

.57

Core service: prevention and health promotion Development or dissemination of educational materials Development or dissemination of educational media

0–8

50

5.00 (2.67)

4.82 (2.49)

–0.18

0.41

.68

Development or dissemination of culturally and linguistically tailored materials

0–8

50

5.40 (2.60)

5.82 (2.26)

0.42

0.98

.33

Implementation of culturally or linguistically tailored educational programs

0–8

50

3.62 (2.83)

4.70(2.39)

1.08

2.46

.02

Implementation of educational or training programs or groups

0–8

50

5.28 (1.98)

5.56 (2.02)

0.28

0.73

.47

Implementation of community development activities Policy development and implementation initiatives

0–8 0–8

50 50

4.36 (2.60) 3.16 (2.74)

4.58 (2.40) 4.56 (2.30)

0.22 1.40

0.55 3.42

.59 .001 .06

Core service: environmental health

0–15

46

11.26 (3.57)

12.09 (2.83)

0.83

1.95

Inspections

0–5

46

3.83 (1.22)

3.96 (1.05)

0.13

0.75

.46

Complaint investigation or response

0–5

46

4.04 (1.23)

4.02 (1.09)

–0.02

0.14

.89

Outreach or education

0–5

46

3.39 (1.47)

4.11 (1.08)

0.72

3.76

t

Dependent variable: change in communicable disease services Former nursing servicea

–4.145

3.0942

–1.34 (–10.38, 2.09)

.187

Elected board of healthb

–5.119

2.6482

–1.93 (–10.46, 0.22)

.06

Turnoverc

–3.821

1.4780

–2.59 (–6.80, –0.84)

.013

Population (in thousands)d Percent povertye

0.001 0.078

6.34E-06 0.1170

1.28 (–4.66E-06, 2.09E-05) 0.67 (–0.16, 0.31)

.207 .507

Constantf

5.946

3.4449

1.73 (–1.00, 12.89)

.091

Dependent variable: change in population and health promotion services Former nursing servicea

13.905

8.5952

1.62 (–3.42, 31.23)

.113

Elected board of healthb

9.463

7.3562

1.29 (–5.36, 24.29)

.205

–2.065

4.1055

–0.50 (–10.34, 6.21)

0.002

0.0176

–0.094 –9.067

0.3249 9.5694

Turnoverc Population (in thousands)d Percent povertye Constantf

0.08 (–3.4E-05, 3.7E-05)

.617 .933

–0.29 (–0.75, 0.56) –0.95 (–28.35, 10.21)

.774 .349 .983

Dependent variable: change in environmental health services Former nursing servicea

–0.039

1.8851

–0.02 (–3.85, 3.77)

Elected board of healthb

–2.235

1.6162

–1.38 (–5.50, 1.03)

.174

Turnoverc

0.131

0.9779

0.13 (–1.85, 2.11)

.894

Population (in thousands)d

0.007

0.0039

1.82 (–7.90E-07, 1.49E-05)

.077

Percent povertye

0.054

0.0722

0.75 (–0.09, 0.20)

.457

Constantf

0.249

2.1113

0.12 (–4.02, 4.52)

.907

Note. n = 50 for all 3 models. a Pre-act local public health agency structure: former nursing service, n = 37; former organized health department, n = 13. b Board of health structure: independent local board of health, n = 17; elected officials only on local board of health, n = 33. c Executive director turnover: new local public health agency executive director from 2011 to 2013, n = 17. d Same local health public agency executive director from 2011 to 2013, n = 33. e Population: continuous. f Poverty: continuous.

the previous law focused the work of these agencies on public health nursing and more individual, direct services. The IOM report called for LPHAs to move to more population health activities,7 which was demonstrated in our study by a significant increase in the number of LPHAs that implemented population health activities. This was observed with the significant increase across all measured core services with system-based services within the core service of communicable diseases, policy development, and implementation initiatives within the core service of prevention and population health promotion, and outreach and education within the core service of environmental health. Although there was a significant increase in many of the population health

activities, not all activities that could be described as population health saw significant increases, such as population-based services, within the core service of communicable disease and community development activities within the core service of prevention and population health promotion. The marginal, nonsignificant increases in these activities might demonstrate that although LPHAs are making shifts toward population health activities, the shift in some work is gradual and will likely take more time and need additional measurement over time. Although there was an increase in the population health activities, a simultaneous, significant decrease in any activities did not occur across all core services. This led to the conclusion that LPHAs were not ending

Supplement 2, 2015, Vol 105, No. S2 | American Journal of Public Health

certain activities as quickly as they were adding other activities. This could be related to continuing community need for more direct services, still evolving roles in clinical care related to the Affordable Care Act,8 and the amount of time and effort needed to transition current work to other community partners, presuming there was still community need. Knowing that LPHA resources were not generally increasing at this time,26 it seemed untenable for LPHAs to continue adding population health services without ending other activities. Future research with follow-up data collection at later points in time would describe the longer term service delivery changes.

Limitations As previously mentioned, the 2-year difference from baseline to follow-up data limited the amount of time for LPHAs to make significant changes related to the core services. Although significant changes were seen, they were not uniform across the LPHAs, and only significant increases were demonstrated. With more time between baseline and follow-up measures, it would be possible that more uniform changes in both increases and decreases could be seen. Furthermore, the act of rulemaking of the core services rule did not assume that all LPHAs would be instantaneously in compliance. It will take time, resources, and targeted efforts for all LPHAs to be in compliance and to see system-wide change in service delivery. This increased the impact of the limitation on timing of the pre---post data collection. A third data collection point would be more instructive. The data we collected in the follow-up were limited by the data collection in baseline. The baseline data collection tool was not meant to be a research instrument, but rather, it was to be used for practical purposes during rulemaking preparations. Although this did limit what longitudinal data we could collect from LPHAs, the follow-up data collection instrument was tested for validity and reliability. In addition, we were aware that some LPHAs added services or performed activities after the Act codification and before baseline measurement, in anticipation of the development of core

Lampe et al. | Peer Reviewed | Research and Practice | S257

RESEARCH AND PRACTICE

services. Examples of this included performing community health assessments and adding environmental health services. These changes were not captured in our data collection. In addition, with the large number of statistical tests run, it was possible that some of these could be a result of false positives with significance demonstrated by chance only. The data we collected in this project were only based on the availability of public health services in local public health jurisdictions. These data did not speak to the quantity, quality, equitable reach, nor components of capacity that could influence the sustainability of services. Measurement of quality of public health services, as called for by the IOM27 and the Public Health Quality Forum of the US Department of Health and Human Services,28 should be further explored in the context of a minimum package, ensuring that the services delivered are available and of the highest quality to adequately serve populations.

Conclusions In concert with any state-level or national agreement upon the content of the minimum package, it will be important to develop measures that can accurately and precisely describe if, and to what extent, the services are being provided. Our work established a research framework on which to continue exploring service delivery measurement. Because the Colorado core services are connected to the 10 essential public health services, and the baseline service delivery measures are built off of the NACCHO Profile activities, the findings in Colorado are replicable in other state contexts. The findings in Colorado might inform other states interested in measuring service delivery change. j

About the Authors Sarah Lampe and Lisa VanRaemdonck are with the Colorado Public Health Practice Based Research Network, Colorado Association of Local Public Health Officials, Denver. Adam Atherly is with the Department of Health Systems, Management and Policy, Colorado School of Public Health, Aurora. Kathleen Matthews is with the Office of Planning and Partnerships, Colorado Department of Public Health and Environment, Denver. Julie Marshall is with the De-

partment of Epidemiology, Colorado School of Public Health, Aurora. Correspondence should be sent to Sarah Lampe, MPH, 1385 S. Colorado Blvd Suite 622, Denver, Colorado 80222 (e-mail: [email protected]). Reprints can be ordered at http://www.ajph.org by clicking the “Reprints” link. This article was accepted June 26, 2014.

Contributors S. Lampe managed the research project, which included instrument and measurement development, data collection, data cleaning, analyses, and took the lead in writing the article. A. Atherly was the research principal investigator for the project, which included assisting in the measurement and instrument development, managing and doing the analyses, and editing and writing of the article. L. VanRaemdonck was the practice principal investigator for the project, which included assisting in measurement and instrument development, leading project communication and translation into public health practice, and editing and writing of the article. K. Matthews was a practice consultant for the project, which included providing legal and practice interpretation of the data, assisting in the instrument development and data collection, and editing of the article. J. Marshall was a research consultant for the project, which included measurement development, research methods expertise, interpretation of the analyses, and editing of the article.

7. National Research Council. For the Public’s Health. Investing in a Healthier Future. Washington, DC: The National Academies Press; 2012. 8. Patient Protection and Affordable Care Act (PPACA) §1301(a). Pub L. No. 111-148, HR 3590. 111th Congress, March 23, 2010. 9. National Association of County and City Health Officials. Statement of policy: minimum package of public health services. Proposed December 18, 2012. Approved December 19, 2012. Available at: http://www.naccho.org/advocacy/positions/ upload/12-18-Minimum-Package-of-Benefits.pdf. Accessed March 10, 2014. 10. Teutsch SM, Baciu AB, Mays GP, Getzen TE, Hansen MM, Geller AB. Wiser investment for a healthier future. J Public Health Manag Pract. 2012;18(4):295---298. 11. Lenaway D, Halverson P, Sotnikov S, Tilson H, Corso L, Millington W. Public health systems research: setting a national agenda. Am J Public Health. 2006; 96(3):410---413. 12. Consortium from Altarum Institute; Centers for Disease Control and Prevention; Robert Wood Johnson Foundation; National Coordinating Center for Public Health Services and Systems Research. A national agenda for public health services and systems. Am J Prev Med. 2012;42(5 suppl 1):S72--S78.

Acknowledgments

13. Mays GP, Scutchfield FD, Bhandari MW, Smith SA. Understanding the organization of public health delivery systems: an empirical typology. Milbank Q. 2010; 88(1):81---111.

Support for this project was provided by a grant from the Robert Wood Johnson Foundation. We would like to acknowledge the time and efforts of the local public health agencies across the state of Colorado.

14. Mays GP, McHugh MC, Shim K, et al. Institutional and economic determinants of public health system performance. Am J Public Health. 2006;96(3): 523---531.

Human Participant Protection

15. Erwin PC. The performance of local health departments: a review of the literature. J Public Health Manag Pract. 2008;14(2):E9---E18.

This project was deemed exempt by the Colorado multiple institutional review board on February 24, 2012.

References 1. Colorado Revised Statutes. Part 5 County and district health departments. CRS 25-1-501 et seq (Repealed and reenacted with amendments 2008). 2. Colorado Revised Statutes. Part 6 Local board of health. CRS 25-1-601 et seq (Repealed 2008) and CRS 25-1-601-25-1-667 (Repealed 2008). 3. General Assembly of the State of Colorado. SB 08-194 Colorado Public Health Act of 2008. Denver, CO: Colorado General Assembly; 2008. 4. Hodge JG, Gostin LO, Gebbie K, Erickson DL. The Turning Point Model state public health act. J Law Med Ethics. 2006;34(1):77---84. 5. Colorado State Board of Health. 6 CCR 1014-7 Core Public Health Services. Denver, CO: Colorado Department of Public Health and Environment; 2011. 6. Public Health Functions Steering Committee. Public Health in America. Washington, DC: US Public Health Services; 1994.

S258 | Research and Practice | Peer Reviewed | Lampe et al.

16. Baum NM, DesRoches C, Campbell EG, Goold SD. Resource allocation in public health practice: a national survey of local public health officials. J Public Health Manag Pract. 2011;17(3):265--274. 17. Hyde J, Arsenault L, Waggett J, et al. Structural and organizational characteristics associated with performance of essential public health services in small jurisdictions: findings from a statewide study in Massachusetts. J Public Health Manag Pract. 2012; 18(6):585---594. 18. Merrill J, Meier BM, Keeling J, Jia H, Gebbie KM. Examination of the relationship between public health statute modernization and local public health system performance. J Public Health Manag Pract. 2009; 15(4):292---298. 19. Mays GP, Smith SA. Geographic variation in public health spending: correlates and consequences. Health Serv Res. 2009;44(5 Pt 2):1796--1817. 20. Luo H, Sotnikov S, Shah G, Galuska DA, Zhang X. Variation in delivery of the 10 essential public health services by local health departments for obesity control in 2005 and 2008. J Public Health Manag Pract. 2013;19(1): 53---61.

American Journal of Public Health | Supplement 2, 2015, Vol 105, No. S2

RESEARCH AND PRACTICE

21. Mays GP. Untangling desirable and undesirable variation in public health practice: accreditation and research working together. J Public Health Manag Pract. 2014;20(1):149---151. 22. Mays GP, Smith SA, Ingram RC, Racster LJ, Lamberth CD, Lovely ES. Public health delivery systems: evidence, uncertainty, and emerging research needs. Am J Prev Med. 2009;36(3):256--265. 23. National Association of County and City Health Officials. 2010 National Profile of Local Health Departments. Available at: http://www.naccho. org/topics/infrastructure/profile/resources/ 2010report/index.cfm. Accessed March 10, 2014. 24. Minnesota Department of Health. PPMRS. Local Public Health Planning and Performance Measurement Reporting System. Available at: http://www. health.state.mn.us/ppmrs/index.html. Accessed March 10, 2014. 25. National Association of County and City Health Officials. Project Public Health Ready. Available at: http://www.naccho.org/topics/ emergency/PPHR/pphr-overview.cfm. Accessed March 10, 2014. 26. National Association of County and City Health Officials. Local Health Department Job Losses and Program Cuts: Findings from the 2013 Profile Study. Published July 2013. Available at: http://www.naccho. org/topics/infrastructure/lhdbudget/upload/SurveyFindings-Brief-8-13-13-3.pdf. Accessed March 10, 2014. 27. Institute of Medicine. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academies Press; 2001. 28. Office of the Assistant Secretary for Health, US Department of Health and Human Services. Priority Areas for the Improvement of Quality in Public Health. The Public Health Quality Forum. Washington, DC: US Department of Health and Human Services; 2010.

Supplement 2, 2015, Vol 105, No. S2 | American Journal of Public Health

Lampe et al. | Peer Reviewed | Research and Practice | S259

Minimum package of public health services: the adoption of core services in local public health agencies in Colorado.

We examined the effect of a state law in Colorado that required local public health agencies to deliver a minimum package of public health services...
565KB Sizes 0 Downloads 11 Views