Current Practices in Clinical Analytics: A Hospital Survey Report Dana M. Womack, MS, RN1, Rosemary Kennedy, PhD, MBA, RN, FAAN2, Bill Bria, MD3 Inova Health System, Falls Church, VA, USA; 2National Quality Forum, Washington, DC, USA; 3Shriners Hospitals for Children, Tampa, FL, USA

1

Abstract Clinical analytics must become a pervasive activity in healthcare settings to achieve the global vision for timely, effective, equitable, and excellent care. Global adoption of the Electronic Health Record (EHR) has increased the volume of data available for performance measurement and healthcare organizational capacity for continuous quality improvement. However, EHR adoption does not automatically result in optimal use of clinical data for performance improvement. In order to understand organizational factors related to use of data for clinical analytics, a survey was conducted of hospitals and hospital-based clinics. The survey revealed sub-optimal use of data captured as a byproduct of care delivery, the need for tools and methodologies to assist with data analytics, and the need for disciplined organizational structure and strategies. Informatics nurse professionals are wellpositioned to lead analytical efforts and serve as a catalyst in their facility’s transformations into a data-driven organization. Introduction The availability to capture and analyze data about business practices, processes and outcomes has transformed many industries. To this extent, health policy in many countries emphasizes public dissemination of data on clinical performance as a method to improve healthcare quality globally1. Healthcare is undergoing a transition from an era where patient outcome data were locked in manual forms and databases, to an era where aggregate clinical data are available for knowledge discovery, clinical decisions are supported by synthesized information, and healthcare processes and outcome are more transparent. While a universal definition of clinical data analytics has not yet been established, within the industry, clinical analytics refers to the capture and use of discrete clinical data to identify and measure quality, patient safety, or service line efficiencies and improvements2. The volume of clinical data captured and stored in an electronic format is steadily increasing as a growing number of EHRs are being deployed. While efforts to “make data electronic” are accelerating, advances in clinical data analytics for quality improvement are not keeping pace. Many healthcare facilities with EHRs remain at the “data” and “information” levels of clinical information processing when they have the potential to operate at the “knowledge” and “actionable insight” levels. A review of the literature reveals that little is known about current organizational structures and efforts related to data analytics1. While implementation of the EHR is a foundational component for quality improvement, the journey to high-quality, efficient systems involves more than technology. It requires a multi-faceted strategy that includes leadership, culture change, and inter-professional involvement3. Progress toward the six dimensions of quality identified by the Institute of Medicine (safety, effectiveness, patient-centeredness, timeliness, efficiency, and equity4) is slowly being made, but the task of improving individuals’ care is hardly completed5. Although the implementation of health information systems has been a significant focus of nursing informatics for the past decade, now is the time for informatics professionals to collaborate with their technology, quality, and evidencebase practice colleagues. This collaboration is essential for using the EHR to improve quality. Purpose A key characteristic of high-performing organizations is the ability of every employee to articulate the meaning of quality in their organization and the impact they each have on quality6. Recognizing that data analytics and the application of findings to care improvement are key to safety and quality care improvement, the authors conducted an informal survey of analytics infrastructure and processes in United States (US) hospitals in the summer of 2011 to provide a current snapshot of clinical data analysis practices in support of clinical care improvement. The purpose of the survey was to better understand current practices, capabilities, and challenges related to clinical data analysis in US acute care settings, for purposes of identifying opportunities and actions to support pervasive clinical analytics.

Methods Survey questions were developed by a small group of health information technology professionals. The survey solicited information regarding organizational safety and quality targeted areas, clinical data analysis team structure, perceived value of clinical data analysis, role of leadership, and opportunities and barriers related to quality improvement cycles. Face validity was established through peer review which resulted in refinement of the survey. The resulting web-based survey was distributed via waterfall method via four prominent health informatics ListServs. Employment by a US hospital or hospital-based clinic was a criterion for participation in the survey. Analysis of survey findings was accomplished through a combination of descriptive statistics and content analysis. Descriptive statistics were used to summarize multiple-choice questions, and content analysis was used to identify themes within free text responses. Sample Demographics A total of 87 complete responses were received from an unfixed sample during a 3 week period during June-July, 2010. While it offers convenient access to targeted professionals, a limitation of waterfall distribution via informatics ListServs is that the exact sample size cannot be determined, nor participant selection randomized. Survey participants represented a variety of organization types and sizes, including 35 Community Hospitals, 29 Academic Medical Centers, 5 Children’s Hospitals, 5 Federal/VA Hospitals, 4 General Medical/Surgical Facilities, and 9 classified as “other” organizational type. Multiple hospital bed sizes were represented, including 0-99 beds (4%), 100-199 beds (13%), 200-299 beds (14%), 300-399 beds (11%), 400-499 beds (14%), and 500+ beds (44%). Sixty-two percent of facilities were teaching facilities, 55.2% were multi-hospital systems, and 26.4% were critical access hospitals. Participants represented a wide variety of informatics-related roles. Nursing informatics (Figure 1) was the largest professional group represented within survey participants.

Figure 1. Survey Participant Roles (Counts, 87 Total) Survey Results Prioritization of focus areas for new improvement initiatives is not occurring consistently in US hospitals, as less than 45% of respondents indicated that their organizations have a formal, repeatable process for prioritizing focus areas for safety or quality improvement initiatives within their organizations. Indicative of a culture that focuses primarily on “hot spots”, survey respondents indicated that Regulatory/Payor Compliance, Sentinel Events, and Opportunities to Reduce Cost are primary drivers for safety/quality improvement initiatives. Traditional technology-focused roles are more prevalent than roles that are more deeply focused on clinical analytics. Over 75% of participants report having a Chief Information Officer (CIO) within their organization, while less than 44% have a Chief Quality Officer (CQO). Nearly 60% have a Chief Medical Information Officer (CMIO), and nearly 13% have a Chief Nursing Information Officer (CNIO).

While the performance of clinical analytics requires specialized skill sets, only 62% of facilities report the existence of a permanent clinical data analysis team to support of safety & quality initiatives (Figure 2). All participants felt that clinical data analysis was highly important to quality improvement efforts, but they consistently were only moderately satisfied with current analytical support. A “report production” mindset may also be present, as 69% of respondents stated that the same persons or teams who perform clinical data analysis in support of care improvement, also produce the organization’s mandatory quality reports.

Figure 2. Clinical Data Analysis Structures While 72% of respondents report having a clinical data warehouse in place for two or more years, improvement projects are challenged to go through the full traditional improvement lifecycle of problem identification, baseline measurement, improvement plan development, plan execution, and re-measurement and evaluation. Nearly half (44.7%) of respondents report that only 50-75% of their quality improvement projects go through the full improvement lifecycle, and an additional 30% report that less than half of their improvement projects go through the full improvement lifecycle. Re-measurement and evaluation, an analytically driven phase of the Care Improvement Cycle, was identified as the most problematic.

Figure 3. Problematic Phases of the Care Improvement Cycle Hospitals may not be taking full advantage of their electronic health information systems. While nearly all (>97%) facilities report laboratory and pharmacy systems in place, higher level functionality to support decision-making based on the data these systems generate lags notably. Only half reported a rules engine or clinical decision support capability in place, and 21% do not yet have a clinical data repository (CDR) or data warehouse in place.

Challenges to achieving the vision of advanced clinical analytics are numerous. Themes from content analysis of narrative responses fell into two categories: Cultural & Educational challenges, and Technical challenges. Cultural & Educational challenges include: • Lack of organizational alignment and/or strategy for data analysis as a challenge • Culture change (requires a different way of thinking) • People do not know how to interpret or use the data Technical challenges include: • Lack of standardized methods and electronic tools • Inconsistent database structures and lack of interoperability Despite challenges, clinical analytics are being employed in clinical settings. Participants described a variety of clinical targets (Figure 3) as the focus of past clinical care improvement initiatives.

Figure 3. Clinical Care Improvement Targets Discussion Although many clinicians believe that they are practicing up to date, evidence-based medicine, wide practice variations suggest that clinical practice often falls short of the best evidence available8. The healthcare information age, now experiencing its greatest growth phase in the history of US, has the potential to eliminate this “evidence gap” through dissemination of best known clinical practices to clinicians through clinical decision support tools delivered via the electronic health record and other information instruments. More importantly, we are nearing a clinical informatics “tipping point” where clinical effectiveness research may be accomplished as a byproduct of daily clinical care. While data collected via an EHR is foundational to many improvement efforts, initiating a clinical outcomes/effectiveness analysis effort does not need to wait until all seven levels of the Health Information Management Systems Society (HIMSS) Electronic Medical Record (EMR) Adoption Model9 are in place. Survey respondents represented a broad cross section of implementation stages, indicating that engaging in data analysis should begin from the outset of the implementation of EHRs.

While the maturity of clinical analytics structures and culture varies across facilities, senior leadership teams are aligning behind the need for a data analysis reporting environment around the EHR. Adoption of methodologies for prioritizing and tracking initiatives through the full care improvement efforts is imperative. Survey results highlight that even facilities with mature data warehouse environments are challenged to serve the full quality cycle and continuous quality improvement needs of their organizations. The full quality cycle, and especially re-measurement phase, provides the key link between simply answering a narrow question, and identifying additional potential opportunities for safety and quality improvement. Challenges related to improvement initiative prioritization, taking projects full cycle, and lack of satisfaction with current analytical support suggest that there is a significant opportunity for improved organizational alignment within applied healthcare informatics in healthcare facilities today. Creation of new knowledge through analysis of EHR data, and use of the EHR as a quality improvement tool requires seamless integration of historically disparate domains and organizational departments focused on information technology, clinical informatics, quality, and evidence based medicine. Clear collaboration strategies are required to advance EHR utilization to a more sophisticated level of continual learning, where care delivery data is used to increase knowledge about how to improve care. This new paradigm requires EHR implementation infrastructure to include clinical data analysis capabilities and a disciplined approach to quality improvement. A vision for knowledge creation through aggregate analysis of EHR data should influence every discussion and decision from the outset of an EHR implementation. The study indicates the importance of contextual factors influencing clinical data analytics such as leadership and culture. This is consistent with the literature showing the multi-dimensional impact of cultural influences which operate across multiple levels within the organization1. Changing the existing culture from measuring EHR success in terms of “go live” functionality to measurable improvement in patient outcomes will require significant investment. Conclusion Implementation of health information systems has been a significant focus over the past decade, but now is the time for informatics professionals to collaborate with their technology, quality, and evidence-base practice colleagues to help their organizations transition from “data” and “information” to “knowledge” and actionable insight” levels of information processing. This collaboration is essential to the effective use of the EHR as a quality improvement tool and the creation of a learning healthcare environment. Additional research regarding organizational structures that are best suited to the execution of clinical effectiveness research as a byproduct of daily clinical care is needed to move facilities beyond EHR implementations and meaningful use compliance to a transformation into an effective learning organization that consistently produces improved patient outcomes. References 1. 2.

3. 4. 5. 6. 7. 8. 9.

Davies H. Public release of performance data and quality improvement: internal responses to external data by US health care providers. Quality in Health Care. 2001; 10: 104-110. Health Information Management Systems Society. Clinical Analytics White Paper: Can Organizations Maximize Clinical Data? 2010 [cited 2011 Sept. 12]; Available from: http://www.himssanalytics.org/docs/clinical_analytics.pdf. 2010. Institute of Medicine. Creating a Business Case for Quality Improvement Research: Expert Views, Workshop Summary. Washington: National Academies Press 2008. Institute of Medicine. Crossing the Quality Chasm: A New Health System for the Twenty-first Century. Washington: National Academies Press 2001. Berwick D. The Triple Aim: Care, Health and Cost. Health Affairs. 2008;27:759-770. Hammarstedt R, Bulger D. Performance Improvement: A "left brain meets right brain'' approach. Healthcare Financial Management. 2006;12:100-106. Berwick D. Preparing Nurses for Participation in and Leadership of Continual Improvement. Journal of Nursing Education. 2011;6:322-327. Sackett DL, Rosenberg WMC, Gray JAM, Haynes RB, Richardson WS. Evidence based medicine: what it is and what it isn't. BMJ 1996;312: 71–72. Davis MW; The seven stages of EMR adoption. Majority of hospitals are in stage 3 and rising. Healthcare Executive. 2010;3:18-19.

Current practices in clinical analytics: a hospital survey report.

Clinical analytics must become a pervasive activity in healthcare settings to achieve the global vision for timely, effective, equitable, and excellen...
284KB Sizes 0 Downloads 0 Views