COMMENTARY

What Big Data Can and Cannot Tell Us About Emergency Department Quality for Urolithiasis

I

n this issue of Academic Emergency Medicine, a paper by Scales et al.1 explores Healthcare Cost and Utilization Project (HCUP) data from California patients with urolithiasis treated in 2008 and 2009 and assesses the risk factors for repeat emergency department (ED) visits, including patient-level factors, ED processes of care, and area health resources. This study illustrates what has become increasingly common in health services research over the past decade: using “big data” to explore factors related to utilization and quality. The question for this and similar studies is: how can interesting associations in big data lead to actionable clinical solutions or guide decision-making for policy-makers? One advantage of big data is the ability to identify associations across time. California HCUP data can track the same patients across all EDs within the state across time. This insight is an advantage for most clinicians who usually only have access to information from their own EDs or to a limited number of EDs that may share information. Because patients frequently present to different EDs with similar complaints, it is difficult to have a “full picture” of a patient’s emergency care through a single electronic record. Other advantages of big data are the ability to identify associations across a wide geographic area and to accurately estimate important care patterns. In this case, knowing that one in nine initial ED visit patients with urolithiasis returned to an ED within 30 days is helpful information. For clinicians, this information underscores the need to provide effective communication at ED discharge. In addition, the analysis emphasizes the need for careful follow-up plans for patients who might encounter barriers to outpatient care, such as those due to Medicaid insurance acceptance or a low supply of community urologists. Medicaid beneficiaries encounter more barriers to timely outpatient care and have higher associated ED utilization regardless of condition.2 In this study, the risk of an ED revisit was 50% higher in Medicaid beneficiaries. The next issue that deserves close examination is the primary outcome of this study: ED revisits. While

The authors have no relevant financial information or potential conflicts of interest to disclose. A related article appears on page 468.

© 2015 by the Society for Academic Emergency Medicine doi: 10.1111/acem.12639

California HCUP can conveniently measure ED revisits, studies that have closely examined ED revisits as a marker of quality have found they are not a reliable indicator of poor care.3 In one study of 72-hour ED return admissions, only one in 20 visits had any deviation from the standard of care or any change in outcome as a result of poor quality care during the first ED visit.4 The reason that ED revisits are important is because in a minority of revisit admissions, there are clinically important conditions that were initially missed, such as appendicitis, acute myocardial infarction, fracture, or subarachnoid hemorrhage. Therefore, ED revisits highlight patients whose index care may require close scrutiny through chart review and whose outcomes should be tracked. This illustrates a limitation of big data: it may be useful for case-finding, or as a signal that needs further investigation, but not necessarily for making definite determinations about care quality. Another challenge of big data is the fact that only certain interactions with the health care system are recorded. The California HCUP only includes data from ED visits or inpatient admissions. Other encounters required for a complete picture of care patterns, such as outpatient clinic visits, urgent care visits, or any procedures that happened outside of hospitals, are not tracked. For example, it is difficult to fully understand the association between local density of urologists and ED revisits without knowing more information about outpatient care. In this study, more community urologists were associated with fewer ED revisits, but also with more hospitalizations and inpatient urgent procedures. In regions with the highest density quartile of urologists, the likelihood of hospitalization and/or intervention increased by 77% compared to the lowest quartile. Of note, this finding is similar to those of other studies that have demonstrated higher rates of cardiology and neurosurgical procedures in areas where there is a higher density of specialists.5 Therefore, along with the ED revisit itself, the need for hospitalization, particularly for an invasive procedure, should be carefully examined to assess indications. Another issue with big data is the limited amount of details included in its available variables. In big data we are able to analyze such factors as date of service, diagnoses, disposition, test use, and sometimes charges or costs of care. While big data can tell us that the patient returned to the ED, it cannot tell us why. There are

ISSN 1069-6563 PII ISSN 1069-6563583

481 481

482

many reasons why urolithiasis patients may return to the ED, probably the most common factor being the natural course of the condition: pain tends to recur as the stone moves down the ureter. However, other conditions or more serious complications, such as renal failure or an infected obstructed stone, may also cause patients to return. In addition, many other factors influencing revisits are unavailable, such as what medications were prescribed at discharge including the type, dose, and duration and the size of the stone. In addition, big data cannot tell us details or results of imaging or laboratory tests. These information gaps weaken the claim of a linkage between the “performance of a complete blood count (CBC)” during the initial visit and the decreased likelihood of a subsequent revisit. Although described as “exploratory,” the authors surmise that the lack of a CBC may indicate an improper assessment for infection, thus the increased rate of revisits is suggested to be due to misdiagnoses of infected stones. Yet, this linkage seems highly improbable given the myriad unmeasured factors that test for an infected stone and are part of standard ED care, including the measurement of temperature and urinalysis. This example of linking a process to an outcome is emblematic of the broad desire to conclude that associations found in big data reflect differences in care quality. Similar studies aiming to link processes to outcomes have proliferated in recent years, and some of those linkages have led to the development of successful quality metrics.6 However, in other examples, the association observed in big data can be misleading. For example, in the case of antibiotic timing in pneumonia, the association in big data led to the development of a quality metric that had several unintended consequences and was ultimately removed from public measurement programs.7 In the end, while big data can provide important clues about ED care for urolithiasis, this analysis does not supplant the need for robust study designs like randomized clinical trials to compare care processes and measure clinically important outcomes. To no fault of the authors, many of the important details regarding process of care and clinical outcomes were simply lacking from the data set. However, as technology improves with expanded information in electronic medical records, so will big data and the information we can get out of it. As more variables become available, more important associations will be revealed that will lead to

Meltzer and Pines • BIG DATA AND QUALITY FOR KIDNEY STONES

actionable clinical solutions to improve patient care. But until then, understanding the limitations of big data is vitally important for researchers, clinicians, and policymakers. Andrew C. Meltzer, MD, MS ([email protected]) Jesse M. Pines, MD, MBA, MSCE Associate Editor Academic Emergency Medicine Department of Emergency Medicine George Washington University Washington, DC

Supervising Editor: David C. Cone, MD.

References 1. Scales CD, Lin L, Saigal CS, et al. Emergency department revisits for patients with kidney stones in California. Acad Emerg Med 2015;22:468–474. 2. Cheung PT, Wiler JL, Lowe RA, Ginde AA. National study of barriers to timely primary care and emergency department utilization among Medicaid beneficiaries. Ann Emerg Med 2012;60:4–10. 3. Pham JC, Kirsch TD, Hill PM, DeRuggerio K, Hoffmann B. Seventy-two-hour returns may not be a good indicator of safety in the emergency department: a national study. Acad Emerg Med 2011;18:390–7. 4. Abualenain J, Frohna WJ, Smith M, et al. The prevalence of quality issues and adverse outcomes among 72-hour return admissions in the emergency department. J Emerg Med 2013;45:281–8. 5. Fisher ES, Welch HG. Avoiding the unintended consequences of growth in medical care: how might more be worse? JAMA 1999;281:446–53. 6. Schuur JD, Hsia RY, Burstin H, Schull MJ, Pines JM. Quality measurement in the emergency department: past and future. Health Aff (Millwood) 2013;32:2129– 38. 7. Pines JM, Hollander JE, Lee H, Everett WW, UscherPines L, Metlay JP. Emergency department operational changes in response to pay-for-performance and antibiotic timing in pneumonia. Acad Emerg Med 2007;14:545–8.

What big data can and cannot tell us about emergency department quality for urolithiasis.

What big data can and cannot tell us about emergency department quality for urolithiasis. - PDF Download Free
198KB Sizes 0 Downloads 12 Views