Perspectives Commentary on: Craniotomy for Glioma Resection: A Predictive Model by Missios et al. World Neurosurg 83:957-964, 2015

Neurosurgical Quality Metrics: Seeking the Right Question James Richard Bean

Quality of care measurement was a late addition to managed care. As a small specialty, neurosurgery quality of care metrics have a low priority for Medicare and other health care payers. Frequency of complications of glioma surgery derived from a large administrative database can be used as a quality measure, but the results may be inaccurate and inadequate. Prospective voluntary outcome registries offer a better alternative.

I

n the era of searching for alternative medical payment mechanisms (i.e., anything but fee-for-service), the concept of paying for quality rather than volume of service has become a widely proclaimed solution for curbing health care inflation. First considered decades ago, improving quality while reducing costs was the basis of the federal Health Maintenance Organization (HMO) Act in 1973 (PL 93-222), a concept championed by Paul Ellwood, M.D. (11). The original idea was to offer prepaid health plans provided by closely coordinated medical groups that managed patient health care for a per capita monthly premium (“capitation”) and provide all needed medical services. The objective was to reverse the financial incentive inherent in fee-for-service payment, discourage expensive

Key words Craniotomy - Glioma - National Inpatient Sample - Risk prediction -

Abbreviations and Acronyms AHRQ: Agency for Healthcare Research and Quality CMS: Centers for Medicare and Medicaid Services HCUP: Healthcare Cost and Utilization Project HMO: Health maintenance organization PQRI: Physician Quality Reporting Initiative PQRS: Physician Quality Reporting System

WORLD NEUROSURGERY 84 [4]: 891-893, OCTOBER 2015

specialty services, and encourage preventive and minimalistic care, allowing the HMO to profit more by doing less, or at least less high-tech, expensive care. Measuring outcomes and quality of care was a key ingredient in Ellwood’s vision of the future of health care (5). The HMO concept was intended to be an alternative to private indemnity health insurance. But predictably, it was co-opted by indemnity insurers, who transformed the prepaid medical group notion into “provider” networks, negotiated fee reductions, preauthorization of medical decisions, and treatment denials, perverting the concept of physicians making cost-effective care decisions while bearing the financial risk of those decisions. The result was a competing alphabet soup of HMOs, preferred provider organizations (PPO), exclusive provider organizations (EPO), point of service plans (POS) , independent practice associations (IPA), physician-hospital organizations (PHO), and other variants all seeking to adapt the HMO model to fee-for-service compensation arrangements. Measuring quality among physicians in their networks was a late addition, to try to balance restrictedenrollee choice of physicians. In 1989, Entoven and Kronig published in the New England Journal of Medicine a two-part critique of U.S. health care entitled “A Consumer Choice Health Plan for the 1990s,” which stated: “The health care economy in the United States is a paradox of excess and

Baptist Health Medical Group, Lexington, Kentucky, USA To whom correspondence should be addressed: James Richard Bean, M.D. [E-mail: [email protected]] Citation: World Neurosurg. (2015) 84, 4:891-893. http://dx.doi.org/10.1016/j.wneu.2015.06.004

www.WORLDNEUROSURGERY.org

891

PERSPECTIVES

deprivation” and “We have spent little on evaluating medical technology, and there is much uncertainty about its efficacy. Much care appears to be of unproved value” (6). Not only were technologies unproven, but care was deemed positively harmful in many cases, chronicled in the 2001 Institute of Medicine report Crossing the Quality Chasm, which noted that, “in America, there is no guarantee that any individual will receive high quality care for any particular health problem. The health care industry is plagued with overutilization of services, underutilization of services and errors in health care practice” (9). The report proposed changes in the organization and provision of health care that emphasized studying priority chronic conditions, improving coordination and communication among various members of a health care team, and building better information systems. Quality was defined ideally as a team effort, not an individual report card. Neurosurgery was not a priority. Medicare initiated the Physician Quality Reporting Initiative (PQRI) in 2007, an attempt to document quality of care and tie it to financing by rewarding or penalizing individual physicians in the Medicare Part B payment system. However, as observed by Berenson et al. in the New England Journal of Medicine in 2013, “. the practical reality is that the Centers for Medicare and Medicaid Services (CMS), despite heroic efforts, cannot accurately measure any physician’s overall value, now or in the foreseeable future.” The reason neurosurgical clinical performance cannot be accurately assessed to decide on an individual Medicare “value-based payment modifier” is because “. something is fundamentally wrong—physicians simply do not respect the measures, and for good reason. Physician Quality Reporting System (PQRS) measures reflect a vanishingly small part of professional activities in most clinical specialties” (2). Neurosurgery in the United States is a small specialty, in the larger scope of things. Neurosurgeons represent less than 1% of Medicare physicians, and neurosurgical charges represent only 0.5% of Medicare Part B spending. (In 2012, neurosurgical payments were $562 million out of a total $99 billion Part B spending) (4). Because performance measures are developed on the basis of volume of services, measures specific to neurosurgeons simply do not exist in the Medicare PQRS program. The dilemma within the Medicare program reflects other payers’ attempts to grade neurosurgical performance: It’s too small an issue to warrant much attention. However, neurosurgeons are concerned not with being left out of quality measures, but with being gauged by quality measures that are either inadequate or inaccurate. Bekelis et al. have described in their study “Craniotomy for glioma resection” a predictive model of complications after craniotomy for glioma derived from a large administrative database of hospital admissions. Their rationale is that “several regulatory organizations are instituting quality metrics, on which surgeons will be held accountable in the near future. Although most of these efforts are targeting general medical conditions with high prevalence, they will likely be used without modification in subspecialty areas.” The authors based this observation on an article by Fisher et al., which states that, “performance measures for accountability should be based on data sources that can be used to support care improvement and should be aligned, to the extent possible, across providers and payers to minimize the burden and

892

www.SCIENCEDIRECT.com

optimize the effect” (7): a rather opaque generalization, offering little practical guidance on data source selection. The database used in the study is the National Inpatient Sample (8), a randomized community hospital discharge selection of 20% of all admissions to hospitals included in the Healthcare Cost and Utilization Project (HCUP) of the Agency for Healthcare Research and Quality (AHRQ). Not all states are represented, though it includes 95% of the U.S. population. The data is based on ICD9 codes over a 6-year period from 2005e2011. Surgical complications retrievable from the database were used to build a complication frequency profile. The authors used a sophisticated statistical analysis to find that advanced age, coronary artery disease, congestive heart failure, stroke, chronic renal failure, and coagulopathy all increased the risk of complications, among other conclusions. These findings are not new. The authors reason that the size of the database confers more accuracy to the numbers. However, ICD-9 code 01.51 is included in the analysis, which is “excision of lesion of cerebral meninges,” so the data necessarily include meningioma and glioma surgery. The reason the outcomes are selected in this study is not because they are the most relevant to neurosurgery but because that’s all the database includes. The quality of the data is questionable, as the accuracy of hospital coding is quite variable (3), suffering from both omissions and incorrect entries, for which complex and sophisticated statistical analysis cannot compensate. In view of the numerical and financial insignificance of neurosurgery in the grand economic scheme, it is unlikely that Medicare, or any other payer, will grade quality in neurosurgical practice on the basis of general medical complications related to craniotomy for glioma. About 20,000 new gliomas are found in the United States each year among the 70,000 new intracranial tumors annually (1), which, if evenly distributed among about 4500 neurosurgeons in the United States, would result in 4e5 cases per year, far too few to offer any accurate practice performance analysis. Yet glioma surgery is not evenly distributed, which further complicates the problem of finding a quality measurement that is widely applicable within the specialty. The value of this report is in the attempt to broaden the database and accuracy of the frequency of typical craniotomy complications, as well as demonstrate the statistical methods for doing so. But it will not set a higher standard for payer quality assessment of neurosurgical practice. The more likely and useful source for detailed analysis of comparative quality of selected services in neurosurgical practice would be a voluntary registry, such as the National Neurosurgical Quality and Outcomes Database, which is building outcome benchmarks for selected neurosurgical procedures entered by more than 60 participating practices nationwide (10). But the very fact of limited participation can easily introduce confounding bias in the performance benchmarks that limit generalization. Practice quality indicators are evolving. The measurement of surgical complications derived from an administrative database is but a primitive stage in the evolution of quality metrics. More important for the future is the development of prospective relevant data entry as a routine practice function, including

WORLD NEUROSURGERY, http://dx.doi.org/10.1016/j.wneu.2015.06.004

PERSPECTIVES

patient-derived outcome data, so that continuous quality improvement graduates from an interesting industrial concept outside the walls of health care institutions to become an integral

routine health care process, as habitual as Current Procedural Terminology coding. Pay for quality should reward the true item, not a convenient but irrelevant proxy.

5. Ellwood PM: Outcomes management: the Shattuck lecture. N Engl J Med 318:1549-1556, 1988.

REFERENCES 1. American Brain Tumor Association: Brain tumor statistics. Available at: http://www.abta.org/aboutus/news/brain-tumor-statistics/. Accessed May 11, 2015.

6. Enthoven A, Kronig R: A consumer choice health plan for the 1990s: universal health insurance in a system designed to promote quality and economy. N Engl J Med 320:29-37, 1989.

2. Berenson RA, Kaye DR: Grading a physician’s value—the misapplication of performance measurement. N Engl J Med 369:2079-2081, 2013.

7. Fisher ES, McClellan MB, Safran DG: Building the path to accountable care. N Engl J Med 365: 2445-2447, 2011.

3. Berthelsen CL: Evaluation of coding data quality of the HCUP National Inpatient Sample. Top Health Inf Manage 21:10-23, 2000.

8. Heathcare Cost and Utilization Project (H-CUP), Agency for Healthcare Research and Quality (AHRQ): Overview of the national inpatient sample. Available at: https://www.hcup-us.ahrq. gov/nisoverview.jsp. Accessed May 11, 2015.

4. CMS.gov, Medicare Utilization for Part B, CY 2012 Expenditures and Services by Specialty. Available at: http://www.cms.gov/Research-Statistics-Dataand-Systems/Statistics-Trends-and-Reports/Medi careFeeforSvcPartsAB/MedicareUtilizationforPartB. html. Accessed May 11, 2015.

9. Institute of Medicine: Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, D.C.: Institute of Medicine; 2001.

10. Neuropoint Alliance, National Neurosurgery Quality and Outcomes Database: Available at: http://www.neuropoint.org/NPA%20N2QOD.html. Accessed May 16, 2015. 11. U.S. Office of Technology Assessment: Managed Care and Competitive Health Care Markets: The Twin Cities Experience. Washington, D.C.: U.S. Office of Technology; 1994.

Citation: World Neurosurg. (2015) 84, 4:891-893. http://dx.doi.org/10.1016/j.wneu.2015.06.004 Journal homepage: www.WORLDNEUROSURGERY.org Available online: www.sciencedirect.com 1878-8750/$ - see front matter ª 2015 Elsevier Inc. All rights reserved.

Photo by Marin F. Stancic, MD. “The Piazza in Hvar is the largest town square in Dalmatia, Croatia with a Renaissance Bell Tower in Romanesque style.” Dalmatia, Croatia

WORLD NEUROSURGERY 84 [4]: 891-893, OCTOBER 2015

www.WORLDNEUROSURGERY.org

893

Neurosurgical Quality Metrics: Seeking the Right Question.

Quality of care measurement was a late addition to managed care. As a small specialty, neurosurgery quality of care metrics have a low priority for Me...
650KB Sizes 2 Downloads 7 Views