The American Journal of Sports Medicine http://ajs.sagepub.com/

Dollars and Sense Bruce Reider Am J Sports Med 2015 43: 1313 DOI: 10.1177/0363546515588309 The online version of this article can be found at: http://ajs.sagepub.com/content/43/6/1313

Published by: http://www.sagepublications.com

On behalf of: American Orthopaedic Society for Sports Medicine

Additional services and information for The American Journal of Sports Medicine can be found at: Email Alerts: http://ajs.sagepub.com/cgi/alerts Subscriptions: http://ajs.sagepub.com/subscriptions Reprints: http://www.sagepub.com/journalsReprints.nav Permissions: http://www.sagepub.com/journalsPermissions.nav

>> Version of Record - May 29, 2015 What is This?

Downloaded from ajs.sagepub.com by guest on August 19, 2015

Editorial

Dollars and Sense and 1 each with hip arthroscopy and shoulder instability. While the number of studies was limited, the authors found that they were generally high in quality and that the operative interventions studied were shown to be highly cost-effective. Because the precise comparisons made in the studies varied widely, I refer you to the complete text for their detailed findings. Delving further into the reason that Nwachukwu and colleagues began their search with 1998 publications will bring up a number of aspects of cost-effectiveness analysis (CEA) worth emphasizing. In 1993, the US Public Health Service convened a panel of experts in the fields of CEA, clinical medicine, ethics, and health outcomes measurement to create guidelines for the conduct and reporting of cost-effectiveness research. Their primary motivation was a perception that widespread disparity in the nature of existing cost-effectiveness analyses severely limited the utility of these studies for formulating public policy. After 2½ years of discussions, the Panel on Cost-Effectiveness in Health and Medicine published their recommendations in a series of 3 articles in 1996.21,24,26 The Panel began by defining a standard set of parameters for performing a cost-effectiveness analysis that they termed the ‘‘reference case.’’ They felt that including such a standardized example in every CEA would eventually allow comparison across different studies. They declared that the point of view or ‘‘perspective’’ of each study should be that of the society as a whole. In other words, not just the direct monetary costs of a treatment would be considered, but all costs, harms, and benefits to all relevant parties. ‘‘Only the society perspective,’’ they said, ‘‘never counts as a gain what is another party’s loss.’’21(p1174) Such a perspective was deemed most appropriate for the diverse US system of health care; in other circumstances, a single private insurer or a government agency tasked with providing health care on a fixed budget might prefer a different perspective that only takes into account the costs for which the entity would be directly responsible. Once the perspective of a CEA is selected, calculating the monetary costs of the interventions under examination is relatively straightforward, although computing secondary costs such as loss of personal productivity is definitely a bit trickier, requiring assumptions and generalizations. Measuring the efficacy of an intervention, on the other hand, is arguably even more difficult, especially for orthopaedic conditions, where the goal is usually to improve the quality of life, rather than forestall mortality. To do so, the Panel noted, one must first assign a relative degree of benefit to the intervention being considered. The Panel acknowledged that there is no gold standard of benefit against which all others are judged, since opinions and preferences vary among individuals. Nevertheless, one must select a standard of preference in order to accomplish a CEA. Although a group of physician experts might

The preferences and conditions of individual patients may point to decisions different from those supported by a reference case CEA. —Russell et al21(p1175) The orthopaedic sports medicine literature is in a state of continuous evolution. Since I assumed the helm of The American Journal of Sports Medicine in 2002, a number of new topics and study types have made their appearance in these pages. When something different arrives in my virtual inbox, it might represent the temporary dalliance of a small group of researchers or the first trickle of an impending torrent of submissions from the orthopaedic sports medicine community. Previously rare topics, including hip arthroscopy, concussion, and biologic healing enhancement, have become commonplace; study types such as systematic reviews, multicenter cohorts, and registry-based analyses, which were once uncommon in our field, now often appear in our table of contents. In this issue of AJSM, Nwachukwu and colleagues17 share their systematic review of ‘‘Cost-Effectiveness Analyses in Orthopaedic Sports Medicine’’; a study type that, as they document, has not been a frequent visitor to the pages of our journal. Interest in economic studies may be growing in our community, so this seems like a good time to take stock of what is available so far. To a certain degree, economic reality varies with locality, so these New York–based authors decided to confine their survey to studies conducted in the United States. This may make their results less generalizable, but certainly not irrelevant, to other parts of the world. Historically, Americans have been reluctant to allow cost-effectiveness analyses to exert a major influence in their collective health care decision making, perhaps owing to an inherent suspicion of the motivation behind such studies15,21 and a distaste for externally imposed limits.15 Because Nwachukwu et al wished to use guidelines issued in 1996 to grade the quality of the articles they reviewed, they chose to limit their timeline to studies published in 1998 or since. An extensive 3-tiered search yielded only 12 suitable articles. The topics covered roughly parallel the cumulative distribution of studies in the orthopaedic sports medicine literature: 5 of these were concerned with ACL reconstruction, 3 with rotator cuff surgery, 2 with autologous chondrocyte implantation,

Keywords: cost-effectiveness analysis; cost utility analysis; QALY (quality-adjusted life year); Quality of Health Economic Studies (QHES) Questionnaire; Panel on Cost-Effectiveness in Health and Medicine

The American Journal of Sports Medicine, Vol. 43, No. 6 DOI: 10.1177/0363546515588309 Ó 2015 American Orthopaedic Society for Sports Medicine

1313 Downloaded from ajs.sagepub.com by guest on August 19, 2015

1314 Editorial

The American Journal of Sports Medicine

establish a preference scale, it has been shown that physicians’ perception of patient benefit correlates poorly with the patients’ perspective.25 A more relevant approach is to ask ‘‘the people’’—but which people? One must choose between patients who actually have the condition under consideration, or a representative sample of the community at large, individuals who are currently healthy but might experience the condition at some future time. Opinions of these 2 groups might differ widely. A choice had to be made, so the Panel endorsed the opinions of the community at large, stating that this was more in line with the goal of considering the society perspective. Several established health preference scales are available to represent ‘‘the people,’’ including the Health Utilities Index, the Quality of Well-Being Scale, the SF-6D, and the EuroQol.11,16,21,23 While the panel recommended a community preference sample for its ‘‘reference case,’’ it did not dismiss the importance of the patients’ perspective and suggested that this might be the basis for an alternative calculation as part of a sensitivity analysis. Costs are easily summed up in a standard currency, such as dollars or euros. In the most common type of CEA, the cost-utility analysis, the ‘‘currency’’ chosen to compute treatment benefit is the quality-adjusted life year, or QALY.2,16 The panel recognized that monetizing the quality of life in this way is a mechanistic exercise that can never account for issues of fairness, feasibility, or societal values. Because a QALY is defined as 1 year of perfect health, this can theoretically be achieved by restoring a dying person to perfect health for 1 year or obtaining a slight improvement in wellbeing equal to 1/100 of that for 100 people over the same period of time. Acknowledging the ethical dilemmas possible from an overly mathematical application of QALYs in promulgating public health decisions, the Panel concluded that it is seldom appropriate to apply CEA mechanically. Instead, CEAs should be used as an aid to decision makers who must weigh the information they provide in the context of other social values. ‘‘By definition, there is no set of preferences that is correct for all people. QALYs . do not as currently defined, and perhaps never can, perfectly reflect everything about health that matters to people.’’ 21(p1175,1176) The well-known case of the Oregon Health Plan illustrates the hazards of a strict CEA approach to allocation of health care resources.18 The 1987 cause ce´le`bre of a 7-year-old boy who died in the midst of a campaign to collect donations for his bone-marrow transplant, a procedure not covered in the state’s Medicaid insurance program, focused Oregon’s attention on the need to establish a better approach for spending the available health care dollars.20 A decision was made to construct an official list that prioritized medical conditions and their associated treatments. The intention was that the available funds would determine where a line would be drawn on the list, separating the procedures that would be covered from those that would not. The line could be moved periodically, depending upon the state’s health care budget, and all needy persons could receive the treatments that resided above the telltale line. The initial list, which followed a fairly literal interpretation of CEA, was widely ridiculed and quickly

withdrawn. One of its more infamous anomalies was the ranking of caps for exposed dental pulp above surgery for ectopic pregnancy.6,12 Critics argued over whether the nonsensical order of priorities resulted from the CEA principle itself or merely an extremely faulty application of the technique.5,6,12 In any case, a revised version was finally approved that more accurately reflected societal values and relied less on strict CEA analysis.6,12,15 Ultimately, as need outstripped resources, Oregon was forced to establish a lottery for Medicaid eligibility.1,8,18 The Oregon experience reveals the perils of using CEA to establish treatment priorities among widely disparate conditions.13 It is less controversial and more fruitful to use this technique to compare alternatives, whether 2 different treatments for the same condition, or 1 treatment with the expected outcome in the absence of specific treatment.9,10,22 For this to be possible, however, good data on the outcomes for each alternative must be available, covering an adequate length of follow-up and tracking complications and secondary procedures. This is probably one reason that relatively few CEAs are available in orthopaedic sports medicine. For example, when Brophy and colleagues3 analyzed the costs of converting from singlebundle to double-bundle ACL reconstruction in 2009, they declined to perform a formal CEA, citing the lack of adequate outcomes data. CEAs may obtain these data from a single randomized trial or a systematic review of multiple studies, which probably will produce a more generalizable result. Clearly, the selection of evidence must be done impartially, or the results will easily be biased.7,14 Researchers who set out to ‘‘prove’’ that their favorite treatment is also the most cost-effective are indeed likely to achieve their goal by stacking the evidentiary deck. Having established the rationale for CEA, its limitations, and the concept of the reference case in its first article,21 the Panel proceeded to enumerate their detailed recommendations for conducting such an analysis in the second.26 These included the items to be considered when calculating both costs and outcomes, the methods for quantifying both the costs and health consequences of an intervention, discounting procedures to adjust for the effect of the passage of time on costs and health outcomes, data sources and methods for estimating the effectiveness of a treatment, and the techniques for handling the uncertainties that inevitably arise during such an exercise. This last feature is particularly important, since CEA depends upon numerous assumptions that might not be universally accepted and clinical evidence that may not be entirely generalizable. Thus, after authors clearly state the data and assumptions used in calculating the reference case, the Panel recommended that they conduct multiple sensitivity analyses to illustrate how the result would vary with changes in important parameters. In the third and final installment of their opus, the Panel outlined their recommendations for reporting CEA.24 Their detailed checklist includes 38 different items, organized into 4 major categories: Framework, which might also have been labeled Introduction; Data and Methods; Results; and Discussion. This document will undoubtedly prove useful to authors contemplating their own CEA.

Downloaded from ajs.sagepub.com by guest on August 19, 2015

Vol. 43, No. 6, 2015

Dollars and Sense

As with most reporting guidelines, careful attention to the checklist during the planning stages of the project will produce a better study that, when completed, practically writes itself. Individuals who merely wish to be critical readers of a CEA might find the Quality of Health Economic Studies (QHES) instrument less cumbersome to apply.4,19 Nwachukwu and colleagues17 used this instrument to grade the studies in their systematic review. The QHES contains a manageable 16 questions for a reader or reviewer to ponder when evaluating a CEA report. Each item is awarded a specified number of points on an all-or-nothing basis, allowing the computation of a maximum score of 100 points. This absolutist approach might seem a bit Draconian, and many readers will be tempted to assign partial credit when a requirement is fulfilled incompletely. In the Nwachukwu et al systematic review, the 12 sports medicine CEAs reviewed scored an average of 81.8 points on this scale, prompting the authors’ conclusion that overall quality was good. The principal role of CEAs lies in the arena of public health policy, where they have the potential to enlighten the debate between various treatment choices. At the same time, they also can be misused to further a particular agenda. In order to be worthy of influencing opinion, CEAs must be well performed, transparently reported, based on high-quality outcomes data, and tempered by the rule of reason. Clinicians may find them interesting, but less pertinent to the care of specific patients. When confronted with an injured or sick fellow human, we are ethically obligated to determine what will produce the best outcome for that individual, rather than an idealized average Joe or Josephine. Although we must be mindful of costs, optimizing the well-being of our patients should be paramount in our minds. This will inevitably bring us back to the principles of evidence-based medicine: Combining our own expertise with the best available evidence and applying that knowledge to the needs and preferences of each individual patient.

Bruce Reider, MD Chicago, Illinois

REFERENCES 1. Baicker K, Taubman SL, Allen HL, et al. The Oregon experiment— effects of Medicaid on clinical outcomes. N Engl J Med. 2013;368(18):1713-1722. 2. Brauer CA, Neumann PJ, Rosen AB. Trends in cost effectiveness analyses in orthopaedic surgery. Clin Orthop Relat Res. 2007;457:42-48.

1315

3. Brophy RH, Wright RW, Matava MJ. Cost analysis of converting from single-bundle to double-bundle anterior cruciate ligament reconstruction. Am J Sports Med. 2009;37(4):683-687. 4. Chiou CF, Hay JW, Wallace JF, et al. Development and validation of a grading system for the quality of cost-effectiveness studies. Med Care. 2003;41(1):32-44. 5. Daniels N. Rationing fairly: programmatic considerations. Bioethics. 1993;7(2-3):224-233. 6. Eddy DM. Oregon’s methods: Did cost-effectiveness analysis fail? JAMA. 1991;266(15):2135-2141. 7. Evans RG. Manufacturing consensus, marketing truth: guidelines for economic evaluation. Ann Intern Med. 1995;123(1):59-60. 8. Fruits E, Hillard A, Lewis L. The Oregon Health Plan: a ‘‘bold experiment’’ that failed. Cascade Policy Institute. http://cascadepolicy.org/ blog/2010/09/09/the-oregon-health-plan-a-%e2%80%9cboldexperiment%e2%80%9d-that-failed/. Accessed May 1, 2015. 9. Genuario JW, Donegan RP, Hamman D, Bell JE, et al. The cost-effectiveness of single-row compared with double-row arthroscopic rotator cuff repair. J Bone Joint Surg Am. 2012;94(15):1369-1377. 10. Genuario JW, Faucett SC, Boublik M, Schlegel TF. A cost-effectiveness analysis comparing 3 anterior cruciate ligament graft types: bone–patellar tendon–bone autograft, hamstring autograft, and allograft. Am J Sports Med. 2012;40(2):307-314. 11. Guyatt GH, Feeny DH, Patrick DL. Measuring health-related quality of life. Ann Intern Med. 1993;118(8):622-629. 12. Hadorn DC. Setting health care priorities in Oregon: cost-effectiveness meets the rule of rescue. JAMA. 1991;265(17):2218-2225. 13. Harris J. QALYfying the value of life. J Med Ethics. 1987;13(3): 117-123. 14. Kassirer JP, Angell M. The Journal’s policy on cost-effective analyses. N Engl J Med. 1994;331(10):669-670. 15. Neumann PJ. Why don’t Americans use cost-effectiveness analysis? Am J Manag Care. 2004;10(5):308-312. 16. Nwachukwu BU, Hamid KS, Bozic KJ. Measuring value in orthopaedic surgery. JBJS Rev. 2013;1(2):e2. 17. Nwachukwu BU, Schairer WW, Bernstein JL, Dodwell ER, Marx RG, Allen AA. Cost-effectiveness analyses in orthopaedic sports medicine: a systematic review. Am J Sports Med. 2015;43(6):1530-1537. 18. Oberlander J. Health reform interrupted: the unraveling of the Oregon Health Plan. Health Aff (Millwood). 2007;26(1):w96-w105. 19. Ofman JJ, Sullivan SD, Neumann PJ, et al. Examining the value and quality of health economic analyses: implications of utilizing the QHES. J Manag Care Pharm. 2003;9(1):53-61. 20. Perry PA, Hotze T. Oregon’s experiment with prioritizing public health care services. Virtual Mentor. 2011;13(4):241-247. 21. Russell LB, Gold MR, Siegel J, Daniels N, Weinstein MC. The role of cost-effectiveness analysis in health and medicine. Panel on CostEffectiveness in Health and Medicine. JAMA. 1996;276(14): 1172-1177. 22. Samuelson EM, Brown DE. Cost-effectiveness analysis of autologous chondrocyte implantation: a comparison of periosteal patch versus type I/III collagen membrane. Am J Sports Med. 2012;40(6):1252-1258. 23. Shaw JW, Johnson JA, Coons SJ. US Valuation of the EQ-5D health states. Development and testing of the D1 valuation model. Med Care. 2005;43(3):203-220. 24. Siegel JE, Weinstein MC, Russell LB, Gold MR. Recommendations for reporting cost-effectiveness analyses. Panel on Cost-Effectiveness in Health and Medicine. JAMA. 1996;276:1339-1341. 25. Slevin ML, Plant H, Lynch D, Drinkwater J, Gregory WM. Who should measure quality of life, the doctor or the patient? Br J Cancer. 1988;57(1):109-112. 26. Weinstein MC, Siegel JE, Gold MR, Kamlet MS, Russell LB. Recommendations of the Panel on Cost-effectiveness in Health and Medicine. JAMA. 1996;276(15):1253-1258.

For reprints and permission queries, please visit SAGE’s Web site at http://www.sagepub.com/journalsPermissions.nav

Downloaded from ajs.sagepub.com by guest on August 19, 2015

Dollars and sense.

Dollars and sense. - PDF Download Free
131KB Sizes 0 Downloads 8 Views