EDITORIAL

Handling Parameter Uncertainty in Cost-Effectiveness Models Simply and Responsibly Sun-Young Kim, PhD, Louise B. Russell, PhD, Anushua Sinha, MD, MPH

C

ost-effectiveness models inevitably involve uncertainty. Among the different sources of uncertainty, parameter uncertainty, also known as second-order uncertainty, is defined as uncertainty about the true values of the parameters used as model inputs.1 Because the decisions informed by the model are affected by parameter uncertainty,2 it is important to represent parameter uncertainty well and completely. Traditionally, parameter uncertainty has been represented through a deterministic approach, which varies parameter values one at a time or several at a time over plausible ranges, to test how a model’s outcomes vary.3 Deterministic sensitivity analysis may take the form of threshold analysis, identifying the parameter value above or below which the decision changes, or scenario analysis, exploring scenarios defined by combinations of parameters, including best and worst cases.4 These analyses are straightforward for analysts and easily understood by decision makers, but they do not incorporate the probability of different parameter values, nor do they account for correlation among parameters.2 These limitations

Received 10 December 2014 from the Department of Management, Policy, and Community Health, University of Texas School of Public Health, San Antonio, TX, USA (SYK); Institute for Health and Department of Economics, Rutgers University, New Brunswick, NJ, USA (LBR); and Department of Preventive Medicine and Community Health, New Jersey Medical School, Rutgers University, Newark, NJ, USA (AS). Revision accepted for publication 15 December 2014. Address correspondence to Sun-Young Kim, University of Texas School of Public Health, 7411 John Smith Dr, Suite 1100, San Antonio, TX 78229-7976, USA; e-mail: [email protected]. Ó The Author(s) 2015 Reprints and permission: http://www.sagepub.com/journalsPermissions.nav DOI: 10.1177/0272989X14567475

are addressed by probabilistic sensitivity analysis (PSA).5 In PSA, each parameter is assigned a distribution and all parameters are varied simultaneously through Monte Carlo simulation. The results of a PSA are typically summarized by showing the distributions of the main outcomes on the cost-effectiveness plane or in cost-effectiveness acceptability curves.4 A PSA that samples from the joint posterior distribution of parameters allows analysts to reflect correlation and thus correctly calculate expected values of model outcomes in nonlinear models, even when parameters are correlated.2,5 In addition, the outputs of a PSA enable analysts to conduct further analyses of parameter uncertainty, including analysis of covariance (ANCOVA) and value of information (VOI) analysis. ANCOVA calculates the sensitivity of the incremental costeffectiveness ratio or net monetary benefit to individual parameter values.6 VOI includes expected value of perfect information (EVPI), which provides an estimate of the upper limit on returns to future research on all model parameters, and expected value of partial perfect information (EVPPI), which identifies the value of future research for a specific parameter or set of parameters.7 The recent report on model parameter estimation and uncertainty analysis by the ISPOR-SMDM Modeling Good Research Practices Task Force recommends EVPI as the best measure of uncertainty surrounding a particular decision.3 EVPPI is considered the most advanced measure of parameter importance (influence), but estimation of EVPPI through 2-loop Monte Carlo simulation can be computationally burdensome, although there have been efforts to increase the efficiency of the process.8 There have been several guidelines for conducting sensitivity analysis,3,9,10 but there has not been clear guidance on how much more we learn from complex methods of modeling uncertainty (PSA and EVPI or

567

MEDICAL DECISION MAKING/JULY 2015

Downloaded from mdm.sagepub.com by guest on June 5, 2016

KIM AND OTHERS

EVPPI) than from simple 1-way sensitivity analyses. While PSA and VOI analysis examine parameter uncertainty in a more rigorous way, they are more costly than deterministic sensitivity analysis in terms of time and computational burden. Given these tradeoffs, what uncertainty analyses should be done and reported by a responsible analyst, constrained by time and budget? In this issue of Medical Decision Making, Campbell and colleagues address this important subject.11 Their study compares 3 methods for evaluating individual parameters—1-way sensitivity analysis, ANCOVA, and EVPPI—in terms of the parameter’s influence on outcome (net monetary benefit in the study) to suggest what type of uncertainty analysis should be reported under what conditions. Since it is particularly important to explore the effect of parameter uncertainty around decision thresholds, the authors choose 2 willingness-to-pay values to explore a case of low decision uncertainty (£20,000) and high decision uncertainty (£8,000). The authors propose that for relatively linear models with uncorrelated parameters, 1-way sensitivity analysis may be sufficient and give the same answers as EVPPI. For less linear models and models with correlated parameters, they suggest that use of both deterministic and probabilistic sensitivity analyses will be ‘‘prudent and conservative.’’ As the authors note, the key to applying their study’s findings is to determine the degree of linearity and parameter correlation in a model. They suggest that an ANCOVA can be conducted to understand the model’s level of linearity. It is not clearly stated how parameter correlation can be determined. However, given that an analyst must conduct a PSA first to conduct the ANCOVA, little will be saved by not reporting the PSA itself. This leads to the question of whether it is possible to assess linearity in some other way. In addition to linearity and parameter correlation, there may be other study characteristics that should be considered in determining the scope and reporting of a study’s parameter uncertainty analysis. One example is the number of strategies compared. While all of the example models used by Campbell et al. compare only 2 strategies, real-world models often compare many more than 2. Even when a model is relatively linear and has no (or little) correlation among parameters, if more than 2 strategies are evaluated, 1way sensitivity analysis may not save time and space for reporting. For example, a model with more than 2 strategies may require multiple tornadograms, one for each pair of strategies. In addition, when doing only

568 

1-way sensitivity analysis, analysts will need to be careful and creative to effectively and efficiently report the overall impact of 1 parameter on the relative rankings of different strategies, including potential strong and weak dominance. Such situations are more straightforwardly handled with cost-effectiveness acceptability curves under a PSA. Despite its limitations, this article has important messages, not just for modelers or analysts, but also for decision makers such as governmental agencies that are building modeling capacity internal to government or through affiliated institutions. Such groups usually have short timelines and an immediate audience for their model results—one that needs to understand decision uncertainty and risk in clear, communicable terms. The quote by Einstein ‘‘Everything should be made as simple as possible but not simpler’’ is a valuable mantra for the communication of cost-effectiveness results as well as for the modeling process itself. The purpose of a study and the needs of decision makers should be the main drivers of the types of uncertainty analyses to be conducted and reported. REFERENCES 1. Gold MR, Siegel JE, Russell LB, Weinstein MC. Costeffectiveness in Health and Medicine. New York: Oxford University Press; 1996. 2. Ades AE, Claxton K, Sculpher M. Evidence synthesis, parameter correlation, and probabilistic sensitivity analysis. Health Economics. 2006;15:373–81. 3. Briggs AH, Weinstein MC, Fenwick EA, et al. Model parameter estimation and uncertainty analysis: a report of the ISPOR-SMDM Modeling Good Research Practices Task Force Working Group-6. Med Decis Making. 2012;32:722–32. 4. Andronis L, Barton P, Bryan S. Sensitivity analysis in economic evaluation: an audit of NICE current practice and a review of its use and value in decision-making. Health Technol Assess. 2009;13:iii, ix–xi, 1–61. 5. Claxton K, Sculpher M, McCabe C, et al. Probabilistic sensitivity analysis for NICE technology assessment: not an optional extra. Health Economics. 2005;14:339–47. 6. Briggs AH, Claxton K, Sculpher MJ. Decision Modelling for Health Economic Evaluation. New York: Oxford University Press; 2006. 7. Steuten L, van de Wetering G, Groothuis-Oudshoorn K, Retel V. A systematic and critical review of the evolving methods and applications of value of information in academia and practice. Pharmacoeconomics. 2013;31:25–48. 8. Strong M, Oakley JE. An efficient method for computing singleparameter partial expected value of perfect information. Med Decis Making. 2013;33:755–66. 9. Briggs A. Handling uncertainty in economic evaluation and presenting the results. In: Drummond MF, MacGuire A, eds. Economic

MEDICAL DECISION MAKING/JULY 2015

Downloaded from mdm.sagepub.com by guest on June 5, 2016

HANDLING PARAMETER UNCERTAINTY

Evaluation in Health Care: Merging Theory with Practice. New York: Oxford University Press; 2001. 10. Bilcke J, Beutels P, Brisson M, Jit M. Accounting for methodological, structural, and parameter uncertainty in decision-analytic models: a practical guide. Med Decis Making. 2011;31:675–92.

11. Campbell JD, McQueen RB, Libby AM, Spackman DE, Carlson JJ, Briggs A. Cost-effectiveness uncertainty analysis methods: a comparison of one-way sensitivity, analysis of covariance, and expected value of partial perfect information. Med Decis Making. 2015;53:596–607.

EDITORIAL

569

Downloaded from mdm.sagepub.com by guest on June 5, 2016

Handling Parameter Uncertainty in Cost-Effectiveness Models Simply and Responsibly.

Handling Parameter Uncertainty in Cost-Effectiveness Models Simply and Responsibly. - PDF Download Free
38KB Sizes 0 Downloads 6 Views