Guidelines for economic evaluations insist that the sensitivity of model results to alternative parameter values should be thoroughly explored. However, differences in model construction and analytical choices (such as the choice of a cost-effectiveness or cost-benefit framework) also introduce uncertainty in results, though these are rarely subjected to a thorough sensitivity analysis. In this article, the authors quantify the effect of model, methodological, and parameter uncertainty, taking varicella vaccination as an example. They used 3 different models (a static model, a dynamic model that only looks at the effect of vaccination on varicella, and a dynamic model that also assesses the implications of vaccination for zoster epidemiology) and 2 forms of analysis (cost-benefit and cost-utility). They also varied the discount rate and time frame of analysis. Probabilistic sensitivity analyses were performed to estimate the impact of parameter uncertainty. In their example, model and methodological choice had a profound effect on estimated cost-effectiveness, but parameter uncertainty played a relatively minor role. Under cost-utility analysis, the probabilistic sensitivity analysis suggested that there was a near certainty that vaccination dominates no vaccination, or the other way around, depending on model choice and perspective. Under cost-benefit analysis, vaccination always appeared to be attractive. Thus, the authors clearly show that model and methodological assumptions can have greater impact on results than parameter estimates, although sensitivity analyses are rarely performed on these sources of uncertainty.