There are numerous examples in the epidemiologic literature of analyses that relate the change in a risk factor, such as serum cholesterol, to the risk of an adverse outcome, such as heart disease. Many of these analyses fit some type of regression model (such as logistic regression or the Cox model for survival time data) that includes both the change in the risk factor and the baseline value as covariates. We show that this method of adjusting for the baseline level can produce misleading results. The problem occurs when the true value of the risk factor relates to the outcome, and the measured value differs from the true value due to measurement error. We may find the observed change in the risk factor significantly related to the outcome when there is in fact no relationship between the true change and the outcome. If the question of interest is whether a person who lowers his level of the risk factor by means of drugs or lifestyle changes will thereby reduce his risk of disease, then we should consider an association due solely to measurement error as spurious. We present a method that adjusts for the measurement error in a linear regression analysis and show that an analogous adjustment applies asymptotically to logistic regression. As in other errors-in-variables problems, this analysis depends on knowledge of the relative variances of the random variation, the true baseline value, and the true change. Since the magnitudes of these variances are usually unknown and sometimes unknowable (the distinction between true change and measurement error being ambiguous), we recommend a sensitivity analysis that examines how the analysis results depend on the assumptions concerning the variances. The commonly used analysis method corresponds to the extreme case in which there is no measurement error. We use data from the Framingham Study and simulations to illustrate these points.