Imprecise measurement of risk factors causes misclassification of individuals, limits sensitivity to detect those with high true levels, and dilutes associations between risk factors and disease. The implications of these effects for two particular examples were explored using data from a large prospective study relating plasma cholesterol to coronary heart disease (CHD) mortality and diastolic blood pressure (DBP) to fatal stroke. The absolute and relative effectiveness of three "high-risk" strategies of screening and treatment and a "population-based" shift in the risk factor distribution were compared, assuming different degrees of measurement error. The absolute benefits of each strategy were greater than suggested by unadjusted estimates from survey data. For cholesterol and CHD (a linear relationship in this cohort), uncorrected estimates tended to exaggerate the effectiveness of "high-risk" strategies relative to the "population-based" approach. For DBP and stroke (an exponential relationship), the relative effectiveness of screening and treatment was underestimated if no allowance was made for measurement error. These findings are strictly applicable only to the middle-aged men from whom they were derived, but the effects of misclassification and regression dilution need to be considered in any assessment of preventive strategies.