Background: Age is assigned a heavy weight in the calculation of the total cardiovascular risk score but often the atherosclerotic disease burden varies from a patient's chronological age.
Methods: We used measures of coronary artery calcium to estimate the number of life years lost (calcium-adjusted age) in 10,377 asymptomatic individuals referred for electron beam tomography (EBT) screening and followed for 5 years for all-cause mortality. Linear regression was used to calculate predicted age and time to death was estimated via a Cox proportional hazard model.
Results: There was a direct relationship between coronary artery calcium and observed age (r = 0.32, p < 0.0001). In linear prediction models, a calcium score < 10 resulted in a reduction in observed age by 10 years in subjects older than 70 years, while a calcium score > 400 added as much as 30 years of age to younger patients. Calcium-adjusted age was a better predictor of mortality (model chi2 = 373, p < 0.0001) than observed age (model chi2 = 355, p < 0.0001). Detectable calcium was noted in 16% of men and 12% of women with an unadjusted low risk Framingham score (p < 0.0001). For those with an intermediate Framingham risk score, calcium scores > 10 were noted in 31 and 43% of men and women (p < 0.0001). Using calcium-adjustments to age, 55% of previously low risk Framingham scores to intermediate risk (p < 0.0001). Similarly, 45% of the unadjusted intermediate Framingham risk scores were re-classified as high risk based upon calcium-adjusted ages (p < 0.0001).
Conclusions: Measures of coronary artery calcium are related to survival and can be used to assess an individual's biological age. Undetected risk based upon current calculations of the Framingham risk may be improved based upon determination of a re-adjustment of a patient's age using the extent of coronary calcification.