Objective: To examine whether secular trends in risk factor levels and improvements in treatment can account for the observed decline in coronary heart disease mortality in the United States from 1980 to 1990 and to analyze the proportional contribution of these changes.
Data sources: Literature review, US statistics, health surveys, and ongoing clinical trials.
Study selection: Data representative of the US situation nationwide reported in adequate detail.
Data extraction: A computer-simulation state-transition model of the US population between the ages of 35 and 84 years was developed to forecast coronary mortality. The input variables were estimated such that the combination of values led to an adequate agreement with reported coronary mortality figures. Subsequently, secular trends were modeled.
Data synthesis: Actual coronary mortality in 1990 was 34% (127,000 deaths) lower than would be predicted if risk factor levels, case-fatality rates, and event rates in those with and without coronary disease remained the same as in 1980. When secular changes in these factors were included in the model, predicted coronary mortality in 1990 was within 3% (10,000 deaths) of the observed mortality and explained 92% of the decline; only 25% of the decline was explained by primary prevention, while 29% was explained by secondary reduction in risk factors in patients with coronary disease and 43% by other improvements in treatment in patients with coronary disease.
Conclusions: These results suggest that primary and secondary risk factor reductions explain about 50% of the striking decline in coronary mortality in the United States between 1980 and 1990 but that more than 70% of the overall decline in mortality has occurred among patients with coronary disease.