Objectives: Quality assessment of coronary artery bypass grafting has traditionally been performed with data from clinical databases. Administrative databases that rely primarily on information collected for billing purposes increasingly have been used as tools for public reporting of outcomes quality. The correlation of administrative data with clinical data for clinical quality assessment has not been confirmed.
Methods: With data from a clinical database, we analyzed the outcomes of all patients who underwent coronary artery bypass grafting surgery in 1 hospital between 1999 and 2001. This information was collected before, during, and after the surgery and hospitalization by designated clinical individuals involved with the patient's care and then entered into an audited clinical database (The Society of Thoracic Surgeons National Cardiac Database). These data were then compared with administrative data collected on the same cohort of patients for the number of procedures performed and mortality rate as reported by the federal government (Medical Provider Analysis and Review), state government (Texas Health Care Information Council), hospital system (HCA, Inc, Casemix Database), and an internet Web site (healthgrades.com). Data were analyzed on the basis of the population reported, definitions used, risk assessment algorithms, and case volumes.
Results: By using the audited The Society of Thoracic Surgeons database as the standard and aggregating the reporting of case volumes by the inclusion criteria of various sources of administrative data, we found variances in the reported procedure volumes and mortality. Case volumes were overreported by as much as 21% in all patients and underreported by up to 16% or more in Medicare patients. Mortality in administrative data exceeded that reported in clinical data by 21%. Reasons for variances included time period reported (calendar vs fiscal year), population reported (all patients, Medicare patients, Medicare patients aged >/= 65 years), date used for the patient record captured (date of surgery, discharge), and the definition of mortality. Different proprietary risk-adjusting algorithms used magnified variances with risk-adjusted mortality exceeding the Society of Thoracic Surgeons data by as much as 61%.
Conclusions: Substantial variability of reported outcomes is seen in administrative data sets compared with an audited clinical database in the end points of the number of procedures performed and mortality. This variability makes it challenging for the nonclinician unfamiliar with outcomes analysis to make an informed decision.