The annual mortality rate of patients on hemodialysis in the United States is 24.3%, substantially higher than the mortality of age-matched patients in Europe and Japan. Differences in the dose of dialysis received by US patients has been proposed as an important factor contributing to this high mortality rate. We undertook a prospective effort to increase the dose of dialysis delivered to 130 patients treated at an urban dialysis center affiliated with Vanderbilt University. From 1988 to 1991 the dose of dialysis, represented by the urea kinetic modelling parameter Kt/V (K = dialyzer clearance, t = dialysis time, V = volume of distribution of urea), has been gradually increased from a dose of 0.82 +/- 0.32 to 1.33 +/- 0.23. Concurrent with this increase, there was a reduction of the gross annual mortality rate from 22.8% in 1988 to 9.1% in 1991. To account for potential differences in patient characteristics during those years, we also calculated the number of expected deaths, based on data from the United States Renal Data System. The ratio of observed to expected deaths, termed the "standardized mortality rate," decreased from a value of 1.03 in 1988 to a value of 0.611 in 1991. In addition, the number of hospital days per patient per year decreased from 15.2 d/patient/yr to 10.3 d/patient/yr. We conclude that increasing the dose of delivered dialysis decreases the hospitalization and mortality rates of hemodialysis-dependent patients.