Septic shock is characterized by hypoperfusion and tissue energy defects. We prospectively evaluated the therapeutic benefit of augmenting cardiac output and therefore oxygen delivery (DO2) on mortality in patients with septic shock. Twenty-five patients were randomized to a normal treatment (NT) group and 26 patients were randomized to an optimal treatment (OT) group. All patients had a clinically evident site of infection, sepsis as defined by a systemic response to the infection, and shock indicated by systemic hypoperfusion. Patients were treated during the initial 72 h by an algorithm differing only in the end point of resuscitation. The cardiac index (CI) was increased to 3.0 L/min/m2 in the NT group and to 6 L/min/m2 in the OT group. There were no significant differences in cardiorespiratory parameters in the NT and OT groups on entrance into the study. During treatment, CI averaged 3.6 +/- 0.2 L/min/m2 and DO2 averaged 8.6 +/- 0.8 ml/min/kg in the NT group and CI averaged 5.1 +/- 0.2 L/min/m2 and DO2 averaged 12.2 +/- 0.7 ml/min/kg in the OT group (p less than 0.01). A significant correlation between DO2 and survival was observed. Seventy-two percent of the OT patients died vs 50 percent of the NT patients (p = 0.14). Surviving NT patients stayed 13.7 +/- 3 days in the ICU vs 7.4 +/- 0.6 days (p less than 0.05) for the OT patients. Since some of the NT patients were spontaneously hyperdynamic and some of the OT patients did not achieve their desired end point, patients were arbitrarily subsetted using a midpoint CI of 4.5 L/min/m2. The NT less than 4.5 group had a CI of 3.1 +/- 0.2 L/min/m2 and DO2 of 10.9 +/- 1.0 ml/min/kg while the OT group greater than 4.5 L/min/m2 had a CI of 5.7 +/- 0.2 L/min/m2 and a DO2 of 13.8 +/- 0.7 ml/min/kg (p less than 0.01). Mortality in the NT less than 4.5 group was 74 percent as compared with 40 percent in the OT greater than 4.5 group (p less than 0.05).