Survival curves of a cocktail of eight serotypes of Salmonella in ground poultry of different fat levels (1-12%), when heated rapidly to specified temperatures (58-65 degrees C), were examined. Because many of the survival curves were concave, values for two parameters: the asymptotic D-value and the "lag" times were estimated and used to develop secondary models for estimating the time needed to obtain a 7 log10 relative reduction as a function of fat level and temperature. To compute the necessary time, at a given temperature and fat level, the estimated lag time should be added to the product of 7 and the estimated asymptotic D-value. A model was also developed for estimating the standard error of the estimated times, so that upper confidence bounds for the necessary times can be computed. It was found that lag times increase with higher fat levels. The effect of fat on D-values depended on the species; it is estimated that, for a given increase of fat level, the increase of the D-value would be greater for ground chicken than that for ground turkey. In addition, there was a statistically significant species effect on D-values, with higher D-values for ground turkey than for ground chicken at the higher temperatures studied. The thermal death curves displayed a non-linear tendency, however, for estimation purposes, a linear curve was assumed. There was not a statistically significant interaction effect of fat levels and temperatures on D-values, thus, for modeling, it was assumed that z-values were not dependent on the fat levels. The z-values for ground chicken and turkey were estimated to be 5.5 degrees C and 6.1 degrees C, respectively, and are statistically significantly different. These findings should have substantial practical importance to food processors of cooked poultry, allowing them to vary their thermal treatment of ready-to-eat poultry products in a safe manner.