Expectation of signal duration influences the signal detectability. This is demonstrated in two experiments in which percent correct was measured for both tonal and noise signals whose durations were either unexpected or uncertain. In both experiments, the signal at each duration was set to have a d' of about 1.5 when that duration was presented alone and expected. When the six subjects were led to expect a short- or a long-duration signal using the probe-signal method, the detectability of signals decreased to near chance as the signal duration deviated from the expected value (experiment 1). When the subjects were led to expect a range of durations, the detectability was only slightly worse than when each signal was presented alone (experiment 2). Those results suggest that listeners adjust their temporal-integration intervals according to the demand of the specific task. Finally, the results obtained with the noise signal were analyzed using the multiple-look model and a modified energy-detector model. Assuming that the integration interval is matched to the expected signal duration, both models predict the detection of signals having unexpected durations reasonably well. Both models, however, fail to predict the small effect of duration uncertainty.