We report a novel multisensory decision task designed to encourage subjects to combine information across both time and sensory modalities. We presented subjects, humans and rats, with multisensory event streams, consisting of a series of brief auditory and/or visual events. Subjects made judgments about whether the event rate of these streams was high or low. We have three main findings. First, we report that subjects can combine multisensory information over time to improve judgments about whether a fluctuating rate is high or low. Importantly, the improvement we observed was frequently close to, or better than, the statistically optimal prediction. Second, we found that subjects showed a clear multisensory enhancement both when the inputs in each modality were redundant and when they provided independent evidence about the rate. This latter finding suggests a model where event rates are estimated separately for each modality and fused at a later stage. Finally, because a similar multisensory enhancement was observed in both humans and rats, we conclude that the ability to optimally exploit sequentially presented multisensory information is not restricted to a particular species.