We investigated how listeners perceive the temporal relationship of a light flash and a complex acoustic signal. The stimulus mimics ubiquitous events in busy scenes which are manifested as a change in the pattern of on-going fluctuation. Detecting pattern emergence inherently requires integration over time; resulting in such events being detected later than when they occurred. How does delayed detection time affect the perception of such events relative to other events in the scene? To model these situations, we use rapid sequences of tone pips with a time-frequency pattern that changes from random to regular ("REG-RAND") or vice versa ("RAND-REG"). REG-RAND transitions are detected rapidly, but RAND-REG take longer to detect (∼880 ms post nominal transition). Using a Temporal Order Judgment task, we instructed subjects to indicate whether the flash appeared before or after the acoustic transition. The point of subjective simultaneity between the flash and RAND-REG does not occur at the point of detection (∼880 ms post nominal transition) but ∼470 ms closer to the nominal acoustic transition. In a second experiment we halved the tone pip duration. The resulting pattern of performance was qualitatively similar to that in Experiment 1, but scaled by half. Our results indicates that the brain possesses mechanisms that survey the proximal history of an on-going stimulus and automatically adjust perception so as to compensate for prolonged detection time, thus producing more accurate representations of scene dynamics. However, this readjustment is not complete.
Keywords: audio-visual temporal order judgment; auditory scene analysis; change detection; time perception.