A common representation of time across visual and auditory modalities

Neuropsychologia. 2018 Oct;119:223-232. doi: 10.1016/j.neuropsychologia.2018.08.014. Epub 2018 Aug 22.


Humans' and non-human animals' ability to process time on the scale of milliseconds and seconds is essential for adaptive behaviour. A central question of how brains keep track of time is how specific temporal information across different sensory modalities is. In the present study, we show that encoding of temporal intervals in auditory and visual modalities are qualitatively similar. Human participants were instructed to reproduce intervals in the range from 750 ms to 1500 ms marked by auditory or visual stimuli. Our behavioural results suggest that, although participants were more accurate in reproducing intervals marked by auditory stimuli, there was a strong correlation in performance between modalities. Using multivariate pattern analysis in scalp EEG, we show that activity during late periods of the intervals was similar within and between modalities. Critically, we show that a multivariate pattern classifier was able to accurately predict the elapsed interval, even when trained on an interval marked by a stimulus of a different sensory modality. Taken together, our results suggest that, while there are differences in the processing of intervals marked by auditory and visual stimuli, they also share a common neural representation.

Keywords: Audition; EEG; Multivariate pattern analysis; Time perception; Vision.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adolescent
  • Adult
  • Auditory Perception / physiology*
  • Brain / physiology*
  • Electroencephalography
  • Female
  • Humans
  • Male
  • Multivariate Analysis
  • Signal Processing, Computer-Assisted
  • Time Factors
  • Time Perception / physiology*
  • Visual Perception / physiology*
  • Young Adult