Shared Representation of Visual and Auditory Motion Directions in the Human Middle-Temporal Cortex

Curr Biol. 2020 Jun 22;30(12):2289-2299.e8. doi: 10.1016/j.cub.2020.04.039. Epub 2020 May 21.

Abstract

The human occipito-temporal region hMT+/V5 is well known for processing visual motion direction. Here, we demonstrate that hMT+/V5 also represents the direction of auditory motion in a format partially aligned with the one used to code visual motion. We show that auditory and visual motion directions can be reliably decoded in individually localized hMT+/V5 and that motion directions in one modality can be predicted from the activity patterns elicited by the other modality. Despite shared motion-direction information across the senses, vision and audition, however, overall produce opposite voxel-wise responses in hMT+/V5. Our results reveal a multifaced representation of multisensory motion signals in hMT+/V5 and have broader implications for our understanding of how we consider the division of sensory labor between brain regions dedicated to a specific perceptual function.

Keywords: MVPA; RSA; auditory; cross-modal; decoding; fMRI; hMT(+)/V5; motion; multimodal; visual.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adult
  • Auditory Perception / physiology*
  • Female
  • Humans
  • Male
  • Motion Perception / physiology*
  • Temporal Lobe / physiology*
  • Visual Perception / physiology*
  • Young Adult