How actions shape perception: learning action-outcome relations and predicting sensory outcomes promote audio-visual temporal binding

Sci Rep. 2016 Dec 16:6:39086. doi: 10.1038/srep39086.

Abstract

To maintain a temporally-unified representation of audio and visual features of objects in our environment, the brain recalibrates audio-visual simultaneity. This process allows adjustment for both differences in time of transmission and time for processing of audio and visual signals. In four experiments, we show that the cognitive processes for controlling instrumental actions also have strong influence on audio-visual recalibration. Participants learned that right and left hand button-presses each produced a specific audio-visual stimulus. Following one action the audio preceded the visual stimulus, while for the other action audio lagged vision. In a subsequent test phase, left and right button-press generated either the same audio-visual stimulus as learned initially, or the pair associated with the other action. We observed recalibration of simultaneity only for previously-learned audio-visual outcomes. Thus, learning an action-outcome relation promotes temporal grouping of the audio and visual events within the outcome pair, contributing to the creation of a temporally unified multisensory object. This suggests that learning action-outcome relations and the prediction of perceptual outcomes can provide an integrative temporal structure for our experiences of external events.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adult
  • Auditory Perception / physiology*
  • Female
  • Humans
  • Learning / physiology*
  • Male
  • Photic Stimulation
  • Visual Perception / physiology*
  • Young Adult