Computational principles and models of multisensory integration

Curr Opin Neurobiol. 2017 Apr;43:25-34. doi: 10.1016/j.conb.2016.11.002. Epub 2016 Dec 2.


Combining information from multiple senses creates robust percepts, speeds up responses, enhances learning, and improves detection, discrimination, and recognition. In this review, I discuss computational models and principles that provide insight into how this process of multisensory integration occurs at the behavioral and neural level. My initial focus is on drift-diffusion and Bayesian models that can predict behavior in multisensory contexts. I then highlight how recent neurophysiological and perturbation experiments provide evidence for a distributed redundant network for multisensory integration. I also emphasize studies which show that task-relevant variables in multisensory contexts are distributed in heterogeneous neural populations. Finally, I describe dimensionality reduction methods and recurrent neural network models that may help decipher heterogeneous neural populations involved in multisensory integration.

Publication types

  • Review

MeSH terms

  • Bayes Theorem
  • Humans
  • Learning / physiology
  • Models, Biological*
  • Nerve Net / physiology
  • Sensation / physiology*