Multimodal sensory integration and concurrent navigation strategies for spatial cognition in real and artificial organisms

J Integr Neurosci. 2007 Sep;6(3):327-66. doi: 10.1142/s0219635207001593.

Abstract

Flexible spatial behavior requires the ability to orchestrate the interaction of multiple parallel processes. At the sensory level, multimodal inputs must be combined to produce a robust description of the spatiotemporal properties of the environment. At the action-selection level, multiple concurrent navigation policies must be dynamically weighted in order to adopt the strategy that is the most adapted to the complexity of the task. Different neural substrates mediate the processing of spatial information. Elucidating their anatomo-functional interrelations is fundamental to unravel the overall spatial memory function. Here we first address the multisensory integration issue and we review a series of experimental findings (both behavioral and electrophysiological) concerning the neural bases of spatial learning and the way the brain builds unambiguous spatial representations from incoming multisensory streams. Second, we move at the navigation strategy level and present an overview of experimental data that begin to explain the cooperation-competition between the brain areas involved in spatial navigation. Third, we introduce the spatial cognition function from a computational neuroscience and neuro-robotics viewpoint. We provide an example of neuro-computational model that focuses on the importance of combining multisensory percepts to enable a robot to acquire coherent (spatial) memories of its interaction with the environment.

Publication types

  • Editorial

MeSH terms

  • Afferent Pathways / physiology
  • Animals
  • Cognition / physiology*
  • Humans
  • Models, Neurological
  • Neural Networks, Computer*
  • Neurons / physiology*
  • Orientation
  • Sensation*
  • Space Perception / physiology*
  • Spatial Behavior / physiology*