A neuron-inspired computational architecture for spatiotemporal visual processing: real-time visual sensory integration for humanoid robots

Biol Cybern. 2014 Jun;108(3):249-59. doi: 10.1007/s00422-014-0597-3. Epub 2014 Apr 1.

Abstract

In this article, we present a neurologically motivated computational architecture for visual information processing. The computational architecture's focus lies in multiple strategies: hierarchical processing, parallel and concurrent processing, and modularity. The architecture is modular and expandable in both hardware and software, so that it can also cope with multisensory integrations - making it an ideal tool for validating and applying computational neuroscience models in real time under real-world conditions. We apply our architecture in real time to validate a long-standing biologically inspired visual object recognition model, HMAX. In this context, the overall aim is to supply a humanoid robot with the ability to perceive and understand its environment with a focus on the active aspect of real-time spatiotemporal visual processing. We show that our approach is capable of simulating information processing in the visual cortex in real time and that our entropy-adaptive modification of HMAX has a higher efficiency and classification performance than the standard model (up to ∼+6%).

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Animals
  • Computer Simulation
  • Humans
  • Models, Neurological*
  • Neurons
  • Robotics*
  • Software
  • Visual Cortex
  • Visual Perception*