FPGA-based multimodal embedded sensor system integrating low- and mid-level vision

Sensors (Basel). 2011;11(8):8164-79. doi: 10.3390/s110808164. Epub 2011 Aug 22.

Abstract

Motion estimation is a low-level vision task that is especially relevant due to its wide range of applications in the real world. Many of the best motion estimation algorithms include some of the features that are found in mammalians, which would demand huge computational resources and therefore are not usually available in real-time. In this paper we present a novel bioinspired sensor based on the synergy between optical flow and orthogonal variant moments. The bioinspired sensor has been designed for Very Large Scale Integration (VLSI) using properties of the mammalian cortical motion pathway. This sensor combines low-level primitives (optical flow and image moments) in order to produce a mid-level vision abstraction layer. The results are described trough experiments showing the validity of the proposed system and an analysis of the computational resources and performance of the applied algorithms.

Keywords: VLSI; bio-inspired systems; machine vision; optical flow; orthogonal variant moments.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms
  • Animals
  • Biosensing Techniques
  • Cerebral Cortex / physiology
  • Computers*
  • Equipment Design
  • Humans
  • Imaging, Three-Dimensional / methods*
  • Models, Statistical
  • Motion
  • Optics and Photonics
  • Pattern Recognition, Automated / methods*
  • Reproducibility of Results
  • Vision, Ocular