Peripersonal Space and Margin of Safety around the Body: Learning Visuo-Tactile Associations in a Humanoid Robot with Artificial Skin

PLoS One. 2016 Oct 6;11(10):e0163713. doi: 10.1371/journal.pone.0163713. eCollection 2016.


This paper investigates a biologically motivated model of peripersonal space through its implementation on a humanoid robot. Guided by the present understanding of the neurophysiology of the fronto-parietal system, we developed a computational model inspired by the receptive fields of polymodal neurons identified, for example, in brain areas F4 and VIP. The experiments on the iCub humanoid robot show that the peripersonal space representation i) can be learned efficiently and in real-time via a simple interaction with the robot, ii) can lead to the generation of behaviors like avoidance and reaching, and iii) can contribute to the understanding the biological principle of motor equivalence. More specifically, with respect to i) the present model contributes to hypothesizing a learning mechanisms for peripersonal space. In relation to point ii) we show how a relatively simple controller can exploit the learned receptive fields to generate either avoidance or reaching of an incoming stimulus and for iii) we show how the robot can select arbitrary body parts as the controlled end-point of an avoidance or reaching movement.

MeSH terms

  • Humans
  • Learning / physiology
  • Personal Space*
  • Probability
  • Robotics*
  • Safety*
  • Skin, Artificial*
  • Space Perception
  • Touch Perception*
  • Visual Perception*

Grants and funding

AR was supported by 7th European Community Framework Programme ( Xperience (FP7-ICT-270273). MH was supported by the Swiss National Science Foundation ( Prospective Researcher Fellowship PBZHP2-147259 and by a Marie Curie Intra European Fellowship (iCub Body Schema 625727) within the 7th European Community Framework Programme. LF was supported by the 7th European Community Framework Programme project POETICON++ (FP7-ICT-2011-7) and by the MIUR-PRIN ( 2010MEFNF7_003 grant. UP and GM were supported by the 7th European Community Framework Programme project WYSIWYD (FP7-ICT-612139).