A neurobehavioral model of flexible spatial language behaviors

J Exp Psychol Learn Mem Cogn. 2012 Nov;38(6):1490-511. doi: 10.1037/a0022643. Epub 2011 Apr 25.

Abstract

We propose a neural dynamic model that specifies how low-level visual processes can be integrated with higher level cognition to achieve flexible spatial language behaviors. This model uses real-word visual input that is linked to relational spatial descriptions through a neural mechanism for reference frame transformations. We demonstrate that the system can extract spatial relations from visual scenes, select items based on relational spatial descriptions, and perform reference object selection in a single unified architecture. We further show that the performance of the system is consistent with behavioral data in humans by simulating results from 2 independent empirical studies, 1 spatial term rating task and 1 study of reference object selection behavior. The architecture we present thereby achieves a high degree of task flexibility under realistic stimulus conditions. At the same time, it also provides a detailed neural grounding for complex behavioral and cognitive processes.

Publication types

  • Research Support, N.I.H., Extramural
  • Research Support, Non-U.S. Gov't

MeSH terms

  • Cognition
  • Humans
  • Language*
  • Models, Psychological*
  • Photic Stimulation
  • Space Perception*
  • Speech
  • Verbal Behavior*
  • Visual Perception