Eye Movements Help Link Different Views in Scene-Selective Cortex

Cereb Cortex. 2011 Sep;21(9):2094-102. doi: 10.1093/cercor/bhq292. Epub 2011 Jan 31.

Abstract

To explore visual scenes in the everyday world, we constantly move our eyes, yet most neural studies of scene processing are conducted with the eyes held fixated. Such prior work in humans suggests that the parahippocampal place area (PPA) represents scenes in a highly specific manner that can differentiate between different but overlapping views of a panoramic scene. Using functional magnetic resonance imaging (fMRI) adaptation to measure sensitivity to change, we asked how this specificity is affected when active eye movements across a stable scene generate retinotopically different views. The PPA adapted to successive views when subjects made a series of saccades across a stationary spatiotopic scene but not when the eyes remained fixed and a scene translated in the background, suggesting that active vision may provide important cues for the PPA to integrate different views over time as the "same." Adaptation was also robust when retinotopic information was preserved across views when the scene moved in tandem with the eyes. These data suggest that retinotopic physical similarity is fundamental, but the visual system may also utilize oculomotor cues and/or global spatiotopic information to generate more ecologically relevant representations of scenes across different views.

Publication types

  • Research Support, N.I.H., Extramural

MeSH terms

  • Adaptation, Psychological / physiology
  • Adult
  • Cues
  • Eye Movements / physiology*
  • Female
  • Fixation, Ocular / physiology
  • Humans
  • Image Processing, Computer-Assisted
  • Magnetic Resonance Imaging
  • Male
  • Parahippocampal Gyrus / physiology*
  • Photic Stimulation
  • Saccades / physiology
  • Space Perception / physiology
  • User-Computer Interface
  • Visual Cortex / physiology*
  • Visual Perception / physiology*
  • Young Adult