The profile of attentional modulation to visual features

J Vis. 2019 Nov 1;19(13):13. doi: 10.1167/19.13.13.


Although it is well established that feature-based attention (FBA) can enhance an attended feature, how it modulates unattended features remains less clear. Previous studies have generally supported either a graded profile as predicted by the feature-similarity gain model or a nonmonotonic profile predicted by the surround suppression model. To reconcile these different views, we systematically measured the attentional profile in three basic feature dimensions-orientation, motion direction, and spatial frequency. In three experiments, we instructed participants to detect a coherent feature signal against noise under attentional or neutral condition. Our results support a nonmonotonic hybrid model of attentional modulation consisting of feature-similarity gain and surround suppression for orientation and motion direction. For spatial frequency, we also found a similar nonmonotonic profile for higher frequencies than the attended frequency, but a lack of attentional modulation for lower frequencies than the attended frequency. The current findings can reconcile the discrepancies in the literature and suggest the hybrid model as a new framework for attentional modulation in feature space. In addition, a computational model incorporating known properties of spatial frequency channels and attentional modulations at the neural level reproduced the asymmetric attentional modulation, thus revealing a connection between surround suppression and the basic neural architecture of an early visual system.

Publication types

  • Research Support, N.I.H., Extramural

MeSH terms

  • Adult
  • Attention / physiology*
  • Cues
  • Female
  • Humans
  • Male
  • Motion Perception / physiology*
  • Orientation / physiology*
  • Spatio-Temporal Analysis*