Activity in perirhinal and entorhinal cortex predicts perceived visual similarities among category exemplars with highest precision

Elife. 2022 Mar 21:11:e66884. doi: 10.7554/eLife.66884.

Abstract

Vision neuroscience has made great strides in understanding the hierarchical organization of object representations along the ventral visual stream (VVS). How VVS representations capture fine-grained visual similarities between objects that observers subjectively perceive has received limited examination so far. In the current study, we addressed this question by focussing on perceived visual similarities among subordinate exemplars of real-world categories. We hypothesized that these perceived similarities are reflected with highest fidelity in neural activity patterns downstream from inferotemporal regions, namely in perirhinal (PrC) and anterolateral entorhinal cortex (alErC) in the medial temporal lobe. To address this issue with functional magnetic resonance imaging (fMRI), we administered a modified 1-back task that required discrimination between category exemplars as well as categorization. Further, we obtained observer-specific ratings of perceived visual similarities, which predicted behavioural discrimination performance during scanning. As anticipated, we found that activity patterns in PrC and alErC predicted the structure of perceived visual similarity relationships among category exemplars, including its observer-specific component, with higher precision than any other VVS region. Our findings provide new evidence that subjective aspects of object perception that rely on fine-grained visual differentiation are reflected with highest fidelity in the medial temporal lobe.

Keywords: fMRI; human; medial temporal lobe; neuroscience; object recognition; ventral visual pathway; visual discrimination.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Brain Mapping
  • Entorhinal Cortex* / pathology
  • Magnetic Resonance Imaging
  • Pattern Recognition, Visual
  • Photic Stimulation
  • Temporal Lobe*

Grants and funding