Evidence for a deep, distributed and dynamic code for animacy in human ventral anterior temporal cortex

Elife. 2021 Oct 27:10:e66276. doi: 10.7554/eLife.66276.

Abstract

How does the human brain encode semantic information about objects? This paper reconciles two seemingly contradictory views. The first proposes that local neural populations independently encode semantic features; the second, that semantic representations arise as a dynamic distributed code that changes radically with stimulus processing. Combining simulations with a well-known neural network model of semantic memory, multivariate pattern classification, and human electrocorticography, we find that both views are partially correct: information about the animacy of a depicted stimulus is distributed across ventral temporal cortex in a dynamic code possessing feature-like elements posteriorly but with elements that change rapidly and nonlinearly in anterior regions. This pattern is consistent with the view that anterior temporal lobes serve as a deep cross-modal 'hub' in an interactive semantic network, and more generally suggests that tertiary association cortices may adopt dynamic distributed codes difficult to detect with common brain imaging methods.

Keywords: ECOG; cognition; human; mvpa; neural networks; neuroscience; semantic memory; temporal lobe.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adolescent
  • Adult
  • Brain Mapping
  • Electrocorticography
  • Female
  • Humans
  • Male
  • Memory / physiology*
  • Neural Networks, Computer
  • Temporal Lobe / physiology*
  • Young Adult