Visual features are processed before navigational affordances in the human brain

Sci Rep. 2024 Mar 6;14(1):5573. doi: 10.1038/s41598-024-55652-y.

Abstract

To navigate through their immediate environment humans process scene information rapidly. How does the cascade of neural processing elicited by scene viewing to facilitate navigational planning unfold over time? To investigate, we recorded human brain responses to visual scenes with electroencephalography and related those to computational models that operationalize three aspects of scene processing (2D, 3D, and semantic information), as well as to a behavioral model capturing navigational affordances. We found a temporal processing hierarchy: navigational affordance is processed later than the other scene features (2D, 3D, and semantic) investigated. This reveals the temporal order with which the human brain computes complex scene information and suggests that the brain leverages these pieces of information to plan navigation.

MeSH terms

  • Brain*
  • Electroencephalography
  • Humans
  • Records
  • Semantics
  • Time Perception*