Psychophysical scaling reveals a unified theory of visual memory strength

Nat Hum Behav. 2020 Nov;4(11):1156-1172. doi: 10.1038/s41562-020-00938-0. Epub 2020 Sep 7.

Abstract

Almost all models of visual memory implicitly assume that errors in mnemonic representations are linearly related to distance in stimulus space. Here we show that neither memory nor perception are appropriately scaled in stimulus space; instead, they are based on a transformed similarity representation that is nonlinearly related to stimulus space. This result calls into question a foundational assumption of extant models of visual working memory. Once psychophysical similarity is taken into account, aspects of memory that have been thought to demonstrate a fixed working memory capacity of around three or four items and to require fundamentally different representations-across different stimuli, tasks and types of memory-can be parsimoniously explained with a unitary signal detection framework. These results have substantial implications for the study of visual memory and lead to a substantial reinterpretation of the relationship between perception, working memory and long-term memory.

Publication types

  • Research Support, U.S. Gov't, Non-P.H.S.

MeSH terms

  • Adult
  • Humans
  • Memory, Long-Term
  • Memory, Short-Term / physiology*
  • Models, Theoretical*
  • Psychophysics*
  • Recognition, Psychology / physiology*
  • Signal Detection, Psychological / physiology
  • Visual Perception / physiology*