Toward a unified theory of efficient, predictive, and sparse coding

Proc Natl Acad Sci U S A. 2018 Jan 2;115(1):186-191. doi: 10.1073/pnas.1711114115. Epub 2017 Dec 19.


A central goal in theoretical neuroscience is to predict the response properties of sensory neurons from first principles. To this end, "efficient coding" posits that sensory neurons encode maximal information about their inputs given internal constraints. There exist, however, many variants of efficient coding (e.g., redundancy reduction, different formulations of predictive coding, robust coding, sparse coding, etc.), differing in their regimes of applicability, in the relevance of signals to be encoded, and in the choice of constraints. It is unclear how these types of efficient coding relate or what is expected when different coding objectives are combined. Here we present a unified framework that encompasses previously proposed efficient coding models and extends to unique regimes. We show that optimizing neural responses to encode predictive information can lead them to either correlate or decorrelate their inputs, depending on the stimulus statistics; in contrast, at low noise, efficiently encoding the past always predicts decorrelation. Later, we investigate coding of naturalistic movies and show that qualitatively different types of visual motion tuning and levels of response sparsity are predicted, depending on whether the objective is to recover the past or predict the future. Our approach promises a way to explain the observed diversity of sensory neural responses, as due to multiple functional goals and constraints fulfilled by different cell types and/or circuits.

Keywords: efficient coding; information theory; neural coding; prediction; sparse coding.

Publication types

  • Research Support, N.I.H., Extramural
  • Research Support, Non-U.S. Gov't

MeSH terms

  • Animals
  • Humans
  • Models, Neurological*
  • Sensory Receptor Cells / physiology*