Mutual information of population codes and distance measures in probability space

Phys Rev Lett. 2001 May 21;86(21):4958-61. doi: 10.1103/PhysRevLett.86.4958.

Abstract

We studied the mutual information between a stimulus and a system consisting of stochastic, statistically independent elements that respond to a stimulus. Using statistical mechanical methods the properties of the mutual information (MI) in the limit of a large system size N are calculated. For continuous valued stimuli, the MI increases logarithmically with N and is related to the log of the Fisher information of the system. For discrete stimuli the MI saturates exponentially with N. We find that the exponent of saturation of the MI is the Chernoff distance between response probabilities that are induced by different stimuli.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Brain / physiology
  • Data Interpretation, Statistical
  • Electronic Data Processing
  • Mental Processes / physiology
  • Models, Neurological*
  • Neurons / physiology
  • Probability*
  • Stochastic Processes