Mean field and capacity in realistic networks of spiking neurons storing sparsely coded random memories

Neural Comput. 2004 Dec;16(12):2597-637. doi: 10.1162/0899766042321805.

Abstract

Mean-field (MF) theory is extended to realistic networks of spiking neurons storing in synaptic couplings of randomly chosen stimuli of a given low coding level. The underlying synaptic matrix is the result of a generic, slow, long-term synaptic plasticity of two-state synapses, upon repeated presentation of the fixed set of the stimuli to be stored. The neural populations subtending the MF description are classified by the number of stimuli to which their neurons are responsive (multiplicity). This involves 2p + 1 populations for a network storing p memories. The computational complexity of the MF description is then significantly reduced by observing that at low coding levels (f), only a few populations remain relevant: the population of mean multiplicity - pf and those of multiplicity of order square root pf around the mean. The theory is used to produce (predict) bifurcation diagrams (the onset of selective delay activity and the rates in its various stationary states) and to compute the storage capacity of the network (the maximal number of single items used in training for each of which the network can sustain a persistent, selective activity state). This is done in various regions of the space of constitutive parameters for the neurons and for the learning process. The capacity is computed in MF versus potentiation amplitude, ratio of potentiation to depression probability and coding level f. The MF results compare well with recordings of delay activity rate distributions in simulations of the underlying microscopic network of 10,000 neurons.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms
  • Artificial Intelligence
  • Computer Simulation
  • Computer Systems
  • Memory / physiology*
  • Models, Neurological
  • Neural Networks, Computer*
  • Neurons / physiology*