Gradient learning in spiking neural networks by dynamic perturbation of conductances

Phys Rev Lett. 2006 Jul 28;97(4):048104. doi: 10.1103/PhysRevLett.97.048104. Epub 2006 Jul 28.

Abstract

We present a method of estimating the gradient of an objective function with respect to the synaptic weights of a spiking neural network. The method works by measuring the fluctuations in the objective function in response to dynamic perturbation of the membrane conductances of the neurons. It is compatible with recurrent networks of conductance-based model neurons with dynamic synapses. The method can be interpreted as a biologically plausible synaptic learning rule, if the dynamic perturbations are generated by a special class of "empiric" synapses driven by random spike trains from an external source.

MeSH terms

  • Action Potentials / physiology*
  • Animals
  • Biological Clocks / physiology*
  • Cell Membrane / physiology*
  • Computer Simulation
  • Electric Conductivity
  • Humans
  • Linear Models
  • Models, Neurological*
  • Nerve Net / physiology*
  • Neurons / physiology*
  • Synaptic Transmission / physiology*