Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2001 Nov-Dec;11(3):207-15.
doi: 10.1023/a:1013776130161.

Supervised and unsupervised learning with two sites of synaptic integration

Affiliations

Supervised and unsupervised learning with two sites of synaptic integration

K P Körding et al. J Comput Neurosci. 2001 Nov-Dec.

Abstract

Many learning rules for neural networks derive from abstract objective functions. The weights in those networks are typically optimized utilizing gradient ascent on the objective function. In those networks each neuron needs to store two variables. One variable, called activity, contains the bottom-up sensory-fugal information involved in the core signal processing. The other variable typically describes the derivative of the objective function with respect to the cell's activity and is exclusively used for learning. This variable allows the objective function's derivative to be calculated with respect to each weight and thus the weight update. Although this approach is widely used, the mapping of such two variables onto physiology is unclear, and these learning algorithms are often considered biologically unrealistic. However, recent research on the properties of cortical pyramidal neurons shows that these cells have at least two sites of synaptic integration, the basal and the apical dendrite, and are thus appropriately described by at least two variables. Here we discuss whether these results could constitute a physiological basis for the described abstract learning rules. As examples we demonstrate an implementation of the backpropagation of error algorithm and a specific self-supervised learning algorithm using these principles. Thus, compared to standard, one-integration-site neurons, it is possible to incorporate interesting properties in neural networks that are inspired by physiology with a modest increase of complexity.

PubMed Disclaimer

Similar articles

Cited by

References

    1. J Neurosci. 1998 Jun 1;18(11):4325-34 - PubMed
    1. Physiol Rev. 1995 Jan;75(1):107-54 - PubMed
    1. J Comput Neurosci. 2000 Mar-Apr;8(2):161-73 - PubMed
    1. Annu Rev Neurosci. 1995;18:555-86 - PubMed
    1. J Neurophysiol. 1999 Feb;81(2):535-43 - PubMed

MeSH terms

LinkOut - more resources