Hebbian learning of recurrent connections: a geometrical perspective
- PMID: 22594830
- DOI: 10.1162/NECO_a_00322
Hebbian learning of recurrent connections: a geometrical perspective
Abstract
We show how a Hopfield network with modifiable recurrent connections undergoing slow Hebbian learning can extract the underlying geometry of an input space. First, we use a slow and fast analysis to derive an averaged system whose dynamics derives from an energy function and therefore always converges to equilibrium points. The equilibria reflect the correlation structure of the inputs, a global object extracted through local recurrent interactions only. Second, we use numerical methods to illustrate how learning extracts the hidden geometrical structure of the inputs. Indeed, multidimensional scaling methods make it possible to project the final connectivity matrix onto a Euclidean distance matrix in a high-dimensional space, with the neurons labeled by spatial position within this space. The resulting network structure turns out to be roughly convolutional. The residual of the projection defines the nonconvolutional part of the connectivity, which is minimized in the process. Finally, we show how restricting the dimension of the space where the neurons live gives rise to patterns similar to cortical maps. We motivate this using an energy efficiency argument based on wire length minimization. Finally, we show how this approach leads to the emergence of ocular dominance or orientation columns in primary visual cortex via the self-organization of recurrent rather than feedforward connections. In addition, we establish that the nonconvolutional (or long-range) connectivity is patchy and is co-aligned in the case of orientation learning.
Similar articles
-
Effects of Hebbian learning on the dynamics and structure of random networks with inhibitory and excitatory neurons.J Physiol Paris. 2007 Jan-May;101(1-3):136-48. doi: 10.1016/j.jphysparis.2007.10.003. Epub 2007 Oct 16. J Physiol Paris. 2007. PMID: 18042357 Review.
-
Dimensional reduction for reward-based learning.Network. 2006 Sep;17(3):235-52. doi: 10.1080/09548980600773215. Network. 2006. PMID: 17162613
-
Hebbian errors in learning: an analysis using the Oja model.J Theor Biol. 2009 Jun 21;258(4):489-501. doi: 10.1016/j.jtbi.2009.01.036. Epub 2009 Feb 25. J Theor Biol. 2009. PMID: 19248792
-
Emergence of network structure due to spike-timing-dependent plasticity in recurrent neuronal networks V: self-organization schemes and weight dependence.Biol Cybern. 2010 Nov;103(5):365-86. doi: 10.1007/s00422-010-0405-7. Epub 2010 Sep 29. Biol Cybern. 2010. PMID: 20882297
-
Local networks in visual cortex and their influence on neuronal responses and dynamics.J Physiol Paris. 2004 Jul-Nov;98(4-6):429-41. doi: 10.1016/j.jphysparis.2005.09.017. Epub 2005 Nov 7. J Physiol Paris. 2004. PMID: 16274974 Review.
Cited by
-
Closed-loop stimulation of a delayed neural fields model of parkinsonian STN-GPe network: a theoretical and computational study.Front Neurosci. 2015 Jul 10;9:237. doi: 10.3389/fnins.2015.00237. eCollection 2015. Front Neurosci. 2015. PMID: 26217171 Free PMC article.
-
Stability analysis of a neural field self-organizing map.J Math Neurosci. 2020 Dec 1;10(1):20. doi: 10.1186/s13408-020-00097-6. J Math Neurosci. 2020. PMID: 33259016 Free PMC article.
-
Endotaxis: A neuromorphic algorithm for mapping, goal-learning, navigation, and patrolling.Elife. 2024 Feb 29;12:RP84141. doi: 10.7554/eLife.84141. Elife. 2024. PMID: 38420996 Free PMC article.
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources
