Deep Learning with Dynamic Spiking Neurons and Fixed Feedback Weights
- PMID: 28095195
- DOI: 10.1162/NECO_a_00929
Deep Learning with Dynamic Spiking Neurons and Fixed Feedback Weights
Abstract
Recent work in computer science has shown the power of deep learning driven by the backpropagation algorithm in networks of artificial neurons. But real neurons in the brain are different from most of these artificial ones in at least three crucial ways: they emit spikes rather than graded outputs, their inputs and outputs are related dynamically rather than by piecewise-smooth functions, and they have no known way to coordinate arrays of synapses in separate forward and feedback pathways so that they change simultaneously and identically, as they do in backpropagation. Given these differences, it is unlikely that current deep learning algorithms can operate in the brain, but we that show these problems can be solved by two simple devices: learning rules can approximate dynamic input-output relations with piecewise-smooth functions, and a variation on the feedback alignment algorithm can train deep networks without having to coordinate forward and feedback synapses. Our results also show that deep spiking networks learn much better if each neuron computes an intracellular teaching signal that reflects that cell's nonlinearity. With this mechanism, networks of spiking neurons show useful learning in synapses at least nine layers upstream from the output cells and perform well compared to other spiking networks in the literature on the MNIST digit recognition task.
Similar articles
-
Backpropagation and the brain.Nat Rev Neurosci. 2020 Jun;21(6):335-346. doi: 10.1038/s41583-020-0277-3. Epub 2020 Apr 17. Nat Rev Neurosci. 2020. PMID: 32303713 Review.
-
Biologically plausible deep learning - But how far can we go with shallow networks?Neural Netw. 2019 Oct;118:90-101. doi: 10.1016/j.neunet.2019.06.001. Epub 2019 Jun 20. Neural Netw. 2019. PMID: 31254771
-
Deep learning in spiking neural networks.Neural Netw. 2019 Mar;111:47-63. doi: 10.1016/j.neunet.2018.12.002. Epub 2018 Dec 18. Neural Netw. 2019. PMID: 30682710 Review.
-
A theory of local learning, the learning channel, and the optimality of backpropagation.Neural Netw. 2016 Nov;83:51-74. doi: 10.1016/j.neunet.2016.07.006. Epub 2016 Aug 5. Neural Netw. 2016. PMID: 27584574
-
Dynamic evolving spiking neural networks for on-line spatio- and spectro-temporal pattern recognition.Neural Netw. 2013 May;41:188-201. doi: 10.1016/j.neunet.2012.11.014. Epub 2012 Dec 20. Neural Netw. 2013. PMID: 23340243
Cited by
-
Backpropagation and the brain.Nat Rev Neurosci. 2020 Jun;21(6):335-346. doi: 10.1038/s41583-020-0277-3. Epub 2020 Apr 17. Nat Rev Neurosci. 2020. PMID: 32303713 Review.
-
Physical deep learning with biologically inspired training method: gradient-free approach for physical hardware.Nat Commun. 2022 Dec 26;13(1):7847. doi: 10.1038/s41467-022-35216-2. Nat Commun. 2022. PMID: 36572696 Free PMC article.
-
Emergence of brain-inspired small-world spiking neural network through neuroevolution.iScience. 2024 Jan 9;27(2):108845. doi: 10.1016/j.isci.2024.108845. eCollection 2024 Feb 16. iScience. 2024. PMID: 38327781 Free PMC article.
-
Photons guided by axons may enable backpropagation-based learning in the brain.Sci Rep. 2022 Dec 1;12(1):20720. doi: 10.1038/s41598-022-24871-6. Sci Rep. 2022. PMID: 36456619 Free PMC article.
-
GLSNN: A Multi-Layer Spiking Neural Network Based on Global Feedback Alignment and Local STDP Plasticity.Front Comput Neurosci. 2020 Nov 12;14:576841. doi: 10.3389/fncom.2020.576841. eCollection 2020. Front Comput Neurosci. 2020. PMID: 33281591 Free PMC article.
Publication types
LinkOut - more resources
Full Text Sources
Other Literature Sources
