SSTDP: Supervised Spike Timing Dependent Plasticity for Efficient Spiking Neural Network Training
- PMID: 34803591
- PMCID: PMC8603828
- DOI: 10.3389/fnins.2021.756876
SSTDP: Supervised Spike Timing Dependent Plasticity for Efficient Spiking Neural Network Training
Abstract
Spiking Neural Networks (SNNs) are a pathway that could potentially empower low-power event-driven neuromorphic hardware due to their spatio-temporal information processing capability and high biological plausibility. Although SNNs are currently more efficient than artificial neural networks (ANNs), they are not as accurate as ANNs. Error backpropagation is the most common method for directly training neural networks, promoting the prosperity of ANNs in various deep learning fields. However, since the signals transmitted in the SNN are non-differentiable discrete binary spike events, the activation function in the form of spikes presents difficulties for the gradient-based optimization algorithms to be directly applied in SNNs, leading to a performance gap (i.e., accuracy and latency) between SNNs and ANNs. This paper introduces a new learning algorithm, called SSTDP, which bridges the gap between backpropagation (BP)-based learning and spike-time-dependent plasticity (STDP)-based learning to train SNNs efficiently. The scheme incorporates the global optimization process from BP and the efficient weight update derived from STDP. It not only avoids the non-differentiable derivation in the BP process but also utilizes the local feature extraction property of STDP. Consequently, our method can lower the possibility of vanishing spikes in BP training and reduce the number of time steps to reduce network latency. In SSTDP, we employ temporal-based coding and use Integrate-and-Fire (IF) neuron as the neuron model to provide considerable computational benefits. Our experiments show the effectiveness of the proposed SSTDP learning algorithm on the SNN by achieving the best classification accuracy 99.3% on the Caltech 101 dataset, 98.1% on the MNIST dataset, and 91.3% on the CIFAR-10 dataset compared to other SNNs trained with other learning methods. It also surpasses the best inference accuracy of the directly trained SNN with 25~32× less inference latency. Moreover, we analyze event-based computations to demonstrate the efficacy of the SNN for inference operation in the spiking domain, and SSTDP methods can achieve 1.3~37.7× fewer addition operations per inference. The code is available at: https://github.com/MXHX7199/SNN-SSTDP.
Keywords: deep learning; efficient training; gradient descent backpropagation; neuromorphic computing; spike-time-dependent plasticity; spiking neural network.
Copyright © 2021 Liu, Zhao, Chen, Wang, Yang and Jiang.
Conflict of interest statement
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Figures
Similar articles
-
Enabling Spike-Based Backpropagation for Training Deep Neural Network Architectures.Front Neurosci. 2020 Feb 28;14:119. doi: 10.3389/fnins.2020.00119. eCollection 2020. Front Neurosci. 2020. PMID: 32180697 Free PMC article.
-
Spatio-Temporal Backpropagation for Training High-Performance Spiking Neural Networks.Front Neurosci. 2018 May 23;12:331. doi: 10.3389/fnins.2018.00331. eCollection 2018. Front Neurosci. 2018. PMID: 29875621 Free PMC article.
-
A biologically plausible supervised learning method for spiking neural networks using the symmetric STDP rule.Neural Netw. 2020 Jan;121:387-395. doi: 10.1016/j.neunet.2019.09.007. Epub 2019 Sep 27. Neural Netw. 2020. PMID: 31593843
-
Deep learning in spiking neural networks.Neural Netw. 2019 Mar;111:47-63. doi: 10.1016/j.neunet.2018.12.002. Epub 2018 Dec 18. Neural Netw. 2019. PMID: 30682710 Review.
-
Deep Learning With Spiking Neurons: Opportunities and Challenges.Front Neurosci. 2018 Oct 25;12:774. doi: 10.3389/fnins.2018.00774. eCollection 2018. Front Neurosci. 2018. PMID: 30410432 Free PMC article. Review.
Cited by
-
Efficient training of spiking neural networks with temporally-truncated local backpropagation through time.Front Neurosci. 2023 Apr 6;17:1047008. doi: 10.3389/fnins.2023.1047008. eCollection 2023. Front Neurosci. 2023. PMID: 37090791 Free PMC article.
-
Overview of Spiking Neural Network Learning Approaches and Their Computational Complexities.Sensors (Basel). 2023 Mar 11;23(6):3037. doi: 10.3390/s23063037. Sensors (Basel). 2023. PMID: 36991750 Free PMC article. Review.
-
Critically synchronized brain waves form an effective, robust and flexible basis for human memory and learning.Sci Rep. 2023 Mar 16;13(1):4343. doi: 10.1038/s41598-023-31365-6. Sci Rep. 2023. PMID: 36928606 Free PMC article.
-
An overview of brain-like computing: Architecture, applications, and future trends.Front Neurorobot. 2022 Nov 24;16:1041108. doi: 10.3389/fnbot.2022.1041108. eCollection 2022. Front Neurorobot. 2022. PMID: 36506817 Free PMC article. Review.
-
BSNN: Towards faster and better conversion of artificial neural networks to spiking neural networks with bistable neurons.Front Neurosci. 2022 Oct 12;16:991851. doi: 10.3389/fnins.2022.991851. eCollection 2022. Front Neurosci. 2022. PMID: 36312025 Free PMC article.
References
-
- Akopyan F., Sawada J., Cassidy A., Alvarez-Icaza R., Arthur J., Merolla P., et al. . (2015). Truenorth: Design and tool flow of a 65 mw 1 million neuron programmable neurosynaptic chip. IEEE Trans. Comput. Aided Design Integr. Circ. Syst. 34, 1537–1557. 10.1109/TCAD.2015.2474396 - DOI
-
- Bellec G., Salaj D., Subramoney A., Legenstein R., Maass W. (2018). Long short-term memory and learning-to-learn in networks of spiking neurons, in Neural Information Processing Systems, Montréal, Canada. 787–797.
-
- Benjamin B. V., Gao P., McQuinn E., Choudhary S., Chandrasekaran A. R., Bussat J.-M., et al. . (2014). Neurogrid: a mixed-analog-digital multichip system for large-scale neural simulations. Proc. IEEE 102, 699–716. 10.1109/JPROC.2014.2313565 - DOI
-
- Comsa I. M., Fischbacher T., Potempa K., Gesmundo A., Versari L., Alakuijala J. (2020). Temporal coding in spiking neural networks with alpha synaptic function, in ICASSP 2020-2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (Barcelona: IEEE; ), 8529–8533.
-
- Davies M., Srinivasa N., Lin T.-H., Chinya G., Cao Y., Choday S. H., et al. . (2018). Loihi: a neuromorphic manycore processor with on-chip learning. IEEE Micro 38, 82–99. 10.1109/MM.2018.112130359 - DOI
LinkOut - more resources
Full Text Sources
Miscellaneous
