Efficient spiking neural network design via neural architecture search

Neural Netw. 2024 May:173:106172. doi: 10.1016/j.neunet.2024.106172. Epub 2024 Feb 16.

Abstract

Spiking neural networks (SNNs) are brain-inspired models that utilize discrete and sparse spikes to transmit information, thus having the property of energy efficiency. Recent advances in learning algorithms have greatly improved SNN performance due to the automation of feature engineering. While the choice of neural architecture plays a significant role in deep learning, the current SNN architectures are mainly designed manually, which is a time-consuming and error-prone process. In this paper, we propose a spiking neural architecture search (NAS) method that can automatically find efficient SNNs. To tackle the challenge of long search time faced by SNNs when utilizing NAS, the proposed NAS encodes candidate architectures in a branchless spiking supernet which significantly reduces the computation requirements in the search process. Considering that real-world tasks prefer efficient networks with optimal accuracy under a limited computational budget, we propose a Synaptic Operation (SynOps)-aware optimization to automatically find the computationally efficient subspace of the supernet. Experimental results show that, in less search time, our proposed NAS can find SNNs with higher accuracy and lower computational cost than state-of-the-art SNNs. We also conduct experiments to validate the search process and the trade-off between accuracy and computational cost.

Keywords: Neural architecture search; Neuromorphic computing; Spiking neural networks.

MeSH terms

  • Algorithms*
  • Automation
  • Engineering
  • Neural Networks, Computer*