Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2021 Jun 29;21(13):4462.
doi: 10.3390/s21134462.

A Low-Power Spiking Neural Network Chip Based on a Compact LIF Neuron and Binary Exponential Charge Injector Synapse Circuits

Affiliations

A Low-Power Spiking Neural Network Chip Based on a Compact LIF Neuron and Binary Exponential Charge Injector Synapse Circuits

Malik Summair Asghar et al. Sensors (Basel). .

Abstract

To realize a large-scale Spiking Neural Network (SNN) on hardware for mobile applications, area and power optimized electronic circuit design is critical. In this work, an area and power optimized hardware implementation of a large-scale SNN for real time IoT applications is presented. The analog Complementary Metal Oxide Semiconductor (CMOS) implementation incorporates neuron and synaptic circuits optimized for area and power consumption. The asynchronous neuronal circuits implemented benefit from higher energy efficiency and higher sensitivity. The proposed synapse circuit based on Binary Exponential Charge Injector (BECI) saves area and power consumption, and provides design scalability for higher resolutions. The SNN model implemented is optimized for 9 × 9 pixel input image and minimum bit-width weights that can satisfy target accuracy, occupies less area and power consumption. Moreover, the spiking neural network is replicated in full digital implementation for area and power comparisons. The SNN chip integrated from neuron and synapse circuits is capable of pattern recognition. The proposed SNN chip is fabricated using 180 nm CMOS process, which occupies a 3.6 mm2 chip core area, and achieves a classification accuracy of 94.66% for the MNIST dataset. The proposed SNN chip consumes an average power of 1.06 mW-20 times lower than the digital implementation.

Keywords: CMOS; artificial intelligence; artificial neural networks; image classification; leaky integrate and fire; neuromorphic; spiking neural network.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
(a) Spiking Neural Network Model with a single neuron connected to multiple input synapses; (b) Leaky Integrate and Fire Model of the CMOS neuron cell.
Figure 2
Figure 2
Block diagram of the SNN implementation consisting of four fully connected layers.
Figure 3
Figure 3
(a) The circuit schematic of the LIF-based neuron; (b) The circuit schematic of the Schmitt Trigger used inside the neuron circuit.
Figure 4
Figure 4
The circuit schematic for the BECI-based synapse with three branches and digital gates.
Figure 5
Figure 5
The circuit simulation results for one neuron cell of an input layer with an input spike train of 15 pulses: (a) shows the Vmem and spike-out when the weight value is +1; (b) shows the Vmem and spike-out when weight value is −1.
Figure 6
Figure 6
(a) Fully connected hidden layer with a large number of synapses connected to a single neuron; (b) One neuron cell of the hidden layers with two synapses connected to a single neuron.
Figure 7
Figure 7
Circuit simulations for a neuron cell (one neuron connected to two synapses) where 15 input spikes are same for all the three cases: (a) shows Vmem and an output spike when weight values are W1 = +1 and W2 = 0; (b) shows Vmem and spike-out when weight values are W1 = −1 and W2 = 0; (c) shows Vmem and spike-out when weight values are W1 = +1 and W2 = −1.
Figure 8
Figure 8
The block diagram of the full digital implementation for the output layer. A number of synapses accumulate in membrane register and the comparator fires the digital output spike pulses.
Figure 9
Figure 9
(a) The complete layout of the SNN implementation constituting of 4 fully connected layers with area estimation; (b) The bonded die micrograph highlighting the fabricated SNN chip.
Figure 10
Figure 10
Measurement Setups: (a) analog SNN test chip mounted on test PCB is measured using oscilloscope, function generator, and a host CPU board (Raspberry Pi 4); (b) digital SNN implementation is measured via FPGA board interfaced with a host CPU board (Raspberry Pi 4).
Figure 11
Figure 11
Measurement results for the spike propagation: (a) One input spike propagates through all layers; (b) seven input spikes propagates through all layers.
Figure 12
Figure 12
The measurement results for Digital SNN showing the correct classification of different input images along with the failed classification of input image ‘5’.
Figure 13
Figure 13
The measured results on the oscilloscope for the analog SNN. The applied input image to the SNN is ‘7’ and the 7th output classifier shows maximum spiking activity.

Similar articles

Cited by

References

    1. Mead C. Neuromorphic electronic systems. Proc. IEEE. 1990;78:1629–1636. doi: 10.1109/5.58356. - DOI
    1. Prezioso M., Merrikh-Bayat F., Hoskins B.D., Adam G.C., Likharev K.K., Strukov D.B. Training and operation of an integrated neuromorphic network based on metal-oxide memristors. Nature. 2015;521:61–64. doi: 10.1038/nature14441. - DOI - PubMed
    1. Alex K., Ilya S., Geoffrey E.H. ImageNet classification with deep convolutional neural networks; Proceedings of the 25th International Conference on Neural Information Processing Systems; Lake Tahoe, NV, USA. 3–6 December 2012; pp. 1097–1105.
    1. Lee J.H., Delbruck T., Pfeiffer M. Training Deep Spiking Neural Networks Using Backpropagation. Front. Neurosci. 2016;10:508. doi: 10.3389/fnins.2016.00508. - DOI - PMC - PubMed
    1. Kyuho L., Junyoung P., Hoi-Jun Y. A Low-power, Mixed-mode Neural Network Classifier for Robust Scene Classification. J. Semicond. Technol. Sci. 2019;19:129–136. doi: 10.5573/JSTS.2019.19.1.129. - DOI

LinkOut - more resources