Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2021 Mar 31;11(1):7291.
doi: 10.1038/s41598-021-86780-4.

Neuron type classification in rat brain based on integrative convolutional and tree-based recurrent neural networks

Affiliations

Neuron type classification in rat brain based on integrative convolutional and tree-based recurrent neural networks

Tielin Zhang et al. Sci Rep. .

Abstract

The study of cellular complexity in the nervous system based on anatomy has shown more practical and objective advantages in morphology than other perspectives on molecular, physiological, and evolutionary aspects. However, morphology-based neuron type classification in the whole rat brain is challenging, given the significant number of neuron types, limited reconstructed neuron samples, and diverse data formats. Here, we report that different types of deep neural network modules may well process different kinds of features and that the integration of these submodules will show power on the representation and classification of neuron types. For SWC-format data, which are compressed but unstructured, we construct a tree-based recurrent neural network (Tree-RNN) module. For 2D or 3D slice-format data, which are structured but with large volumes of pixels, we construct a convolutional neural network (CNN) module. We also generate a virtually simulated dataset with two classes, reconstruct a CASIA rat-neuron dataset with 2.6 million neurons without labels, and select the NeuroMorpho-rat dataset with 35,000 neurons containing hierarchical labels. In the twelve-class classification task, the proposed model achieves state-of-the-art performance compared with other models, e.g., the CNN, RNN, and support vector machine based on hand-designed features.

PubMed Disclaimer

Conflict of interest statement

The authors declare no competing interests.

Figures

Figure 1
Figure 1
(a) Comparisons of the two main formats of neuronal morphology. (b) Morphological repairment of a single neuron by the Trees toolbox with different hyperparameters of the “zcorr” method. (c) Virtual simulated basket-type neuron samples containing different branch orders generated with Trees toolbox. (d) Generated pyramidal-type neurons with Trees toolbox.
Figure 2
Figure 2
The neuronal morphologies from SWC-format data after repairing with Trees toolbox. The Houdini software generates the 3D model of neurons. Here, we show three examples for each class of neurons in the NeuroMorpho-rat dataset. (a) The six types of principal cells: ganglion, granule, medium spiny, parachromaffin, Purkinje, and pyramidal cells. (b) The three types of interneuron cells: basket, GABAergic, and nitrergic cells. (c) The two types of glial cells: microglia and astrocyte cells. (d) The 1 type of sensory receptor cell.
Figure 3
Figure 3
The self-reconstructed 2.6 million rat cells without labels. (a) Neuron branch detection and reconstruction. (b) Neuron soma identification and surface detection. (c) Different types of neurons were reconstructed from the neocortex; different colors represent different types of neurons. (d) The reconstructed CASIA rat-neuron dataset with approximately 2.6 million cells from a single rat brain. (e) The density of neurons in the whole rat brain.
Figure 4
Figure 4
(a) The test accuracy with the number of iterations in the learning procedure of DNN-based neuron classification. (b) The loss curves in the test procedure of the DNN-based model. (c) The IDs and names of 12 subclasses of neuron types. (d) The performance of DNN-based models and other compared models on different types of datasets. (e) The confusion matrix, which shows the error classification in the 12-class identification. (f) The t-SNE distribution for the 2-class DNN-based features. (g) The t-SNE distribution for the 12-class DNN-based features.
Figure 5
Figure 5
(a) Accuracy improvements of various hand-designed features. (b) The t-SNE distribution for the 2-class hand-designed features. (c) The t-SNE distribution for the 12-class hand-designed features. (d, e, f, g) Some positive hand-designed features for neuron type classification, including the total length, max path distance, number of branches, and max branch order. (h) Negative hand-designed features.
Figure 6
Figure 6
The proposed integrated DNN architecture contains a ResNet CNN for 2D-image feature detection and Tree-RNN for SWC-format feature detection. (a) Diagram of the processing procedure, including the data preparation, feature learning, neuron classification, and categorization. (b) The structure of ResNet18 for image classification. The input is three-channel images from the projections of the X, Y, and Z axes. Before the residual blocks, data are first processed by convolution, batch normalization (BN), ReLU activation, and MaxPooling. The following four layers have different residual block hyperparameters, as shown in Table 2. The output is the generated features after completing average-pooling and full-connection. (c) The proposed Tree-RNN, including the standard structures of the RNN and LSTM modules. There are five 2-layer LSTM blocks in this tree, and black arrows represent the connections between them. Every block has the same structure, which contains two hidden layers, each with 128 neurons. Finally, the result is output through a fully connected layer. The dotted boxes show the basic operating units of the RNN and LSTM. (d) The submodule of the ResNet layer. (e) The submodule of the Tree-RNN, which can also be considered the traditional simple 2-layer RNN.

Similar articles

Cited by

References

    1. Armañanzas R, Ascoli GA. Towards the automatic classification of neurons. Trends Neurosci. 2015;38:307–318. doi: 10.1016/j.tins.2015.02.004. - DOI - PMC - PubMed
    1. Zeng H, Sanes JR. Neuronal cell-type classification: challenges, opportunities and the path forward. Nat. Rev. Neurosci. 2017;18:530–546. doi: 10.1038/nrn.2017.85. - DOI - PubMed
    1. Gouwens NW, et al. Classification of electrophysiological and morphological neuron types in the mouse visual cortex. Nat. Neurosci. 2019;22:1182–1195. doi: 10.1038/s41593-019-0417-0. - DOI - PMC - PubMed
    1. Tasic B, et al. Shared and distinct transcriptomic cell types across neocortical areas. Nature. 2018;563:72–78. doi: 10.1038/s41586-018-0654-5. - DOI - PMC - PubMed
    1. Lein E, Borm LE, Linnarsson S. The promise of spatial transcriptomics for neuroscience in the era of molecular cell typing. Science. 2017;358:64–69. doi: 10.1126/science.aan6827. - DOI - PubMed

Publication types

LinkOut - more resources