Evolutionary Shallowing Deep Neural Networks at Block Levels

IEEE Trans Neural Netw Learn Syst. 2022 Sep;33(9):4635-4647. doi: 10.1109/TNNLS.2021.3059529. Epub 2022 Aug 31.

Abstract

Neural networks have been demonstrated to be trainable even with hundreds of layers, which exhibit remarkable improvement on expressive power and provide significant performance gains in a variety of tasks. However, the prohibitive computational cost has become a severe challenge for deploying them on resource-constrained platforms. Meanwhile, widely adopted deep neural network architectures, for example, ResNets or DenseNets, are manually crafted on benchmark datasets, which hamper their generalization ability to other domains. To cope with these issues, we propose an evolutionary algorithm-based method for shallowing deep neural networks (DNNs) at block levels, which is termed as ESNB. Different from existing studies, ESNB utilizes the ensemble view of block-wise DNNs and employs the multiobjective optimization paradigm to reduce the number of blocks while avoiding performance degradation. It automatically discovers shallower network architectures by pruning less informative blocks, and employs knowledge distillation to recover the performance. Moreover, a novel prior knowledge incorporation strategy is proposed to improve the exploration ability of the evolutionary search process, and a correctness-aware knowledge distillation strategy is designed for better knowledge transferring. Experimental results show that the proposed method can effectively accelerate the inference of DNNs while achieving superior performance when compared with the state-of-the-art competing methods.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms*
  • Biological Evolution
  • Neural Networks, Computer*