Journal of Beijing University of Posts and Telecommunications

  • EI核心期刊

Journal of Beijing University of Posts and Telecommunications ›› 2023, Vol. 46 ›› Issue (6): 66-0.

Previous Articles     Next Articles

A Self-distillation Lightweight Image Classification Network Scheme

#br#   


  • Received:2022-10-11 Revised:2022-11-07 Online:2023-12-28 Published:2023-12-29

Abstract: Image classification tasks often compress neural network models to reduce the number of parameters, which will lead to a decrease in classification accuracy. In order to solve this problem, a self-distillation lightweight image classification network scheme is proposed. First, a lightweight attention module with negligible calculation and parameter amounts is introduced into the self-distillation framework to reduce the parameter amount and calculation amount of the self-distillation framework, thereby achieving a lightweight self-distillation framework. Then, group convolution and depthwise separable convolution are used to compress the residual network and VGG11 network. Next, two compressed neural networks are adopted as teacher model. According to depth of the teacher model, multiple shallow classifiers as student models are constructed. Finally, a lightweight self-distillation framework is built. Experimental results show that the proposed scheme not only ensures original self-distillation effect, but also greatly reduces the number of parameters of the compressed image classification network without being lower than the accuracy of the original classification, and reduces the difficulty of model deployment.

Key words: image classification, neural network, model compression, self-distillation

CLC Number: