北京邮电大学学报

  • EI核心期刊

北京邮电大学学报 ›› 2023, Vol. 46 ›› Issue (6): 66-0.

• 论文 • 上一篇    下一篇

一种自蒸馏的轻量化图像分类网络方案

倪水平,马新良   

  1. 河南理工大学
  • 收稿日期:2022-10-11 修回日期:2022-11-07 出版日期:2023-12-28 发布日期:2023-12-29
  • 通讯作者: 倪水平 E-mail:nishuiping@ hpu. edu. cn
  • 基金资助:
     国家自然科学基金项目(61872126)

A Self-distillation Lightweight Image Classification Network Scheme

#br#   


  • Received:2022-10-11 Revised:2022-11-07 Online:2023-12-28 Published:2023-12-29

摘要: 图像分类任务经常通过压缩神经网络模型以减少参数量,会导致分类准确率下降。对此,提出了一种自蒸馏的轻量化图像分类网络方案。首先,在自蒸馏框架内引入计算量和参数量可忽略的轻量注意力模块,减少自蒸馏框架的参数量与计算量,从而实现自蒸馏框架轻量化;然后,采用分组卷积与深度可分离卷积对残差网络和VGG11网络进行模型压缩,再把压缩后的2个神经网络作为教师模型,根据教师模型的深度,构建多个作为学生模型的浅层分类器,搭建轻量自蒸馏框架。实验结果表明,所提方案不仅确保原自蒸馏的效果,压缩后的图像分类网络在不低于原分类准确率的基础上,参数量极大减少,并降低模型部署的难度。

关键词: 图像分类, 神经网络, 模型压缩, 自蒸馏

Abstract: Image classification tasks often compress neural network models to reduce the number of parameters, which will lead to a decrease in classification accuracy. In order to solve this problem, a self-distillation lightweight image classification network scheme is proposed. First, a lightweight attention module with negligible calculation and parameter amounts is introduced into the self-distillation framework to reduce the parameter amount and calculation amount of the self-distillation framework, thereby achieving a lightweight self-distillation framework. Then, group convolution and depthwise separable convolution are used to compress the residual network and VGG11 network. Next, two compressed neural networks are adopted as teacher model. According to depth of the teacher model, multiple shallow classifiers as student models are constructed. Finally, a lightweight self-distillation framework is built. Experimental results show that the proposed scheme not only ensures original self-distillation effect, but also greatly reduces the number of parameters of the compressed image classification network without being lower than the accuracy of the original classification, and reduces the difficulty of model deployment.

Key words: image classification, neural network, model compression, self-distillation

中图分类号: