首页    期刊浏览 2024年10月06日 星期日
登录注册

文章基本信息

  • 标题:KENet: Distilling Convolutional Networks via Knowledge Enhancement
  • 本地全文:下载
  • 作者:Hongzhe Liu ; Chi Zhang ; Cheng Xu
  • 期刊名称:IFAC PapersOnLine
  • 印刷版ISSN:2405-8963
  • 出版年度:2020
  • 卷号:53
  • 期号:5
  • 页码:385-390
  • DOI:10.1016/j.ifacol.2021.04.116
  • 语种:English
  • 出版社:Elsevier
  • 摘要:AbstractFor practical applications, deep neural networks need to be deployed with low memory and computing resources. To achieve this goal, we design a lightweight convolutional neural network namely KENet (Knowledge Enhance Network) and propose a knowledge distillation method to improve the performance of KENet. Our proposed KENet is a lightweight convolutional network derived from a wide residual network by replacing the normal convolutions with a hybrid of group convolutions and bottleneck blocks to reduce the number of parameters. However, the use of small kernels and group convolutions loses the information of both spatial and channel-wise dimensions. To solve this problem, we further propose a knowledge distillation method to enhance the information of these two dimensions. We extract both spatial and channel-wise knowledge from a ‘teacher’, and improve the attention transfer features for knowledge distillation. The experiment results on multiple datasets show that KENet is computationally cheap and memory saving with hardly any loss of precision. Moreover, we confirm that KENet can be effectively deployed in the advanced detectors with strong robustness and real-time performance.
  • 关键词:Keywordsmachine learningmodel compressionknowledge distillationattention transfer
国家哲学社会科学文献中心版权所有