首页    期刊浏览 2024年07月08日 星期一
登录注册

文章基本信息

  • 标题:Combine-Net: An Improved Filter Pruning Algorithm
  • 本地全文:下载
  • 作者:Jinghan Wang ; Guangyue Li ; Wenzhao Zhang
  • 期刊名称:Information
  • 电子版ISSN:2078-2489
  • 出版年度:2021
  • 卷号:12
  • 期号:7
  • 页码:264
  • DOI:10.3390/info12070264
  • 出版社:MDPI Publishing
  • 摘要:The powerful performance of deep learning is evident to all. With the deepening of research, neural networks have become more complex and not easily generalized to resource-constrained devices. The emergence of a series of model compression algorithms makes artificial intelligence on edge possible. Among them, structured model pruning is widely utilized because of its versatility. Structured pruning prunes the neural network itself and discards some relatively unimportant structures to compress the model’s size. However, in the previous pruning work, problems such as evaluation errors of networks, empirical determination of pruning rate, and low retraining efficiency remain. Therefore, we propose an accurate, objective, and efficient pruning algorithm—Combine-Net, introducing Adaptive BN to eliminate evaluation errors, the Kneedle algorithm to determine the pruning rate objectively, and knowledge distillation to improve the efficiency of retraining. Results show that, without precision loss, Combine-Net achieves 95% parameter compression and 83% computation compression on VGG16 on CIFAR10, 71% of parameter compression and 41% computation compression on ResNet50 on CIFAR100. Experiments on different datasets and models have proved that Combine-Net can efficiently compress the neural network’s parameters and computation.
  • 关键词:network pruning; model compression; knowledge distillation; artificial intelligence; edge computing network pruning ; model compression ; knowledge distillation ; artificial intelligence ; edge computing
国家哲学社会科学文献中心版权所有