首页    期刊浏览 2024年09月19日 星期四
登录注册

文章基本信息

  • 标题:BERxiT: Early Exiting forBERTwith Better Fine-Tuning and Extension to Regression
  • 本地全文:下载
  • 作者:Ji Xin ; Raphael Tang ; Yaoliang Yu
  • 期刊名称:Conference on European Chapter of the Association for Computational Linguistics (EACL)
  • 出版年度:2021
  • 卷号:2021
  • 页码:91-104
  • DOI:10.18653/v1/2021.eacl-main.8
  • 语种:English
  • 出版社:ACL Anthology
  • 摘要:The slow speed of BERT has motivated much research on accelerating its inference, and the early exiting idea has been proposed to make trade-offs between model quality and efficiency. This paper aims to address two weaknesses of previous work: (1) existing fine-tuning strategies for early exiting models fail to take full advantage of BERT; (2) methods to make exiting decisions are limited to classification tasks. We propose a more advanced fine-tuning strategy and a learning-to-exit module that extends early exiting to tasks other than classification. Experiments demonstrate improved early exiting for BERT, with better trade-offs obtained by the proposed fine-tuning strategy, successful application to regression tasks, and the possibility to combine it with other acceleration methods.
国家哲学社会科学文献中心版权所有