首页    期刊浏览 2024年09月20日 星期五
登录注册

文章基本信息

  • 标题:Benchmarking a transformer-FREEmodel for ad-hoc retrieval
  • 本地全文:下载
  • 作者:Tiago Almeida ; Sérgio Matos
  • 期刊名称:Conference on European Chapter of the Association for Computational Linguistics (EACL)
  • 出版年度:2021
  • 卷号:2021
  • 页码:3343-3353
  • DOI:10.18653/v1/2021.eacl-main.293
  • 语种:Danish
  • 出版社:ACL Anthology
  • 摘要:Transformer-based “behemoths” have grown in popularity, as well as structurally, shattering multiple NLP benchmarks along the way. However, their real-world usability remains a question. In this work, we empirically assess the feasibility of applying transformer-based models in real-world ad-hoc retrieval applications by comparison to a “greener and more sustainable” alternative, comprising only 620 trainable parameters. We present an analysis of their efficacy and efficiency and show that considering limited computational resources, the lighter model running on the CPU achieves a 3 to 20 times speedup in training and 7 to 47 times in inference while maintaining a comparable retrieval performance. Code to reproduce the efficiency experiments is available on “https://github.com/bioinformatics-ua/EACL2021-reproducibility/“.
国家哲学社会科学文献中心版权所有