首页    期刊浏览 2024年08月31日 星期六
登录注册

文章基本信息

  • 标题:Boosting Low-Resource BiomedicalQAvia Entity-Aware Masking Strategies
  • 本地全文:下载
  • 作者:Gabriele Pergola ; Elena Kochkina ; Lin Gui
  • 期刊名称:Conference on European Chapter of the Association for Computational Linguistics (EACL)
  • 出版年度:2021
  • 卷号:2021
  • 页码:1977-1985
  • DOI:10.18653/v1/2021.eacl-main.169
  • 语种:English
  • 出版社:ACL Anthology
  • 摘要:Biomedical question-answering (QA) has gained increased attention for its capability to provide users with high-quality information from a vast scientific literature. Although an increasing number of biomedical QA datasets has been recently made available, those resources are still rather limited and expensive to produce; thus, transfer learning via pre-trained language models (LMs) has been shown as a promising approach to leverage existing general-purpose knowledge. However, fine-tuning these large models can be costly and time consuming and often yields limited benefits when adapting to specific themes of specialised domains, such as the COVID-19 literature. Therefore, to bootstrap further their domain adaptation, we propose a simple yet unexplored approach, which we call biomedical entity-aware masking (BEM) strategy, encouraging masked language models to learn entity-centric knowledge based on the pivotal entities characterizing the domain at hand, and employ those entities to drive the LM fine-tuning. The resulting strategy is a downstream process applicable to a wide variety of masked LMs, not requiring additional memory or components in the neural architectures. Experimental results show performance on par with the state-of-the-art models on several biomedical QA datasets.
国家哲学社会科学文献中心版权所有