首页    期刊浏览 2024年11月29日 星期五
登录注册

文章基本信息

  • 标题:BERTPrescriptions to Avoid Unwanted Headaches: A Comparison of Transformer Architectures for Adverse Drug Event Detection
  • 本地全文:下载
  • 作者:Beatrice Portelli ; Edoardo Lenzi ; Emmanuele Chersoni
  • 期刊名称:Conference on European Chapter of the Association for Computational Linguistics (EACL)
  • 出版年度:2021
  • 卷号:2021
  • 页码:1740-1747
  • DOI:10.18653/v1/2021.eacl-main.149
  • 语种:English
  • 出版社:ACL Anthology
  • 摘要:Pretrained transformer-based models, such as BERT and its variants, have become a common choice to obtain state-of-the-art performances in NLP tasks. In the identification of Adverse Drug Events (ADE) from social media texts, for example, BERT architectures rank first in the leaderboard. However, a systematic comparison between these models has not yet been done. In this paper, we aim at shedding light on the differences between their performance analyzing the results of 12 models, tested on two standard benchmarks. SpanBERT and PubMedBERT emerged as the best models in our evaluation: this result clearly shows that span-based pretraining gives a decisive advantage in the precise recognition of ADEs, and that in-domain language pretraining is particularly useful when the transformer model is trained just on biomedical text from scratch.
国家哲学社会科学文献中心版权所有