首页    期刊浏览 2024年10月01日 星期二
登录注册

文章基本信息

  • 标题:BERTmeets Shapley: ExtendingSHAPExplanations to Transformer-based Classifiers
  • 本地全文:下载
  • 作者:Enja Kokalj ; Blaž Škrlj ; Nada Lavrač
  • 期刊名称:Conference on European Chapter of the Association for Computational Linguistics (EACL)
  • 出版年度:2021
  • 卷号:2021
  • 页码:16-21
  • 语种:English
  • 出版社:ACL Anthology
  • 摘要:Transformer-based neural networks offer very good classification performance across a wide range of domains, but do not provide explanations of their predictions. While several explanation methods, including SHAP, address the problem of interpreting deep learning models, they are not adapted to operate on state-of-the-art transformer-based neural networks such as BERT. Another shortcoming of these methods is that their visualization of explanations in the form of lists of most relevant words does not take into account the sequential and structurally dependent nature of text. This paper proposes the TransSHAP method that adapts SHAP to transformer models including BERT-based text classifiers. It advances SHAP visualizations by showing explanations in a sequential manner, assessed by human evaluators as competitive to state-of-the-art solutions.
国家哲学社会科学文献中心版权所有