首页    期刊浏览 2024年09月20日 星期五
登录注册

文章基本信息

  • 标题:Exploring the Influence of Focal Loss on Transformer Models for Imbalanced Maintenance Data in Industry 4.0
  • 本地全文:下载
  • 作者:Juan Pablo Usuga-Cadavid ; Bernard Grabot ; Samir Lamouri
  • 期刊名称:IFAC PapersOnLine
  • 印刷版ISSN:2405-8963
  • 出版年度:2021
  • 卷号:54
  • 期号:1
  • 页码:1023-1028
  • DOI:10.1016/j.ifacol.2021.08.121
  • 语种:English
  • 出版社:Elsevier
  • 摘要:AbstractHarnessing data from historical maintenance databases may be challenging, as they tend to rely on text data provided by operators. Thus, they often include acronyms, jargon, typos, and other irregularities that complicate the automated analysis of such reports. Furthermore, maintenance datasets may present highly imbalanced distributions: some situations happen more often than others, which hinders the effective application of classic Machine Learning (ML) models. Hence, this paper explores the use of a recent Deep Learning (DL) architecture called Transformer, which has provided cutting-edge results in Natural Language Processing (NLP). To tackle the class imbalance, a loss function called Focal Loss (FL) is explored. Results suggests that when all the classes are equally important, the FL does not improve the classification performance. However, if the objective is to detect the minority class, the FL achieves the best performance, although by degrading the detection capacity for the majority class.
  • 关键词:KeywordsArtificial IntelligenceNatural Language ProcessingPredictive MaintenanceImbalanced ClassificationDeep LearningTransformersTransfer Learning
国家哲学社会科学文献中心版权所有