首页    期刊浏览 2025年06月26日 星期四
登录注册

文章基本信息

  • 标题:LogSE: An Uncertainty-Based Multi-Task Loss Function for Learning Two Regression Tasks
  • 本地全文:下载
  • 作者:Zeinab Ghasemi-Naraghi ; Ahmad Nickabadi ; Reza Safabakhsh
  • 期刊名称:Journal of Universal Computer Science
  • 印刷版ISSN:0948-6968
  • 出版年度:2022
  • 卷号:28
  • 期号:2
  • 页码:141-159
  • DOI:10.3897/jucs.70549
  • 语种:English
  • 出版社:Graz University of Technology and Know-Center
  • 摘要:Multi-task learning (MTL) is a popular method in machine learning which utilizes related information of multi tasks to learn a task more efficiently and accurately. Naively, one can benefit from MTL by using a weighted linear sum of the different tasks loss functions. Manual specification of appropriate weights is difficult and typically does not improve performance, so it is critical to find an automatic weighting strategy for MTL. Also, there are three types of uncertainties that are captured in deep learning. Epistemic uncertainty is related to the lack of data. Heteroscedas- tic aleatoric uncertainty depends on the input data and differs from one input to another. In this paper, we focus on the third type, homoscedastic aleatoric uncertainty, which is constant for differ- ent inputs and is task-dependent. There are some methods for learning uncertainty-based weights as the parameters of a model. But in this paper, we introduce a novel multi-task loss function to capture homoscedastic uncertainty in multi regression tasks models, without increasing the complexity of the network. As the experiments show, the proposed loss function aids in learning a multi regression tasks network fairly with higher accuracy in fewer training steps.
国家哲学社会科学文献中心版权所有