首页    期刊浏览 2024年10月05日 星期六
登录注册

文章基本信息

  • 标题:A neural network regularization method to address variance inflation in autoencoders
  • 本地全文:下载
  • 作者:Boeun Kim ; Kyung Hwan Ryu ; Seongmin Heo
  • 期刊名称:IFAC PapersOnLine
  • 印刷版ISSN:2405-8963
  • 出版年度:2022
  • 卷号:55
  • 期号:7
  • 页码:744-749
  • DOI:10.1016/j.ifacol.2022.07.533
  • 语种:English
  • 出版社:Elsevier
  • 摘要:AbstractThere exist various machine learning techniques which can be used to reduce the dimensionality of original data while minimizing the information loss. Principal component analysis (PCA) is one of the most well known such techniques, which transforms the original correlated variables into uncorrelated variables called principal components. Although PCA is known to preserve the total variance of the original data during the transformation, there are some cases with a potential of variance inflation, where the total variance of principal components becomes much larger than that of original variables. It is important to prevent variance inflation, as it can negatively affect the performance of other application systems (e.g. process monitoring systems) which are designed on the basis of principal component with inflated variances. Variance inflation also has a high potential to occur during the training of autoencoder, a special type of neural network performing nonlinear version of PCA. Although there are several neural network regularization methods available to alleviate the problem of variance inflation, none of them is tailored to do such task. To this end, in this work, an alternative neural network regularization method is proposed, which can strongly regulate the total variance in the feature space. Using the Tennessee Eastman process as an illustrative example, the proposed regularization method is compared with the existing ones in terms of neural network overfitting, variance inflation, and training time.
  • 关键词:Keywordsprincipal component analysisautoencoderfeature extractionfeature varianceneural network regularization
国家哲学社会科学文献中心版权所有