首页    期刊浏览 2024年10月01日 星期二
登录注册

文章基本信息

  • 标题:High-dimensional neural feature design for layer-wise reduction of training cost
  • 本地全文:下载
  • 作者:Alireza M. Javid ; Arun Venkitaraman ; Mikael Skoglund
  • 期刊名称:EURASIP Journal on Advances in Signal Processing
  • 印刷版ISSN:1687-6172
  • 电子版ISSN:1687-6180
  • 出版年度:2020
  • 卷号:2020
  • 期号:1
  • 页码:1
  • DOI:10.1186/s13634-020-00695-2
  • 出版社:Hindawi Publishing Corporation
  • 摘要:We design a rectified linear unit-based multilayer neural network by mapping the feature vectors to a higher dimensional space in every layer. We design the weight matrices in every layer to ensure a reduction of the training cost as the number of layers increases. Linear projection to the target in the higher dimensional space leads to a lower training cost if a convex cost is minimized. An ℓ2-norm convex constraint is used in the minimization to reduce the generalization error and avoid overfitting. The regularization hyperparameters of the network are derived analytically to guarantee a monotonic decrement of the training cost, and therefore, it eliminates the need for cross-validation to find the regularization hyperparameter in each layer. We show that the proposed architecture is norm-preserving and provides an invertible feature vector and, therefore, can be used to reduce the training cost of any other learning method which employs linear projection to estimate the target.
  • 关键词:Rectified linear unit ; Feature design ; Neural network ; Convex cost function
国家哲学社会科学文献中心版权所有