期刊名称:EURASIP Journal on Advances in Signal Processing
印刷版ISSN:1687-6172
电子版ISSN:1687-6180
出版年度:2020
卷号:2020
期号:1
页码:1
DOI:10.1186/s13634-020-00695-2
出版社:Hindawi Publishing Corporation
摘要:We design a rectified linear unit-based multilayer neural network by mapping the feature vectors to a higher dimensional space in every layer. We design the weight matrices in every layer to ensure a reduction of the training cost as the number of layers increases. Linear projection to the target in the higher dimensional space leads to a lower training cost if a convex cost is minimized. An ℓ2-norm convex constraint is used in the minimization to reduce the generalization error and avoid overfitting. The regularization hyperparameters of the network are derived analytically to guarantee a monotonic decrement of the training cost, and therefore, it eliminates the need for cross-validation to find the regularization hyperparameter in each layer. We show that the proposed architecture is norm-preserving and provides an invertible feature vector and, therefore, can be used to reduce the training cost of any other learning method which employs linear projection to estimate the target.
关键词:Rectified linear unit ; Feature design ; Neural network ; Convex cost function