摘要:AbstractRegularization of simple linear regression models for system identification is a recent much-studied problem. Several parameterizations (“kernels”) of the regularization matrix have been suggested together with different ways of estimating (“tuning”) its parameters. This contribution defines an asymptotic view on the problem of tuning and selection of kernels. It is shown that the SURE approach to parameter tuning provides an asymptotically consistent estimate of the optimal (in a MSE sense) hyperparameters. At the same time it is shown that the common marginal likelihood (empirical Bayes) approach does not enjoy that property.
关键词:KeywordsLinear system identificationGaussian process regressionKernel-based regularizationMarginal likelihood estimatorsStein’s unbiased risk estimators