首页    期刊浏览 2024年10月06日 星期日
登录注册

文章基本信息

  • 标题:Optimal equivariant prediction for high-dimensional linear models with arbitrary predictor covariance
  • 本地全文:下载
  • 作者:Lee H. Dicker
  • 期刊名称:Electronic Journal of Statistics
  • 印刷版ISSN:1935-7524
  • 出版年度:2013
  • 卷号:7
  • 页码:1806-1834
  • DOI:10.1214/13-EJS826
  • 语种:English
  • 出版社:Institute of Mathematical Statistics
  • 摘要:In a linear model, consider the class of estimators that are equivariant with respect to linear transformations of the predictor basis. Each of these estimators determines an equivariant linear prediction rule. Equivariant prediction rules may be appropriate in settings where sparsity assumptions (like those common in high-dimensional data analysis) are untenable or little is known about the relevance of the given predictor basis, insofar as it relates to the outcome. In this paper, we study the out-of-sample prediction error associated with equivariant estimators in high-dimensional linear models with Gaussian predictors and errors. We show that non-trivial equivariant prediction is impossible when the number of predictors $d$ is greater than the number of observations $n$. For $d/n\to \rho \in[0,1)$, we show that a James-Stein estimator (a scalar multiple of the ordinary least squares estimator) is asymptotically optimal for equivariant out-of-sample prediction, and derive a closed-form expression for its asymptotic predictive risk. Finally, we undertake a detailed comparative analysis involving the proposed James-Stein estimator and other well-known estimators for non-sparse settings, including the ordinary least squares estimator, ridge regression, and other James-Stein estimators for the linear model. Among other things, this comparative analysis sheds light on the role of the population-level predictor covariance matrix and reveals that other previously studied James-Stein estimators for the linear model are sub-optimal in terms of out-of-sample prediction error.
国家哲学社会科学文献中心版权所有