首页    期刊浏览 2024年07月03日 星期三
登录注册

文章基本信息

  • 标题:Optimal prediction for sparse linear models? Lower bounds for coordinate-separable M-estimators
  • 本地全文:下载
  • 作者:Yuchen Zhang ; Martin J. Wainwright ; Michael I. Jordan
  • 期刊名称:Electronic Journal of Statistics
  • 印刷版ISSN:1935-7524
  • 出版年度:2017
  • 卷号:11
  • 期号:1
  • 页码:752-799
  • DOI:10.1214/17-EJS1233
  • 语种:English
  • 出版社:Institute of Mathematical Statistics
  • 摘要:For the problem of high-dimensional sparse linear regression, it is known that an $\ell_{0}$-based estimator can achieve a $1/n$ “fast” rate for prediction error without any conditions on the design matrix, whereas in the absence of restrictive conditions on the design matrix, popular polynomial-time methods only guarantee the $1/\sqrt{n}$ “slow” rate. In this paper, we show that the slow rate is intrinsic to a broad class of M-estimators. In particular, for estimators based on minimizing a least-squares cost function together with a (possibly nonconvex) coordinate-wise separable regularizer, there is always a “bad” local optimum such that the associated prediction error is lower bounded by a constant multiple of $1/\sqrt{n}$. For convex regularizers, this lower bound applies to all global optima. The theory is applicable to many popular estimators, including convex $\ell_{1}$-based methods as well as M-estimators based on nonconvex regularizers, including the SCAD penalty or the MCP regularizer. In addition, we show that bad local optima are very common, in that a broad class of local minimization algorithms with random initialization typically converge to a bad solution.
国家哲学社会科学文献中心版权所有