期刊名称:Sankhya. Series A, mathematical statistics and probability
印刷版ISSN:0976-836X
电子版ISSN:0976-8378
出版年度:2011
卷号:73
期号:01
页码:55--78
出版社:Indian Statistical Institute
摘要:In this paper, we study the strong consistency and rates of convergence
of the Lasso estimator. It is shown that when the error variables have a
nite mean, the Lasso estimator is strongly consistent, provided the penalty
parameter (say, n) is of smaller order than the sample size (say n). We
also show that this condition on n cannot be relaxed. More specically,
we show that consistency of the Lasso estimators fail in the cases where
n{n . a for some a P p0; 8s. For error variables with a nite
th moment,
1 .
. 2, we also obtain convergence rates of the Lasso estimator to
the true parameter. It is noted that the convergence rates of the Lasso
estimators of the non-zero components of the regression parameter vector can
be worse than the corresponding least squares estimators. However, when the
design matrix satises some orthogonality conditions, the Lasso estimators
of the zero components are surprisingly accurate; The Lasso recovers the zero
components exactly, for large n, almost surely.