期刊名称:Sankhya. Series A, mathematical statistics and probability
印刷版ISSN:0976-836X
电子版ISSN:0976-8378
出版年度:2017
卷号:79
期号:2
页码:298-335
DOI:10.1007/s13171-017-0107-5
语种:English
出版社:Indian Statistical Institute
摘要:Estimator selection has become a crucial issue in non parametric estimation. Two widely used methods are penalized empirical risk minimization (such as penalized log-likelihood estimation) or pairwise comparison (such as Lepski’s method). Our aim in this paper is twofold. First we explain some general ideas about the calibration issue of estimator selection methods. We review some known results, putting the emphasis on the concept of minimal penalty which is helpful to design data-driven selection criteria. Secondly we present a new method for bandwidth selection within the framework of kernel density density estimation which is in some sense intermediate between these two main methods mentioned above. We provide some theoretical results which lead to some fully data-driven selection strategy.