摘要:We study the problem of estimating an unknown vector $\theta $ from an observation $X$ drawn according to the normal distribution with mean $\theta $ and identity covariance matrix under the knowledge that $\theta $ belongs to a known closed convex set $\Theta $. In this general setting, Chatterjee (2014) proved that the natural constrained least squares estimator is “approximately admissible” for every $\Theta $. We extend this result by proving that the same property holds for all convex penalized estimators as well. Moreover, we simplify and shorten the original proof considerably. We also provide explicit upper and lower bounds for the universal constant underlying the notion of approximate admissibility.