摘要:We consider the sparse regression model where the number of parameters p is larger than the sample size n. The difficulty when considering high-dimensional problems is to propose estimators achieving a good compromise between statistical and computational performances. The Lasso is solution of a convex minimization problem, hence computable for large value of p. However stringent conditions on the design are required to establish fast rates of convergence for this estimator. Dalalyan and Tsybakov [17–19] proposed an exponential weights procedure achieving a good compromise between the statistical and computational aspects. This estimator can be computed for reasonably large p and satisfies a sparsity oracle inequality in expectation for the empirical excess risk only under mild assumptions on the design. In this paper, we propose an exponential weights estimator similar to that of [17] but with improved statistical performances. Our main result is a sparsity oracle inequality in probability for the true excess risk.