Results 1  10
of
19
Sharp Adaptation for Inverse Problems With Random Noise
, 2000
"... We consider a heteroscedastic sequence space setup with polynomially increasing variances of observations that allows to treat a number of inverse problems, in particular multivariate ones. We propose an adaptive estimator that attains simultaneously exact asymptotic minimax constants on every ellip ..."
Abstract

Cited by 84 (8 self)
 Add to MetaCart
We consider a heteroscedastic sequence space setup with polynomially increasing variances of observations that allows to treat a number of inverse problems, in particular multivariate ones. We propose an adaptive estimator that attains simultaneously exact asymptotic minimax constants on every ellipsoid of functions within a wide scale (that includes ellipoids with polynomially and exponentially decreasing axes) and, at the same time, satisfies asymptotically exact oracle inequalities within any class of linear estimates having monotone nondecreasing weights. As application, we construct sharp adaptive estimators in the problems of deconvolution and tomography.
Oracle Inequalities for Inverse Problems
, 2000
"... We consider a sequence space model of statistical linear inverse problems where we need to estimate a function f from indirect noisy observations. Let a finite set of linear estimators be given. Our aim is to mimic the estimator in that has the smallest risk on the true f . Under general conditions, ..."
Abstract

Cited by 75 (9 self)
 Add to MetaCart
We consider a sequence space model of statistical linear inverse problems where we need to estimate a function f from indirect noisy observations. Let a finite set of linear estimators be given. Our aim is to mimic the estimator in that has the smallest risk on the true f . Under general conditions, we show that this can be achieved by simple minimization of unbiased risk estimator, provided the singular values of the operator of the inverse problem decrease as a power law. The main result is a nonasymptotic oracle inequality that is shown to be asymptotically exact. This inequality can be also used to obtain sharp minimax adaptive results. In particular, we apply it to show that minimax adaptation on ellipsoids in multivariate anisotropic case is realized by minimization of unbiased risk estimator without any loss of efficiency with respect to optimal nonadaptive procedures. Mathematics Subject Classifications: 62G05, 62G20 Key Words: Statistical inverse problems, Oracle inequaliti...
Linear and convex aggregation of density estimators
, 2004
"... We study the problem of learning the best linear and convex combination of M estimators of a density with respect to the mean squared risk. We suggest aggregation procedures and we prove sharp oracle inequalities for their risks, i.e., oracle inequalities with leading constant 1. We also obtain lowe ..."
Abstract

Cited by 37 (1 self)
 Add to MetaCart
(Show Context)
We study the problem of learning the best linear and convex combination of M estimators of a density with respect to the mean squared risk. We suggest aggregation procedures and we prove sharp oracle inequalities for their risks, i.e., oracle inequalities with leading constant 1. We also obtain lower bounds showing that these procedures attain optimal rates of aggregation. As an example, we consider aggregation of multivariate kernel density estimators with different bandwidths. We show that linear and convex aggregates mimic the kernel oracles in asymptotically exact sense. We prove that, for Pinsker’s kernel, the proposed aggregates are sharp asymptotically minimax simultaneously over a large scale of Sobolev classes of densities. Finally, we provide simulations demonstrating performance of the convex aggregation procedure.
General empirical Bayes wavelet methods and exactly adaptive minimax estimation

, 2005
"... In many statistical problems, stochastic signals can be represented as a sequence of noisy wavelet coefficients. In this paper, we develop general empirical Bayes methods for the estimation of true signal. Our estimators approximate certain oracle separable rules and achieve adaptation to ideal risk ..."
Abstract

Cited by 25 (3 self)
 Add to MetaCart
In many statistical problems, stochastic signals can be represented as a sequence of noisy wavelet coefficients. In this paper, we develop general empirical Bayes methods for the estimation of true signal. Our estimators approximate certain oracle separable rules and achieve adaptation to ideal risks and exact minimax risks in broad collections of classes of signals. In particular, our estimators are uniformly adaptive to the minimum risk of separable estimators and the exact minimax risks simultaneously in Besov balls of all smoothness and shape indices, and they are uniformly superefficient in convergence rates in all compact sets in Besov spaces with a finite secondary shape parameter. Furthermore, in classes nested between Besov balls of the same smoothness index, our estimators dominate threshold and James–Stein estimators within an infinitesimal fraction of the minimax risks. More general block empirical Bayes estimators are developed. Both white noise with drift and nonparametric regression are considered.
On minimax density estimation on R
 Bernoulli
, 2004
"... Abstract: the problem of density estimation on R from an independent sample X1,...XN with common density f is concerned. The behavior of the minimax Lprisk, 1 ≤ p ≤ ∞, is studied when f belongs to a Hölder class of regularity s on the real line. The lower bound for the minimax risk is provided. We ..."
Abstract

Cited by 21 (0 self)
 Add to MetaCart
(Show Context)
Abstract: the problem of density estimation on R from an independent sample X1,...XN with common density f is concerned. The behavior of the minimax Lprisk, 1 ≤ p ≤ ∞, is studied when f belongs to a Hölder class of regularity s on the real line. The lower bound for the minimax risk is provided. We show that the linear estimator is not efficient in this setting and construct a wavelet adaptive estimator which attains (up to a logarithmic factor in N) the lower bounds involved. We show that the minimax risk depends on the parameter p when p < 2 + 1 s. Key words: nonparametric density estimation, minimax estimation, adaptive estimation. 1
Exact adaptive pointwise estimation on Sobolev classes of densities
 ESAIM Probab. Statist
, 2001
"... Abstract. The subject of this paper is to estimate adaptively the common probability density of n independent, identically distributed random variables. The estimation is done at a xed point x0 2 R, over the density functions that belong to the Sobolev class Wn (;L). We consider the adaptive problem ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
(Show Context)
Abstract. The subject of this paper is to estimate adaptively the common probability density of n independent, identically distributed random variables. The estimation is done at a xed point x0 2 R, over the density functions that belong to the Sobolev class Wn (;L). We consider the adaptive problem setup, where the regularity parameter is unknown and varies in a given set Bn. A sharp adaptive estimator is obtained, and the explicit asymptotical constant, associated to its rate of convergence is found. Mathematics Subject Classication. 62N01, 62N02, 62G20.
Estimation of the density of regression errors
 The Annals of Statistics
, 2005
"... ..."
(Show Context)
Sparse density estimation with ℓ1 penalties
 In Proceedings of 20th Annual Conference on Learning Theory (COLT 2007) (2007
"... Abstract. This paper studies oracle properties of ℓ1penalized estimators of a probability density. We show that the penalized least squares estimator satisfies sparsity oracle inequalities, i.e., bounds in terms of the number of nonzero components of the oracle vector. The results are valid even w ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
(Show Context)
Abstract. This paper studies oracle properties of ℓ1penalized estimators of a probability density. We show that the penalized least squares estimator satisfies sparsity oracle inequalities, i.e., bounds in terms of the number of nonzero components of the oracle vector. The results are valid even when the dimension of the model is (much) larger than the sample size. They are applied to estimation in sparse highdimensional mixture models, to nonparametric adaptive density estimation and to the problem of aggregation of density estimators. 1
Adaptive estimation of and oracle inequalities for probability densities
, 2004
"... The theory of adaptive estimation and oracle inequalities for the case of Gaussianshift–finiteinterval experiments has made significant progress in recent years. In particular, sharpminimax adaptive estimators and exact exponentialtype oracle inequalities have been suggested for a vast set of fu ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
(Show Context)
The theory of adaptive estimation and oracle inequalities for the case of Gaussianshift–finiteinterval experiments has made significant progress in recent years. In particular, sharpminimax adaptive estimators and exact exponentialtype oracle inequalities have been suggested for a vast set of functions including analytic and Sobolev with any positive index as well as for Efromovich–Pinsker and Stein blockwiseshrinkage estimators. Is it possible to obtain similar results for a more interesting applied problem of density estimation and/or the dual problem of characteristic function estimation? The answer is “yes. ” In particular, the obtained results include exact exponentialtype oracle inequalities which allow to consider, for the first time in the literature, a simultaneous sharpminimax estimation of Sobolev densities with any positive index (not necessarily larger than 1/2), infinitely differentiable densities (including analytic, entire and stable), as well as of not absolutely integrable characteristic functions. The same adaptive estimator is also rate minimax over a familiar class of distributions with bounded spectrum where the density and the characteristic function can be estimated with the parametric rate. 1. Introduction. Univariate
Sparse density estimation with `1 penalties
"... Abstract. This paper studies oracle properties of `1penalized estimators of a probability density. We show that the penalized least squares estimator satisfies sparsity oracle inequalities, i.e., bounds in terms of the number of nonzero components of the oracle vector. The results are valid even ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
Abstract. This paper studies oracle properties of `1penalized estimators of a probability density. We show that the penalized least squares estimator satisfies sparsity oracle inequalities, i.e., bounds in terms of the number of nonzero components of the oracle vector. The results are valid even when the dimension of the model is (much) larger than the sample size. They are applied to estimation in sparse highdimensional mixture models, to nonparametric adaptive density estimation and to the problem of aggregation of density estimators. 1