Results 1 
8 of
8
Minimax Estimation via Wavelet Shrinkage
, 1992
"... We attempt to recover an unknown function from noisy, sampled data. Using orthonormal bases of compactly supported wavelets we develop a nonlinear method which works in the wavelet domain by simple nonlinear shrinkage of the empirical wavelet coe cients. The shrinkage can be tuned to be nearly minim ..."
Abstract

Cited by 246 (32 self)
 Add to MetaCart
We attempt to recover an unknown function from noisy, sampled data. Using orthonormal bases of compactly supported wavelets we develop a nonlinear method which works in the wavelet domain by simple nonlinear shrinkage of the empirical wavelet coe cients. The shrinkage can be tuned to be nearly minimax over any member of a wide range of Triebel and Besovtype smoothness constraints, and asymptotically minimax over Besov bodies with p q. Linear estimates cannot achieve even the minimax rates over Triebel and Besov classes with p <2, so our method can signi cantly outperform every linear method (kernel, smoothing spline, sieve,:::) in a minimax sense. Variants of our method based on simple threshold nonlinearities are nearly minimax. Our method possesses the interpretation of spatial adaptivity: it reconstructs using a kernel which mayvary in shape and bandwidth from point to point, depending on the data. Least favorable distributions for certain of the Triebel and Besov scales generate objects with sparse wavelet transforms. Many real objects have similarly sparse transforms, which suggests that these minimax results are relevant for practical problems. Sequels to this paper discuss practical implementation, spatial adaptation properties and applications to inverse problems.
Wavelet shrinkage: asymptopia
 Journal of the Royal Statistical Society, Ser. B
, 1995
"... Considerable e ort has been directed recently to develop asymptotically minimax methods in problems of recovering in nitedimensional objects (curves, densities, spectral densities, images) from noisy data. A rich and complex body of work has evolved, with nearly or exactly minimax estimators bein ..."
Abstract

Cited by 239 (35 self)
 Add to MetaCart
Considerable e ort has been directed recently to develop asymptotically minimax methods in problems of recovering in nitedimensional objects (curves, densities, spectral densities, images) from noisy data. A rich and complex body of work has evolved, with nearly or exactly minimax estimators being obtained for a variety of interesting problems. Unfortunately, the results have often not been translated into practice, for a variety of reasons { sometimes, similarity to known methods, sometimes, computational intractability, and sometimes, lack of spatial adaptivity. We discuss a method for curve estimation based on n noisy data; one translates the empirical wavelet coe cients towards the origin by an amount p p 2 log(n) = n. The method is di erent from methods in common use today, is computationally practical, and is spatially adaptive; thus it avoids a number of previous objections to minimax estimators. At the same time, the method is nearly minimax for a wide variety of loss functions { e.g. pointwise error, global error measured in L p norms, pointwise and global error in estimation of derivatives { and for a wide range of smoothness classes, including standard Holder classes, Sobolev classes, and Bounded Variation. This is amuch broader nearoptimality than anything previously proposed in the minimax literature. Finally, the theory underlying the method is interesting, as it exploits a correspondence between statistical questions and questions of optimal recovery and informationbased complexity.
Nonlinear solution of linear inverse problems by waveletvaguelette decomposition
, 1992
"... We describe the WaveletVaguelette Decomposition (WVD) of a linear inverse problem. It is a substitute for the singular value decomposition (SVD) of an inverse problem, and it exists for a class of special inverse problems of homogeneous type { such asnumerical di erentiation, inversion of Abeltype ..."
Abstract

Cited by 182 (12 self)
 Add to MetaCart
We describe the WaveletVaguelette Decomposition (WVD) of a linear inverse problem. It is a substitute for the singular value decomposition (SVD) of an inverse problem, and it exists for a class of special inverse problems of homogeneous type { such asnumerical di erentiation, inversion of Abeltype transforms, certain convolution transforms, and the Radon Transform. We propose to solve illposed linear inverse problems by nonlinearly \shrinking" the WVD coe cients of the noisy, indirect data. Our approach o ers signi cant advantages over traditional SVD inversion in the case of recovering spatially inhomogeneous objects. We suppose that observations are contaminated by white noise and that the object is an unknown element of a Besov space. We prove that nonlinear WVD shrinkage can be tuned to attain the minimax rate of convergence, for L 2 loss, over the entire Besov scale. The important case of Besov spaces Bp;q, p <2, which model spatial inhomogeneity, is included. In comparison, linear procedures { SVD included { cannot attain optimal rates of convergence over such classes in the case p<2. For example, our methods achieve faster rates of convergence, for objects known to lie in the Bump Algebra or in Bounded Variation, than any linear procedure.
Unconditional bases are optimal bases for data compression and for statistical estimation
 Applied and Computational Harmonic Analysis
, 1993
"... An orthogonal basis of L 2 which is also an unconditional basis of a functional space F is a kind of optimal basis for compressing, estimating, and recovering functions in F. Simple thresholding operations, applied in the unconditional basis, work essentially better for compressing, estimating, and ..."
Abstract

Cited by 140 (23 self)
 Add to MetaCart
An orthogonal basis of L 2 which is also an unconditional basis of a functional space F is a kind of optimal basis for compressing, estimating, and recovering functions in F. Simple thresholding operations, applied in the unconditional basis, work essentially better for compressing, estimating, and recovering than they do in any other orthogonal basis. In fact, simple thresholding in an unconditional basis works essentially better for recovery and estimation than other methods, period. (Performance is measured in an asymptotic minimax sense.) As an application, we formalize and prove Mallat's Heuristic, which says that wavelet bases are optimal for representing functions containing singularities, when there may be an arbitrary number of singularities, arbitrarily distributed.
Density estimation by wavelet thresholding
 Ann. Statist
, 1996
"... Density estimation is a commonly used test case for nonparametric estimation methods. We explore the asymptotic properties of estimators based on thresholding of empirical wavelet coe cients. Minimax rates of convergence are studied over a large range of Besov function classes Bs;p;q and for a rang ..."
Abstract

Cited by 139 (8 self)
 Add to MetaCart
Density estimation is a commonly used test case for nonparametric estimation methods. We explore the asymptotic properties of estimators based on thresholding of empirical wavelet coe cients. Minimax rates of convergence are studied over a large range of Besov function classes Bs;p;q and for a range of global L 0 p error measures, 1 p 0 < 1. A single wavelet threshold estimator is asymptotically minimax within logarithmic terms simultaneously over a range of spaces and error measures. In particular, when p 0> p, some form of nonlinearity is essential, since the minimax linear estimators are suboptimal by polynomial powers of n. A second approach, using an approximation of a Gaussian white noise model in a Mallows metric, is used to attain exactly optimal rates of convergence for quadratic error (p 0 = 2).
Minimax bayes, asymptotic minimax and sparse wavelet priors, in
 Sciences Paris (A
, 1994
"... Pinsker(1980) gave a precise asymptotic evaluation of the minimax mean squared error of estimation of a signal in Gaussian noise when the signal is known a priori to lie in a compact ellipsoid in Hilbert space. This `Minimax Bayes ' method can be applied to a variety of global nonparametric estimat ..."
Abstract

Cited by 35 (9 self)
 Add to MetaCart
Pinsker(1980) gave a precise asymptotic evaluation of the minimax mean squared error of estimation of a signal in Gaussian noise when the signal is known a priori to lie in a compact ellipsoid in Hilbert space. This `Minimax Bayes ' method can be applied to a variety of global nonparametric estimation settings with parameter spaces far from ellipsoidal. For example it leads to a theory of exact asymptotic minimax estimation over norm balls in Besov and Triebel spaces using simple coordinatewise estimators and wavelet bases. This paper outlines some features of the method common to several applications. In particular, we derive new results on the exact asymptotic minimax risk over weak `p balls in Rn as n!1, and also for a class of `local ' estimators on the Triebel scale. By its very nature, the method reveals the structure of asymptotically least favorable distributions. Thus wemaysimulate `least favorable ' sample paths. We illustrate this for estimation of a signal in Gaussian white noise over norm balls in certain Besov spaces. In wavelet bases, when p<2, the least favorable priors are sparse, and the resulting sample paths strikingly di erent from those observed in Pinsker's ellipsoidal setting (p =2).
Wavelet Estimators, Global Error Measures: Revisited
 Technical Report. IRISAINRIA. Available at http://www.irisa.fr
, 1993
"... : In the paper minimax rates of convergence for wavelet estimators are studied. For the problems of density estimation and nonparametric regression we establish upper bounds over a large range of functional classes and global error measures. The constructed estimate is simultaneously minimax (up to ..."
Abstract

Cited by 16 (3 self)
 Add to MetaCart
: In the paper minimax rates of convergence for wavelet estimators are studied. For the problems of density estimation and nonparametric regression we establish upper bounds over a large range of functional classes and global error measures. The constructed estimate is simultaneously minimax (up to constant) for all L ß error measures, 0 ! ß 1. Keywords: minimax estimation, density estimation, nonparametric regression, Besov spaces, wavelet estimators. (R'esum'e : tsvp) This is the author to whom correspondence should be sent ; email: iouditsk@irisa.fr. Centre National de la Recherche Scientifique Institut National de Recherche en Informatique (URA 227) Universit e de Rennes 1  Insa de Rennes et en Automatique  unit e de recherche de Rennes Estimateurs par Ondelettes, Majorations d'Erreur Globale R'esum'e : On 'etudie dans cet article la vitesse de convergence des estimateurs par ondelettes. Pour les probl`emes d'estimation de densit'e et de r'egression nonparam'etrique...
On Minimax Filtering over Ellipsoids
 Math. Meth. Statist
, 1995
"... this article, developing further the approach of [9], we describe the secondorder behaviour of the minimax estimators and the quadratic minimax risk for the model (1) (2). These results are illustrated by a number of examples. The authors are grateful to G.K. Golubev for a number of comments resu ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
this article, developing further the approach of [9], we describe the secondorder behaviour of the minimax estimators and the quadratic minimax risk for the model (1) (2). These results are illustrated by a number of examples. The authors are grateful to G.K. Golubev for a number of comments resulting in the improvement of some results of the paper and their better presentation. 2 Minimax linear estimation