Results 1  10
of
14
DeNoising By SoftThresholding
, 1992
"... Donoho and Johnstone (1992a) proposed a method for reconstructing an unknown function f on [0; 1] from noisy data di = f(ti)+ zi, iid i =0;:::;n 1, ti = i=n, zi N(0; 1). The reconstruction fn ^ is de ned in the wavelet domain by translating all the empirical wavelet coe cients of d towards 0 by an a ..."
Abstract

Cited by 812 (13 self)
 Add to MetaCart
Donoho and Johnstone (1992a) proposed a method for reconstructing an unknown function f on [0; 1] from noisy data di = f(ti)+ zi, iid i =0;:::;n 1, ti = i=n, zi N(0; 1). The reconstruction fn ^ is de ned in the wavelet domain by translating all the empirical wavelet coe cients of d towards 0 by an amount p 2 log(n) = p n. We prove two results about that estimator. [Smooth]: With high probability ^ fn is at least as smooth as f, in any of a wide variety of smoothness measures. [Adapt]: The estimator comes nearly as close in mean square to f as any measurable estimator can come, uniformly over balls in each of two broad scales of smoothness classes. These two properties are unprecedented in several ways. Our proof of these results develops new facts about abstract statistical inference and its connection with an optimal recovery model.
Minimax Estimation via Wavelet Shrinkage
, 1992
"... We attempt to recover an unknown function from noisy, sampled data. Using orthonormal bases of compactly supported wavelets we develop a nonlinear method which works in the wavelet domain by simple nonlinear shrinkage of the empirical wavelet coe cients. The shrinkage can be tuned to be nearly minim ..."
Abstract

Cited by 246 (32 self)
 Add to MetaCart
We attempt to recover an unknown function from noisy, sampled data. Using orthonormal bases of compactly supported wavelets we develop a nonlinear method which works in the wavelet domain by simple nonlinear shrinkage of the empirical wavelet coe cients. The shrinkage can be tuned to be nearly minimax over any member of a wide range of Triebel and Besovtype smoothness constraints, and asymptotically minimax over Besov bodies with p q. Linear estimates cannot achieve even the minimax rates over Triebel and Besov classes with p <2, so our method can signi cantly outperform every linear method (kernel, smoothing spline, sieve,:::) in a minimax sense. Variants of our method based on simple threshold nonlinearities are nearly minimax. Our method possesses the interpretation of spatial adaptivity: it reconstructs using a kernel which mayvary in shape and bandwidth from point to point, depending on the data. Least favorable distributions for certain of the Triebel and Besov scales generate objects with sparse wavelet transforms. Many real objects have similarly sparse transforms, which suggests that these minimax results are relevant for practical problems. Sequels to this paper discuss practical implementation, spatial adaptation properties and applications to inverse problems.
Wavelet shrinkage: asymptopia
 Journal of the Royal Statistical Society, Ser. B
, 1995
"... Considerable e ort has been directed recently to develop asymptotically minimax methods in problems of recovering in nitedimensional objects (curves, densities, spectral densities, images) from noisy data. A rich and complex body of work has evolved, with nearly or exactly minimax estimators bein ..."
Abstract

Cited by 239 (35 self)
 Add to MetaCart
Considerable e ort has been directed recently to develop asymptotically minimax methods in problems of recovering in nitedimensional objects (curves, densities, spectral densities, images) from noisy data. A rich and complex body of work has evolved, with nearly or exactly minimax estimators being obtained for a variety of interesting problems. Unfortunately, the results have often not been translated into practice, for a variety of reasons { sometimes, similarity to known methods, sometimes, computational intractability, and sometimes, lack of spatial adaptivity. We discuss a method for curve estimation based on n noisy data; one translates the empirical wavelet coe cients towards the origin by an amount p p 2 log(n) = n. The method is di erent from methods in common use today, is computationally practical, and is spatially adaptive; thus it avoids a number of previous objections to minimax estimators. At the same time, the method is nearly minimax for a wide variety of loss functions { e.g. pointwise error, global error measured in L p norms, pointwise and global error in estimation of derivatives { and for a wide range of smoothness classes, including standard Holder classes, Sobolev classes, and Bounded Variation. This is amuch broader nearoptimality than anything previously proposed in the minimax literature. Finally, the theory underlying the method is interesting, as it exploits a correspondence between statistical questions and questions of optimal recovery and informationbased complexity.
Unconditional bases are optimal bases for data compression and for statistical estimation
 Applied and Computational Harmonic Analysis
, 1993
"... An orthogonal basis of L 2 which is also an unconditional basis of a functional space F is a kind of optimal basis for compressing, estimating, and recovering functions in F. Simple thresholding operations, applied in the unconditional basis, work essentially better for compressing, estimating, and ..."
Abstract

Cited by 142 (23 self)
 Add to MetaCart
An orthogonal basis of L 2 which is also an unconditional basis of a functional space F is a kind of optimal basis for compressing, estimating, and recovering functions in F. Simple thresholding operations, applied in the unconditional basis, work essentially better for compressing, estimating, and recovering than they do in any other orthogonal basis. In fact, simple thresholding in an unconditional basis works essentially better for recovery and estimation than other methods, period. (Performance is measured in an asymptotic minimax sense.) As an application, we formalize and prove Mallat's Heuristic, which says that wavelet bases are optimal for representing functions containing singularities, when there may be an arbitrary number of singularities, arbitrarily distributed.
Density estimation by wavelet thresholding
 Ann. Statist
, 1996
"... Density estimation is a commonly used test case for nonparametric estimation methods. We explore the asymptotic properties of estimators based on thresholding of empirical wavelet coe cients. Minimax rates of convergence are studied over a large range of Besov function classes Bs;p;q and for a rang ..."
Abstract

Cited by 140 (8 self)
 Add to MetaCart
Density estimation is a commonly used test case for nonparametric estimation methods. We explore the asymptotic properties of estimators based on thresholding of empirical wavelet coe cients. Minimax rates of convergence are studied over a large range of Besov function classes Bs;p;q and for a range of global L 0 p error measures, 1 p 0 < 1. A single wavelet threshold estimator is asymptotically minimax within logarithmic terms simultaneously over a range of spaces and error measures. In particular, when p 0> p, some form of nonlinearity is essential, since the minimax linear estimators are suboptimal by polynomial powers of n. A second approach, using an approximation of a Gaussian white noise model in a Mallows metric, is used to attain exactly optimal rates of convergence for quadratic error (p 0 = 2).
Maximal Spaces with given rate of convergence for thresholding algorithms
, 1999
"... this paper is to discuss the existence and the nature of maximal spaces in the context of nonlinear methods based on thresholding (or shrinkage) procedures. Before going further, some remarks should be made: ..."
Abstract

Cited by 36 (7 self)
 Add to MetaCart
this paper is to discuss the existence and the nature of maximal spaces in the context of nonlinear methods based on thresholding (or shrinkage) procedures. Before going further, some remarks should be made:
Exact Risk Analysis of Wavelet Regression
, 1995
"... Wavelets have motivated development of a host of new ideas in nonparametric regression smoothing. Here we apply the tool of exact risk analysis, to understand the small sample behavior of wavelet estimators, and thus to check directly the conclusions suggested by asymptotics. Comparisons between som ..."
Abstract

Cited by 24 (2 self)
 Add to MetaCart
Wavelets have motivated development of a host of new ideas in nonparametric regression smoothing. Here we apply the tool of exact risk analysis, to understand the small sample behavior of wavelet estimators, and thus to check directly the conclusions suggested by asymptotics. Comparisons between some wavelet bases, and also between hard and soft thresholding are given from several viewpoints. Our results provide insight as to why the viewpoints and conclusions of Donoho and Johnstone differ from those of Hall and Patil. 1 Introduction In a series of papers, Donoho and Johnstone (1992 [9],1994a [10], 1995 [13]) and Donoho, Johnstone, Kerkyacharian and Picard (1995) [14] developed nonlinear wavelet shrinkage technology in nonparametric regression. For other work relating wavelets and nonparametric estimation, see Doukhan (1988) [15], Kerkyacharian and Picard, (1992) [21], Antoniadis (1994) [1] and Antoniadis, Gregoire and McKeague (1994) [2]. These papers have both introduced a new clas...
Some Uses of Cumulants in Wavelet Analysis
 J. Nonparametric Statistics
, 1996
"... Cumulants are useful in studying nonlinear phenomena and in developing (approximate) statistical properties of quantities computed from random process data. Wavelet analysis is a powerful tool for the approximation and estimation of curves and surfaces. This work considers both wavelets and cumulant ..."
Abstract

Cited by 20 (2 self)
 Add to MetaCart
Cumulants are useful in studying nonlinear phenomena and in developing (approximate) statistical properties of quantities computed from random process data. Wavelet analysis is a powerful tool for the approximation and estimation of curves and surfaces. This work considers both wavelets and cumulants, developing some sampling properties of linear wavelet fits to a signal in the presence of additive stationary noise via the calculus of cumulants. Of some concern is the construction of approximate confidence bounds around a fit. Some extensions to spatial processes, irregularly observed processes and long memory processes are indicated.
Wavelets in Statistics: A Review
 Journal of the Italian Statistical Association
, 1997
"... The field of nonparametric function estimation has broadened its appeal in recent years with an array of new tools for statistical analysis. In particular, theoretical and applied research on the field of wavelets has had noticeable influence on statistical topics such as nonparametric regression, n ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
The field of nonparametric function estimation has broadened its appeal in recent years with an array of new tools for statistical analysis. In particular, theoretical and applied research on the field of wavelets has had noticeable influence on statistical topics such as nonparametric regression, nonpararametric density estimation, nonparametric discrimination and many other related topics. This is a survey article that attempts to synthetize a broad variety of work on wavelets in statistics and includes some recent developments in nonparametric curve estimation that have been omitted from review articles and books on the subject. After a short introduction to wavelet theory, wavelets are treated in the familiar context of estimation of "smooth" functions. Both "linear" and "nonlinear" wavelet estimation methods are discussed and crossvalidation methods for choosing the smoothing parameters are addressed. Finally, some areas of related research are mentioned, such as hypothesis testi...
On Adaptivity Of BlockShrink Wavelet Estimator Over Besov Spaces
, 1997
"... Cai(1996b) proposed a wavelet method, BlockShrink, for estimating regression functions of unknown smoothness from noisy data by thresholding empirical wavelet coefficients in groups rather than individually. The BlockShrink utilizes the information about neighboring wavelet coefficients and thus inc ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
Cai(1996b) proposed a wavelet method, BlockShrink, for estimating regression functions of unknown smoothness from noisy data by thresholding empirical wavelet coefficients in groups rather than individually. The BlockShrink utilizes the information about neighboring wavelet coefficients and thus increases the estimation accuracy of the wavelet coefficients. In the present paper, we offer insights into the BlockShrink procedure and show that the minimax optimality of the BlockShrink estimators holds broadly over a wide range of Besov classes B ff p;q (M ). We prove that the BlockShrink estimators attain the exact optimal rate of convergence over a wide interval of Besov classes with p 2; and the BlockShrink estimators achieves the optimal convergence rate within a logarithmic factor over the Besov classes with p ! 2. We also show that the BlockShrink estimators enjoys a smoothness property: if the underlying function is the zero function, then, with high probability, the BlockShrink...