## Ideal spatial adaptation by wavelet shrinkage (1994)

Venue: | Biometrika |

Citations: | 884 - 4 self |

### BibTeX

@ARTICLE{Donoho94idealspatial,

author = {David L. Donoho and Iain M. Johnstone},

title = {Ideal spatial adaptation by wavelet shrinkage},

journal = {Biometrika},

year = {1994},

volume = {81},

pages = {425--455}

}

### Years of Citing Articles

### OpenURL

### Abstract

With ideal spatial adaptation, an oracle furnishes information about how best to adapt a spatially variable estimator, whether piecewise constant, piecewise polynomial, variable knot spline, or variable bandwidth kernel, to the unknown function. Estimation with the aid of an oracle o ers dramatic advantages over traditional linear estimation by nonadaptive kernels � however, it is a priori unclear whether such performance can be obtained by a procedure relying on the data alone. We describe a new principle for spatially-adaptive estimation: selective wavelet reconstruction. Weshowthatvariableknot spline ts and piecewise-polynomial ts, when equipped with an oracle to select the knots, are not dramatically more powerful than selective wavelet reconstruction with an oracle. We develop a practical spatially adaptive method, RiskShrink, which works by shrinkage of empirical wavelet coe cients. RiskShrink mimics the performance of an oracle for selective wavelet reconstruction as well as it is possible to do so. A new inequality inmultivariate normal decision theory which wecallthe oracle inequality shows that attained performance di ers from ideal performance by at most a factor 2logn, where n is the sample size. Moreover no estimator can give a better guarantee than this. Within the class of spatially adaptive procedures, RiskShrink is essentially optimal. Relying only on the data, it comes within a factor log 2 n of the performance of piecewise polynomial and variable-knot spline methods equipped with an oracle. In contrast, it is unknown how or if piecewise polynomial methods could be made to function this well when denied access to an oracle and forced to rely on data alone.

### Citations

4172 | Classification and Regression Trees - Breiman, Freidman, et al. - 1984 |

1653 | Orthonormal bases of compactly supported wavelets - Daubechies - 1988 |

738 | An Introduction to Wavelets - Chui - 1992 |

249 | Subset selection in regression - Miller - 1990 |

195 |
Classi - cation and regression trees
- Breiman, Friedman, et al.
- 1984
(Show Context)
Citation Context ...e thatLis a variable. The reconstruction formula is TPC(y� )(t)= LX `=1 Ave(yi :ti 2I`)1I ` (t)� piecewise constant reconstruction using the mean of the data within each piece to estimate the pieces. =-=[2]-=-. Piecewise PolynomialsT PP(D)(y� ). Here the interpretation of is the same as in [1], only the reconstruction uses polynomials of degreeD. T PP(D)(y� )(t)= 2 LX `=1 ^p`(t)1I ` (t)�swhere ^p`(t) = PD ... |

150 | Littlewood-Paley Theory and the study of function spaces - Frazier, Jawerth, et al. - 1991 |

101 |
Multiresolution analysis, wavelets and fast wavelet transform on an interval. Comptes Rendus Acad. Sci Paris
- Coen, Daubechies, et al.
- 1993
(Show Context)
Citation Context ...ty, and if f = T V K;2 (y; ffi ) thensf(t) = 1 n n X i=1 y i K ` t \Gamma t i ffi (t) ' OE ffi (t): (3) More refined versions of this formula would adjust K for boundary effects near t = 0 and t = 1. =-=[5]-=-. Variable-Bandwidth High-Order Kernels T V K;D (y; ffi ), D ? 2. Here ffi is again the local bandwidth, and the reconstruction formula is as in (3), only K(\Delta) is a C D function integrating to 1,... |

88 | Flexible parsimonious smoothing and additive modelling (C/R: p - Friedman, Silverman - 1989 |

53 | Robust kernel density estimation - Kim, Scott |

45 | Ondelettes sur l’intervalle - Meyer - 1991 |

39 | Learning algorithm for nonparametric filtering. Auto - Efromovich, Pinsker - 1984 |

36 |
Locally adaptive bandwidth choice for kernel regression estimators
- Brockmann, Gasser, et al.
- 1993
(Show Context)
Citation Context ...(t) = L X `=1sp ` (t)1 I ` (t); wheresp ` (t) = P D k=0 a k t k is determined by applying the least squares principle to the data arising for interval I ` X t i 2I ` (p ` (t i ) \Gamma y i ) 2 = min! =-=[3]-=-. Variable-Knot Splines T spl;D (y; ffi ). Here ffi defines a partition as above, and on each interval of the partition the reconstruction formula is a polynomial of degree D, but now the reconstructi... |

31 | Variable Bandwidth Kernel Estimators of Regression Curves” Annals of Statistics - Müller, Stadtmüller - 1987 |

24 | Ondelettes et Operateurs I: Ondelettes", Hermann Editeurs - Meyer - 1990 |

15 | Selection of subsets of regression variables (with Discussion - Miller - 1984 |

14 | On problems of adaptive estimation in white Gaussian noise,” in Topics in nonparametric estimation, ser - Lepskiı̆ - 1992 |

12 |
Minimax estimation of a normal mean subject to doing well at a point
- Bickel
- 1983
(Show Context)
Citation Context ...onstruction formula with "spatial smoothing" parameter ffi , and d(y) is a data-adaptive choice of the spatial smoothing parameter ffi. A clearer picture of what we intend emerges from five =-=examples. [1]-=-. Piecewise Constant Reconstruction T PC (y; ffi ). Here ffi is a finite list of, say, L real numbers defining a partition (I 1 ; : : : ; I L ) of [0; 1] via I 1 = [0; ffi 1 ); I 2 = [ffi 1 ; ffi 1 + ... |

6 | Multivariate additive regression splines, (with discussion - Friedman - 1991 |

4 | A learning algorithm for nonparametric ltering - Pinsker - 1984 |

1 |
Superefficiency and lack of adaptability in nonparametric functional estimation. To appear, Annals of Statistics
- BROWN
- 1993
(Show Context)
Citation Context ...mials s(t) satisfying / d dt k s ! (�� ` \Gamma) = / d dt k s ! (�� ` +) for k = 0; : : : ; D \Gamma 1, ` = 2; : : : ; L; subject to this constraint, one solves n X i=1 (s(t i ) \Gamma y i ) 2=-= = min! [4]. Variable-=- Bandwidth Kernel Methods T V K;2 (y; ffi). Now ffi is a function on [0; 1]; ffi (t) represents the "bandwidth of the kernel at t"; the smoothing kernel K is a C 2 function of compact suppor... |

1 | The risk inflation of variable selection in regression - GEORGE |

1 | Ondelettes sur l'Intervalle: algorithmes rapides - MALGOUYRES - 1991 |

1 | Variable bandwidth kernel estimators of regression curves - ULLER, Hans-Georg - 1987 |

1 |
Supere ciency and lack of adaptability in nonparametric functional estimation. To appear, Annals of Statistics
- BROWN, LOW
- 1993
(Show Context)
Citation Context ...ion is chosen from among those piecewise polynomialss(t) satisfying k d s dt ! ( `;) = k d s dt ! ( `+) fork =0�:::�D; 1,` =2�:::�L� subject to this constraint, one solves nX i=1 (s(ti) ;yi) 2 = min! =-=[4]-=-. Variable Bandwidth Kernel MethodsTVK�2(y� ). Now is a function on [0� 1]� (t) represents the \bandwidth of the kernel att"� the smoothing kernelKis aC 2 function of compact support which is also a p... |

1 | The risk in ation of variable selection in regression - GEORGE, Foster |