Results 1 
7 of
7
Adapting to unknown smoothness via wavelet shrinkage
 JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
, 1995
"... We attempt to recover a function of unknown smoothness from noisy, sampled data. We introduce a procedure, SureShrink, which suppresses noise by thresholding the empirical wavelet coefficients. The thresholding is adaptive: a threshold level is assigned to each dyadic resolution level by the princip ..."
Abstract

Cited by 675 (19 self)
 Add to MetaCart
We attempt to recover a function of unknown smoothness from noisy, sampled data. We introduce a procedure, SureShrink, which suppresses noise by thresholding the empirical wavelet coefficients. The thresholding is adaptive: a threshold level is assigned to each dyadic resolution level by the principle of minimizing the Stein Unbiased Estimate of Risk (Sure) for threshold estimates. The computational effort of the overall procedure is order N log(N) as a function of the sample size N. SureShrink is smoothnessadaptive: if the unknown function contains jumps, the reconstruction (essentially) does also; if the unknown function has a smooth piece, the reconstruction is (essentially) as smooth as the mother wavelet will allow. The procedure is in a sense optimally smoothnessadaptive: it is nearminimax simultaneously over a whole interval of the Besov scale; the size of this interval depends on the choice of mother wavelet. We know from a previous paper by the authors that traditional smoothing methods  kernels, splines, and orthogonal series estimates  even with optimal choices of the smoothing parameter, would be unable to perform
Minimax Estimation via Wavelet Shrinkage
, 1992
"... We attempt to recover an unknown function from noisy, sampled data. Using orthonormal bases of compactly supported wavelets we develop a nonlinear method which works in the wavelet domain by simple nonlinear shrinkage of the empirical wavelet coe cients. The shrinkage can be tuned to be nearly minim ..."
Abstract

Cited by 246 (32 self)
 Add to MetaCart
We attempt to recover an unknown function from noisy, sampled data. Using orthonormal bases of compactly supported wavelets we develop a nonlinear method which works in the wavelet domain by simple nonlinear shrinkage of the empirical wavelet coe cients. The shrinkage can be tuned to be nearly minimax over any member of a wide range of Triebel and Besovtype smoothness constraints, and asymptotically minimax over Besov bodies with p q. Linear estimates cannot achieve even the minimax rates over Triebel and Besov classes with p <2, so our method can signi cantly outperform every linear method (kernel, smoothing spline, sieve,:::) in a minimax sense. Variants of our method based on simple threshold nonlinearities are nearly minimax. Our method possesses the interpretation of spatial adaptivity: it reconstructs using a kernel which mayvary in shape and bandwidth from point to point, depending on the data. Least favorable distributions for certain of the Triebel and Besov scales generate objects with sparse wavelet transforms. Many real objects have similarly sparse transforms, which suggests that these minimax results are relevant for practical problems. Sequels to this paper discuss practical implementation, spatial adaptation properties and applications to inverse problems.
Unconditional bases are optimal bases for data compression and for statistical estimation
 Applied and Computational Harmonic Analysis
, 1993
"... An orthogonal basis of L 2 which is also an unconditional basis of a functional space F is a kind of optimal basis for compressing, estimating, and recovering functions in F. Simple thresholding operations, applied in the unconditional basis, work essentially better for compressing, estimating, and ..."
Abstract

Cited by 140 (23 self)
 Add to MetaCart
An orthogonal basis of L 2 which is also an unconditional basis of a functional space F is a kind of optimal basis for compressing, estimating, and recovering functions in F. Simple thresholding operations, applied in the unconditional basis, work essentially better for compressing, estimating, and recovering than they do in any other orthogonal basis. In fact, simple thresholding in an unconditional basis works essentially better for recovery and estimation than other methods, period. (Performance is measured in an asymptotic minimax sense.) As an application, we formalize and prove Mallat's Heuristic, which says that wavelet bases are optimal for representing functions containing singularities, when there may be an arbitrary number of singularities, arbitrarily distributed.
A datadriven block thresholding approach to wavelet estimation
, 2005
"... A datadriven block thresholding procedure for wavelet regression is proposed and its theoretical and numerical properties are investigated. The procedure empirically chooses the block size and threshold level at each resolution level by minimizing Stein’s unbiased risk estimate. The estimator is sh ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
A datadriven block thresholding procedure for wavelet regression is proposed and its theoretical and numerical properties are investigated. The procedure empirically chooses the block size and threshold level at each resolution level by minimizing Stein’s unbiased risk estimate. The estimator is sharp adaptive over a class of Besov bodies and achieves simultaneously within a small constant factor of the minimax risk over a wide collection of Besov Bodies including both the dense and sparse cases. The procedure is easy to implement. Numerical results show that it has superior finite sample performance in comparison to the other leading wavelet thresholding estimators.
ASYMPTOTIC BEHAVIOR OF L 2NORMALIZED EIGENFUNCTIONS OF THE LAPLACEBELTRAMI OPERATOR ON A CLOSED RIEMANNIAN MANIFOLD
, 2005
"... Abstract. Let e(x,y, λ) be the spectral function and χλ the unit band spectral projection operator, with respect to the LaplaceBeltrami operator ∆M on a closed Riemannian manifold M. We firstly review the oneterm asymptotic formula of e(x,x, λ) as λ → ∞ by Hörmander (1968) and the one of ∂ α x ∂ β ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract. Let e(x,y, λ) be the spectral function and χλ the unit band spectral projection operator, with respect to the LaplaceBeltrami operator ∆M on a closed Riemannian manifold M. We firstly review the oneterm asymptotic formula of e(x,x, λ) as λ → ∞ by Hörmander (1968) and the one of ∂ α x ∂ β y e(x, y,λ)x=y as λ → ∞ in a geodesic normal coordinate chart by the author (2004) and the sharp asymptotic estimates from above of the mapping norm ‖χλ‖L2→Lp (2 ≤ p ≤ ∞) by Sogge (1988 & 1989) and of the mapping norm ‖χλ‖L2→Sobolev Lp by the author (2004). In the paper we show the one term asymptotic formula for e(x,y,λ) as λ → ∞, provided that the Riemannian distance between x and y is O(1/λ). As a consequence, we obtain the sharp estimate of the mapping norm ‖χλ ‖ L2→C δ (0 < δ < 1), where Cδ (M) is the space of Hölder continuous functions with exponent δ on M. Moreover, we show a geometric property of the eigenfunction eλ: ∆Meλ + λ 2 eλ = 0, which says that 1/λ is comparable to the distance between the nodal set of eλ (where eλ vanishes) and the concentrating set of eλ (where eλ attains its maximum or minimum) as λ → ∞.
COMPARISON OF THE CLASSICAL BMO WITH THE BMO SPACES ASSOCIATED WITH OPERATORS AND APPLICATIONS
, 2006
"... Abstract. Let L be a generator of a semigroup satisfying the Gaussian upper bounds. In this paper, we study further a new BMOL space associated with L which was introduced recently by Duong and Yan. We discuss applications of the new BMOL spaces in the theory of singular integration such as BMOL est ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract. Let L be a generator of a semigroup satisfying the Gaussian upper bounds. In this paper, we study further a new BMOL space associated with L which was introduced recently by Duong and Yan. We discuss applications of the new BMOL spaces in the theory of singular integration such as BMOL estimates and interpolation results for fractional powers, purely imaginary powers and spectral multipliers of self adjoint operators. We also demonstrate that the space BMOL might coincide with or might be essentially different from the classical BMO space. 1.
LIFTING OF S 1VALUED MAPS IN SUMS OF SOBOLEV SPACES PETRU MIRONESCU
, 2012
"... Abstract. We describe, in terms of lifting, the closure of smooth S 1valued maps in the space W s,p ((−1, 1) N; S 1). (Here, 0 < s < ∞ and 1 ≤ p < ∞.) This description follows from an estimate for the phase of smooth maps: let 0 < s < 1, let ϕ ∈ C ∞ ([−1, 1] N; R) and set u = e ıϕ. Then we may spli ..."
Abstract
 Add to MetaCart
Abstract. We describe, in terms of lifting, the closure of smooth S 1valued maps in the space W s,p ((−1, 1) N; S 1). (Here, 0 < s < ∞ and 1 ≤ p < ∞.) This description follows from an estimate for the phase of smooth maps: let 0 < s < 1, let ϕ ∈ C ∞ ([−1, 1] N; R) and set u = e ıϕ. Then we may split ϕ = ϕ1 + ϕ2, where the smooth maps ϕ1 and ϕ2 satisfy sp (∗) ϕ1W s,p ≤ CuW s,p and ‖∇ϕ2‖Lsp ≤ Cup W s,p. (∗) was proved for s = 1/2, p = 2 and arbitrary space dimension N by Bourgain and Brezis [3] and for N = 1, p> 1 and s = 1/p by Nguyen [14]. Our proof is a sort of continuous version of the BourgainBrezis approach (based on paraproducts). Estimate (∗) answers (and generalizes) a question of Bourgain, Brezis, and the author [5]. 1.