Results 1  10
of
47
Atomic decomposition by basis pursuit
 SIAM Journal on Scientific Computing
, 1998
"... Abstract. The timefrequency and timescale communities have recently developed a large number of overcomplete waveform dictionaries — stationary wavelets, wavelet packets, cosine packets, chirplets, and warplets, to name a few. Decomposition into overcomplete systems is not unique, and several meth ..."
Abstract

Cited by 1719 (48 self)
 Add to MetaCart
Abstract. The timefrequency and timescale communities have recently developed a large number of overcomplete waveform dictionaries — stationary wavelets, wavelet packets, cosine packets, chirplets, and warplets, to name a few. Decomposition into overcomplete systems is not unique, and several methods for decomposition have been proposed, including the method of frames (MOF), Matching pursuit (MP), and, for special dictionaries, the best orthogonal basis (BOB). Basis Pursuit (BP) is a principle for decomposing a signal into an “optimal ” superposition of dictionary elements, where optimal means having the smallest l 1 norm of coefficients among all such decompositions. We give examples exhibiting several advantages over MOF, MP, and BOB, including better sparsity and superresolution. BP has interesting relations to ideas in areas as diverse as illposed problems, in abstract harmonic analysis, total variation denoising, and multiscale edge denoising. BP in highly overcomplete dictionaries leads to largescale optimization problems. With signals of length 8192 and a wavelet packet dictionary, one gets an equivalent linear program of size 8192 by 212,992. Such problems can be attacked successfully only because of recent advances in linear programming by interiorpoint methods. We obtain reasonable success with a primaldual logarithmic barrier method and conjugategradient solver.
Ideal spatial adaptation by wavelet shrinkage
 Biometrika
, 1994
"... With ideal spatial adaptation, an oracle furnishes information about how best to adapt a spatially variable estimator, whether piecewise constant, piecewise polynomial, variable knot spline, or variable bandwidth kernel, to the unknown function. Estimation with the aid of an oracle o ers dramatic ad ..."
Abstract

Cited by 862 (4 self)
 Add to MetaCart
With ideal spatial adaptation, an oracle furnishes information about how best to adapt a spatially variable estimator, whether piecewise constant, piecewise polynomial, variable knot spline, or variable bandwidth kernel, to the unknown function. Estimation with the aid of an oracle o ers dramatic advantages over traditional linear estimation by nonadaptive kernels � however, it is a priori unclear whether such performance can be obtained by a procedure relying on the data alone. We describe a new principle for spatiallyadaptive estimation: selective wavelet reconstruction. Weshowthatvariableknot spline ts and piecewisepolynomial ts, when equipped with an oracle to select the knots, are not dramatically more powerful than selective wavelet reconstruction with an oracle. We develop a practical spatially adaptive method, RiskShrink, which works by shrinkage of empirical wavelet coe cients. RiskShrink mimics the performance of an oracle for selective wavelet reconstruction as well as it is possible to do so. A new inequality inmultivariate normal decision theory which wecallthe oracle inequality shows that attained performance di ers from ideal performance by at most a factor 2logn, where n is the sample size. Moreover no estimator can give a better guarantee than this. Within the class of spatially adaptive procedures, RiskShrink is essentially optimal. Relying only on the data, it comes within a factor log 2 n of the performance of piecewise polynomial and variableknot spline methods equipped with an oracle. In contrast, it is unknown how or if piecewise polynomial methods could be made to function this well when denied access to an oracle and forced to rely on data alone.
DeNoising By SoftThresholding
, 1992
"... Donoho and Johnstone (1992a) proposed a method for reconstructing an unknown function f on [0; 1] from noisy data di = f(ti)+ zi, iid i =0;:::;n 1, ti = i=n, zi N(0; 1). The reconstruction fn ^ is de ned in the wavelet domain by translating all the empirical wavelet coe cients of d towards 0 by an a ..."
Abstract

Cited by 827 (13 self)
 Add to MetaCart
Donoho and Johnstone (1992a) proposed a method for reconstructing an unknown function f on [0; 1] from noisy data di = f(ti)+ zi, iid i =0;:::;n 1, ti = i=n, zi N(0; 1). The reconstruction fn ^ is de ned in the wavelet domain by translating all the empirical wavelet coe cients of d towards 0 by an amount p 2 log(n) = p n. We prove two results about that estimator. [Smooth]: With high probability ^ fn is at least as smooth as f, in any of a wide variety of smoothness measures. [Adapt]: The estimator comes nearly as close in mean square to f as any measurable estimator can come, uniformly over balls in each of two broad scales of smoothness classes. These two properties are unprecedented in several ways. Our proof of these results develops new facts about abstract statistical inference and its connection with an optimal recovery model.
The Lifting Scheme: A Construction Of Second Generation Wavelets
, 1997
"... . We present the lifting scheme, a simple construction of second generation wavelets, wavelets that are not necessarily translates and dilates of one fixed function. Such wavelets can be adapted to intervals, domains, surfaces, weights, and irregular samples. We show how the lifting scheme leads to ..."
Abstract

Cited by 385 (16 self)
 Add to MetaCart
. We present the lifting scheme, a simple construction of second generation wavelets, wavelets that are not necessarily translates and dilates of one fixed function. Such wavelets can be adapted to intervals, domains, surfaces, weights, and irregular samples. We show how the lifting scheme leads to a faster, inplace calculation of the wavelet transform. Several examples are included. Key words. wavelet, multiresolution, second generation wavelet, lifting scheme AMS subject classifications. 42C15 1. Introduction. Wavelets form a versatile tool for representing general functions or data sets. Essentially we can think of them as data building blocks. Their fundamental property is that they allow for representations which are efficient and which can be computed fast. In other words, wavelets are capable of quickly capturing the essence of a data set with only a small set of coefficients. This is based on the fact that most data sets have correlation both in time (or space) and frequenc...
Basis pursuit
 in Proceedings of the Asilomar Conference on Signals, Systems, and Computers
, 1994
"... ..."
Interpolating Wavelet Transform
, 1992
"... We describe several "wavelet transforms" which characterize smoothness spaces and for which the coefficients are obtained by sampling rather than integration. We use them to reinterpret the empirical wavelet transform, i.e. the common practice of applying pyramid filters to samples of a f ..."
Abstract

Cited by 123 (13 self)
 Add to MetaCart
We describe several "wavelet transforms" which characterize smoothness spaces and for which the coefficients are obtained by sampling rather than integration. We use them to reinterpret the empirical wavelet transform, i.e. the common practice of applying pyramid filters to samples of a function.
Wavelets on Closed Subsets of the Real Line
 in: Topics in the Theory and Applications of Wavelets, L.L. Schumaker and
"... . We construct orthogonal and biorthogonal wavelets on a given closed subset of the real line. We also study wavelets satisfying certain types of boundary conditions. We introduce the concept of "wavelet probing ", which is closely related to our construction of wavelets. This technique al ..."
Abstract

Cited by 69 (5 self)
 Add to MetaCart
. We construct orthogonal and biorthogonal wavelets on a given closed subset of the real line. We also study wavelets satisfying certain types of boundary conditions. We introduce the concept of "wavelet probing ", which is closely related to our construction of wavelets. This technique allows us to very quickly perform a number of different numerical tasks associated with wavelets. x1. Introduction Wavelets and multiscale analysis have emerged in a number of different fields, from harmonic analysis and partial differential equations in pure mathematics to signal and image processing in computer science and electrical engineering. Typically a general function, signal, or image is broken up into linear combinations of translated and scaled versions of some simple, basic building blocks. Multiscale analysis comes with a natural hierarchical structure obtained by only considering the linear combinations of building blocks up to a certain scale. This hierarchical structure is particularly...
Smooth Wavelet Decompositions with Blocky Coefficient Kernels
, 1993
"... We describe bases of smooth wavelets where the coefficients are obtained by integration against (finite combinations of) boxcar kernels rather than against traditional smooth wavelets. Bases of this type were first developed in work of Tchamitchian and of Cohen, Daubechies, and Feauveau. Our approac ..."
Abstract

Cited by 54 (12 self)
 Add to MetaCart
We describe bases of smooth wavelets where the coefficients are obtained by integration against (finite combinations of) boxcar kernels rather than against traditional smooth wavelets. Bases of this type were first developed in work of Tchamitchian and of Cohen, Daubechies, and Feauveau. Our approach emphasizes the idea of averageinterpolation  synthesizing a smooth function on the line having prescribed boxcar averages  and the link between averageinterpolation and DubucDeslauriers interpolation. We also emphasize characterizations of smooth functions via their coefficients. We describe boundarycorrected expansions for the interval, which have a simple and revealing form. We use these results to reinterpret the empirical wavelet transform  i.e. finite, discrete wavelet transforms of data arising from boxcar integrators (e.g. CCD devices).
Wavelet Thresholding in Anisotropic Function Classes and Application to Adaptive Estimation of Evolutionary Spectra
, 1997
"... We derive minimax rates for estimation in anisotropic smoothness classes. This rate is attained by a coordinatewise thresholded wavelet estimator based on a tensor product basis with separate scale parameter for every dimension. It is shown that this basis is superior to its onescale multiresoluti ..."
Abstract

Cited by 45 (14 self)
 Add to MetaCart
We derive minimax rates for estimation in anisotropic smoothness classes. This rate is attained by a coordinatewise thresholded wavelet estimator based on a tensor product basis with separate scale parameter for every dimension. It is shown that this basis is superior to its onescale multiresolution analog, if different degrees of smoothness in different directions are present. As an important application we introduce a new adaptive wavelet estimator of the timedependent spectrum of a locally stationary time series. Using this model which was recently developed by Dahlhaus, we show that the resulting estimator attains nearly the rate, which is optimal in Gaussian white noise, simultaneously over a wide range of smoothness classes. Moreover, by our new approach we overcome the difficulty of how to choose the right amount of smoothing, i.e. how to adapt to the appropriate resolution, for reconstructing the local structure of the evolutionary spectrum in the timefrequency plane.