Results 1  10
of
617
Adapting to unknown smoothness via wavelet shrinkage
 JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
, 1995
"... We attempt to recover a function of unknown smoothness from noisy, sampled data. We introduce a procedure, SureShrink, which suppresses noise by thresholding the empirical wavelet coefficients. The thresholding is adaptive: a threshold level is assigned to each dyadic resolution level by the princip ..."
Abstract

Cited by 990 (20 self)
 Add to MetaCart
We attempt to recover a function of unknown smoothness from noisy, sampled data. We introduce a procedure, SureShrink, which suppresses noise by thresholding the empirical wavelet coefficients. The thresholding is adaptive: a threshold level is assigned to each dyadic resolution level by the principle of minimizing the Stein Unbiased Estimate of Risk (Sure) for threshold estimates. The computational effort of the overall procedure is order N log(N) as a function of the sample size N. SureShrink is smoothnessadaptive: if the unknown function contains jumps, the reconstruction (essentially) does also; if the unknown function has a smooth piece, the reconstruction is (essentially) as smooth as the mother wavelet will allow. The procedure is in a sense optimally smoothnessadaptive: it is nearminimax simultaneously over a whole interval of the Besov scale; the size of this interval depends on the choice of mother wavelet. We know from a previous paper by the authors that traditional smoothing methods  kernels, splines, and orthogonal series estimates  even with optimal choices of the smoothing parameter, would be unable to perform
Factoring wavelet transforms into lifting steps
 J. Fourier Anal. Appl
, 1998
"... ABSTRACT. This paper is essentially tutorial in nature. We show how any discrete wavelet transform or two band subband filtering with finite filters can be decomposed into a finite sequence of simple filtering steps, which we call lifting steps but that are also known as ladder structures. This dec ..."
Abstract

Cited by 573 (8 self)
 Add to MetaCart
(Show Context)
ABSTRACT. This paper is essentially tutorial in nature. We show how any discrete wavelet transform or two band subband filtering with finite filters can be decomposed into a finite sequence of simple filtering steps, which we call lifting steps but that are also known as ladder structures. This decomposition corresponds to a factorization of the polyphase matrix of the wavelet or subband filters into elementary matrices. That such a factorization is possible is wellknown to algebraists (and expressed by the formula); it is also used in linear systems theory in the electrical engineering community. We present here a selfcontained derivation, building the decomposition from basic principles such as the Euclidean algorithm, with a focus on applying it to wavelet filtering. This factorization provides an alternative for the lattice factorization, with the advantage that it can also be used in the biorthogonal, i.e, nonunitary case. Like the lattice factorization, the decomposition presented here asymptotically reduces the computational complexity of the transform by a factor two. It has other applications, such as the possibility of defining a waveletlike transform that maps integers to integers. 1.
The Lifting Scheme: A Construction Of Second Generation Wavelets
, 1997
"... . We present the lifting scheme, a simple construction of second generation wavelets, wavelets that are not necessarily translates and dilates of one fixed function. Such wavelets can be adapted to intervals, domains, surfaces, weights, and irregular samples. We show how the lifting scheme leads to ..."
Abstract

Cited by 541 (16 self)
 Add to MetaCart
(Show Context)
. We present the lifting scheme, a simple construction of second generation wavelets, wavelets that are not necessarily translates and dilates of one fixed function. Such wavelets can be adapted to intervals, domains, surfaces, weights, and irregular samples. We show how the lifting scheme leads to a faster, inplace calculation of the wavelet transform. Several examples are included. Key words. wavelet, multiresolution, second generation wavelet, lifting scheme AMS subject classifications. 42C15 1. Introduction. Wavelets form a versatile tool for representing general functions or data sets. Essentially we can think of them as data building blocks. Their fundamental property is that they allow for representations which are efficient and which can be computed fast. In other words, wavelets are capable of quickly capturing the essence of a data set with only a small set of coefficients. This is based on the fact that most data sets have correlation both in time (or space) and frequenc...
Splines: A Perfect Fit for Signal/Image Processing
 IEEE SIGNAL PROCESSING MAGAZINE
, 1999
"... ..."
(Show Context)
Minimax Estimation via Wavelet Shrinkage
, 1992
"... We attempt to recover an unknown function from noisy, sampled data. Using orthonormal bases of compactly supported wavelets we develop a nonlinear method which works in the wavelet domain by simple nonlinear shrinkage of the empirical wavelet coe cients. The shrinkage can be tuned to be nearly minim ..."
Abstract

Cited by 322 (32 self)
 Add to MetaCart
We attempt to recover an unknown function from noisy, sampled data. Using orthonormal bases of compactly supported wavelets we develop a nonlinear method which works in the wavelet domain by simple nonlinear shrinkage of the empirical wavelet coe cients. The shrinkage can be tuned to be nearly minimax over any member of a wide range of Triebel and Besovtype smoothness constraints, and asymptotically minimax over Besov bodies with p q. Linear estimates cannot achieve even the minimax rates over Triebel and Besov classes with p <2, so our method can signi cantly outperform every linear method (kernel, smoothing spline, sieve,:::) in a minimax sense. Variants of our method based on simple threshold nonlinearities are nearly minimax. Our method possesses the interpretation of spatial adaptivity: it reconstructs using a kernel which mayvary in shape and bandwidth from point to point, depending on the data. Least favorable distributions for certain of the Triebel and Besov scales generate objects with sparse wavelet transforms. Many real objects have similarly sparse transforms, which suggests that these minimax results are relevant for practical problems. Sequels to this paper discuss practical implementation, spatial adaptation properties and applications to inverse problems.
Texture analysis and classification with treestructured wavelet transform
 IEEE Trans. Image Processing
, 1993
"... AbstractOne difficulty of texture analysis in the past was the lack of adequate tools to characterize different scales of textures effectively. Recent developments in multiresolution analysis such as the Gabor and wavelet transforms help to overcome this difficulty. In this research, we propose a m ..."
Abstract

Cited by 314 (1 self)
 Add to MetaCart
(Show Context)
AbstractOne difficulty of texture analysis in the past was the lack of adequate tools to characterize different scales of textures effectively. Recent developments in multiresolution analysis such as the Gabor and wavelet transforms help to overcome this difficulty. In this research, we propose a multiresolution approach based on a modified wavelet transform called the treestructured wavelet transform or wavelet packets for texture analysis and classification. The development of this new transform is motivated by the observation that a large class of natural textures can be modeled as quasiperiodic signals whose dominant frequencies are located in the middle frequency channels. With the transform, we are able to zoom into any desired frequency channels for further decomposition. In contrast, the conventional pyramidstructured wavelet transform performs further decomposition only in low frequency channels. We develop a progressive texture classification algorithm which is not only computationally attractive but also has excellent performance. The performance of our new method is compared with that of several other methods using the DCT, DST, DHT, pyramidstructured wavelet transforms, Gabor filters, and Laws filters.
Geodesic Active Regions and Level Set Methods for Supervised Texture Segmentation
 INTERNATIONAL JOURNAL OF COMPUTER VISION
, 2002
"... This paper presents a novel variational framework to deal with frame partition problems in Computer Vision. This framework exploits boundary and regionbased segmentation modules under a curvebased optimization objective function. The task of supervised texture segmentation is considered to demonst ..."
Abstract

Cited by 309 (9 self)
 Add to MetaCart
This paper presents a novel variational framework to deal with frame partition problems in Computer Vision. This framework exploits boundary and regionbased segmentation modules under a curvebased optimization objective function. The task of supervised texture segmentation is considered to demonstrate the potentials of the proposed framework. The textured feature space is generated by filtering the given textured images using isotropic and anisotropic filters, and analyzing their responses as multicomponent conditional probability density functions. The texture segmentation is obtained by unifying region and boundarybased information as an improved Geodesic Active Contour Model. The defined objective function is minimized using a gradientdescent method where a level set approach is used to implement the obtained PDE. According to this PDE, the curve propagation towards the final solution is guided by boundary and regionbased segmentation forces, and is constrained by a regularity force. The level set implementation is performed using a fast front propagation algorithm where topological changes are naturally handled. The performance of our method is demonstrated on a variety of synthetic and real textured frames.
Nonlinear solution of linear inverse problems by waveletvaguelette decomposition
, 1992
"... We describe the WaveletVaguelette Decomposition (WVD) of a linear inverse problem. It is a substitute for the singular value decomposition (SVD) of an inverse problem, and it exists for a class of special inverse problems of homogeneous type { such asnumerical di erentiation, inversion of Abeltype ..."
Abstract

Cited by 248 (12 self)
 Add to MetaCart
(Show Context)
We describe the WaveletVaguelette Decomposition (WVD) of a linear inverse problem. It is a substitute for the singular value decomposition (SVD) of an inverse problem, and it exists for a class of special inverse problems of homogeneous type { such asnumerical di erentiation, inversion of Abeltype transforms, certain convolution transforms, and the Radon Transform. We propose to solve illposed linear inverse problems by nonlinearly \shrinking" the WVD coe cients of the noisy, indirect data. Our approach o ers signi cant advantages over traditional SVD inversion in the case of recovering spatially inhomogeneous objects. We suppose that observations are contaminated by white noise and that the object is an unknown element of a Besov space. We prove that nonlinear WVD shrinkage can be tuned to attain the minimax rate of convergence, for L 2 loss, over the entire Besov scale. The important case of Besov spaces Bp;q, p <2, which model spatial inhomogeneity, is included. In comparison, linear procedures { SVD included { cannot attain optimal rates of convergence over such classes in the case p<2. For example, our methods achieve faster rates of convergence, for objects known to lie in the Bump Algebra or in Bounded Variation, than any linear procedure.
Filters, Random Fields and Maximum Entropy . . .
 INTERNATIONAL JOURNAL OF COMPUTER VISION
, 1998
"... This article presents a statistical theory for texture modeling. This theory combines filtering theory and Markov random field modeling through the maximum entropy principle, and interprets and clarifies many previous concepts and methods for texture analysis and synthesis from a unified point of vi ..."
Abstract

Cited by 233 (17 self)
 Add to MetaCart
(Show Context)
This article presents a statistical theory for texture modeling. This theory combines filtering theory and Markov random field modeling through the maximum entropy principle, and interprets and clarifies many previous concepts and methods for texture analysis and synthesis from a unified point of view. Our theory characterizes the ensemble of images I with the same texture appearance by a probability distribution f (I) on a random field, and the objective of texture modeling is to make inference about f (I), given a set of observed texture examples. In our theory, texture modeling consists of two steps. (1) A set of filters is selected from a general filter bank to capture features of the texture, these filters are applied to observed texture images, and the histograms of the filtered images are extracted. These histograms are estimates of the marginal distributions of f (I). This step is called feature extraction. (2) The maximum entropy principle is employed to derive a distribution p(I), which is restricted to have the same marginal distributions as those in (1). This p(I) is considered as an estimate of f (I). This step is called feature fusion. A stepwise algorithm is proposed to choose filters from a general filter bank. The resulting model, called FRAME (Filters, Random fields And Maximum Entropy), is a Markov random field (MRF) model, but with a much enriched vocabulary and hence much stronger descriptive ability than the previous MRF models used for texture modeling. Gibbs sampler is adopted to synthesize texture images by drawing typical samples from p(I), thus the model is verified by seeing whether the synthesized texture images have similar visual appearances
Minimax Entropy Principle and Its Application to Texture Modeling
, 1997
"... This article proposes a general theory and methodology, called the minimax entropy principle, for building statistical models for images (or signals) in a variety of applications. This principle consists of two parts. The first is the maximum entropy principle for feature binding (or fusion): for a ..."
Abstract

Cited by 226 (46 self)
 Add to MetaCart
This article proposes a general theory and methodology, called the minimax entropy principle, for building statistical models for images (or signals) in a variety of applications. This principle consists of two parts. The first is the maximum entropy principle for feature binding (or fusion): for a certain set of feature statistics, a distribution can be built to bind these feature statistics together by maximizing the entropy over all distributions that reproduce these feature statistics. The second part is the minimum entropy principle for feature selection: among all plausible sets of feature statistics, we choose the set whose maximum entropy distribution has the minimum entropy. Computational and inferential issues in both parts are addressed, in particular, a feature pursuit procedure is proposed for approximately selecting the optimal set of features. The model complexity is restricted because of the sample variation in the observed feature statistics. The minimax entropy principle is applied to texture modeling, where a novel Markov random field (MRF) model, called FRAME (Filter, Random field, And Minimax Entropy), is derived, and encouraging results are obtained in experiments on a variety of texture images. Relationship between our theory and the mechanisms of neural computation is also discussed.