Results 1  10
of
36
WaveletBased Texture Retrieval Using Generalized Gaussian Density and KullbackLeibler Distance
 IEEE Trans. Image Processing
, 2002
"... We present a statistical view of the texture retrieval problem by combining the two related tasks, namely feature extraction (FE) and similarity measurement (SM), into a joint modeling and classification scheme. We show that using a consistent estimator of texture model parameters for the FE step fo ..."
Abstract

Cited by 145 (4 self)
 Add to MetaCart
We present a statistical view of the texture retrieval problem by combining the two related tasks, namely feature extraction (FE) and similarity measurement (SM), into a joint modeling and classification scheme. We show that using a consistent estimator of texture model parameters for the FE step followed by computing the KullbackLeibler distance (KLD) between estimated models for the SM step is asymptotically optimal in term of retrieval error probability. The statistical scheme leads to a new waveletbased texture retrieval method that is based on the accurate modeling of the marginal distribution of wavelet coefficients using generalized Gaussian density (GGD) and on the existence a closed form for the KLD between GGDs. The proposed method provides greater accuracy and flexibility in capturing texture information, while its simplified form has a close resemblance with the existing methods which uses energy distribution in the frequency domain to identify textures. Experimental results on a database of 640 texture images indicate that the new method significantly improves retrieval rates, e.g., from 65% to 77%, compared with traditional approaches, while it retains comparable levels of computational complexity.
Bivariate Shrinkage Functions for WaveletBased Denoising Exploiting Interscale Dependency
, 2002
"... Most simple nonlinear thresholding rules for waveletbased denoising assume that the wavelet coefficients are independent. However, wavelet coefficients of natural images have significant dependencies. In this paper, we will only consider the dependencies between the coefficients and their parents i ..."
Abstract

Cited by 135 (4 self)
 Add to MetaCart
Most simple nonlinear thresholding rules for waveletbased denoising assume that the wavelet coefficients are independent. However, wavelet coefficients of natural images have significant dependencies. In this paper, we will only consider the dependencies between the coefficients and their parents in detail. For this purpose, new nonGaussian bivariate distributions are proposed, and corresponding nonlinear threshold functions (shrinkage functions) are derived from the models using Bayesian estimation theory. The new shrinkage functions do not assume the independence of wavelet coefficients. We will show three image denoising examples in order to show the performance of these new bivariate shrinkage rules. In the second example, a simple subbanddependent datadriven image denoising system is described and compared with effective datadriven techniques in the literature, namely VisuShrink, SureShrink, BayesShrink, and hidden Markov models. In the third example, the same idea is applied to the dualtree complex wavelet coefficients.
Bivariate Shrinkage with Local Variance Estimation
, 2002
"... The performance of imagedenoising algorithms using wavelet transforms can be improved significantly by taking into account the statistical dependencies among wavelet coefficients as demonstrated by several algorithms presented in the literature. In two earlier papers by the authors, a simple bivari ..."
Abstract

Cited by 74 (5 self)
 Add to MetaCart
The performance of imagedenoising algorithms using wavelet transforms can be improved significantly by taking into account the statistical dependencies among wavelet coefficients as demonstrated by several algorithms presented in the literature. In two earlier papers by the authors, a simple bivariate shrinkage rule is described using a coefficient and its parent. The performance can also be improved using simple models by estimating model parameters in a local neighborhood. This letter presents a locally adaptive denoising algorithm using the bivariate shrinkage function. The algorithm is illustrated using both the orthogonal and dual tree complex wavelet transforms. Some comparisons with the best available results will be given in order to illustrate the effectiveness of the proposed algorithm.
Wavelets, Approximation, and Compression
, 2001
"... this article is to look at recent wavelet advances from a signal processing perspective. In particular, approximation results are reviewed, and the implication on compression algorithms is discussed. New constructions and open problems are also addressed ..."
Abstract

Cited by 51 (6 self)
 Add to MetaCart
this article is to look at recent wavelet advances from a signal processing perspective. In particular, approximation results are reviewed, and the implication on compression algorithms is discussed. New constructions and open problems are also addressed
Bayesian Compressed Sensing via Belief Propagation
, 2010
"... Compressive sensing (CS) is an emerging field based on the revelation that a small collection of linear projections of a sparse signal contains enough information for stable, subNyquist signal acquisition. When a statistical characterization of the signal is available, Bayesian inference can comple ..."
Abstract

Cited by 51 (12 self)
 Add to MetaCart
Compressive sensing (CS) is an emerging field based on the revelation that a small collection of linear projections of a sparse signal contains enough information for stable, subNyquist signal acquisition. When a statistical characterization of the signal is available, Bayesian inference can complement conventional CS methods based on linear programming or greedy algorithms. We perform asymptotically optimal Bayesian inference using belief propagation (BP) decoding, which represents the CS encoding matrix as a graphical model. Fast computation is obtained by reducing the size of the graphical model with sparse encoding matrices. To decode a length signal containing large coefficients, our CSBP decoding algorithm uses ( log ()) measurements and ( log 2 ()) computation. Finally, although we focus on a twostate mixture Gaussian model, CSBP is easily adapted to other signal models.
Rotation Invariant Texture Characterization and Retrieval using Steerable Waveletdomain Hidden Markov Models
"... A new statistical model for characterizing texture images based on waveletdomain hidden Markov models and steerable pyramids is presented. The new model is shown to capture well both the subband marginal distributions and the dependencies across scales and orientations of the wavelet descriptors. O ..."
Abstract

Cited by 40 (4 self)
 Add to MetaCart
A new statistical model for characterizing texture images based on waveletdomain hidden Markov models and steerable pyramids is presented. The new model is shown to capture well both the subband marginal distributions and the dependencies across scales and orientations of the wavelet descriptors. Once it is trained for an input texture image, the model can be easily steered to characterize that texture at any other orientation. After a diagonalization operation, one obtains a rotationinvariant model of the texture image. The effectiveness of the new texture models are demonstrated in retrieval experiments with large image databases, where significant performance gains are shown. Keywords texture characterization, image retrieval, rotation invariance, wavelets, hidden Markov models, steerable pyramids. Corresponding author. Address: see above; Phone: +41 21 693 7663; Fax: +41 21 693 4312. y Also with Department of EECS, UC Berkeley, Berkeley CA 94720, USA. April 23, 2001 DRAFT I.
Signal reconstruction using sparse tree representation
 in Proc. Wavelets XI at SPIE Optics and Photonics
, 2005
"... Recent studies in linear inverse problems have recognized the sparse representation of unknown signal in a certain basis as an useful and effective prior information to solve those problems. In many multiscale bases (e.g. wavelets), signals of interest (e.g. piecewisesmooth signals) not only have f ..."
Abstract

Cited by 33 (2 self)
 Add to MetaCart
Recent studies in linear inverse problems have recognized the sparse representation of unknown signal in a certain basis as an useful and effective prior information to solve those problems. In many multiscale bases (e.g. wavelets), signals of interest (e.g. piecewisesmooth signals) not only have few significant coefficients, but also those significant coefficients are wellorganized in trees. We propose to exploit the treestructured sparse representation as additional prior information for linear inverse problems with limited numbers of measurements. We present numerical results showing that exploiting the sparse tree representations lead to better reconstruction while requiring less time compared to methods that only assume sparse representations. 1.
Wavelet Footprints: Theory, Algorithms, and Applications
 IEEE Trans. Signal Processing
, 2003
"... In recent years, waveletbased algorithms have been successful in different signal processing tasks. The wavelet transform is a powerful tool because it manages to represent both transient and stationary behaviors of a signal with few transform coefficients. Discontinuities often carry relevant sign ..."
Abstract

Cited by 32 (5 self)
 Add to MetaCart
In recent years, waveletbased algorithms have been successful in different signal processing tasks. The wavelet transform is a powerful tool because it manages to represent both transient and stationary behaviors of a signal with few transform coefficients. Discontinuities often carry relevant signal information, and therefore, they represent a critical part to analyze. In this paper, we study the dependency across scales of the wavelet coefficients generated by discontinuities. We start by showing that any piecewise smooth signal can be expressed as a sum of a piecewise polynomial signal and a uniformly smooth residual (see Theorem 1 in Section II). We then introduce the notion of footprints, which are scale space vectors that model discontinuities in piecewise polynomial signals exactly. We show that footprints form an overcomplete dictionary and develop efficient and robust algorithms to find the exact representation of a piecewise polynomial function in terms of footprints. This also leads to efficient approximation of piecewise smooth functions. Finally, we focus on applications and show that algorithms based on footprints outperform standard wavelet methods in different applications such as denoising, compression, and (nonblind) deconvolution. In the case of compression, we also prove that at high rates, footprintbased algorithms attain optimal performance (see Theorem 3 in Section V).
Covariance structure of wavelet coefficients: theory and models in a Bayesian perspective
, 1999
"... We present theoretical results on the random wavelet coefficients covariance structure. We use simple properties of the coefficients to derive a recursive way to compute the within and acrossscale covariances. We point out a useful link between the algorithm proposed and the twodimensional discre ..."
Abstract

Cited by 24 (4 self)
 Add to MetaCart
We present theoretical results on the random wavelet coefficients covariance structure. We use simple properties of the coefficients to derive a recursive way to compute the within and acrossscale covariances. We point out a useful link between the algorithm proposed and the twodimensional discrete wavelet transform. We then focus on Bayesian wavelet shrinkage for estimating a function from noisy data. A prior distribution is imposed on the coefficients of the unknown function. We show how our findings on the covariance structure make it possible to specify priors that take into account the full correlation between coefficients through a parsimonious number of hyperparameters. We use Markov chain Monte Carlo methods to estimate the parameters and illustrate our method on benchmark simulated signals.