Results 1  10
of
36
Sparse Geometric Image Representations with Bandelets
, 2004
"... This paper introduces a new class of bases, called bandelet bases, which decompose the image along multiscale vectors that are elongated in the direction of a geometric flow. This geometric flow indicates directions in which the image grey levels have regular variations. The image decomposition in ..."
Abstract

Cited by 179 (4 self)
 Add to MetaCart
This paper introduces a new class of bases, called bandelet bases, which decompose the image along multiscale vectors that are elongated in the direction of a geometric flow. This geometric flow indicates directions in which the image grey levels have regular variations. The image decomposition in a bandelet basis is implemented with a fast subband filtering algorithm. Bandelet bases lead to optimal approximation rates for geometrically regular images. For image compression and noise removal applications, the geometric flow is optimized with fast algorithms, so that the resulting bandelet basis produces a minimum distortion. Comparisons are made with wavelet image compression and noise removal algorithms.
Multiresolution markov models for signal and image processing
 Proceedings of the IEEE
, 2002
"... This paper reviews a significant component of the rich field of statistical multiresolution (MR) modeling and processing. These MR methods have found application and permeated the literature of a widely scattered set of disciplines, and one of our principal objectives is to present a single, coheren ..."
Abstract

Cited by 141 (18 self)
 Add to MetaCart
(Show Context)
This paper reviews a significant component of the rich field of statistical multiresolution (MR) modeling and processing. These MR methods have found application and permeated the literature of a widely scattered set of disciplines, and one of our principal objectives is to present a single, coherent picture of this framework. A second goal is to describe how this topic fits into the even larger field of MR methods and concepts–in particular making ties to topics such as wavelets and multigrid methods. A third is to provide several alternate viewpoints for this body of work, as the methods and concepts we describe intersect with a number of other fields. The principle focus of our presentation is the class of MR Markov processes defined on pyramidally organized trees. The attractiveness of these models stems from both the very efficient algorithms they admit and their expressive power and broad applicability. We show how a variety of methods and models relate to this framework including models for selfsimilar and 1/f processes. We also illustrate how these methods have been used in practice. We discuss the construction of MR models on trees and show how questions that arise in this context make contact with wavelets, state space modeling of time series, system and parameter identification, and hidden
Platelets: A Multiscale Approach for Recovering Edges and Surfaces in PhotonLimited Medical Imaging
 IEEE TRANSACTIONS ON MEDICAL IMAGING
, 2003
"... The nonparametric multiscale platelet algorithms presented in this paper, unlike traditional waveletbased methods, are both well suited to photonlimited medical imaging applications involving Poisson data and capable of better approximating edge contours. This paper introduces platelets, localized ..."
Abstract

Cited by 85 (19 self)
 Add to MetaCart
The nonparametric multiscale platelet algorithms presented in this paper, unlike traditional waveletbased methods, are both well suited to photonlimited medical imaging applications involving Poisson data and capable of better approximating edge contours. This paper introduces platelets, localized functions at various scales, locations, and orientations that produce piecewise linear image approximations, and a new multiscale image decomposition based on these functions. Platelets are well suited for approximating images consisting of smooth regions separated by smooth boundaries. For smoothness measured in certain H older classes, it is shown that the error of mterm platelet approximations can decay significantly faster than that of mterm approximations in terms of sinusoids, wavelets, or wedgelets. This suggests that platelets may outperform existing techniques for image denoising and reconstruction. Fast, plateletbased, maximum penalized likelihood methods for photonlimited image denoising, deblurring and tomographic reconstruction problems are developed. Because platelet decompositions of Poisson distributed images are tractable and computationally efficient, existing image reconstruction methods based on expectationmaximization type algorithms can be easily enhanced with platelet techniques. Experimental results suggest that plateletbased methods can outperform standard reconstruction methods currently in use in confocal microscopy, image restoration, and emission tomography.
WaveletBased Image Estimation: An Empirical Bayes Approach Using Jeffreys' Noninformative Prior
, 2001
"... The sparseness and decorrelation properties of the discrete wavelet transform have been exploited to develop powerful denoising methods. However, most of these methods have free parameters which have to be adjusted or estimated. In this paper, we propose a waveletbased denoising technique without a ..."
Abstract

Cited by 66 (10 self)
 Add to MetaCart
The sparseness and decorrelation properties of the discrete wavelet transform have been exploited to develop powerful denoising methods. However, most of these methods have free parameters which have to be adjusted or estimated. In this paper, we propose a waveletbased denoising technique without any free parameters; it is, in this sense, a "universal" method. Our approach uses empirical Bayes estimation based on a Jeffreys' noninformative prior; it is a step toward objective Bayesian waveletbased denoising. The result is a remarkably simple fixed nonlinear shrinkage/thresholding rule which performs better than other more computationally demanding methods.
MDL Denoising
 IEEE Transactions on Information Theory
, 1999
"... The socalled denoising problem, relative to normal models for noise, is formalized such that `noise' is defined as the incompressible part in the data while the compressible part defines the meaningful information bearing signal. Such a decomposition is effected by minimization of the ideal ..."
Abstract

Cited by 59 (10 self)
 Add to MetaCart
(Show Context)
The socalled denoising problem, relative to normal models for noise, is formalized such that `noise' is defined as the incompressible part in the data while the compressible part defines the meaningful information bearing signal. Such a decomposition is effected by minimization of the ideal code length, called for by the Minimum Description Length (MDL) principle, and obtained by an application of the normalized maximum likelihood technique to the primary parameters, their range, and their number. For any orthonormal regression matrix, such as defined by wavelet transforms, the minimization can be done with a threshold for the squared coefficients resulting from the expansion of the data sequence in the basis vectors defined by the matrix. keywords: linear regression, wavelet transforms, threshold, stochastic complexity, Kolmogorov sufficient statistics 1 Introduction Intuitively speaking the socalled `denoising' problem is to separate an observed data sequence x 1 ; x 2 ; ...
A Statistical Multiscale Framework for Poisson Inverse Problems
, 2000
"... This paper describes a statistical modeling and analysis method for linear inverse problems involving Poisson data based on a novel multiscale framework. The framework itself is founded upon a multiscale analysis associated with recursive partitioning of the underlying intensity, a corresponding ..."
Abstract

Cited by 42 (4 self)
 Add to MetaCart
This paper describes a statistical modeling and analysis method for linear inverse problems involving Poisson data based on a novel multiscale framework. The framework itself is founded upon a multiscale analysis associated with recursive partitioning of the underlying intensity, a corresponding multiscale factorization of the likelihood (induced by this analysis), and a choice of prior probability distribution made to match this factorization by modeling the \splits" in the underlying partition. The class of priors used here has the interesting feature that the \noninformative" member yields the traditional maximum likelihood solution; other choices are made to reect prior belief as to the smoothness of the unknown intensity. Adopting the expectationmaximization (EM) algorithm for use in computing the MAP estimate corresponding to our model, we nd that our model permits remarkably simple, closedform expressions for the EM update equations. The behavior of our EM algorit...
ComplexityRegularized Image Denoising
, 1997
"... We introduce a new complexity regularization method for image denoising and explore the use of sophisticated complexity penalties. We have found improvements of the order of 2 dB in reconstructed image meansquared error over existing complexityregularized estimators. 1. INTRODUCTION In a numbe ..."
Abstract

Cited by 34 (4 self)
 Add to MetaCart
(Show Context)
We introduce a new complexity regularization method for image denoising and explore the use of sophisticated complexity penalties. We have found improvements of the order of 2 dB in reconstructed image meansquared error over existing complexityregularized estimators. 1. INTRODUCTION In a number of applications, image data are corrupted by additive noise [1]. The purpose of image denoising is to reduce noise while preserving visually significant image components. This may be done e#ectively if suitable a priori image models are available. Images may then be denoised by application of fundamental statistical estimation principles, e.g., using Bayesian estimation or regularized maximumlikelihood (ML) estimation, depending on the type of a priori information available. In this paper, we explore the use of complexity regularization as a particular regularization technique. This method has received considerable attention in the statistical estimation community in the last 57 years...
Image Denoising: A Nonlinear Robust Statistical Approach
, 2001
"... Nonlinear filtering techniques based on the theory of robust estimation are introduced. Some deterministic and asymptotic properties are derived. The proposed denoising methods are optimal over the Hubercontaminated normal neighborhood and are highly resistant to outliers. Experimental results show ..."
Abstract

Cited by 31 (5 self)
 Add to MetaCart
Nonlinear filtering techniques based on the theory of robust estimation are introduced. Some deterministic and asymptotic properties are derived. The proposed denoising methods are optimal over the Hubercontaminated normal neighborhood and are highly resistant to outliers. Experimental results showing a much improved performance of the proposed filters in the presence of Gaussian and heavytailed noise are analyzed and illustrated.
Statistical imaging and complexity regularization
 IEEE Trans. Inf. Theory
, 2000
"... Abstract — We apply the complexity–regularization principle to statistical illposed inverse problems in imaging. We formulate a natural distortion measure in image space and develop nonasymptotic bounds on estimation performance in terms of an index of resolvability that characterizes the compressi ..."
Abstract

Cited by 21 (3 self)
 Add to MetaCart
(Show Context)
Abstract — We apply the complexity–regularization principle to statistical illposed inverse problems in imaging. We formulate a natural distortion measure in image space and develop nonasymptotic bounds on estimation performance in terms of an index of resolvability that characterizes the compressibility of the true image. These bounds extend previous results that were obtained in the literature under simpler observational models. I. Statement of the Problem A variety of imaging problems involve estimation of an image from noisy, degraded observations [1, 2]. Examples include tomography, astronomical imaging, ultrasound imaging, radar imaging, forensic science, and restoration of old movies. In some of these problems, a statistical model relating the observations
Unifying Probabilistic and Variational Estimation
, 2002
"... this article, we present a variational approach to MAP estimation with a more qualitative and tutorial emphasis. The key idea behind this approach is to use geometric insight in helping construct regularizing functionals and avoiding a subjective choice of a prior in MAP estimation. Using tools from ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
this article, we present a variational approach to MAP estimation with a more qualitative and tutorial emphasis. The key idea behind this approach is to use geometric insight in helping construct regularizing functionals and avoiding a subjective choice of a prior in MAP estimation. Using tools from robust statistics and information theory, weshow that we can extend this strategy and develop two gradient descent flows for image denoising with a demonstrated performance