Results 1  10
of
14
An efficient TVL1 algorithm for deblurring multichannel images corrupted by impulsive noise
 SIAM J. SCI. COMPUT
, 2009
"... We extend the alternating minimization algorithm recently proposed in [38, 39] to the case of recovering blurry multichannel (color) images corrupted by impulsive rather than Gaussian noise. The algorithm minimizes the sum of a multichannel extension of total variation (TV), either isotropic or anis ..."
Abstract

Cited by 50 (8 self)
 Add to MetaCart
(Show Context)
We extend the alternating minimization algorithm recently proposed in [38, 39] to the case of recovering blurry multichannel (color) images corrupted by impulsive rather than Gaussian noise. The algorithm minimizes the sum of a multichannel extension of total variation (TV), either isotropic or anisotropic, and a data fidelity term measured in the L1norm. We derive the algorithm by applying the wellknown quadratic penalty function technique and prove attractive convergence properties including finite convergence for some variables and global qlinear convergence. Under periodic boundary conditions, the main computational requirements of the algorithm are fast Fourier transforms and a lowcomplexity Gaussian elimination procedure. Numerical results on images with different blurs and impulsive noise are presented to demonstrate the efficiency of the algorithm. In addition, it is numerically compared to an algorithm recently proposed in [20] that uses a linear program and an interior point method for recovering grayscale images.
A fast algorithm for edgepreserving variational multichannel image restoration
"... Abstract. We generalize the alternating minimization algorithm recently proposed in [32] to efficiently solve a general, edgepreserving, variational model for recovering multichannel images degraded by within and crosschannel blurs, as well as additive Gaussian noise. This general model allows th ..."
Abstract

Cited by 45 (9 self)
 Add to MetaCart
(Show Context)
Abstract. We generalize the alternating minimization algorithm recently proposed in [32] to efficiently solve a general, edgepreserving, variational model for recovering multichannel images degraded by within and crosschannel blurs, as well as additive Gaussian noise. This general model allows the use of localized weights and higherorder derivatives in regularization, and includes a multichannel extension of total variation (MTV) regularization as a special case. In the MTV case, we show that the model can be derived from an extended halfquadratic transform of Geman and Yang [14]. For color images with three channels and when applied to the MTV model (either locally weighted or not), the periteration computational complexity of this algorithm is dominated by nine fast Fourier transforms. We establish strong convergence results for the algorithm including finite convergence for some variables and fast qlinear convergence for the others. Numerical results on various types of blurs are presented to demonstrate the performance of our algorithm compared to that of the MATLAB deblurring functions. We also present experimental results on regularization models using weighted MTV and higherorder derivatives to demonstrate improvements in image quality provided by these models over the plain MTV model.
Joint reconstruction of Stokes images from polarimetric measurements,”
 J. Opt. Soc. Am. A
, 2009
"... In the field of imaging polarimetry Stokes parameters are sought and must be inferred from noisy and blurred intensity measurements. Using a penalizedlikelihood estimation framework we investigate reconstruction quality when estimating intensity images and then transforming to Stokes parameters, a ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
(Show Context)
In the field of imaging polarimetry Stokes parameters are sought and must be inferred from noisy and blurred intensity measurements. Using a penalizedlikelihood estimation framework we investigate reconstruction quality when estimating intensity images and then transforming to Stokes parameters, and when estimating Stokes parameters directly. We define our cost function for reconstruction by a weighted leastsquares data fit term and a regularization penalty. We show that for quadratic regularization the estimators of Stokes and intensity images can be made equal by appropriate choice of regularization parameters. It is empirically shown that, when using edge preserving regularization, estimating the Stokes parameters directly leads to lower RMS error. Also, the addition of a cross channel regularization term further lowers the RMS error for both methods, especially in the case of low SNR.
A quaternion framework for color image smoothing and segmentation
, 2011
"... In this paper, we present feature/detail preserving models for color image smoothing and segmentation using the Hamiltonian quaternion framework. First, we introduce a novel quaternionic Gabor filter (QGF) which can combine the color channels and the orientations in the image plane. We show that th ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
In this paper, we present feature/detail preserving models for color image smoothing and segmentation using the Hamiltonian quaternion framework. First, we introduce a novel quaternionic Gabor filter (QGF) which can combine the color channels and the orientations in the image plane. We show that these filters are optimally localized both in the spatial and frequency domains and provide a good approximation to quaternionic quadrature filters. Using the QGFs, we extract the local orientation information in the color images. Second, in order to model this derived orientation information, we propose continuous mixtures of appropriate exponential basis functions and derive analytic expressions for these models. These analytic expressions take the form of spatially varying kernels which, when convolved with a color image or the signed distance function of an evolving contour (placed in the color image), yield a detail preserving smoothing and segmentation, respectively. Several examples on widely used image databases are shown to depict the performance of our algorithms.
Polyakov action minimization for efficient color image processing.
, 2010
"... Abstract. The LaplaceBeltrami operator is an extension of the Laplacian from flat domains to curved manifolds. It was proven to be useful for color image processing as it models a meaningful coupling between the color channels. This coupling is naturally expressed in the Beltrami framework in whic ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
Abstract. The LaplaceBeltrami operator is an extension of the Laplacian from flat domains to curved manifolds. It was proven to be useful for color image processing as it models a meaningful coupling between the color channels. This coupling is naturally expressed in the Beltrami framework in which a color image is regarded as a two dimensional manifold embedded in a hybrid, fivedimensional, spatialchromatic (x, y, R, G, B) space. The Beltrami filter defined by this framework minimizes the Polyakov action, adopted from highenergy physics, which measures the area of the image manifold. Minimization is usually obtained through a geometric heat equation defined by the LaplaceBeltrami operator. Though efficient simplifications such as the bilateral filter have been proposed for the single channel case, so far, the coupling between the color channel posed a nontrivial obstacle when designing fast Beltrami filters. Here, we propose to use an augmented Lagrangian approach to design an efficient and accurate regularization framework for color image processing by minimizing the Polyakov action. We extend the augmented Lagrangian framework for total variation (TV) image denoising to the more general Polyakov action case for color images, and apply the proposed framework to denoise and deblur color images.
Recovery of polarimetric Stokes images by spatial mixture models
"... A Bayesian approach for joint restoration and segmentation of polarization encoded images is presented with emphasis on both physical admissibility and smoothness of the solution. In this probabilistic framework, two distinct models for the sought polarized radiances are used. In the first model, th ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
A Bayesian approach for joint restoration and segmentation of polarization encoded images is presented with emphasis on both physical admissibility and smoothness of the solution. In this probabilistic framework, two distinct models for the sought polarized radiances are used. In the first model, the polarized light at each site of the image is described by its Stokes vector which directly follows a mixture of truncated Gaussians explicitly assigning zero probability to inadmissible configurations. In the second one, the polarization at each site is represented by the coherency matrix whose positive semidefiniteness provides a convenient way to ensure the physical admissibility of the solution. This matrix is parameterized by a set of variables assumed to be generated by a spatially varying mixture of Gaussians. Inference for both models is obtained by the ExpectationMaximization (EM) algorithm. The restored Stokes images are always physically admissible which is not the case for the näıve pseudoinverse approach. Numerical experiments on real and synthetic images using the proposed methods assess the pertinence of the approach. Experiments on noisedegraded images confirm the robustness to noise, both in terms of visual quality as well as SNR improvement.
Suppression of Missing Data Artifacts for Deblurring Images Corrupted by Random Valued Noise
"... For deblurring images corrupted by random valued noise, twophase methods first select likelytobe reliables (data that are not corrupted by random valued noise) and then deblur images only with selected data. The selective use of data in twophase methods, however, often causes missing data artifac ..."
Abstract
 Add to MetaCart
(Show Context)
For deblurring images corrupted by random valued noise, twophase methods first select likelytobe reliables (data that are not corrupted by random valued noise) and then deblur images only with selected data. The selective use of data in twophase methods, however, often causes missing data artifacts. In this paper, to suppress these missing data artifacts, we propose a blurring model based reliableselection technique to select sufficiently many reliables so that all of toberecovered pixel values can contribute to selected data, while excluding random value noised data accurately. We also propose a normalization technique to compensate for nonuniform rates in recovering pixel values. Simulation studies show that proposed techniques effectively suppress missing data artifacts and, as a result, improve the performance of twophase methods. Suppression of Missing Data Artifacts for Deblurring Images 3 1
Polarimetric Image Reconstruction Algorithms
, 2010
"... There are many many people (too many to name) who have helped me along the way to this degree and I have a deep heartfelt thanks to all of them. There are a few individuals however, without whom, I would have stopped my graduate training at the Master’s level. Jeff Fessler, Brian Thelen, and Rick P ..."
Abstract
 Add to MetaCart
(Show Context)
There are many many people (too many to name) who have helped me along the way to this degree and I have a deep heartfelt thanks to all of them. There are a few individuals however, without whom, I would have stopped my graduate training at the Master’s level. Jeff Fessler, Brian Thelen, and Rick Paxman; my gratitude to these three outstanding scientists could not be greater. My advisor, Jeff, is the model graduate student mentor. Jeff is always genial and encouraging; qualities that are very much appreciated by bewildered graduate students. In addition to his pleasant demeanor Jeff is a scientist of the highest caliber. Working with Jeff has been an unforgettable experience. My colleagues, Brian Thelen and Rick Paxman, have been more supportive and helpful than can be expressed in words. They have both always been willing to spend time discussing technical aspects of my research as well as providing insights as to which direction to move. I stand indebted to them in more ways than one, I never once provided them with a charge number, check’s in the mail guys:). I will never
Digital Signal Processing Group
"... There has recently been considerable interest in applying Total Variation regularization with an ℓ 1 data fidelity term to the denoising of images subject to salt and pepper noise, but the extension of this formulation to more general problems, such as deconvolution, has received little attention. W ..."
Abstract
 Add to MetaCart
(Show Context)
There has recently been considerable interest in applying Total Variation regularization with an ℓ 1 data fidelity term to the denoising of images subject to salt and pepper noise, but the extension of this formulation to more general problems, such as deconvolution, has received little attention. We consider this problem, comparing the performance of ℓ 1TV deconvolution, computed via our Iteratively Reweighted Norm algorithm, with an alternative variational approach based on MumfordShah regularization. The ℓ 1TV deconvolution method is found to have a significant advantage in reconstruction quality, with comparable computational cost.