Results 1 - 10
of
14
An efficient TVL1 algorithm for deblurring multichannel images corrupted by impulsive noise
- SIAM J. SCI. COMPUT
, 2009
"... We extend the alternating minimization algorithm recently proposed in [38, 39] to the case of recovering blurry multichannel (color) images corrupted by impulsive rather than Gaussian noise. The algorithm minimizes the sum of a multichannel extension of total variation (TV), either isotropic or anis ..."
Abstract
-
Cited by 50 (8 self)
- Add to MetaCart
(Show Context)
We extend the alternating minimization algorithm recently proposed in [38, 39] to the case of recovering blurry multichannel (color) images corrupted by impulsive rather than Gaussian noise. The algorithm minimizes the sum of a multichannel extension of total variation (TV), either isotropic or anisotropic, and a data fidelity term measured in the L1-norm. We derive the algorithm by applying the well-known quadratic penalty function technique and prove attractive convergence properties including finite convergence for some variables and global q-linear convergence. Under periodic boundary conditions, the main computational requirements of the algorithm are fast Fourier transforms and a low-complexity Gaussian elimination procedure. Numerical results on images with different blurs and impulsive noise are presented to demonstrate the efficiency of the algorithm. In addition, it is numerically compared to an algorithm recently proposed in [20] that uses a linear program and an interior point method for recovering grayscale images.
A fast algorithm for edgepreserving variational multichannel image restoration
"... Abstract. We generalize the alternating minimization algorithm recently proposed in [32] to efficiently solve a general, edge-preserving, variational model for recovering multichannel images degraded by within- and cross-channel blurs, as well as additive Gaussian noise. This general model allows th ..."
Abstract
-
Cited by 45 (9 self)
- Add to MetaCart
(Show Context)
Abstract. We generalize the alternating minimization algorithm recently proposed in [32] to efficiently solve a general, edge-preserving, variational model for recovering multichannel images degraded by within- and cross-channel blurs, as well as additive Gaussian noise. This general model allows the use of localized weights and higher-order derivatives in regularization, and includes a multichannel extension of total variation (MTV) regularization as a special case. In the MTV case, we show that the model can be derived from an extended half-quadratic transform of Geman and Yang [14]. For color images with three channels and when applied to the MTV model (either locally weighted or not), the per-iteration computational complexity of this algorithm is dominated by nine fast Fourier transforms. We establish strong convergence results for the algorithm including finite convergence for some variables and fast q-linear convergence for the others. Numerical results on various types of blurs are presented to demonstrate the performance of our algorithm compared to that of the MATLAB deblurring functions. We also present experimental results on regularization models using weighted MTV and higher-order derivatives to demonstrate improvements in image quality provided by these models over the plain MTV model.
Joint reconstruction of Stokes images from polarimetric measurements,”
- J. Opt. Soc. Am. A
, 2009
"... In the field of imaging polarimetry Stokes parameters are sought and must be inferred from noisy and blurred intensity measurements. Using a penalized-likelihood estimation framework we investigate reconstruction quality when estimating intensity images and then transforming to Stokes parameters, a ..."
Abstract
-
Cited by 5 (2 self)
- Add to MetaCart
(Show Context)
In the field of imaging polarimetry Stokes parameters are sought and must be inferred from noisy and blurred intensity measurements. Using a penalized-likelihood estimation framework we investigate reconstruction quality when estimating intensity images and then transforming to Stokes parameters, and when estimating Stokes parameters directly. We define our cost function for reconstruction by a weighted least-squares data fit term and a regularization penalty. We show that for quadratic regularization the estimators of Stokes and intensity images can be made equal by appropriate choice of regularization parameters. It is empirically shown that, when using edge preserving regularization, estimating the Stokes parameters directly leads to lower RMS error. Also, the addition of a cross channel regularization term further lowers the RMS error for both methods, especially in the case of low SNR.
A quaternion framework for color image smoothing and segmentation
, 2011
"... In this paper, we present feature/detail preserving models for color image smoothing and segmentation using the Hamiltonian quaternion framework. First, we introduce a novel quaternionic Gabor filter (QGF) which can combine the color channels and the orientations in the image plane. We show that th ..."
Abstract
-
Cited by 3 (0 self)
- Add to MetaCart
In this paper, we present feature/detail preserving models for color image smoothing and segmentation using the Hamiltonian quaternion framework. First, we introduce a novel quaternionic Gabor filter (QGF) which can combine the color channels and the orientations in the image plane. We show that these filters are optimally localized both in the spatial and frequency domains and provide a good approx-imation to quaternionic quadrature filters. Using the QGFs, we extract the local orientation information in the color im-ages. Second, in order to model this derived orientation in-formation, we propose continuous mixtures of appropriate exponential basis functions and derive analytic expressions for these models. These analytic expressions take the form of spatially varying kernels which, when convolved with a color image or the signed distance function of an evolving contour (placed in the color image), yield a detail preserving smoothing and segmentation, respectively. Several examples on widely used image databases are shown to depict the per-formance of our algorithms.
Polyakov action minimization for efficient color image processing.
, 2010
"... Abstract. The Laplace-Beltrami operator is an extension of the Laplacian from flat domains to curved manifolds. It was proven to be useful for color image processing as it models a meaningful coupling between the color channels. This coupling is naturally expressed in the Beltrami framework in whic ..."
Abstract
-
Cited by 2 (0 self)
- Add to MetaCart
(Show Context)
Abstract. The Laplace-Beltrami operator is an extension of the Laplacian from flat domains to curved manifolds. It was proven to be useful for color image processing as it models a meaningful coupling between the color channels. This coupling is naturally expressed in the Beltrami framework in which a color image is regarded as a two dimensional manifold embedded in a hybrid, five-dimensional, spatialchromatic (x, y, R, G, B) space. The Beltrami filter defined by this framework minimizes the Polyakov action, adopted from high-energy physics, which measures the area of the image manifold. Minimization is usually obtained through a geometric heat equation defined by the Laplace-Beltrami operator. Though efficient simplifications such as the bilateral filter have been proposed for the single channel case, so far, the coupling between the color channel posed a non-trivial obstacle when designing fast Beltrami filters. Here, we propose to use an augmented Lagrangian approach to design an efficient and accurate regularization framework for color image processing by minimizing the Polyakov action. We extend the augmented Lagrangian framework for total variation (TV) image denoising to the more general Polyakov action case for color images, and apply the proposed framework to denoise and deblur color images.
Recovery of polarimetric Stokes images by spatial mixture models
"... A Bayesian approach for joint restoration and segmentation of polarization encoded images is presented with emphasis on both physical admissibility and smoothness of the solution. In this probabilistic framework, two distinct models for the sought polarized radiances are used. In the first model, th ..."
Abstract
-
Cited by 1 (1 self)
- Add to MetaCart
(Show Context)
A Bayesian approach for joint restoration and segmentation of polarization encoded images is presented with emphasis on both physical admissibility and smoothness of the solution. In this probabilistic framework, two distinct models for the sought polarized radiances are used. In the first model, the polarized light at each site of the image is described by its Stokes vector which directly follows a mixture of truncated Gaussians explicitly assigning zero probability to inadmissible configurations. In the second one, the polarization at each site is represented by the coherency matrix whose positive semidefiniteness provides a convenient way to ensure the physical admissibility of the solution. This matrix is parameterized by a set of variables assumed to be generated by a spatially varying mixture of Gaussians. Inference for both models is obtained by the Expectation-Maximization (EM) algorithm. The restored Stokes images are always physically admissible which is not the case for the näıve pseudo-inverse approach. Numerical experiments on real and synthetic images using the proposed methods assess the pertinence of the approach. Experiments on noise-degraded images confirm the robustness to noise, both in terms of visual quality as well as SNR improvement.
Suppression of Missing Data Artifacts for Deblurring Images Corrupted by Random Valued Noise
"... For deblurring images corrupted by random valued noise, two-phase methods first select likely-to-be reliables (data that are not corrupted by random valued noise) and then deblur images only with selected data. The selective use of data in twophase methods, however, often causes missing data artifac ..."
Abstract
- Add to MetaCart
(Show Context)
For deblurring images corrupted by random valued noise, two-phase methods first select likely-to-be reliables (data that are not corrupted by random valued noise) and then deblur images only with selected data. The selective use of data in twophase methods, however, often causes missing data artifacts. In this paper, to suppress these missing data artifacts, we propose a blurring model based reliableselection technique to select sufficiently many reliables so that all of to-be-recovered pixel values can contribute to selected data, while excluding random value noised data accurately. We also propose a normalization technique to compensate for nonuniform rates in recovering pixel values. Simulation studies show that proposed techniques effectively suppress missing data artifacts and, as a result, improve the performance of two-phase methods. Suppression of Missing Data Artifacts for Deblurring Images 3 1
Polarimetric Image Reconstruction Algorithms
, 2010
"... There are many many people (too many to name) who have helped me along the way to this degree and I have a deep heart-felt thanks to all of them. There are a few individuals however, without whom, I would have stopped my graduate training at the Master’s level. Jeff Fessler, Brian Thelen, and Rick P ..."
Abstract
- Add to MetaCart
(Show Context)
There are many many people (too many to name) who have helped me along the way to this degree and I have a deep heart-felt thanks to all of them. There are a few individuals however, without whom, I would have stopped my graduate training at the Master’s level. Jeff Fessler, Brian Thelen, and Rick Paxman; my gratitude to these three outstanding scientists could not be greater. My advisor, Jeff, is the model graduate student mentor. Jeff is always genial and encouraging; qualities that are very much appreciated by bewildered graduate students. In addition to his pleasant demeanor Jeff is a scientist of the highest caliber. Working with Jeff has been an unforgettable experience. My colleagues, Brian Thelen and Rick Paxman, have been more supportive and helpful than can be expressed in words. They have both always been willing to spend time discussing technical aspects of my research as well as providing insights as to which direction to move. I stand indebted to them in more ways than one, I never once provided them with a charge number, check’s in the mail guys:-). I will never
Digital Signal Processing Group
"... There has recently been considerable interest in applying Total Variation regularization with an ℓ 1 data fidelity term to the denoising of images subject to salt and pepper noise, but the extension of this formulation to more general problems, such as deconvolution, has received little attention. W ..."
Abstract
- Add to MetaCart
(Show Context)
There has recently been considerable interest in applying Total Variation regularization with an ℓ 1 data fidelity term to the denoising of images subject to salt and pepper noise, but the extension of this formulation to more general problems, such as deconvolution, has received little attention. We consider this problem, comparing the performance of ℓ 1-TV deconvolution, computed via our Iteratively Reweighted Norm algorithm, with an alternative variational approach based on Mumford-Shah regularization. The ℓ 1-TV deconvolution method is found to have a significant advantage in reconstruction quality, with comparable computational cost.