Results 1  10
of
109
Deterministic edgepreserving regularization in computed imaging
 IEEE Trans. Image Processing
, 1997
"... Abstract—Many image processing problems are ill posed and must be regularized. Usually, a roughness penalty is imposed on the solution. The difficulty is to avoid the smoothing of edges, which are very important attributes of the image. In this paper, we first give conditions for the design of such ..."
Abstract

Cited by 231 (23 self)
 Add to MetaCart
Abstract—Many image processing problems are ill posed and must be regularized. Usually, a roughness penalty is imposed on the solution. The difficulty is to avoid the smoothing of edges, which are very important attributes of the image. In this paper, we first give conditions for the design of such an edgepreserving regularization. Under these conditions, we show that it is possible to introduce an auxiliary variable whose role is twofold. First, it marks the discontinuities and ensures their preservation from smoothing. Second, it makes the criterion halfquadratic. The optimization is then easier. We propose a deterministic strategy, based on alternate minimizations on the image and the auxiliary variable. This leads to the definition of an original reconstruction algorithm, called ARTUR. Some theoretical properties of ARTUR are discussed. Experimental results illustrate the behavior of the algorithm. These results are shown in the field of tomography, but this method can be applied in a large number of applications in image processing. I.
Robust Solutions To LeastSquares Problems With Uncertain Data
, 1997
"... . We consider leastsquares problems where the coefficient matrices A; b are unknownbutbounded. We minimize the worstcase residual error using (convex) secondorder cone programming, yielding an algorithm with complexity similar to one singular value decomposition of A. The method can be interpret ..."
Abstract

Cited by 149 (13 self)
 Add to MetaCart
. We consider leastsquares problems where the coefficient matrices A; b are unknownbutbounded. We minimize the worstcase residual error using (convex) secondorder cone programming, yielding an algorithm with complexity similar to one singular value decomposition of A. The method can be interpreted as a Tikhonov regularization procedure, with the advantage that it provides an exact bound on the robustness of solution, and a rigorous way to compute the regularization parameter. When the perturbation has a known (e.g., Toeplitz) structure, the same problem can be solved in polynomialtime using semidefinite programming (SDP). We also consider the case when A; b are rational functions of an unknownbutbounded perturbation vector. We show how to minimize (via SDP) upper bounds on the optimal worstcase residual. We provide numerical examples, including one from robust identification and one from robust interpolation. Key Words. Leastsquares, uncertainty, robustness, secondorder cone...
A new alternating minimization algorithm for total variation image reconstruction
 SIAM J. IMAGING SCI
, 2008
"... We propose, analyze and test an alternating minimization algorithm for recovering images from blurry and noisy observations with total variation (TV) regularization. This algorithm arises from a new halfquadratic model applicable to not only the anisotropic but also isotropic forms of total variati ..."
Abstract

Cited by 97 (16 self)
 Add to MetaCart
We propose, analyze and test an alternating minimization algorithm for recovering images from blurry and noisy observations with total variation (TV) regularization. This algorithm arises from a new halfquadratic model applicable to not only the anisotropic but also isotropic forms of total variation discretizations. The periteration computational complexity of the algorithm is three Fast Fourier Transforms (FFTs). We establish strong convergence properties for the algorithm including finite convergence for some variables and relatively fast exponential (or qlinear in optimization terminology) convergence for the others. Furthermore, we propose a continuation scheme to accelerate the practical convergence of the algorithm. Extensive numerical results show that our algorithm performs favorably in comparison to several stateoftheart algorithms. In particular, it runs orders of magnitude faster than the Lagged Diffusivity algorithm for totalvariationbased deblurring. Some extensions of our algorithm are also discussed.
Bayesian and Regularization Methods for Hyperparameter Estimation in Image Restoration
 IEEE Trans. Image Processing
, 1999
"... In this paper, we propose the application of the hierarchical Bayesian paradigm to the image restoration problem. We derive expressions for the iterative evaluation of the two hyperparameters applying the evidence and maximum a posteriori (MAP) analysis within the hierarchical Bayesian paradigm. We ..."
Abstract

Cited by 65 (26 self)
 Add to MetaCart
In this paper, we propose the application of the hierarchical Bayesian paradigm to the image restoration problem. We derive expressions for the iterative evaluation of the two hyperparameters applying the evidence and maximum a posteriori (MAP) analysis within the hierarchical Bayesian paradigm. We show analytically that the analysis provided by the evidence approach is more realistic and appropriate than the MAP approach for the image restoration problem. We furthermore study the relationship between the evidence and an iterative approach resulting from the set theoretic regularization approach for estimating the two hyperparameters, or their ratio, defined as the regularization parameter. Finally the proposed algorithms are tested experimentally.
A Bayesian Approach to Introducing AnatomoFunctional Priors in the EEG/MEG Inverse Problem
, 1997
"... In this paper, we present a new approach to the recovering of dipole magnitudes in a distributed source model for magnetoencephalographic (MEG) and electroencephalographic (EEG) imaging. This method consists in introducing spatial and temporal a priori information as a cure to this illposed inverse ..."
Abstract

Cited by 51 (2 self)
 Add to MetaCart
In this paper, we present a new approach to the recovering of dipole magnitudes in a distributed source model for magnetoencephalographic (MEG) and electroencephalographic (EEG) imaging. This method consists in introducing spatial and temporal a priori information as a cure to this illposed inverse problem. A nonlinear spatial regularization scheme allows the preservation of dipole moment discontinuities between some a priori noncorrelated sources, for instance, when considering dipoles located on both sides of a sulcus. Moreover, we introduce temporal smoothness constraints on dipole magnitude evolution at time scales smaller than those of cognitive processes. These priors are easily integrated into a Bayesian formalism, yielding a maximum a posteriori (MAP) estimator of brain electrical activity. Results from EEG simulations of our method are presented and compared with those of classical quadratic regularization and a now popular generalized minimumnorm technique called lowresolution electromagnetic tomography (LORETA).
Image restoration subject to a total variation constraint
 IEEE Transactions on Image Processing
, 2004
"... Abstract—Total variation has proven to be a valuable concept in connection with the recovery of images featuring piecewise smooth components. So far, however, it has been used exclusively as an objective to be minimized under constraints. In this paper, we propose an alternative formulation in which ..."
Abstract

Cited by 26 (2 self)
 Add to MetaCart
Abstract—Total variation has proven to be a valuable concept in connection with the recovery of images featuring piecewise smooth components. So far, however, it has been used exclusively as an objective to be minimized under constraints. In this paper, we propose an alternative formulation in which total variation is used as a constraint in a general convex programming framework. This approach places no limitation on the incorporation of additional constraints in the restoration process and the resulting optimization problem can be solved efficiently via blockiterative methods. Image denoising and deconvolution applications are demonstrated. I. PROBLEM STATEMENT THE CLASSICAL linear restoration problem is to find the original form of an image in a real Hilbert space from the observation of a degraded image where
Regularized Constrained Total LeastSquares Image Restoration
 IEEE Trans. Image Processing
, 1995
"... In this paper the problem of restoring an image distorted by a linear spaceinvariant (LSI) pointspread function (psf) which is not exactly known is formulated as the solution of a perturbed set of linear equations. The regularized constrained total leastsquares (RCTLS) method is used to solve thi ..."
Abstract

Cited by 24 (6 self)
 Add to MetaCart
In this paper the problem of restoring an image distorted by a linear spaceinvariant (LSI) pointspread function (psf) which is not exactly known is formulated as the solution of a perturbed set of linear equations. The regularized constrained total leastsquares (RCTLS) method is used to solve this set of equations. Using the diagonalization properties of the discrete Fourier transform (DFT) for circulant matrices, the RCTLS estimate is computed in the DFT domain. This significantly reduces the computational cost of this approach and makes its implementation possible even for large images. An error analysis of the RCTLS estimate, based on the meansquarederror (MSE) criterion is performed to verify its superiority over the constrained total leastsquares (CTLS) estimate. Numerical experiments for different psf errors are performed to test the RCTLS estimator for this problem. Objective and visual comparisons are presented with the linear minimum meansquarederror (LMMSE) and the re...
Hyperparameter estimation for satellite image restoration using a MCMC Maximum Likelihood method
 Pattern Recognition
, 2000
"... The satellite image deconvolution problem is illposed and must be regularized. Herein, we use an edgepreserving regularization model using a ' function, involving two hyperparameters. Our goal is to estimate the optimal parameters in order to automatically reconstruct images. We propose to use the ..."
Abstract

Cited by 22 (7 self)
 Add to MetaCart
The satellite image deconvolution problem is illposed and must be regularized. Herein, we use an edgepreserving regularization model using a ' function, involving two hyperparameters. Our goal is to estimate the optimal parameters in order to automatically reconstruct images. We propose to use the Maximum Likelihood Estimator (MLE), applied to the observed image. We need sampling from prior and posterior distributions. Since the convolution prevents from using standard samplers, we have developed a modied GemanYang algorithm, using an auxiliary variable and a cosine transform. We present a Markov Chain Monte Carlo Maximum Likelihood (MCMCML) technique which is able to simultaneously achieve the estimation and the reconstruction.
Inversion Of LargeSupport IllPosed Linear . . .
 IEEE TRANSACTIONS ON IMAGE PROCESSING
, 1998
"... We propose a method for the reconstruction of signals and images observed partially through a linear operator with a large support (e.g., a Fourier transform on a sparse set). This inverse problem is illposed and we resolve it by incorporating the prior information that the reconstructed object ..."
Abstract

Cited by 20 (12 self)
 Add to MetaCart
We propose a method for the reconstruction of signals and images observed partially through a linear operator with a large support (e.g., a Fourier transform on a sparse set). This inverse problem is illposed and we resolve it by incorporating the prior information that the reconstructed objects are composed of smooth regions separated by sharp transitions. This feature is modeled by a piecewise Gaussian (PG) Markov random field (MRF), known also as the weakstring in one dimension and the weakmembrane in two dimensions. The reconstruction is defined as the maximum a posteriori estimate. The prerequisite
Variational Bayesian image restoration based on a product of tdistributions image prior
 IEEE Transactions on Image Processing
, 2008
"... Abstract—Image priors based on products have been recognized to offer many advantages because they allow simultaneous enforcement of multiple constraints. However, they are inconvenient for Bayesian inference because it is hard to find their normalization constant in closed form. In this paper, a ne ..."
Abstract

Cited by 20 (2 self)
 Add to MetaCart
Abstract—Image priors based on products have been recognized to offer many advantages because they allow simultaneous enforcement of multiple constraints. However, they are inconvenient for Bayesian inference because it is hard to find their normalization constant in closed form. In this paper, a new Bayesian algorithm is proposed for the image restoration problem that bypasses this difficulty. An image prior is defined by imposing Studentt densities on the outputs of local convolutional filters. A variational methodology, with a constrained expectation step, is used to infer the restored image. Numerical experiments are shown that compare this methodology to previous ones and demonstrate its advantages. Index Terms—Constrained variational inference, image restoration, product prior, Student’st prior, Variational Bayesian Inference.