Results 1  10
of
110
Robust Anisotropic Diffusion
, 1998
"... Relations between anisotropic diffusion and robust statistics are described in this paper. Specifically, we show that anisotropic diffusion can be seen as a robust estimation procedure that estimates a piecewise smooth image from a noisy input image. The "edgestopping" function in the anisotropic d ..."
Abstract

Cited by 278 (16 self)
 Add to MetaCart
Relations between anisotropic diffusion and robust statistics are described in this paper. Specifically, we show that anisotropic diffusion can be seen as a robust estimation procedure that estimates a piecewise smooth image from a noisy input image. The "edgestopping" function in the anisotropic diffusion equation is closely related to the error norm and influence function in the robust estimation framework. This connection leads to a new "edgestopping" function based on Tukey's biweight robust estimator, that preserves sharper boundaries than previous formulations and improves the automatic stopping of the diffusion. The robust statistical interpretation also provides a means for detecting the boundaries (edges) between the piecewise smooth regions in an image that has been smoothed with anisotropic diffusion. Additionally, we derive a relationship between anisotropic diffusion and regularization with line processes. Adding constraints on the spatial organization of the ...
Deterministic edgepreserving regularization in computed imaging
 IEEE Trans. Image Processing
, 1997
"... Abstract—Many image processing problems are ill posed and must be regularized. Usually, a roughness penalty is imposed on the solution. The difficulty is to avoid the smoothing of edges, which are very important attributes of the image. In this paper, we first give conditions for the design of such ..."
Abstract

Cited by 231 (23 self)
 Add to MetaCart
Abstract—Many image processing problems are ill posed and must be regularized. Usually, a roughness penalty is imposed on the solution. The difficulty is to avoid the smoothing of edges, which are very important attributes of the image. In this paper, we first give conditions for the design of such an edgepreserving regularization. Under these conditions, we show that it is possible to introduce an auxiliary variable whose role is twofold. First, it marks the discontinuities and ensures their preservation from smoothing. Second, it makes the criterion halfquadratic. The optimization is then easier. We propose a deterministic strategy, based on alternate minimizations on the image and the auxiliary variable. This leads to the definition of an original reconstruction algorithm, called ARTUR. Some theoretical properties of ARTUR are discussed. Experimental results illustrate the behavior of the algorithm. These results are shown in the field of tomography, but this method can be applied in a large number of applications in image processing. I.
Iterative Methods For Total Variation Denoising
 SIAM J. SCI. COMPUT
"... Total Variation (TV) methods are very effective for recovering "blocky", possibly discontinuous, images from noisy data. A fixed point algorithm for minimizing a TVpenalized least squares functional is presented and compared with existing minimization schemes. A variant of the cellcentered finite ..."
Abstract

Cited by 230 (7 self)
 Add to MetaCart
Total Variation (TV) methods are very effective for recovering "blocky", possibly discontinuous, images from noisy data. A fixed point algorithm for minimizing a TVpenalized least squares functional is presented and compared with existing minimization schemes. A variant of the cellcentered finite difference multigrid method of Ewing and Shen is implemented for solving the (large, sparse) linear subproblems. Numerical results are presented for one and twodimensional examples; in particular, the algorithm is applied to actual data obtained from confocal microscopy.
A Variational Method In Image Recovery
 SIAM J. Numer. Anal
, 1997
"... This paper is concerned with a classical denoising and deblurring problem in image recovery. Our approach is based on a variational method. By using the LegendreFenchel transform, we show how the nonquadratic criterion to be minimized can be split into a sequence of halfquadratic problems easier t ..."
Abstract

Cited by 101 (22 self)
 Add to MetaCart
This paper is concerned with a classical denoising and deblurring problem in image recovery. Our approach is based on a variational method. By using the LegendreFenchel transform, we show how the nonquadratic criterion to be minimized can be split into a sequence of halfquadratic problems easier to solve numerically. First we prove an existence and uniqueness result, and then we describe the algorithm for computing the solution and we give a proof of convergence. Finally, we present some experimental results for synthetic and real images.
A new alternating minimization algorithm for total variation image reconstruction
 SIAM J. IMAGING SCI
, 2008
"... We propose, analyze and test an alternating minimization algorithm for recovering images from blurry and noisy observations with total variation (TV) regularization. This algorithm arises from a new halfquadratic model applicable to not only the anisotropic but also isotropic forms of total variati ..."
Abstract

Cited by 97 (16 self)
 Add to MetaCart
We propose, analyze and test an alternating minimization algorithm for recovering images from blurry and noisy observations with total variation (TV) regularization. This algorithm arises from a new halfquadratic model applicable to not only the anisotropic but also isotropic forms of total variation discretizations. The periteration computational complexity of the algorithm is three Fast Fourier Transforms (FFTs). We establish strong convergence properties for the algorithm including finite convergence for some variables and relatively fast exponential (or qlinear in optimization terminology) convergence for the others. Furthermore, we propose a continuation scheme to accelerate the practical convergence of the algorithm. Extensive numerical results show that our algorithm performs favorably in comparison to several stateoftheart algorithms. In particular, it runs orders of magnitude faster than the Lagged Diffusivity algorithm for totalvariationbased deblurring. Some extensions of our algorithm are also discussed.
ConjugateGradient Preconditioning Methods for ShiftVariant PET Image Reconstruction
 IEEE Tr. Im. Proc
, 2002
"... Gradientbased iterative methods often converge slowly for tomographic image reconstruction and image restoration problems, but can be accelerated by suitable preconditioners. Diagonal preconditioners offer some improvement in convergence rate, but do not incorporate the structure of the Hessian mat ..."
Abstract

Cited by 51 (21 self)
 Add to MetaCart
Gradientbased iterative methods often converge slowly for tomographic image reconstruction and image restoration problems, but can be accelerated by suitable preconditioners. Diagonal preconditioners offer some improvement in convergence rate, but do not incorporate the structure of the Hessian matrices in imaging problems. Circulant preconditioners can provide remarkable acceleration for inverse problems that are approximately shiftinvariant, i.e. for those with approximately blockToeplitz or blockcirculant Hessians. However, in applications with nonuniform noise variance, such as arises from Poisson statistics in emission tomography and in quantumlimited optical imaging, the Hessian of the weighted leastsquares objective function is quite shiftvariant, and circulant preconditioners perform poorly. Additional shiftvariance is caused by edgepreserving regularization methods based on nonquadratic penalty functions. This paper describes new preconditioners that approximate more accurately the Hessian matrices of shiftvariant imaging problems. Compared to diagonal or circulant preconditioning, the new preconditioners lead to significantly faster convergence rates for the unconstrained conjugategradient (CG) iteration. We also propose a new efficient method for the linesearch step required by CG methods. Applications to positron emission tomography (PET) illustrate the method.
Recovering Edges in IllPosed Inverse Problems: Optimality of Curvelet Frames
, 2000
"... We consider a model problem of recovering a function f(x1,x2) from noisy Radon data. The function f to be recovered is assumed smooth apart from a discontinuity along a C2 curve – i.e. an edge. We use the continuum white noise model, with noise level ɛ. Traditional linear methods for solving such in ..."
Abstract

Cited by 50 (14 self)
 Add to MetaCart
We consider a model problem of recovering a function f(x1,x2) from noisy Radon data. The function f to be recovered is assumed smooth apart from a discontinuity along a C2 curve – i.e. an edge. We use the continuum white noise model, with noise level ɛ. Traditional linear methods for solving such inverse problems behave poorly in the presence of edges. Qualitatively, the reconstructions are blurred near the edges; quantitatively, they give in our model Mean Squared Errors (MSEs) that tend to zero with noise level ɛ only as O(ɛ1/2)asɛ → 0. A recent innovation – nonlinear shrinkage in the wavelet domain – visually improves edge sharpness and improves MSE convergence to O(ɛ2/3). However, as we show here, this rate is not optimal. In fact, essentially optimal performance is obtained by deploying the recentlyintroduced tight frames of curvelets in this setting. Curvelets are smooth, highly anisotropic elements ideally suited for detecting and synthesizing curved edges. To deploy them in the Radon setting, we construct a curveletbased biorthogonal decomposition
On The Convergence Of The Lagged Diffusivity Fixed Point Method In Total Variation Image Restoration
, 1997
"... . In this paper we show that the lagged diffusivity fixed point algorithm introduced by Vogel and Oman in [10] to solve the problem of Total Variation denoising, proposed by Rudin, Osher and Fatemi in [9], is a particular instance of a class of algorithms introduced by Eckhardt and Voss in [11], who ..."
Abstract

Cited by 44 (4 self)
 Add to MetaCart
. In this paper we show that the lagged diffusivity fixed point algorithm introduced by Vogel and Oman in [10] to solve the problem of Total Variation denoising, proposed by Rudin, Osher and Fatemi in [9], is a particular instance of a class of algorithms introduced by Eckhardt and Voss in [11], whose origins can be traced back to Weiszfeld's original work for minimizing a sum of Euclidean lengths [12]. There have recently appeared several proofs for the convergence of this algorithm [2], [3], [6]. Here we present a proof of the global and linear convergence using the framework introduced in [11] and give a bound for the convergence rate of the fixed point iteration that agrees with our experimental results. These results are also valid for suitable generalizations of the fixed point algorithm. 1. Introduction. Recently, a new class of nonlinear PDE based techniques has emerged for image restoration problems, primarily because they preserve sharp edges better. A particularly popular te...
Efficient schemes for total variation minimization under constraints in image processing
, 2007
"... ..."
Convex halfquadratic criteria and interacting auxiliary variables for image restoration
 IEEE Trans. Image Processing
, 2001
"... © 2001 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other w ..."
Abstract

Cited by 38 (13 self)
 Add to MetaCart
© 2001 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE. Abstract—This paper deals with convex halfquadratic criteria and associated minimization algorithms for the purpose of image restoration. It brings a number of original elements within a unified mathematical presentation based on convex duality. Firstly, Geman and Yang’s [1] and Geman and Reynolds’s [2] constructions are revisited, with a view to establish convexity properties of the resulting halfquadratic augmented criteria, when the original nonquadratic criterion is already convex. Secondly, a family of convex Gibbsian energies that incorporate interacting auxiliary variables is revealed as a potentially fruitful extension of Geman and Reynolds’s construction. Index Terms—Convex duality, coordinate descent algorithms, edgepreserving restoration, Gibbs–Markov models, line processes. I.