Results 1  10
of
203
An Algorithm for Total Variation Minimization and Applications
, 2004
"... We propose an algorithm for minimizing the total variation of an image, and provide a proof of convergence. We show applications to image denoising, zooming, and the computation of the mean curvature motion of interfaces. ..."
Abstract

Cited by 349 (9 self)
 Add to MetaCart
We propose an algorithm for minimizing the total variation of an image, and provide a proof of convergence. We show applications to image denoising, zooming, and the computation of the mean curvature motion of interfaces.
A review of image denoising algorithms, with a new one
 Simul
, 2005
"... Abstract. The search for efficient image denoising methods is still a valid challenge at the crossing of functional analysis and statistics. In spite of the sophistication of the recently proposed methods, most algorithms have not yet attained a desirable level of applicability. All show an outstand ..."
Abstract

Cited by 265 (2 self)
 Add to MetaCart
Abstract. The search for efficient image denoising methods is still a valid challenge at the crossing of functional analysis and statistics. In spite of the sophistication of the recently proposed methods, most algorithms have not yet attained a desirable level of applicability. All show an outstanding performance when the image model corresponds to the algorithm assumptions but fail in general and create artifacts or remove image fine structures. The main focus of this paper is, first, to define a general mathematical and experimental methodology to compare and classify classical image denoising algorithms and, second, to propose a nonlocal means (NLmeans) algorithm addressing the preservation of structure in a digital image. The mathematical analysis is based on the analysis of the “method noise, ” defined as the difference between a digital image and its denoised version. The NLmeans algorithm is proven to be asymptotically optimal under a generic statistical image model. The denoising performance of all considered methods are compared in four ways; mathematical: asymptotic order of magnitude of the method noise under regularity assumptions; perceptualmathematical: the algorithms artifacts and their explanation as a violation of the image model; quantitative experimental: by tables of L 2 distances of the denoised version to the original image. The most powerful evaluation method seems, however, to be the visualization of the method noise on natural images. The more this method noise looks like a real white noise, the better the method.
Iterative Methods For Total Variation Denoising
 SIAM J. SCI. COMPUT
"... Total Variation (TV) methods are very effective for recovering "blocky", possibly discontinuous, images from noisy data. A fixed point algorithm for minimizing a TVpenalized least squares functional is presented and compared with existing minimization schemes. A variant of the cellcentered finite ..."
Abstract

Cited by 230 (7 self)
 Add to MetaCart
Total Variation (TV) methods are very effective for recovering "blocky", possibly discontinuous, images from noisy data. A fixed point algorithm for minimizing a TVpenalized least squares functional is presented and compared with existing minimization schemes. A variant of the cellcentered finite difference multigrid method of Ewing and Shen is implemented for solving the (large, sparse) linear subproblems. Numerical results are presented for one and twodimensional examples; in particular, the algorithm is applied to actual data obtained from confocal microscopy.
Modeling Textures with Total Variation Minimization and Oscillating Patterns in Image Processing
 JOURNAL OF SCIENTIFIC COMPUTING
, 2002
"... This paper is devoted to the modeling of real textured images by functional minimization and partial differential equations. Following the ideas of Yves Meyer in a total variation minimization framework of L. Rudin, S. Osher and E. Fatemi, we decompose a given (possible textured) image f into a su ..."
Abstract

Cited by 150 (23 self)
 Add to MetaCart
This paper is devoted to the modeling of real textured images by functional minimization and partial differential equations. Following the ideas of Yves Meyer in a total variation minimization framework of L. Rudin, S. Osher and E. Fatemi, we decompose a given (possible textured) image f into a sum of two functions u + v, where u E BV is a function of bounded variation (a cartoon or sketchy approximation of f), while v is a function representing the texture or noise. To model v we use the space of oscillating functions introduced by Yves Meyer, which is in some sense the dual of the BV space. The new algorithm is very simple, making use of differential equations and is easily solved in practice. Finally, we implement the method by finite differences, and we present various numerical results on real textured images, showing the obtained decomposition u + v, but we also show how the method can be used for texture discrimination and texture segmentation.
Mathematical Models for Local Nontexture Inpaintings
 SIAM J. Appl. Math
, 2002
"... Inspired by the recent work of Bertalmio et al. on digital inpaintings [SIGGRAPH 2000], we develop general mathematical models for local inpaintings of nontexture images. On smooth regions, inpaintings are connected to the harmonic and biharmonic extensions, and inpainting orders are analyzed. For i ..."
Abstract

Cited by 148 (30 self)
 Add to MetaCart
Inspired by the recent work of Bertalmio et al. on digital inpaintings [SIGGRAPH 2000], we develop general mathematical models for local inpaintings of nontexture images. On smooth regions, inpaintings are connected to the harmonic and biharmonic extensions, and inpainting orders are analyzed. For inpaintings involving the recovery of edges, we study a variational model that is closely connected to the classical total variation (TV) denoising model of Rudin, Osher, and Fatemi [PhSG D, 60 (1992), pp. 259268]. Other models are also discussed based on the MumfordShah regularity [Comm. Pure Appl. Mathq XLII (1989), pp. 577685] and curvature driven di#usions (CDD) of Chan and Shen [J. Visual Comm. Image Rep., 12 (2001)]. The broad applications of the inpainting models are demonstrated through restoring scratched old photos, disocclusion in vision analysis, text removal, digital zooming, and edgebased image coding.
Image Decomposition and Restoration Using Total Variation Minimization and the H^1 Norm
 Simul
, 2002
"... In this paper, we propose a new model for image restoration and decomposition, based on the total variation minimization of RudinOsherFatemi, and of the results of Y. Meyer on oscillatory functions. An initial image f is decomposed into a cartoon part u and a texture or noise part v. The u comp ..."
Abstract

Cited by 111 (16 self)
 Add to MetaCart
In this paper, we propose a new model for image restoration and decomposition, based on the total variation minimization of RudinOsherFatemi, and of the results of Y. Meyer on oscillatory functions. An initial image f is decomposed into a cartoon part u and a texture or noise part v. The u component is modeled by a function of bounded variation, while the v component by an oscillatory function, with bounded H1 norm. After some transformation, the resulting PDE is of fourth order. Finally, image decomposition, alenoising and aleblurring numerical results are shown.
The Digital TV Filter and Nonlinear Denoising
 IEEE Trans. Image Process
, 2001
"... Motivated by the classical TV (total variation) restoration model, we propose a new nonlinear filterthe digital TV filter for denoising and enhancing digital images, or more generally, data living on graphs. The digital TV filter is a data dependent lowpass filter, capable of denoising data witho ..."
Abstract

Cited by 110 (14 self)
 Add to MetaCart
Motivated by the classical TV (total variation) restoration model, we propose a new nonlinear filterthe digital TV filter for denoising and enhancing digital images, or more generally, data living on graphs. The digital TV filter is a data dependent lowpass filter, capable of denoising data without blurring jumps or edges. In iterations, it solves a global total variational optimization problem, which differs from most statistical filters. Applications are given in the denoising of onedimensional (1D) signals, twodimensional (2D) data with irregular structures, gray scale and color images, and nonflat image features such as chromaticity.
A Variational Method In Image Recovery
 SIAM J. Numer. Anal
, 1997
"... This paper is concerned with a classical denoising and deblurring problem in image recovery. Our approach is based on a variational method. By using the LegendreFenchel transform, we show how the nonquadratic criterion to be minimized can be split into a sequence of halfquadratic problems easier t ..."
Abstract

Cited by 101 (22 self)
 Add to MetaCart
This paper is concerned with a classical denoising and deblurring problem in image recovery. Our approach is based on a variational method. By using the LegendreFenchel transform, we show how the nonquadratic criterion to be minimized can be split into a sequence of halfquadratic problems easier to solve numerically. First we prove an existence and uniqueness result, and then we describe the algorithm for computing the solution and we give a proof of convergence. Finally, we present some experimental results for synthetic and real images.
A new alternating minimization algorithm for total variation image reconstruction
 SIAM J. IMAGING SCI
, 2008
"... We propose, analyze and test an alternating minimization algorithm for recovering images from blurry and noisy observations with total variation (TV) regularization. This algorithm arises from a new halfquadratic model applicable to not only the anisotropic but also isotropic forms of total variati ..."
Abstract

Cited by 97 (16 self)
 Add to MetaCart
We propose, analyze and test an alternating minimization algorithm for recovering images from blurry and noisy observations with total variation (TV) regularization. This algorithm arises from a new halfquadratic model applicable to not only the anisotropic but also isotropic forms of total variation discretizations. The periteration computational complexity of the algorithm is three Fast Fourier Transforms (FFTs). We establish strong convergence properties for the algorithm including finite convergence for some variables and relatively fast exponential (or qlinear in optimization terminology) convergence for the others. Furthermore, we propose a continuation scheme to accelerate the practical convergence of the algorithm. Extensive numerical results show that our algorithm performs favorably in comparison to several stateoftheart algorithms. In particular, it runs orders of magnitude faster than the Lagged Diffusivity algorithm for totalvariationbased deblurring. Some extensions of our algorithm are also discussed.
A New TwIST: TwoStep Iterative Shrinkage/Thresholding Algorithms for Image Restoration
 IEEE TRANSACTIONS ON IMAGE PROCESSING
, 2007
"... Iterative shrinkage/thresholding (IST) algorithms have been recently proposed to handle a class of convex unconstrained optimization problems arising in image restoration and other linear inverse problems. This class of problems results from combining a linear observation model with a nonquadratic ..."
Abstract

Cited by 96 (19 self)
 Add to MetaCart
Iterative shrinkage/thresholding (IST) algorithms have been recently proposed to handle a class of convex unconstrained optimization problems arising in image restoration and other linear inverse problems. This class of problems results from combining a linear observation model with a nonquadratic regularizer (e.g., total variation or waveletbased regularization). It happens that the convergence rate of these IST algorithms depends heavily on the linear observation operator, becoming very slow when this operator is illconditioned or illposed. In this paper, we introduce twostep IST (TwIST) algorithms, exhibiting much faster convergence rate than IST for illconditioned problems. For a vast class of nonquadratic convex regularizers ( norms, some Besov norms, and total variation), we show that TwIST converges to a minimizer of the objective function, for a given range of values of its parameters. For noninvertible observation operators, we introduce a monotonic version of TwIST (MTwIST); although the convergence proof does not apply to this scenario, we give experimental evidence that MTwIST exhibits similar speed gains over IST. The effectiveness of the new methods are experimentally confirmed on problems of image deconvolution and of restoration with missing samples.