Results 1  10
of
171
Modeling Textures with Total Variation Minimization and Oscillating Patterns in Image Processing
 JOURNAL OF SCIENTIFIC COMPUTING
, 2002
"... This paper is devoted to the modeling of real textured images by functional minimization and partial differential equations. Following the ideas of Yves Meyer in a total variation minimization framework of L. Rudin, S. Osher and E. Fatemi, we decompose a given (possible textured) image f into a su ..."
Abstract

Cited by 198 (24 self)
 Add to MetaCart
This paper is devoted to the modeling of real textured images by functional minimization and partial differential equations. Following the ideas of Yves Meyer in a total variation minimization framework of L. Rudin, S. Osher and E. Fatemi, we decompose a given (possible textured) image f into a sum of two functions u + v, where u E BV is a function of bounded variation (a cartoon or sketchy approximation of f), while v is a function representing the texture or noise. To model v we use the space of oscillating functions introduced by Yves Meyer, which is in some sense the dual of the BV space. The new algorithm is very simple, making use of differential equations and is easily solved in practice. Finally, we implement the method by finite differences, and we present various numerical results on real textured images, showing the obtained decomposition u + v, but we also show how the method can be used for texture discrimination and texture segmentation.
Variable exponent, linear growth functionals in image processing
 SIAM Journal on Applied Mathematics
, 2004
"... Abstract. We study a functional with variable exponent, 1 ≤ p(x) ≤ 2, which provides a model for image denoising, enhancement, and restoration. The diffusion resulting from the proposed model is a combination of Total Variation based regularization and Gaussian smoothing. The existence, uniqueness, ..."
Abstract

Cited by 112 (1 self)
 Add to MetaCart
(Show Context)
Abstract. We study a functional with variable exponent, 1 ≤ p(x) ≤ 2, which provides a model for image denoising, enhancement, and restoration. The diffusion resulting from the proposed model is a combination of Total Variation based regularization and Gaussian smoothing. The existence, uniqueness, and longtime behavior of the proposed model are established. Experimental results illustrate the effectiveness of the model in image restoration.
Highly accurate optic flow computation with theoretically justified warping
 INTERNATIONAL JOURNAL OF COMPUTER VISION
, 2006
"... ..."
(Show Context)
MINIMIZERS OF COSTFUNCTIONS INVOLVING NONSMOOTH DATAFIDELITY TERMS. APPLICATION TO THE PROCESSING OF OUTLIERS
, 2002
"... We present a theoretical study of the recovery of an unknown vector x ∈ Rp (such as a signal or an image) from noisy data y ∈ Rq by minimizing with respect to x a regularized costfunction F(x, y) = Ψ(x, y) + αΦ(x), where Ψ is a datafidelity term, Φ is a smooth regularization term, and α> 0 i ..."
Abstract

Cited by 105 (19 self)
 Add to MetaCart
We present a theoretical study of the recovery of an unknown vector x ∈ Rp (such as a signal or an image) from noisy data y ∈ Rq by minimizing with respect to x a regularized costfunction F(x, y) = Ψ(x, y) + αΦ(x), where Ψ is a datafidelity term, Φ is a smooth regularization term, and α> 0 is a parameter. Typically, Ψ(x, y) = ‖Ax − y‖2, where A is a linear operator. The datafidelity terms Ψ involved in regularized costfunctions are generally smooth functions; only a few papers make an exception to this and they consider restricted situations. Nonsmooth datafidelity terms are avoided in image processing. In spite of this, we consider both smooth and nonsmooth datafidelity terms. Our goal is to capture essential features exhibited by the local minimizers of regularized costfunctions in relation to the smoothness of the datafidelity term. In order to fix the context of our study, we consider Ψ(x, y) = i ψ(aTi x − yi), where aTi are the rows of A and ψ is Cm on R \ {0}. We show that if ψ′(0−) < ψ′(0+), then typical data y give rise to local minimizers x ̂ of F(., y) which fit exactly a certain number of the data entries: there is a possibly large set h ̂ of indexes such that aTi x ̂ = yi for every i ∈ ĥ. In contrast, if ψ is
On the Equivalence of Soft Wavelet Shrinkage, Total Variation Diffusion, Total Variation Regularization, and SIDEs
 SIAM J. NUMER. ANAL
, 2004
"... Soft wavelet shrinkage, total variation (TV) diffusion, TV regularization, and a dynamical system called SIDEs are four useful techniques for discontinuity preserving denoising of signals and images. In this paper we investigate under which circumstances these methods are equivalent in the onedimen ..."
Abstract

Cited by 89 (18 self)
 Add to MetaCart
(Show Context)
Soft wavelet shrinkage, total variation (TV) diffusion, TV regularization, and a dynamical system called SIDEs are four useful techniques for discontinuity preserving denoising of signals and images. In this paper we investigate under which circumstances these methods are equivalent in the onedimensional case. First, we prove that Haar wavelet shrinkage on a single scale is equivalent to a single step of spacediscrete TV diffusion or regularization of twopixel pairs. In the translationally invariant case we show that applying cycle spinning to Haar wavelet shrinkage on a single scale can be regarded as an absolutely stable explicit discretization of TV diffusion. We prove that spacediscrete TV diffusion and TV regularization are identical and that they are also equivalent to the SIDEs system when a specific force function is chosen. Afterwards, we show that wavelet shrinkage on multiple scales can be regarded as a single step diffusion filtering or regularization of the Laplacian pyramid of the signal. We analyze possibilities to avoid Gibbslike artifacts for multiscale Haar wavelet shrinkage by scaling the thresholds. Finally, we present experiments where hybrid methods are designed that combine the advantages of wavelets and PDE/variational approaches. These methods are based on iterated shiftinvariant wavelet shrinkage at multiple scales with scaled thresholds.
High Resolution Forward and Inverse Earthquake Modeling on Terascale Computers
 In SC2003
, 2003
"... For earthquake simulations to play an important role in the reduction of seismic risk, they must be capable of high resolution and high fidelity. We have developed algorithms and tools for earthquake simulation based on multiresolution hexahedral meshes. We have used this capability to carry out 1 H ..."
Abstract

Cited by 80 (27 self)
 Add to MetaCart
(Show Context)
For earthquake simulations to play an important role in the reduction of seismic risk, they must be capable of high resolution and high fidelity. We have developed algorithms and tools for earthquake simulation based on multiresolution hexahedral meshes. We have used this capability to carry out 1 Hz simulations of the 1994 Northridge earthquake in the LA Basin using 100 million grid points. Our wave propagation solver sustains 1.21 teraflop/s for 4 hours on 3000 AlphaServer processors at 80% parallel efficiency. Because of uncertainties in characterizing earthquake source and basin material properties, a critical remaining challenge is to invert for source and material parameter fields for complex 3D basins from records of past earthquakes. Towards this end, we present results for material and source inversion of highresolution models of basins undergoing antiplane motion using parallel scalable inversion algorithms that overcome many of the difficulties particular to inverse heterogeneous wave propagation problems.
Practical and Theoretical Aspects of Adjoint Parameter Estimation and Identifiability in . . .
, 1997
"... The present paper has two aims. One is to survey briefly the state of the art of parameter estimation in meteorology and oceanography in view of applications of 4D variational data assimilation techniques to inverse parameter estimation problems, which bear promise of serious positive impact on imp ..."
Abstract

Cited by 79 (6 self)
 Add to MetaCart
The present paper has two aims. One is to survey briefly the state of the art of parameter estimation in meteorology and oceanography in view of applications of 4D variational data assimilation techniques to inverse parameter estimation problems, which bear promise of serious positive impact on improving model prediction. The other aim is to present crucial aspects of identifiability and stability essential for validating results of optimal parameter estimation and which have not been addressed so far in either the meteorological or the oceanographic literature. As noted by Yeh (1986, Water Resour. Res. 22, 95108) in the context of ground water flow parameter estimation the inverse or parameter estimation problem is often illposed and beset by instability and nonuniqueness, particularly if one seeks parameters distributed in spacetime domain. This approach will allow one to assess and rigorously validate results of parameter estimation, i.e. do they indeed represent a real identification of physical model parameters or just compensate model errors? A brief survey of other approaches for solving the problem of optimal parameter estimation in meteorology and oceanography is finally presented. 1997 Elsevier Science B.V.
A MULTISCALE IMAGE REPRESENTATION USING HIERARCHICAL (BV, L²) DECOMPOSITIONS
 MULTISCALE MODEL. SIMUL.
, 2004
"... We propose a new multiscale image decomposition which offers a hierarchical, adaptive representation for the different features in general images. The starting point is a variational decomposition of an image, f = u0 { + v0, where [u0,v0] is the minimizer of a Jfunctional, J(f, λ0; X, Y) = infu+v= ..."
Abstract

Cited by 71 (11 self)
 Add to MetaCart
(Show Context)
We propose a new multiscale image decomposition which offers a hierarchical, adaptive representation for the different features in general images. The starting point is a variational decomposition of an image, f = u0 { + v0, where [u0,v0] is the minimizer of a Jfunctional, J(f, λ0; X, Y) = infu+v=f ‖u‖X + λ0‖v ‖ p} Y. Such minimizers are standard tools for image manipulations
Relations Between Regularization and Diffusion Filtering
 Journal of Mathematical Imaging and Vision
, 1998
"... Regularization may be regarded as diffusion filtering with an implicit time discretization where one single step is used. Thus, iterated regularization with small regularization parameters approximates a diffusion process. The goal of this paper is to analyse relations between noniterated and iterat ..."
Abstract

Cited by 54 (13 self)
 Add to MetaCart
(Show Context)
Regularization may be regarded as diffusion filtering with an implicit time discretization where one single step is used. Thus, iterated regularization with small regularization parameters approximates a diffusion process. The goal of this paper is to analyse relations between noniterated and iterated regularization and diffusion filtering in image processing. In the linear setting, we show that with iterated Tikhonov regularization noise can be better handled than with noniterated. In the nonlinear framework, two filtering strategies are considered: total variation regularization and the diffusion filter of Perona and Malik. It is established that the PeronaMalik equation decreases the total variation during its evolution. While noniterated and iterated total variation regularization is wellposed, one cannot expect to find a minimizing sequence which converges to a minimizer of the corresponding energy functional for the PeronaMalik filter. To address this shortcoming, a novel regu...