Results 1 
5 of
5
A Nonlinear PrimalDual Method For Total VariationBased Image Restoration
, 1995
"... . We present a new method for solving total variation (TV) minimization problems in image restoration. The main idea is to remove some of the singularity caused by the nondifferentiability of the quantity jruj in the definition of the TVnorm before we apply a linearization technique such as Newton ..."
Abstract

Cited by 212 (22 self)
 Add to MetaCart
(Show Context)
. We present a new method for solving total variation (TV) minimization problems in image restoration. The main idea is to remove some of the singularity caused by the nondifferentiability of the quantity jruj in the definition of the TVnorm before we apply a linearization technique such as Newton's method. This is accomplished by introducing an additional variable for the flux quantity appearing in the gradient of the objective function. Our method can be viewed as a primaldual method as proposed by Conn and Overton [8] and Andersen [3] for the minimization of a sum of Euclidean norms. Experimental results show that the new method has much improved global convergence behaviour than the primal Newton's method. 1. Introduction. During some phases of the manipulation of an image some random noise and blurring is usually introduced. The presence of this noise and blurring makes difficult and inaccurate the latter phases of the image processing. The algorithms for noise removal and debl...
On implementing a primaldual interiorpoint method for conic quadratic optimization
 MATHEMATICAL PROGRAMMING SER. B
, 2000
"... Conic quadratic optimization is the problem of minimizing a linear function subject to the intersection of an affine set and the product of quadratic cones. The problem is a convex optimization problem and has numerous applications in engineering, economics, and other areas of science. Indeed, linea ..."
Abstract

Cited by 61 (5 self)
 Add to MetaCart
Conic quadratic optimization is the problem of minimizing a linear function subject to the intersection of an affine set and the product of quadratic cones. The problem is a convex optimization problem and has numerous applications in engineering, economics, and other areas of science. Indeed, linear and convex quadratic optimization is a special case. Conic quadratic optimization problems can in theory be solved efficiently using interiorpoint methods. In particular it has been shown by Nesterov and Todd that primaldual interiorpoint methods developed for linear optimization can be generalized to the conic quadratic case while maintaining their efficiency. Therefore, based on the work of Nesterov and Todd, we discuss an implementation of a primaldual interiorpoint method for solution of largescale sparse conic quadratic optimization problems. The main features of the implementation are it is based on a homogeneous and selfdual model, handles the rotated quadratic cone directly, employs a Mehrotra type predictorcorrector
A Computational Study of the Homogeneous Algorithm for LargeScale Convex Optimization
, 1997
"... Recently the authors have proposed a homogeneous and selfdual algorithm for solving the monotone complementarity problem (MCP) [5]. The algorithm is a single phase interiorpoint type method, nevertheless it yields either an approximate optimal solution or detects a possible infeasibility of th ..."
Abstract

Cited by 21 (1 self)
 Add to MetaCart
Recently the authors have proposed a homogeneous and selfdual algorithm for solving the monotone complementarity problem (MCP) [5]. The algorithm is a single phase interiorpoint type method, nevertheless it yields either an approximate optimal solution or detects a possible infeasibility of the problem. In this paper we specialize the algorithm to the solution of general smooth convex optimization problems that also possess nonlinear inequality constraints and free variables. We discuss an implementation of the algorithm for largescale sparse convex optimization. Moreover, we present computational results for solving quadratically constrained quadratic programming and geometric programming problems, where some of the problems contain more than 100,000 constraints and variables. The results indicate that the proposed algorithm is also practically efficient. Department of Management, Odense University, Campusvej 55, DK5230 Odense M, Denmark. Email: eda@busieco.ou.dk y ...
Iterative Methods for Total Variation Image Restoration
, 1995
"... this paper. Others may involve nonlinear blurring operators, multiplicative noise, noise with more complicated distributions and with possible correlation with the image. The aim of image restoration is the estimation of the ideal true image from the recorded one. The direct problem of computing the ..."
Abstract

Cited by 16 (4 self)
 Add to MetaCart
this paper. Others may involve nonlinear blurring operators, multiplicative noise, noise with more complicated distributions and with possible correlation with the image. The aim of image restoration is the estimation of the ideal true image from the recorded one. The direct problem of computing the imaging system response (blurred image) from a given image is often assumed to be known and wellposed.
A computational study of the homogeneous algorithm for largescale convex optimization
, 1996
"... Key words: Monotone complementarity problem, homogeneous and selfdual model, interiorpoint algorithms, largescale convex optimization. 1 Introduction In 1984 Karmarkar [31] presented an interiorpoint method for linear programming (LP) and since then interiorpoint algorithms enjoyed great public ..."
Abstract
 Add to MetaCart
(Show Context)
Key words: Monotone complementarity problem, homogeneous and selfdual model, interiorpoint algorithms, largescale convex optimization. 1 Introduction In 1984 Karmarkar [31] presented an interiorpoint method for linear programming (LP) and since then interiorpoint algorithms enjoyed great publicity for two reasons. First, these algorithms solve LP problems in polynomial time, as proved by Karmarkar and many others. Secondly, interiorpoint algorithms have demonstrated excellent practical performance when solving largescale LP problems, see Lustig et al. [37]. It was soon realized (see Gill et al. [25]) that Karmarkar's method was closely related to the logarithmic barrier algorithm for general nonlinear programming studied by Fiacco and McCormick [23] and others in the sixties. Hence, it is natural to investigate the efficiency of the interiorpoint methods for solving more general classes of problems. In general good complexity results could only be expected for solving convex optimization problems.