Results 11  20
of
586
An ExtragradientBased Alternating Direction Method for Convex Minimization
"... Abstract In this paper, we consider the problem of minimizing the sum of two convex functions subject to linear linking constraints. The classical alternating direction type methods usually assume that the two convex functions have relatively easy proximal mappings. However, many problems arising f ..."
Abstract
 Add to MetaCart
Abstract In this paper, we consider the problem of minimizing the sum of two convex functions subject to linear linking constraints. The classical alternating direction type methods usually assume that the two convex functions have relatively easy proximal mappings. However, many problems arising
Global Convergence of Subspace Correction Methods for Convex Optimization Problems
, 1998
"... A general space decomposition technique is used to solve nonlinear convex minimization problems. The dierential of the minimization functional is required to satisfy some growth conditions that are weaker than Lipschitz continuity and strong monotonicity. Optimal rate of convergence is proved. If th ..."
Abstract

Cited by 28 (5 self)
 Add to MetaCart
A general space decomposition technique is used to solve nonlinear convex minimization problems. The dierential of the minimization functional is required to satisfy some growth conditions that are weaker than Lipschitz continuity and strong monotonicity. Optimal rate of convergence is proved
Proximal Newtontype methods for convex optimization
"... We seek to solve convex optimization problems in composite form: minimize x∈R n f(x): = g(x) + h(x), where g is convex and continuously differentiable and h: R n → R is a convex but not necessarily differentiable function whose proximal mapping can be evaluated efficiently. We derive a generalizatio ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
We seek to solve convex optimization problems in composite form: minimize x∈R n f(x): = g(x) + h(x), where g is convex and continuously differentiable and h: R n → R is a convex but not necessarily differentiable function whose proximal mapping can be evaluated efficiently. We derive a
Gradient methods for convex minimization: better rates under weaker conditions
, 2013
"... Abstract The convergence behavior of gradient methods for minimizing convex differentiable functions is one of the core questions in convex optimization. This paper shows that their wellknown complexities can be achieved under conditions weaker than the commonly accepted ones. We relax the common ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
Abstract The convergence behavior of gradient methods for minimizing convex differentiable functions is one of the core questions in convex optimization. This paper shows that their wellknown complexities can be achieved under conditions weaker than the commonly accepted ones. We relax the common
AN EFFECTIVE OPTIMIZATION ALGORITHM FOR LOCALLY NONCONVEX LIPSCHITZ FUNCTIONS BASED ON MOLLIFIER SUBGRADIENTS
, 2011
"... Abstract. We present an effective algorithm for minimization of locally nonconvex Lipschitz functions based on mollifier functions approximating the Clarke generalized gradient. To this aim, first we approximate the Clarke generalized gradient by mollifier subgradients. To construct this approximat ..."
Abstract
 Add to MetaCart
this approximation, we use a set of averaged functions gradients. Then, we show that the convex hull of this set serves as a good approximation for the Clarke generalized gradient. Using this approximation of the Clarke generalized gradient, we establish an algorithm for minimization of locally Lipschitz functions
A VariablePenalty Alternating Directions Method for Convex Optimization
"... We study a generalized version of the method of alternating directions as applied to the minimization of the sum of two convex functions subject to linear constraints. The method consists of solving consecutively in each iteration two optimization problems which contain in the objective function bot ..."
Abstract

Cited by 27 (0 self)
 Add to MetaCart
We study a generalized version of the method of alternating directions as applied to the minimization of the sum of two convex functions subject to linear constraints. The method consists of solving consecutively in each iteration two optimization problems which contain in the objective function
ON MINIMIZING THE MAXIMUM EIGENVALUE OF A SYMMETRIC MATRIX
, 1988
"... An important optimization problem that arises in control is to minimize o(x), the largest eigenvalue (in magnitude) of a symmetric matrix function of x. If the matrix function is affine, 9(x) is convex. However, 9(x) is not differentiable, since the eigenvalues are not differentiable at points wher ..."
Abstract

Cited by 75 (4 self)
 Add to MetaCart
An important optimization problem that arises in control is to minimize o(x), the largest eigenvalue (in magnitude) of a symmetric matrix function of x. If the matrix function is affine, 9(x) is convex. However, 9(x) is not differentiable, since the eigenvalues are not differentiable at points
SPG: Software for ConvexConstrained Optimization
, 2001
"... this paper we describe Fortran 77 software that implements the nonmonotone spectral projected gradient (SPG) algorithm. The SPG method applies to problems of the form min f(x) subject to x 2 ; where is a closed convex set in IR n . It is assumed that f is dened and has continuous partial deriva ..."
Abstract

Cited by 11 (4 self)
 Add to MetaCart
derivatives on an open set that contains Users of the software must supply subroutines to compute the function f(x), the gradient rf(x) and projections of an arbitrary point x onto Information about the Hessian matrix is not required and the storage requirements are minimal. Therefore, the algorithm
Integration of multiview stereo and silhouettes via convex functionals on convex domains
 In European Conference on Computer Vision (ECCV
, 2008
"... Abstract. We propose a convex framework for silhouette and stereo fusion in 3D reconstruction from multiple images. The key idea is to show that the reconstruction problem can be cast as one of minimizing a convex functional where the exact silhouette consistency is imposed as a convex constraint th ..."
Abstract

Cited by 28 (6 self)
 Add to MetaCart
Abstract. We propose a convex framework for silhouette and stereo fusion in 3D reconstruction from multiple images. The key idea is to show that the reconstruction problem can be cast as one of minimizing a convex functional where the exact silhouette consistency is imposed as a convex constraint
Nested iterative algorithms for convex constrained image recovery problems
 IEEE Journal of Selected Topics in Signal Processing
, 2007
"... The objective of this paper is to develop methods for solving image recovery problems subject to constraints on the solution. More precisely, we will be interested in problems which can be formulated as the minimization over a closed convex constraint set of the sum of two convex functions f and g, ..."
Abstract

Cited by 30 (8 self)
 Add to MetaCart
The objective of this paper is to develop methods for solving image recovery problems subject to constraints on the solution. More precisely, we will be interested in problems which can be formulated as the minimization over a closed convex constraint set of the sum of two convex functions f and g
Results 11  20
of
586