Results 1  10
of
9,104
Subgradient algorithm on Riemannian manifolds
 J. Optim. Theory Appl
, 1998
"... Abstract. The subgradient method is generalized to the context of Riemannian manifolds. The motivation can be seen in nonEuclidean metrics that occur in interiorpoint methods. In that frame, the natural curves for local steps are the geodesies relative to the specific Riemannian manifold. In this ..."
Abstract

Cited by 24 (8 self)
 Add to MetaCart
Abstract. The subgradient method is generalized to the context of Riemannian manifolds. The motivation can be seen in nonEuclidean metrics that occur in interiorpoint methods. In that frame, the natural curves for local steps are the geodesies relative to the specific Riemannian manifold
Convergence Rate of Incremental Subgradient Algorithms
 Stochastic Optimization: Algorithms and Applications
, 2000
"... We consider a class of subgradient methods for minimizing a convex function that consists of the sum of a large number of component functions. This type of minimization arises in a dual context from Lagrangian relaxation of the coupling constraints of large scale separable problems. The idea is to p ..."
Abstract

Cited by 68 (7 self)
 Add to MetaCart
We consider a class of subgradient methods for minimizing a convex function that consists of the sum of a large number of component functions. This type of minimization arises in a dual context from Lagrangian relaxation of the coupling constraints of large scale separable problems. The idea
ON GENERAL AUGMENTED LAGRANGIANS AND A MODIFIED SUBGRADIENT ALGORITHM
, 2009
"... In this thesis we study a modified subgradient algorithm applied to the dual problem generated by augmented Lagrangians. We consider an optimization problem with equality constraints and study an exact version of the algorithm with a sharp Lagrangian in finite dimensional spaces. An inexact versi ..."
Abstract
 Add to MetaCart
In this thesis we study a modified subgradient algorithm applied to the dual problem generated by augmented Lagrangians. We consider an optimization problem with equality constraints and study an exact version of the algorithm with a sharp Lagrangian in finite dimensional spaces. An ine
A Subgradient Algorithm for Nonlinear Integer Programming
, 1991
"... . This paper describes a subgradient approach to nonlinear integer programming and, in particular, nonlinear 01 integer programming. In this approach, the objective function for a nonlinear integer program is considered as a nonsmooth function over the integer points. The subgradient and the suppor ..."
Abstract
 Add to MetaCart
and the supporting plane for the function are defined, and a necessary and sufficient condition for the optimal solution is established, based on the theory of nonsmooth analysis. A new algorithm, called the subgradient algorithm, is developed. The algorithm is in some sense an extension of Newton's method
Incremental stochastic subgradient algorithms for convex optimization
 SIAM J. Optim
"... Abstract. In this paper we study the effect of stochastic errors on two constrained incremental subgradient algorithms. We view the incremental subgradient algorithms as decentralized network optimization algorithms as applied to minimize a sum of functions, when each component function is known o ..."
Abstract

Cited by 48 (7 self)
 Add to MetaCart
Abstract. In this paper we study the effect of stochastic errors on two constrained incremental subgradient algorithms. We view the incremental subgradient algorithms as decentralized network optimization algorithms as applied to minimize a sum of functions, when each component function is known
The Exponentiated Subgradient Algorithm for Heuristic Boolean Programming
 IN PROC. IJCAI01
, 2001
"... Boolean linear programs (BLPs) are ubiquitous in AI. Satisfiability testing, planning with resource constraints, and winner determination in combinatorial auctions are all examples of this type of problem. Although increasingly wellinformed by work in OR, current AI research has tended to focu ..."
Abstract

Cited by 46 (2 self)
 Add to MetaCart
to focus on specialized algorithms for each type of BLP task and has only loosely patterned new algorithms on effective methods from other tasks. In this paper we introduce a single generalpurpose local search procedure that can be simultaneously applied to the entire range of BLP problems, without
An Inexact Modified Subgradient Algorithm for Nonconvex Optimization ∗
, 2008
"... We propose and analyze an inexact version of the modified subgradient (MSG) algorithm, which we call the IMSG algorithm, for nonsmooth and nonconvex optimization over a compact set. We prove that under an approximate, i.e. inexact, minimization of the sharp augmented Lagrangian, the main convergence ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
We propose and analyze an inexact version of the modified subgradient (MSG) algorithm, which we call the IMSG algorithm, for nonsmooth and nonconvex optimization over a compact set. We prove that under an approximate, i.e. inexact, minimization of the sharp augmented Lagrangian, the main
OSGA: A fast subgradient algorithm with optimal complexity
, 2014
"... This paper presents an algorithm for approximately minimizing a convex function in simple, not necessarily bounded convex domains, assuming only that function values and subgradients are available. No global information about the objective function is needed apart from a strong convexity parameter ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
This paper presents an algorithm for approximately minimizing a convex function in simple, not necessarily bounded convex domains, assuming only that function values and subgradients are available. No global information about the objective function is needed apart from a strong convexity
An optimal subgradient algorithm with subspace search for costly convex optimization problems
"... This paper presents an acceleration of the optimal subgradient algorithm OSGA [30] for solving convex optimization problems, where the objective function involves costly affine and cheap nonlinear terms. We combine OSGA with a multidimensional subspace search technique, which leads to lowdimension ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
This paper presents an acceleration of the optimal subgradient algorithm OSGA [30] for solving convex optimization problems, where the objective function involves costly affine and cheap nonlinear terms. We combine OSGA with a multidimensional subspace search technique, which leads to low
Results 1  10
of
9,104