Results 1  10
of
32
ConditionBased Complexity Of Convex Optimization In Conic Linear Form Via The Ellipsoid Algorithm
, 1998
"... A convex optimization problem in conic linear form is an optimization problem of the form CP (d) : maximize c T ..."
Abstract

Cited by 37 (17 self)
 Add to MetaCart
A convex optimization problem in conic linear form is an optimization problem of the form CP (d) : maximize c T
On implementing a primaldual interiorpoint method for conic quadratic optimization
 MATHEMATICAL PROGRAMMING SER. B
, 2000
"... Conic quadratic optimization is the problem of minimizing a linear function subject to the intersection of an affine set and the product of quadratic cones. The problem is a convex optimization problem and has numerous applications in engineering, economics, and other areas of science. Indeed, linea ..."
Abstract

Cited by 36 (4 self)
 Add to MetaCart
Conic quadratic optimization is the problem of minimizing a linear function subject to the intersection of an affine set and the product of quadratic cones. The problem is a convex optimization problem and has numerous applications in engineering, economics, and other areas of science. Indeed, linear and convex quadratic optimization is a special case. Conic quadratic optimization problems can in theory be solved efficiently using interiorpoint methods. In particular it has been shown by Nesterov and Todd that primaldual interiorpoint methods developed for linear optimization can be generalized to the conic quadratic case while maintaining their efficiency. Therefore, based on the work of Nesterov and Todd, we discuss an implementation of a primaldual interiorpoint method for solution of largescale sparse conic quadratic optimization problems. The main features of the implementation are it is based on a homogeneous and selfdual model, handles the rotated quadratic cone directly, employs a Mehrotra type predictorcorrector
On the Riemannian geometry defined by selfconcordant barriers and interiorpoint methods
 Found. Comput. Math
"... We consider the Riemannian geometry defined on a convex set by the Hessian of a selfconcordant barrier function, and its associated geodesic curves. These provide guidance for the construction of efficient interiorpoint methods for optimizing a linear function over the intersection of the set with ..."
Abstract

Cited by 28 (0 self)
 Add to MetaCart
We consider the Riemannian geometry defined on a convex set by the Hessian of a selfconcordant barrier function, and its associated geodesic curves. These provide guidance for the construction of efficient interiorpoint methods for optimizing a linear function over the intersection of the set with an affine manifold. We show that algorithms that follow the primaldual central path are in some sense close to optimal. The same is true for methods that follow the shifted primaldual central path among certain infeasibleinteriorpoint methods. We also compute the geodesics in several simple sets. ∗ Copyright (C) by SpringerVerlag. Foundations of Computational Mathewmatics 2 (2002), 333–361.
Duality Results For Conic Convex Programming
, 1997
"... This paper presents a unified study of duality properties for the problem of minimizing a linear function over the intersection of an affine space with a convex cone infinite dimension. Existing duality results are carefully surveyed and some new duality properties are established. Examples are give ..."
Abstract

Cited by 15 (10 self)
 Add to MetaCart
This paper presents a unified study of duality properties for the problem of minimizing a linear function over the intersection of an affine space with a convex cone infinite dimension. Existing duality results are carefully surveyed and some new duality properties are established. Examples are given to illustrate these new properties. The topics covered in this paper include GordonStiemke type theorems, Farkas type theorems, perfect duality, Slater condition, regularization, Ramana's duality, and approximate dualities. The dual representations of various convex sets, convex cones and conic convex programs are also discussed.
A fullNewton step O(n) infeasible interiorpoint algorithm for linear optimization
, 2005
"... We present a primaldual infeasible interiorpoint algorithm. As usual, the algorithm decreases the duality gap and the feasibility residuals at the same rate. Assuming that an optimal solution exists it is shown that at most O(n) iterations suffice to reduce the duality gap and the residuals by the ..."
Abstract

Cited by 9 (6 self)
 Add to MetaCart
We present a primaldual infeasible interiorpoint algorithm. As usual, the algorithm decreases the duality gap and the feasibility residuals at the same rate. Assuming that an optimal solution exists it is shown that at most O(n) iterations suffice to reduce the duality gap and the residuals by the factor 1/e. This implies an O(nlog(n/ε)) iteration bound for getting an εsolution of the problem at hand, which coincides with the best known bound for infeasible interiorpoint algorithms. The algorithm constructs strictly feasible iterates for a sequence of perturbations of the given problem and its dual problem. A special feature of the algorithm is that it uses only fullNewton steps. Two types of fullNewton steps are used, socalled feasibility steps and usual (centering) steps. Starting at strictly feasible iterates of a perturbed pair, (very) close its central path, feasibility steps serve to generate strictly feasible iterates for the next perturbed pair. By accomplishing a few centering steps for the new perturbed pair we obtain strictly feasible iterates close enough to the central path of the new perturbed pair. The algorithm finds an optimal solution or detects infeasibility or unboundedness of the given problem.
A New SelfDual Embedding Method for Convex Programming
 Journal of Global Optimization
, 2001
"... In this paper we introduce a conic optimization formulation for inequalityconstrained convex programming, and propose a selfdual embedding model for solving the resulting conic optimization problem. The primal and dual cones in this formulation are characterized by the original constraint function ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
In this paper we introduce a conic optimization formulation for inequalityconstrained convex programming, and propose a selfdual embedding model for solving the resulting conic optimization problem. The primal and dual cones in this formulation are characterized by the original constraint functions and their corresponding conjugate functions respectively. Hence they are completely symmetric. This allows for a standard primaldual path following approach for solving the embedded problem. Moreover, there are two immediate logarithmic barrier functions for the primal and dual cones. We show that these two logarithmic barrier functions are conjugate to each other. The explicit form of the conjugate functions are in fact not required to be known in the algorithm. An advantage of the new approach is that there is no need to assume an initial feasible solution to start with. To guarantee the polynomiality of the pathfollowing procedure, we may apply the selfconcordant barrier theory of Nesterov and Nemirovski. For this purpose, as one application, we prove that the barrier functions constructed this way are indeed selfconcordant when the original constraint functions are convex and quadratic. Keywords: Convex Programming, Convex Cones, SelfDual Embedding, SelfConcordant Barrier Functions. # Department of Systems Engineering and Engineering Management, The Chinese University of Hong Kong, Shatin, Hong Kong. Research supported by Hong Kong RGC Earmarked Grants CUHK4181/00E and CUHK4233/01E. 1 1
Interiorpoint methods for optimization
, 2008
"... This article describes the current state of the art of interiorpoint methods (IPMs) for convex, conic, and general nonlinear optimization. We discuss the theory, outline the algorithms, and comment on the applicability of this class of methods, which have revolutionized the field over the last twen ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
This article describes the current state of the art of interiorpoint methods (IPMs) for convex, conic, and general nonlinear optimization. We discuss the theory, outline the algorithms, and comment on the applicability of this class of methods, which have revolutionized the field over the last twenty years.