Results 1 
9 of
9
LAGRANGE MULTIPLIERS AND OPTIMALITY
, 1993
"... Lagrange multipliers used to be viewed as auxiliary variables introduced in a problem of constrained minimization in order to write firstorder optimality conditions formally as a system of equations. Modern applications, with their emphasis on numerical methods and more complicated side conditions ..."
Abstract

Cited by 89 (7 self)
 Add to MetaCart
Lagrange multipliers used to be viewed as auxiliary variables introduced in a problem of constrained minimization in order to write firstorder optimality conditions formally as a system of equations. Modern applications, with their emphasis on numerical methods and more complicated side conditions than equations, have demanded deeper understanding of the concept and how it fits into a larger theoretical picture. A major line of research has been the nonsmooth geometry of onesided tangent and normal vectors to the set of points satisfying the given constraints. Another has been the gametheoretic role of multiplier vectors as solutions to a dual problem. Interpretations as generalized derivatives of the optimal value with respect to problem parameters have also been explored. Lagrange multipliers are now being seen as arising from a general rule for the subdifferentiation of a nonsmooth objective function which allows blackandwhite constraints to be replaced by penalty expressions. This paper traces such themes in the current theory of Lagrange multipliers, providing along the way a freestanding exposition of basic nonsmooth analysis as motivated by and applied to this subject.
Topics In Convex Optimization: InteriorPoint Methods, Conic Duality and Approximations
"... Contents Table of Contents i List of gures v Preface vii Introduction 1 I INTERIORPOINT METHODS 5 1 Interiorpoint methods for linear optimization 7 1.1.1 Linear optimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 1.1.2 The simplex method . . . . . . . . . . . . . . . . . . . . ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Contents Table of Contents i List of gures v Preface vii Introduction 1 I INTERIORPOINT METHODS 5 1 Interiorpoint methods for linear optimization 7 1.1.1 Linear optimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 1.1.2 The simplex method . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 1.1.3 A rst glimpse on interiorpoint methods . . . . . . . . . . . . . . . . 9 1.1.4 A short historical account . . . . . . . . . . . . . . . . . . . . . . . . . 9 1.2 Building blocks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 1.2.1 Duality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 1.2.2 Optimality conditions . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 1.2.3 Newton's method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 1.2.4 Barrier function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 1.2.5 The central path . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Algorithms for Convex Multiquadratic Programming
, 1997
"... A convex multiquadratic program is defined as minimizing a strictly convex quadratic function subject to convex quadratic inequality constraints. The associated Lagrangian dual problem is a strictly concave maximization problem subject to non negativity constraints. In this thesis three methods for ..."
Abstract
 Add to MetaCart
A convex multiquadratic program is defined as minimizing a strictly convex quadratic function subject to convex quadratic inequality constraints. The associated Lagrangian dual problem is a strictly concave maximization problem subject to non negativity constraints. In this thesis three methods for solving the dual program are developed. The methods are based on Projected Gradient Method, Sequential Quadratic Programming, and Affine Scaling respectively. Furthermore an algorithm for solving a convex quadratic program subject to a spheric constraint, which uses Hessenberg reduction of a symmetric positive semidefinite matrix, is developed. Computational results for dense and randomly generated small to medium size convex multiquadratic programs are presented. i Acknowledgments This report constitutes my master thesis. The work presented in this thesis was conducted at the Department of Industrial and Systems Engineering (ISE), at University of Florida, Gainesville, Florida during the...
Calculating the Cone of Directions of Constancy 1 H. WOLKOWICZ 2
"... Abstract. This note presents an algorithm that finds the cone of directions of constancy of a differentiable, faithfully convex function. Key Words. Cone of directions of constancy, faithfully convex functions, gradient. 1. ..."
Abstract
 Add to MetaCart
Abstract. This note presents an algorithm that finds the cone of directions of constancy of a differentiable, faithfully convex function. Key Words. Cone of directions of constancy, faithfully convex functions, gradient. 1.
NorthHolland Publishing Company CHARACTERIZATIONS OF OPTIMALITY WITHOUT CONSTRAINT QUALIFICATION FOR THE
, 1979
"... We consider the general abstract convex program (P) minimize [(x), subject to g(x) E S, where f is an extended convex functional on X, g: X ~ Y is Sconvex, S is a closed convex cone and X and Y are topological linear spaces. We present primal and dual characterizations for (P). These characterizat ..."
Abstract
 Add to MetaCart
We consider the general abstract convex program (P) minimize [(x), subject to g(x) E S, where f is an extended convex functional on X, g: X ~ Y is Sconvex, S is a closed convex cone and X and Y are topological linear spaces. We present primal and dual characterizations for (P). These characterizations are derived by reducing the problem to a standard Lagrange multiplier problem. Examples given include operator constrained problems as well as semiinfinite programming problems.
NorthHolland Publishing Company GEOMETRY OF OPTIMALITY CONDITIONS AND CONSTRAINT QUALIFICATIONS: THE CONVEX CASE*
, 1979
"... The cones of directions of constancy are used to derive: new as well as known optimality conditions; weakest constraint qualifications; and regularization techniques, for the convex programming problem. In addition, the "badly behaved set " of constraints, i.e. the set of constraints which causes pr ..."
Abstract
 Add to MetaCart
The cones of directions of constancy are used to derive: new as well as known optimality conditions; weakest constraint qualifications; and regularization techniques, for the convex programming problem. In addition, the "badly behaved set " of constraints, i.e. the set of constraints which causes problems in the KuhnTucker theory, is isolated and a computational procedure for checking whether a feasible point is regular or not is presented.
Method of Reduction in Convex Programming 1
"... Abstract. We present an algorithm which solves a convex program with faithfully convex (not necessarily differentiable) constraints. While finding a feasible starting point, the algorithm reduces the program to an equivalent program for which Slater's condition is satisfied. Included are algorithms ..."
Abstract
 Add to MetaCart
Abstract. We present an algorithm which solves a convex program with faithfully convex (not necessarily differentiable) constraints. While finding a feasible starting point, the algorithm reduces the program to an equivalent program for which Slater's condition is satisfied. Included are algorithms for calculating various objects which have recently appeared in the literature. Stability of the algorithm is discussed. Key Words. Convexity, subdifferentials, cones of directions of constancy, equality set of constraints, stability.
Preprocessing and . . .
, 2013
"... This paper presentsa backward stable preprocessing technique for (nearly) illposed semidefinite programming, SDP, problems, i.e., programs for which the Slater constraint qualification, existence of strictly feasible points, (nearly) fails. Current popular algorithms for semidefinite programming r ..."
Abstract
 Add to MetaCart
This paper presentsa backward stable preprocessing technique for (nearly) illposed semidefinite programming, SDP, problems, i.e., programs for which the Slater constraint qualification, existence of strictly feasible points, (nearly) fails. Current popular algorithms for semidefinite programming rely on primaldual interiorpoint, pd ip methods. These algorithms require the Slater constraint qualification for both the primal and dual problems. This assumption guarantees the existence of Lagrange multipliers, wellposedness of the problem, and stability of algorithms. However, there are many instances of SDPs where the Slater constraint qualification fails or nearly fails. Our backward stable preprocessing technique is based on applying the BorweinWolkowicz facial reduction process to find a finite number, k, of rankrevealing orthogonal rotations of the problem. After an appropriate truncation, this results in a smaller, wellposed, nearby problem that satisfies the Robinson constraint qualification, and one that can be solved by standard SDP solvers. The
1 2
, 2012
"... backwards stability. AMS subject classifications: 90C46, 90C22, 90C25, 49K40, 65K10 This paper presentsa backwardstable preprocessingtechnique for (nearly) illposed semidefinite programming, SDP, problems, i.e., programs for which the Slater constraint qualification, existence of strictly feasible ..."
Abstract
 Add to MetaCart
backwards stability. AMS subject classifications: 90C46, 90C22, 90C25, 49K40, 65K10 This paper presentsa backwardstable preprocessingtechnique for (nearly) illposed semidefinite programming, SDP, problems, i.e., programs for which the Slater constraint qualification, existence of strictly feasible points, (nearly) fails. Current popular algorithms for semidefinite programming rely on primaldual interiorpoint, pd ip methods. These algorithms require the Slater constraint qualification for both the primal and dual problems. This assumption guarantees the existence of Lagrange multipliers, wellposedness of the problem, and stability of algorithms. However, there are many instances of SDPs where the Slater constraint qualification fails or nearly fails. Our backward stable preprocessing technique is based on applying the BorweinWolkowicz facial reduction process to find a finite number, k, of rankrevealing orthogonal rotations of the problem. After an appropriate truncation, this results in a smaller, wellposed, nearby problem that satisfies the Robinson constraint qualification, and one that can be solved by standard SDP solvers. The