Results 1 
7 of
7
LAGRANGE MULTIPLIERS AND OPTIMALITY
, 1993
"... Lagrange multipliers used to be viewed as auxiliary variables introduced in a problem of constrained minimization in order to write firstorder optimality conditions formally as a system of equations. Modern applications, with their emphasis on numerical methods and more complicated side conditions ..."
Abstract

Cited by 89 (7 self)
 Add to MetaCart
Lagrange multipliers used to be viewed as auxiliary variables introduced in a problem of constrained minimization in order to write firstorder optimality conditions formally as a system of equations. Modern applications, with their emphasis on numerical methods and more complicated side conditions than equations, have demanded deeper understanding of the concept and how it fits into a larger theoretical picture. A major line of research has been the nonsmooth geometry of onesided tangent and normal vectors to the set of points satisfying the given constraints. Another has been the gametheoretic role of multiplier vectors as solutions to a dual problem. Interpretations as generalized derivatives of the optimal value with respect to problem parameters have also been explored. Lagrange multipliers are now being seen as arising from a general rule for the subdifferentiation of a nonsmooth objective function which allows blackandwhite constraints to be replaced by penalty expressions. This paper traces such themes in the current theory of Lagrange multipliers, providing along the way a freestanding exposition of basic nonsmooth analysis as motivated by and applied to this subject.
THE THEORY OF 2REGULARITY FOR MAPPINGS WITH LIPSCHITZIAN DERIVATIVES AND ITS APPLICATIONS TO OPTIMALITY CONDITIONS
, 2002
"... We study local structure of a nonlinear mapping near points where standard regularity and/or smoothness assumptions need not be satisfied. We introduce a new concept of 2regularity (a certain kind of secondorder regularity) for a once differentiable mapping whose derivative is Lipschitz continuous ..."
Abstract

Cited by 17 (15 self)
 Add to MetaCart
We study local structure of a nonlinear mapping near points where standard regularity and/or smoothness assumptions need not be satisfied. We introduce a new concept of 2regularity (a certain kind of secondorder regularity) for a once differentiable mapping whose derivative is Lipschitz continuous. Under this 2regularity condition, we obtain the representation theorem and the covering theorem (i.e., stability with respect to “righthand side ” perturbations) under assumptions that are weaker than those previously employed in the literature for results of this type. These results are further used to derive a constructive description of the tangent cone to a set defined by (2regular) equality constraints and optimality conditions for related optimization problems. The class of mappings introduced and studied in the paper appears to be a convenient tool for treating complementarity structures by means of an appropriate equationbased reformulation. Optimality conditions for mathematical programs with (equivalently reformulated) complementarity constraints are also discussed.
Optimality conditions for irregular inequalityconstrained problems
 SIAM J. Control Optim
"... Abstract. We consider feasible sets given by conic constraints, where the cone defining the constraints is convex with nonempty interior. We study the case where the feasible set is not assumed to be regular in the classical sense of Robinson and obtain a constructive description of the tangent cone ..."
Abstract

Cited by 8 (8 self)
 Add to MetaCart
Abstract. We consider feasible sets given by conic constraints, where the cone defining the constraints is convex with nonempty interior. We study the case where the feasible set is not assumed to be regular in the classical sense of Robinson and obtain a constructive description of the tangent cone under a certain new secondorder regularity condition. This condition contains classical regularity as a special case, while being weaker when constraints are twice differentiable. Assuming that the cone defining the constraints is finitely generated, we also derive a special form of primaldual optimality conditions for the corresponding constrained optimization problem. Our results subsume optimality conditions for both the classical regular and secondorder regular cases, while still being meaningful in the more general setting in the sense that the multiplier associated with the objective function is nonzero.
CONSTRAINT QUALIFICATIONS
 ENCYCLOPEDIA OF OPERATIONS RESEARCH AND MANAGEMENT SCIENCE
"... We discuss assumptions on the constraint functions that allow constructive description of the geometry of a given set around a given point in terms of the constraints derivatives. Consequences for characterizing solutions of variational and optimization problems are discussed. In the optimization ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
We discuss assumptions on the constraint functions that allow constructive description of the geometry of a given set around a given point in terms of the constraints derivatives. Consequences for characterizing solutions of variational and optimization problems are discussed. In the optimization case, these include primal and primaldual first and secondorder necessary optimality conditions.
Second Order Optimality Conditions in Generalized SemiInfinite Programming
"... This paper deals with generalized semiinfinite optimization problems where the (infinite) index set of inequality constraints depends on the state variables and all involved functions are twice continuously differentiable. Necessary and sufficient second order optimality conditions for such problem ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
This paper deals with generalized semiinfinite optimization problems where the (infinite) index set of inequality constraints depends on the state variables and all involved functions are twice continuously differentiable. Necessary and sufficient second order optimality conditions for such problems are derived under assumptions which imply that the corresponding optimal value function is second order (parabolically) directionally differentiable and second order epiregular at the considered point. These sufficient conditions are, in particular, equivalent to the second order growth condition.
Second Order Necessary and Sufficient Conditions for Efficiency in Multiobjective Programming Problems
, 1998
"... ..."
Control and Cybernetics
"... Second order convexity and a modified objective function method in mathematical programming by ..."
Abstract
 Add to MetaCart
Second order convexity and a modified objective function method in mathematical programming by