Results 1  10
of
25
LAGRANGE MULTIPLIERS AND OPTIMALITY
, 1993
"... Lagrange multipliers used to be viewed as auxiliary variables introduced in a problem of constrained minimization in order to write firstorder optimality conditions formally as a system of equations. Modern applications, with their emphasis on numerical methods and more complicated side conditions ..."
Abstract

Cited by 88 (7 self)
 Add to MetaCart
Lagrange multipliers used to be viewed as auxiliary variables introduced in a problem of constrained minimization in order to write firstorder optimality conditions formally as a system of equations. Modern applications, with their emphasis on numerical methods and more complicated side conditions than equations, have demanded deeper understanding of the concept and how it fits into a larger theoretical picture. A major line of research has been the nonsmooth geometry of onesided tangent and normal vectors to the set of points satisfying the given constraints. Another has been the gametheoretic role of multiplier vectors as solutions to a dual problem. Interpretations as generalized derivatives of the optimal value with respect to problem parameters have also been explored. Lagrange multipliers are now being seen as arising from a general rule for the subdifferentiation of a nonsmooth objective function which allows blackandwhite constraints to be replaced by penalty expressions. This paper traces such themes in the current theory of Lagrange multipliers, providing along the way a freestanding exposition of basic nonsmooth analysis as motivated by and applied to this subject.
Constraint qualifications and necessary optimality conditions for optimization problems with variational inequality constraints
 SIAM J. Optim
"... Abstract. A very general optimization problem with a variational inequality constraint, inequality constraints, and an abstract constraint are studied. Fritz John type and Kuhn–Tucker type necessary optimality conditions involving Mordukhovich coderivatives are derived. Several constraint qualificat ..."
Abstract

Cited by 18 (12 self)
 Add to MetaCart
Abstract. A very general optimization problem with a variational inequality constraint, inequality constraints, and an abstract constraint are studied. Fritz John type and Kuhn–Tucker type necessary optimality conditions involving Mordukhovich coderivatives are derived. Several constraint qualifications for the Kuhn–Tucker type necessary optimality conditions involving Mordukhovich coderivatives are introduced and their relationships are studied. Applications to bilevel programming problems are also given.
Stability theory for parametric generalized equations and variational inequalities via nonsmooth analysis
 Trans. Amer. Math. Soc
, 1994
"... In this paper we develop a stability theory for broad classes of parametric generalized equations and variational inequalities in finite dimensions. These objects have a wide range of applications in optimization, nonlinear analysis, mathematical economics, etc. Our main concern is the Lipschitzian ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
In this paper we develop a stability theory for broad classes of parametric generalized equations and variational inequalities in finite dimensions. These objects have a wide range of applications in optimization, nonlinear analysis, mathematical economics, etc. Our main concern is the Lipschitzian stability of multivalued solution maps depending on parameters. We employ a new approach of nonsmooth analysis based on the generalized differentiation of multivalued and nonsmooth operators. This approach allows us to obtain effectice sufficient conditions as well as necessary and sufficient conditions for a natural Lipschitzian behavior of solution maps. In particular, we prove new criteria for the existence of Lipschitzian multivalued and singlevalued implicit functions.
A Survey of Subdifferential Calculus with Applications
 TMA
, 1998
"... This survey is an account of the current status of subdifferential research. It is intended to serve as an entry point for researchers and graduate students in a wide variety of pure and applied analysis areas who might profitably use subdifferentials as tools. ..."
Abstract

Cited by 14 (6 self)
 Add to MetaCart
This survey is an account of the current status of subdifferential research. It is intended to serve as an entry point for researchers and graduate students in a wide variety of pure and applied analysis areas who might profitably use subdifferentials as tools.
Necessary and sufficient optimality conditions for mathematical programs with equilibrium constraints
, 2005
"... ..."
Sensitivity analysis of the value function for optimization problems with variational inequality constraints
 SIAM J. CONTROL OPTIM
, 2001
"... In this paper we perform sensitivity analysis for optimization problems with variational inequality constraints (OPVICs). We provide upper estimates for the limiting subdifferential (singular limiting subdifferential) of the value function in terms of the set of normal (abnormal) coderivative (CD) m ..."
Abstract

Cited by 12 (7 self)
 Add to MetaCart
In this paper we perform sensitivity analysis for optimization problems with variational inequality constraints (OPVICs). We provide upper estimates for the limiting subdifferential (singular limiting subdifferential) of the value function in terms of the set of normal (abnormal) coderivative (CD) multipliers for OPVICs. For the case of optimization problems with complementarity constraints (OPCCs), we provide upper estimates for the limiting subdifferentials in terms of various multipliers. An example shows that the other multipliers may not provide useful information on the subdifferentials of the value function, while the CD multipliers may provide tighter bounds. Applications to sensitivity analysis of bilevel programming problems are also given.
Equivalent subgradient versions of Hamiltonian and EulerLagrange equations in variational analysis
 SIAM J. Control and Optimization
, 1996
"... Abstract. Much effort in recent years has gone into generalizing the classical Hamiltonian and EulerLagrange equations of the calculus of variations so as to encompass problems in optimal control and a greater variety of integrands and constraints. These generalizations, in which nonsmoothness abou ..."
Abstract

Cited by 10 (3 self)
 Add to MetaCart
Abstract. Much effort in recent years has gone into generalizing the classical Hamiltonian and EulerLagrange equations of the calculus of variations so as to encompass problems in optimal control and a greater variety of integrands and constraints. These generalizations, in which nonsmoothness abounds and gradients are systematically replaced by subgradients, have succeeded in furnishing necessary conditions for optimality which reduce to the classical ones in the classical setting, but important issues have remained unsettled, especially concerning the exact relationship of the subgradient versions of the Hamiltonian equations versus those of the EulerLagrange equations. Here it is shown that new, tighter subgradient versions of these equations are actually equivalent to each other. The theory of epiconvergence of convex functions provides the technical basis for this development. Key words. EulerLagrange equations, Hamiltonian equations, variational analysis, nonsmooth analysis, subgradients, optimality.
New necessary conditions for the generalized problem of Bolza
 SIAM J. Control Optim
, 1996
"... Abstract. Problems of optimal control are considered in the neoclassical Bolza format, which centers on states and velocities and relies on nonsmooth analysis. Subgradient versions of the EulerLagrange equation and the Hamiltonian equation are shown to be necessary for the optimality of a trajector ..."
Abstract

Cited by 10 (3 self)
 Add to MetaCart
Abstract. Problems of optimal control are considered in the neoclassical Bolza format, which centers on states and velocities and relies on nonsmooth analysis. Subgradient versions of the EulerLagrange equation and the Hamiltonian equation are shown to be necessary for the optimality of a trajectory, moreover in a newly sharpened form that makes these conditions equivalent to each other. At the same time, the assumptions on the Lagrangian integrand are weakened substantially over what has been required previously in obtaining such conditions. Key words. Optimal control, calculus of variations, nonsmooth analysis, problem of Bolza, EulerLagrange condition, Hamiltonian condition, transversality condition AMS subject classifications. 49K15, 49K05, 49K24 1. Introduction. Among
Approximate Jacobian Matrices for Nonsmooth Continuous Maps and C¹Optimization
 TO APPEAR IN SIAM JOURNAL ON CONTROL AND OPTIMIZATION
, 1997
"... A notion of approximate Jacobian matrix is introduced for a continuous vectorvalued map. It is shown for instance that the Clarke generalized Jacobian is an approximate Jacobian for a locally Lipschitz map. The approach is based on the idea of convexificators of realvalued functions. Mean value co ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
A notion of approximate Jacobian matrix is introduced for a continuous vectorvalued map. It is shown for instance that the Clarke generalized Jacobian is an approximate Jacobian for a locally Lipschitz map. The approach is based on the idea of convexificators of realvalued functions. Mean value conditions for continuous vectorvalued maps and Taylor's expansions for continuously Gateaux differentiable functions ( i.e. C¹functions) are presented in terms of approximate Jacobians and approximate Hessians respectively. Secondorder necessary, and sufficient conditions for optimality and convexity of C¹functions are also given.
METRIC INEQUALITY, SUBDIFFERENTIAL CALCULUS AND APPLICATIONS
, 2000
"... In this paper, we establish characterizations of Asplund spaces in terms of conditions ensuring the metric inequality and intersection formulae. Then we establish chain rules for the limiting Fréchet subdifferentials. Necessary conditions for constrained optimization problems with nonLipschitz dat ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
In this paper, we establish characterizations of Asplund spaces in terms of conditions ensuring the metric inequality and intersection formulae. Then we establish chain rules for the limiting Fréchet subdifferentials. Necessary conditions for constrained optimization problems with nonLipschitz data are derived.