Results 1  10
of
59
LAGRANGE MULTIPLIERS AND OPTIMALITY
, 1993
"... Lagrange multipliers used to be viewed as auxiliary variables introduced in a problem of constrained minimization in order to write firstorder optimality conditions formally as a system of equations. Modern applications, with their emphasis on numerical methods and more complicated side conditions ..."
Abstract

Cited by 98 (7 self)
 Add to MetaCart
Lagrange multipliers used to be viewed as auxiliary variables introduced in a problem of constrained minimization in order to write firstorder optimality conditions formally as a system of equations. Modern applications, with their emphasis on numerical methods and more complicated side conditions than equations, have demanded deeper understanding of the concept and how it fits into a larger theoretical picture. A major line of research has been the nonsmooth geometry of onesided tangent and normal vectors to the set of points satisfying the given constraints. Another has been the gametheoretic role of multiplier vectors as solutions to a dual problem. Interpretations as generalized derivatives of the optimal value with respect to problem parameters have also been explored. Lagrange multipliers are now being seen as arising from a general rule for the subdifferentiation of a nonsmooth objective function which allows blackandwhite constraints to be replaced by penalty expressions. This paper traces such themes in the current theory of Lagrange multipliers, providing along the way a freestanding exposition of basic nonsmooth analysis as motivated by and applied to this subject.
Nonsmooth Analysis of Eigenvalues
 MATHEMATICAL PROGRAMMING
, 1998
"... The eigenvalues of a symmetric matrix depend on the matrix nonsmoothly. This paper describes the nonsmooth analysis of these eigenvalues. In particular, I present a simple formula for the approximate (limiting Frechet) subdifferential of an arbitrary function of the eigenvalues, subsuming earlier re ..."
Abstract

Cited by 37 (10 self)
 Add to MetaCart
The eigenvalues of a symmetric matrix depend on the matrix nonsmoothly. This paper describes the nonsmooth analysis of these eigenvalues. In particular, I present a simple formula for the approximate (limiting Frechet) subdifferential of an arbitrary function of the eigenvalues, subsuming earlier results on convex and Clarke subgradients. As an example I compute the subdifferential of the k'th largest eigenvalue.
Constraint qualifications and necessary optimality conditions for optimization problems with variational inequality constraints
 SIAM J. Optim
"... Abstract. A very general optimization problem with a variational inequality constraint, inequality constraints, and an abstract constraint are studied. Fritz John type and Kuhn–Tucker type necessary optimality conditions involving Mordukhovich coderivatives are derived. Several constraint qualificat ..."
Abstract

Cited by 22 (12 self)
 Add to MetaCart
(Show Context)
Abstract. A very general optimization problem with a variational inequality constraint, inequality constraints, and an abstract constraint are studied. Fritz John type and Kuhn–Tucker type necessary optimality conditions involving Mordukhovich coderivatives are derived. Several constraint qualifications for the Kuhn–Tucker type necessary optimality conditions involving Mordukhovich coderivatives are introduced and their relationships are studied. Applications to bilevel programming problems are also given.
Variational Analysis Of NonLipschitz Spectral Functions
 MATHEMATICAL PROGRAMMING
, 1999
"... We consider spectral functions f , where f is any permutationinvariant mapping from C n to R, and is the eigenvalue map from C nn to C n , ordering the eigenvalues lexicographically. For example, if f is the function \maximum real part", then f is the spectral abscissa, while if f i ..."
Abstract

Cited by 22 (13 self)
 Add to MetaCart
We consider spectral functions f , where f is any permutationinvariant mapping from C n to R, and is the eigenvalue map from C nn to C n , ordering the eigenvalues lexicographically. For example, if f is the function \maximum real part", then f is the spectral abscissa, while if f is \maximum modulus", then f is the spectral radius. Both these spectral functions are continuous, but they are neither convex nor Lipschitz. For our analysis, we use the notion of subgradient extensively analyzed in Variational Analysis, R.T. Rockafellar and R. J.B. Wets (Springer, 1998), which is particularly well suited to the variational analysis of nonLipschitz spectral functions. We derive a number of necessary conditions for subgradients of spectral functions. For the spectral abscissa, we give both necessary and sucient conditions for subgradients, and precisely identify the case where subdierential regularity holds. We conclude by introducing the notion of semistable programmin...
Stability theory for parametric generalized equations and variational inequalities via nonsmooth analysis
 Trans. Amer. Math. Soc
, 1994
"... In this paper we develop a stability theory for broad classes of parametric generalized equations and variational inequalities in finite dimensions. These objects have a wide range of applications in optimization, nonlinear analysis, mathematical economics, etc. Our main concern is the Lipschitzian ..."
Abstract

Cited by 19 (3 self)
 Add to MetaCart
In this paper we develop a stability theory for broad classes of parametric generalized equations and variational inequalities in finite dimensions. These objects have a wide range of applications in optimization, nonlinear analysis, mathematical economics, etc. Our main concern is the Lipschitzian stability of multivalued solution maps depending on parameters. We employ a new approach of nonsmooth analysis based on the generalized differentiation of multivalued and nonsmooth operators. This approach allows us to obtain effectice sufficient conditions as well as necessary and sufficient conditions for a natural Lipschitzian behavior of solution maps. In particular, we prove new criteria for the existence of Lipschitzian multivalued and singlevalued implicit functions.
Viscosity Solutions and Viscosity Subderivatives in Smooth Banach Spaces with Applications to Metric Regularity
, 1996
"... In Gateaux or bornologically differentiable spaces there are two natural generalizations of the concept of a Fr'echet subderivative: In this paper we study the viscosity subderivative (which is the more robust of the two) and establish refined fuzzy sum rules for it in a smooth Banach space. Th ..."
Abstract

Cited by 19 (9 self)
 Add to MetaCart
In Gateaux or bornologically differentiable spaces there are two natural generalizations of the concept of a Fr'echet subderivative: In this paper we study the viscosity subderivative (which is the more robust of the two) and establish refined fuzzy sum rules for it in a smooth Banach space. These rules are applied to obtain comparison results for viscosity solutions of HamiltonJacobi equations in fismooth spaces. A unified treatment of metric regularity in smooth spaces completes the paper. This illustrates the flexibility of viscosity subderivatives as a tool for analysis.
A Survey of Subdifferential Calculus with Applications
 TMA
, 1998
"... This survey is an account of the current status of subdifferential research. It is intended to serve as an entry point for researchers and graduate students in a wide variety of pure and applied analysis areas who might profitably use subdifferentials as tools. ..."
Abstract

Cited by 19 (6 self)
 Add to MetaCart
This survey is an account of the current status of subdifferential research. It is intended to serve as an entry point for researchers and graduate students in a wide variety of pure and applied analysis areas who might profitably use subdifferentials as tools.
Euler Lagrange and Hamiltonian formalisms in dynamic optimization
 Trans. Amer. Math. Soc
"... Abstract. We consider dynamic optimization problems for systems governed by differential inclusions. The main focus is on the structure of and interrelations between necessary optimality conditions stated in terms of Euler– Lagrange and Hamiltonian formalisms. The principal new results are: an exte ..."
Abstract

Cited by 14 (1 self)
 Add to MetaCart
(Show Context)
Abstract. We consider dynamic optimization problems for systems governed by differential inclusions. The main focus is on the structure of and interrelations between necessary optimality conditions stated in terms of Euler– Lagrange and Hamiltonian formalisms. The principal new results are: an extension of the recently discovered form of the Euler–Weierstrass condition to nonconvex valued differential inclusions, and a new Hamiltonian condition for convex valued inclusions. In both cases additional attention was given to weakening Lipschitz type requirements on the set–valued mapping. The central role of the Euler type condition is emphasized by showing that both the new Hamiltonian condition and the most general form of the Pontriagin maximum principle for equality constrained control systems are consequences of the Euler–Weierstrass condition. An example is given demonstrating that the new Hamiltonian condition is strictly stronger than the previously known one. 1.
METRIC INEQUALITY, SUBDIFFERENTIAL CALCULUS AND APPLICATIONS
, 2000
"... In this paper, we establish characterizations of Asplund spaces in terms of conditions ensuring the metric inequality and intersection formulae. Then we establish chain rules for the limiting Fréchet subdifferentials. Necessary conditions for constrained optimization problems with nonLipschitz dat ..."
Abstract

Cited by 12 (4 self)
 Add to MetaCart
In this paper, we establish characterizations of Asplund spaces in terms of conditions ensuring the metric inequality and intersection formulae. Then we establish chain rules for the limiting Fréchet subdifferentials. Necessary conditions for constrained optimization problems with nonLipschitz data are derived.