Results 1  10
of
41
LAGRANGE MULTIPLIERS AND OPTIMALITY
, 1993
"... Lagrange multipliers used to be viewed as auxiliary variables introduced in a problem of constrained minimization in order to write firstorder optimality conditions formally as a system of equations. Modern applications, with their emphasis on numerical methods and more complicated side conditions ..."
Abstract

Cited by 89 (7 self)
 Add to MetaCart
Lagrange multipliers used to be viewed as auxiliary variables introduced in a problem of constrained minimization in order to write firstorder optimality conditions formally as a system of equations. Modern applications, with their emphasis on numerical methods and more complicated side conditions than equations, have demanded deeper understanding of the concept and how it fits into a larger theoretical picture. A major line of research has been the nonsmooth geometry of onesided tangent and normal vectors to the set of points satisfying the given constraints. Another has been the gametheoretic role of multiplier vectors as solutions to a dual problem. Interpretations as generalized derivatives of the optimal value with respect to problem parameters have also been explored. Lagrange multipliers are now being seen as arising from a general rule for the subdifferentiation of a nonsmooth objective function which allows blackandwhite constraints to be replaced by penalty expressions. This paper traces such themes in the current theory of Lagrange multipliers, providing along the way a freestanding exposition of basic nonsmooth analysis as motivated by and applied to this subject.
Nonsmooth Analysis of Eigenvalues
 MATHEMATICAL PROGRAMMING
, 1998
"... The eigenvalues of a symmetric matrix depend on the matrix nonsmoothly. This paper describes the nonsmooth analysis of these eigenvalues. In particular, I present a simple formula for the approximate (limiting Frechet) subdifferential of an arbitrary function of the eigenvalues, subsuming earlier re ..."
Abstract

Cited by 37 (12 self)
 Add to MetaCart
The eigenvalues of a symmetric matrix depend on the matrix nonsmoothly. This paper describes the nonsmooth analysis of these eigenvalues. In particular, I present a simple formula for the approximate (limiting Frechet) subdifferential of an arbitrary function of the eigenvalues, subsuming earlier results on convex and Clarke subgradients. As an example I compute the subdifferential of the k'th largest eigenvalue.
Variational Analysis Of NonLipschitz Spectral Functions
 MATHEMATICAL PROGRAMMING
, 1999
"... We consider spectral functions f , where f is any permutationinvariant mapping from C n to R, and is the eigenvalue map from C nn to C n , ordering the eigenvalues lexicographically. For example, if f is the function \maximum real part", then f is the spectral abscissa, while if f is \ma ..."
Abstract

Cited by 24 (14 self)
 Add to MetaCart
We consider spectral functions f , where f is any permutationinvariant mapping from C n to R, and is the eigenvalue map from C nn to C n , ordering the eigenvalues lexicographically. For example, if f is the function \maximum real part", then f is the spectral abscissa, while if f is \maximum modulus", then f is the spectral radius. Both these spectral functions are continuous, but they are neither convex nor Lipschitz. For our analysis, we use the notion of subgradient extensively analyzed in Variational Analysis, R.T. Rockafellar and R. J.B. Wets (Springer, 1998), which is particularly well suited to the variational analysis of nonLipschitz spectral functions. We derive a number of necessary conditions for subgradients of spectral functions. For the spectral abscissa, we give both necessary and sucient conditions for subgradients, and precisely identify the case where subdierential regularity holds. We conclude by introducing the notion of semistable programmin...
Constraint qualifications and necessary optimality conditions for optimization problems with variational inequality constraints
 SIAM J. Optim
"... Abstract. A very general optimization problem with a variational inequality constraint, inequality constraints, and an abstract constraint are studied. Fritz John type and Kuhn–Tucker type necessary optimality conditions involving Mordukhovich coderivatives are derived. Several constraint qualificat ..."
Abstract

Cited by 18 (12 self)
 Add to MetaCart
Abstract. A very general optimization problem with a variational inequality constraint, inequality constraints, and an abstract constraint are studied. Fritz John type and Kuhn–Tucker type necessary optimality conditions involving Mordukhovich coderivatives are derived. Several constraint qualifications for the Kuhn–Tucker type necessary optimality conditions involving Mordukhovich coderivatives are introduced and their relationships are studied. Applications to bilevel programming problems are also given.
Stability theory for parametric generalized equations and variational inequalities via nonsmooth analysis
 Trans. Amer. Math. Soc
, 1994
"... In this paper we develop a stability theory for broad classes of parametric generalized equations and variational inequalities in finite dimensions. These objects have a wide range of applications in optimization, nonlinear analysis, mathematical economics, etc. Our main concern is the Lipschitzian ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
In this paper we develop a stability theory for broad classes of parametric generalized equations and variational inequalities in finite dimensions. These objects have a wide range of applications in optimization, nonlinear analysis, mathematical economics, etc. Our main concern is the Lipschitzian stability of multivalued solution maps depending on parameters. We employ a new approach of nonsmooth analysis based on the generalized differentiation of multivalued and nonsmooth operators. This approach allows us to obtain effectice sufficient conditions as well as necessary and sufficient conditions for a natural Lipschitzian behavior of solution maps. In particular, we prove new criteria for the existence of Lipschitzian multivalued and singlevalued implicit functions.
Viscosity Solutions and Viscosity Subderivatives in Smooth Banach Spaces with Applications to Metric Regularity
, 1996
"... In Gateaux or bornologically differentiable spaces there are two natural generalizations of the concept of a Fr'echet subderivative: In this paper we study the viscosity subderivative (which is the more robust of the two) and establish refined fuzzy sum rules for it in a smooth Banach space. These r ..."
Abstract

Cited by 15 (9 self)
 Add to MetaCart
In Gateaux or bornologically differentiable spaces there are two natural generalizations of the concept of a Fr'echet subderivative: In this paper we study the viscosity subderivative (which is the more robust of the two) and establish refined fuzzy sum rules for it in a smooth Banach space. These rules are applied to obtain comparison results for viscosity solutions of HamiltonJacobi equations in fismooth spaces. A unified treatment of metric regularity in smooth spaces completes the paper. This illustrates the flexibility of viscosity subderivatives as a tool for analysis.
A Survey of Subdifferential Calculus with Applications
 TMA
, 1998
"... This survey is an account of the current status of subdifferential research. It is intended to serve as an entry point for researchers and graduate students in a wide variety of pure and applied analysis areas who might profitably use subdifferentials as tools. ..."
Abstract

Cited by 14 (6 self)
 Add to MetaCart
This survey is an account of the current status of subdifferential research. It is intended to serve as an entry point for researchers and graduate students in a wide variety of pure and applied analysis areas who might profitably use subdifferentials as tools.
Pseudonormality and a Lagrange Multiplier Theory for Constrained Optimization
, 2000
"... We consider optimization problems with equality, inequality, and abstract set constraints, and we explore various characteristics of the constraint set that imply the existence of Lagrange multipliers. We prove a generalized version of the FritzJohn theorem, and we introduce new and general conditi ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
We consider optimization problems with equality, inequality, and abstract set constraints, and we explore various characteristics of the constraint set that imply the existence of Lagrange multipliers. We prove a generalized version of the FritzJohn theorem, and we introduce new and general conditions that extend and unify the major constraint qualifications. Among these conditions, two new properties, pseudonormality and quasinormality, emerge as central within the taxonomy of interesting constraint characteristics. In the case where there is no abstract set constraint, these properties provide the connecting link between the classical constraint qualifications and two distinct pathways to the existence of Lagrange multipliers: one involving the notion of quasiregularity and Farkas' Lemma, and the other involving the use of exact penalty functions. The second pathway also applies in the general case where there is an abstract set constraint.
METRIC INEQUALITY, SUBDIFFERENTIAL CALCULUS AND APPLICATIONS
, 2000
"... In this paper, we establish characterizations of Asplund spaces in terms of conditions ensuring the metric inequality and intersection formulae. Then we establish chain rules for the limiting Fréchet subdifferentials. Necessary conditions for constrained optimization problems with nonLipschitz dat ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
In this paper, we establish characterizations of Asplund spaces in terms of conditions ensuring the metric inequality and intersection formulae. Then we establish chain rules for the limiting Fréchet subdifferentials. Necessary conditions for constrained optimization problems with nonLipschitz data are derived.
Lagrange Multipliers For Nonconvex Generalized Gradients With Equality, Inequality And Set Constraints.
 SIAM J. Control Optim
"... . A Lagrange multiplier rule for finite dimensional Lipschitz problems is proven that uses a nonconvex generalized gradient. This result uses either both the linear generalized gradient and the generalized gradient of Mordukhovich or the linear generalized gradient and a qualification condition invo ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
. A Lagrange multiplier rule for finite dimensional Lipschitz problems is proven that uses a nonconvex generalized gradient. This result uses either both the linear generalized gradient and the generalized gradient of Mordukhovich or the linear generalized gradient and a qualification condition involving the pseudoLipschitz behavior of the feasible set under perturbations. The optimization problem includes equality constraints, inequality constraints and a set constraint. This result extends known nonsmooth results for the Lipschitz case. Abbreviated Title: Nonconvex gradients and Lagrange Multipliers. 1991 Mathematics Subject Classification. 90C30, 49J52. Key words and phrases. Lagrange multipliers, nonsmooth analysis, generalized gradients, optimality conditions. 1. Introduction In this paper we derive necessary conditions for a finite dimensional constrained optimization problem. The main differences between this and other work is that a small nonconvex generalized gradient is u...