Results 1  10
of
20
LAGRANGE MULTIPLIERS AND OPTIMALITY
, 1993
"... Lagrange multipliers used to be viewed as auxiliary variables introduced in a problem of constrained minimization in order to write firstorder optimality conditions formally as a system of equations. Modern applications, with their emphasis on numerical methods and more complicated side conditions ..."
Abstract

Cited by 88 (7 self)
 Add to MetaCart
Lagrange multipliers used to be viewed as auxiliary variables introduced in a problem of constrained minimization in order to write firstorder optimality conditions formally as a system of equations. Modern applications, with their emphasis on numerical methods and more complicated side conditions than equations, have demanded deeper understanding of the concept and how it fits into a larger theoretical picture. A major line of research has been the nonsmooth geometry of onesided tangent and normal vectors to the set of points satisfying the given constraints. Another has been the gametheoretic role of multiplier vectors as solutions to a dual problem. Interpretations as generalized derivatives of the optimal value with respect to problem parameters have also been explored. Lagrange multipliers are now being seen as arising from a general rule for the subdifferentiation of a nonsmooth objective function which allows blackandwhite constraints to be replaced by penalty expressions. This paper traces such themes in the current theory of Lagrange multipliers, providing along the way a freestanding exposition of basic nonsmooth analysis as motivated by and applied to this subject.
Nonsmooth Analysis of Eigenvalues
 MATHEMATICAL PROGRAMMING
, 1998
"... The eigenvalues of a symmetric matrix depend on the matrix nonsmoothly. This paper describes the nonsmooth analysis of these eigenvalues. In particular, I present a simple formula for the approximate (limiting Frechet) subdifferential of an arbitrary function of the eigenvalues, subsuming earlier re ..."
Abstract

Cited by 35 (10 self)
 Add to MetaCart
The eigenvalues of a symmetric matrix depend on the matrix nonsmoothly. This paper describes the nonsmooth analysis of these eigenvalues. In particular, I present a simple formula for the approximate (limiting Frechet) subdifferential of an arbitrary function of the eigenvalues, subsuming earlier results on convex and Clarke subgradients. As an example I compute the subdifferential of the k'th largest eigenvalue.
Stability theory for parametric generalized equations and variational inequalities via nonsmooth analysis
 Trans. Amer. Math. Soc
, 1994
"... In this paper we develop a stability theory for broad classes of parametric generalized equations and variational inequalities in finite dimensions. These objects have a wide range of applications in optimization, nonlinear analysis, mathematical economics, etc. Our main concern is the Lipschitzian ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
In this paper we develop a stability theory for broad classes of parametric generalized equations and variational inequalities in finite dimensions. These objects have a wide range of applications in optimization, nonlinear analysis, mathematical economics, etc. Our main concern is the Lipschitzian stability of multivalued solution maps depending on parameters. We employ a new approach of nonsmooth analysis based on the generalized differentiation of multivalued and nonsmooth operators. This approach allows us to obtain effectice sufficient conditions as well as necessary and sufficient conditions for a natural Lipschitzian behavior of solution maps. In particular, we prove new criteria for the existence of Lipschitzian multivalued and singlevalued implicit functions.
A Survey of Subdifferential Calculus with Applications
 TMA
, 1998
"... This survey is an account of the current status of subdifferential research. It is intended to serve as an entry point for researchers and graduate students in a wide variety of pure and applied analysis areas who might profitably use subdifferentials as tools. ..."
Abstract

Cited by 14 (6 self)
 Add to MetaCart
This survey is an account of the current status of subdifferential research. It is intended to serve as an entry point for researchers and graduate students in a wide variety of pure and applied analysis areas who might profitably use subdifferentials as tools.
Metric Regularity and Quantitative Stability in Stochastic Programs With Probabilistic Constraints
"... Introducing probabilistic constraints leads in general to nonconvex, nonsmooth or even discontinuous optimization models. In this paper, necessary and sufficient conditions for metric regularity of (several joint) probabilistic constraints are derived using recent results from nonsmooth analysis. Th ..."
Abstract

Cited by 12 (9 self)
 Add to MetaCart
Introducing probabilistic constraints leads in general to nonconvex, nonsmooth or even discontinuous optimization models. In this paper, necessary and sufficient conditions for metric regularity of (several joint) probabilistic constraints are derived using recent results from nonsmooth analysis. The conditions apply to fairly general constraints and extend earlier work in this direction. Further, a verifiable sufficient condition for quadratic growth of the objective function in a more specific convex stochastic program is indicated and applied in order to obtain a new result on quantitative stability of solution sets when the underlying probability distribution is subjected to perturbations. This is used to derive bounds for the deviation of solution sets when the probability measure is replaced by empirical estimates. Keywords: stochastic programming, probabilistic constraints, metric regularity, nonsmooth analysis, quadratic growth, quantitative stability, empirical approximation ...
TILT STABILITY OF A LOCAL MINIMUM
 SIAM J. OPTIMIZATION
"... The behavior of a minimizing point when an objective function is tilted by adding a small linear term is studied from the perspective of secondorder conditions for local optimality. The classical condition of a positivedefinite Hessian in smooth problems without constraints is found to have an exa ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
The behavior of a minimizing point when an objective function is tilted by adding a small linear term is studied from the perspective of secondorder conditions for local optimality. The classical condition of a positivedefinite Hessian in smooth problems without constraints is found to have an exact counterpart much more broadly in the positivity of a certain generalized Hessian mapping. This fully characterizes the case where tilt perturbations cause the minimizing point to shift in a lipschitzian manner.
New necessary conditions for the generalized problem of Bolza
 SIAM J. Control Optim
, 1996
"... Abstract. Problems of optimal control are considered in the neoclassical Bolza format, which centers on states and velocities and relies on nonsmooth analysis. Subgradient versions of the EulerLagrange equation and the Hamiltonian equation are shown to be necessary for the optimality of a trajector ..."
Abstract

Cited by 10 (3 self)
 Add to MetaCart
Abstract. Problems of optimal control are considered in the neoclassical Bolza format, which centers on states and velocities and relies on nonsmooth analysis. Subgradient versions of the EulerLagrange equation and the Hamiltonian equation are shown to be necessary for the optimality of a trajectory, moreover in a newly sharpened form that makes these conditions equivalent to each other. At the same time, the assumptions on the Lagrangian integrand are weakened substantially over what has been required previously in obtaining such conditions. Key words. Optimal control, calculus of variations, nonsmooth analysis, problem of Bolza, EulerLagrange condition, Hamiltonian condition, transversality condition AMS subject classifications. 49K15, 49K05, 49K24 1. Introduction. Among
Approximate Jacobian Matrices for Nonsmooth Continuous Maps and C¹Optimization
 TO APPEAR IN SIAM JOURNAL ON CONTROL AND OPTIMIZATION
, 1997
"... A notion of approximate Jacobian matrix is introduced for a continuous vectorvalued map. It is shown for instance that the Clarke generalized Jacobian is an approximate Jacobian for a locally Lipschitz map. The approach is based on the idea of convexificators of realvalued functions. Mean value co ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
A notion of approximate Jacobian matrix is introduced for a continuous vectorvalued map. It is shown for instance that the Clarke generalized Jacobian is an approximate Jacobian for a locally Lipschitz map. The approach is based on the idea of convexificators of realvalued functions. Mean value conditions for continuous vectorvalued maps and Taylor's expansions for continuously Gateaux differentiable functions ( i.e. C¹functions) are presented in terms of approximate Jacobians and approximate Hessians respectively. Secondorder necessary, and sufficient conditions for optimality and convexity of C¹functions are also given.
Lagrange Multipliers For Nonconvex Generalized Gradients With Equality, Inequality And Set Constraints.
 SIAM J. Control Optim
"... . A Lagrange multiplier rule for finite dimensional Lipschitz problems is proven that uses a nonconvex generalized gradient. This result uses either both the linear generalized gradient and the generalized gradient of Mordukhovich or the linear generalized gradient and a qualification condition invo ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
. A Lagrange multiplier rule for finite dimensional Lipschitz problems is proven that uses a nonconvex generalized gradient. This result uses either both the linear generalized gradient and the generalized gradient of Mordukhovich or the linear generalized gradient and a qualification condition involving the pseudoLipschitz behavior of the feasible set under perturbations. The optimization problem includes equality constraints, inequality constraints and a set constraint. This result extends known nonsmooth results for the Lipschitz case. Abbreviated Title: Nonconvex gradients and Lagrange Multipliers. 1991 Mathematics Subject Classification. 90C30, 49J52. Key words and phrases. Lagrange multipliers, nonsmooth analysis, generalized gradients, optimality conditions. 1. Introduction In this paper we derive necessary conditions for a finite dimensional constrained optimization problem. The main differences between this and other work is that a small nonconvex generalized gradient is u...
The Linear Nonconvex Generalized Gradient And Lagrange Multipliers
 SIAM J. Optimization
, 1995
"... . A Lagrange multiplier rules that uses small generalized gradients is introduced. It includes both inequality and set constraints. The generalized gradient is the linear generalized gradient. It is smaller than the generalized gradients of Clarke and Mordukhovich but retains much of their nice calc ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
. A Lagrange multiplier rules that uses small generalized gradients is introduced. It includes both inequality and set constraints. The generalized gradient is the linear generalized gradient. It is smaller than the generalized gradients of Clarke and Mordukhovich but retains much of their nice calculus. Its convex hull is the generalized gradient of Michel and Penot if a function is Lipschitz. The tools used in the proof of this Lagrange multiplier result are a coderivative, a chain rule and a scalarization formula for this coderivative. Many smooth and nonsmooth Lagrange multiplier results are corollaries of this result. It is shown that the technique in this paper can be used for the case of equality, inequality and set constraints if one considers the generalized gradient of Mordukhovich. An open question is if a Lagrange multiplier result holds when one has equality constraints and uses the linear generalized gradient. 1. Introduction Following the work of Mordukhovich [8,10,11,...