Results 1  10
of
14
LAGRANGE MULTIPLIERS AND OPTIMALITY
, 1993
"... Lagrange multipliers used to be viewed as auxiliary variables introduced in a problem of constrained minimization in order to write firstorder optimality conditions formally as a system of equations. Modern applications, with their emphasis on numerical methods and more complicated side conditions ..."
Abstract

Cited by 89 (7 self)
 Add to MetaCart
Lagrange multipliers used to be viewed as auxiliary variables introduced in a problem of constrained minimization in order to write firstorder optimality conditions formally as a system of equations. Modern applications, with their emphasis on numerical methods and more complicated side conditions than equations, have demanded deeper understanding of the concept and how it fits into a larger theoretical picture. A major line of research has been the nonsmooth geometry of onesided tangent and normal vectors to the set of points satisfying the given constraints. Another has been the gametheoretic role of multiplier vectors as solutions to a dual problem. Interpretations as generalized derivatives of the optimal value with respect to problem parameters have also been explored. Lagrange multipliers are now being seen as arising from a general rule for the subdifferentiation of a nonsmooth objective function which allows blackandwhite constraints to be replaced by penalty expressions. This paper traces such themes in the current theory of Lagrange multipliers, providing along the way a freestanding exposition of basic nonsmooth analysis as motivated by and applied to this subject.
A Survey of Subdifferential Calculus with Applications
 TMA
, 1998
"... This survey is an account of the current status of subdifferential research. It is intended to serve as an entry point for researchers and graduate students in a wide variety of pure and applied analysis areas who might profitably use subdifferentials as tools. ..."
Abstract

Cited by 14 (6 self)
 Add to MetaCart
This survey is an account of the current status of subdifferential research. It is intended to serve as an entry point for researchers and graduate students in a wide variety of pure and applied analysis areas who might profitably use subdifferentials as tools.
Ample parameterization of variational inclusions
 SIAM JOURNAL ON OPTIMIZATION
, 2001
"... For a general category of variational inclusions in finite dimensions, a class of parameterizations, called “ample” parameterizations, is identified that is rich enough to provide a full theory of Lipschitztype properties of solution mappings without the need for resorting to the auxiliary introdu ..."
Abstract

Cited by 14 (9 self)
 Add to MetaCart
For a general category of variational inclusions in finite dimensions, a class of parameterizations, called “ample” parameterizations, is identified that is rich enough to provide a full theory of Lipschitztype properties of solution mappings without the need for resorting to the auxiliary introduction of canonical parameters. Ample parameterizations also support a detailed description of the graphical geometry that underlies generalized differentiation of solutions mappings. A theorem on protoderivatives is thereby obtained. The case of a variational inequality over a polyhedral convex set is given special treatment along with an application to minimizing a parameterized function over such a set.
Robinson’s implicit function theorem and its extensions
 MATH. PROGRAM
, 2008
"... S. M. Robinson published in 1980 a powerful theorem about solutions to certain “generalized equations” corresponding to parameterized variational inequalities which could represent the firstorder optimality conditions in nonlinear programming, in particular. In fact, his result covered much of the ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
S. M. Robinson published in 1980 a powerful theorem about solutions to certain “generalized equations” corresponding to parameterized variational inequalities which could represent the firstorder optimality conditions in nonlinear programming, in particular. In fact, his result covered much of the classical implicit function theorem, if not quite all, but went far beyond that in ideas and format. Here, Robinson’s theorem is viewed from the perspective of more recent developments in variational analysis as well as some lesserknown results in the implicit function literature on equations, prior to the advent of generalized equations. Extensions are presented which fully cover such results, translating them at the same time to generalized equations broader than variational inequalities. Robinson’s notion of firstorder approximations in the absence of differentiability is utilized in part, but even looser forms of approximation are shown to furnish significant information about solutions.
Dualization of subgradient conditions for optimality
"... Abstract. A basic relationship is derived between generalized subgradients of a given function, possibly nonsmooth and nonconvex, and those of a second function obtained from it by partial conjugation. Applications are made to the study of multiplier rules in finitedimensional optimization and to t ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
Abstract. A basic relationship is derived between generalized subgradients of a given function, possibly nonsmooth and nonconvex, and those of a second function obtained from it by partial conjugation. Applications are made to the study of multiplier rules in finitedimensional optimization and to the theory of the EulerLagrange condition and Hamiltonian condition in nonsmooth optimal control. Keywords. Subgradients, nonsmooth analysis, Lagrange multiplier rules, EulerLagrange
Implicit Multifunction Theorems
"... . We prove a general implicit function theorem for multifunctions with a metric estimate on the implicit multifunction and a characterization of its coderivative. Traditional open covering theorems, stability results, and sufficient conditions for a multifunction to be metrically regular or pseudoL ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
. We prove a general implicit function theorem for multifunctions with a metric estimate on the implicit multifunction and a characterization of its coderivative. Traditional open covering theorems, stability results, and sufficient conditions for a multifunction to be metrically regular or pseudoLipschitzian can be deduced from this implicit function theorem. We prove this implicit multifunction theorem by reducing it to an implicit function/solvability theorem for functions. This approach can also be used to prove the RobinsonUrsescu open mapping theorem. As a tool for this alternative proof of the RobinsonUrsescu theorem we also establish a refined version of the multidirectional mean value inequality which is of independent interest. Key Words. Nonsmooth analysis, subdifferentials, coderivatives, implicit function theorem, solvability, stability, open mapping theorem, metric regularity, multidirectional mean value inequality. AMS (1991) subject classification: 26B05. 1 Research...
Multiplier Rules under Mixed Assumptions of Differentiability and Lipschitz Continuity
 SIAM J. Control Optim
"... In this paper we study nonlinear programming problems with equality, inequality, and abstract constraints where some of the functions are Fr'echet differentiable at the optimal solution, some of the functions are Lipschitz near the optimal solution, and the abstract constraint set may be nonconvex. ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
In this paper we study nonlinear programming problems with equality, inequality, and abstract constraints where some of the functions are Fr'echet differentiable at the optimal solution, some of the functions are Lipschitz near the optimal solution, and the abstract constraint set may be nonconvex. We derive Fritz John type and KarushKuhnTucker (KKT) type first order necessary optimality conditions for the above problem where Fr'echet derivatives are used for the differentiable functions and subdifferentials are used for the Lipschitz continuous functions. Constraint qualifications for the KKT type first order necessary optimality conditions to hold include the generalized MangasarianFromovitz constraint qualification, the no nonzero abnormal multiplier constraint qualification, the metric regularity of the constraint region, and the calmness constraint qualification.
Nonsmooth analysis and parametric optimization
 in Methods of Nonconvex Analysis, Lecture Notes in
, 1990
"... Abstract. In an optimization problem that depends on parameters, an important issue is the effect that perturbations of the parameters can have on solutions to the problem and their associated multipliers. Under quite broad conditions the possibly multivalued mapping that gives these elements in te ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Abstract. In an optimization problem that depends on parameters, an important issue is the effect that perturbations of the parameters can have on solutions to the problem and their associated multipliers. Under quite broad conditions the possibly multivalued mapping that gives these elements in terms of the parameters turns out to enjoy a property of “protodifferentiability. ” Generalized derivatives can then be calculated by solving an auxiliary optimization problem with auxiliary parameters. This is constructed from the original problem by taking secondorder epiderivatives of an essential objective function. 1. Solutions to Optimization Problems with Parameters. From an abstract point of view, a general optimization problem relative to elements x of a Banach space X can be seen in terms of minimizing an expression f(x) over all x ∈ X, where f is a function on X with values in IR = IR ∪ {±∞}. The effective domain dom f: = { x ∈ X ∣ } f(x) < ∞ gives the “feasible ” or “admissible ” elements x. Under the assumption that f is lower semicontinuous and proper (the latter meaning that f(x) < ∞ for at least one x, but f(x)> − ∞ for all x), a solution ¯x to the problem must satisfy 0 ∈ ∂f(¯x), where ∂f denotes subgradients in the sense of Clarke [1] (see also Rockafellar
Some Stability Concepts and Their Applications in Optimal Control Problems
"... In this work we are concerned with stateconstrained optimal control problems. Our aim is to derive the optimality conditions and to prove the convergence of the numerical approximations. To deal with these questions, whose difficulty is motivated by the presence of the state constraints, we conside ..."
Abstract
 Add to MetaCart
In this work we are concerned with stateconstrained optimal control problems. Our aim is to derive the optimality conditions and to prove the convergence of the numerical approximations. To deal with these questions, whose difficulty is motivated by the presence of the state constraints, we consider some concepts of stability of the optimal cost functional with respect to small perturbations of the set of feasible states. While weak and strong stability on the right allow us to derive optimality conditions in a nonqualified and qualified form respectively, the weak stability on the left is the key to prove the convergence of the numerical approximations. 1 Introduction In this paper we consider an optimal control problem with pointwise state constraints governed by a semilinear elliptic partial differential equation. We study the influence of some stability properties in the derivation of the optimality conditions satisfied by the optimal control, obtained through the penalization o...
REGULARITY AND CONDITIONING IN THE VARIATIONAL ANALYSIS OF SOLUTION MAPPINGS
, 2003
"... Concepts of conditioning have long been important in numerical work on solving systems of equations, but in recent years attempts have been made to extend them to feasibility conditions, optimality conditions, complementarity conditions and variational inequalities, all of which can be posed as solv ..."
Abstract
 Add to MetaCart
Concepts of conditioning have long been important in numerical work on solving systems of equations, but in recent years attempts have been made to extend them to feasibility conditions, optimality conditions, complementarity conditions and variational inequalities, all of which can be posed as solving “generalized equations” for setvalued mappings. Here, the conditioning of such generalized equations is systematically organized around four key notions: metric regularity, subregularity, strong regularity and strong subregularity. Various properties and characterizations already known for metric regularity itself are extended to strong regularity and strong subregularity, but metric subregularity, although widely considered, is shown to be too fragile to support stability results such as a radius of good behavior modeled on the EckartYoung theorem.