Results 1  10
of
15
LAGRANGE MULTIPLIERS AND OPTIMALITY
, 1993
"... Lagrange multipliers used to be viewed as auxiliary variables introduced in a problem of constrained minimization in order to write firstorder optimality conditions formally as a system of equations. Modern applications, with their emphasis on numerical methods and more complicated side conditions ..."
Abstract

Cited by 98 (7 self)
 Add to MetaCart
Lagrange multipliers used to be viewed as auxiliary variables introduced in a problem of constrained minimization in order to write firstorder optimality conditions formally as a system of equations. Modern applications, with their emphasis on numerical methods and more complicated side conditions than equations, have demanded deeper understanding of the concept and how it fits into a larger theoretical picture. A major line of research has been the nonsmooth geometry of onesided tangent and normal vectors to the set of points satisfying the given constraints. Another has been the gametheoretic role of multiplier vectors as solutions to a dual problem. Interpretations as generalized derivatives of the optimal value with respect to problem parameters have also been explored. Lagrange multipliers are now being seen as arising from a general rule for the subdifferentiation of a nonsmooth objective function which allows blackandwhite constraints to be replaced by penalty expressions. This paper traces such themes in the current theory of Lagrange multipliers, providing along the way a freestanding exposition of basic nonsmooth analysis as motivated by and applied to this subject.
Weak Sharp Minima In Mathematical Programming
 SIAM Journal on Control and Optimization
, 1993
"... . The notion of a sharp, or strongly unique, minimum is extended to include the possibility of a nonunique solution set. These minima will be called weak sharp minima. Conditions necessary for the solution set of a minimization problem to be a set of weak sharp minima are developed in both the uncon ..."
Abstract

Cited by 40 (3 self)
 Add to MetaCart
. The notion of a sharp, or strongly unique, minimum is extended to include the possibility of a nonunique solution set. These minima will be called weak sharp minima. Conditions necessary for the solution set of a minimization problem to be a set of weak sharp minima are developed in both the unconstrained and constrained cases. These conditions are also shown to be sufficient under the appropriate convexity hypotheses. The existence of weak sharp minima is characterized in the cases of linear and quadratic convex programming and for the linear complementarity problem. In particular, we reproduce a result of Mangasarian and Meyer that shows that the solution set of a linear program is always a set of weak sharp minima whenever it is nonempty. Consequences for the convergence theory of algorithms is also examined, especially conditions yielding finite termination. 1. Introduction. Let f : X 7! IR : = IR S f\Gamma1; 1g, we say that f has a sharp minimum at ¯ x 2 IR n if f(x) f(¯x)...
Equivalent subgradient versions of Hamiltonian and EulerLagrange equations in variational analysis
 SIAM J. Control and Optimization
, 1996
"... Abstract. Much effort in recent years has gone into generalizing the classical Hamiltonian and EulerLagrange equations of the calculus of variations so as to encompass problems in optimal control and a greater variety of integrands and constraints. These generalizations, in which nonsmoothness abou ..."
Abstract

Cited by 14 (3 self)
 Add to MetaCart
Abstract. Much effort in recent years has gone into generalizing the classical Hamiltonian and EulerLagrange equations of the calculus of variations so as to encompass problems in optimal control and a greater variety of integrands and constraints. These generalizations, in which nonsmoothness abounds and gradients are systematically replaced by subgradients, have succeeded in furnishing necessary conditions for optimality which reduce to the classical ones in the classical setting, but important issues have remained unsettled, especially concerning the exact relationship of the subgradient versions of the Hamiltonian equations versus those of the EulerLagrange equations. Here it is shown that new, tighter subgradient versions of these equations are actually equivalent to each other. The theory of epiconvergence of convex functions provides the technical basis for this development. Key words. EulerLagrange equations, Hamiltonian equations, variational analysis, nonsmooth analysis, subgradients, optimality.
Partially Smooth Variational Principles and Applications
 CECM Research Report
"... . We discuss a smooth variational principle for partially smooth viscosity subdifferentials and explore its applications in nonsmooth analysis. Keywords: Smooth variational principle, fuzzy sum rules, mean value inequalities and partially smooth spaces. Short Title: Partially smooth variational prin ..."
Abstract

Cited by 7 (5 self)
 Add to MetaCart
(Show Context)
. We discuss a smooth variational principle for partially smooth viscosity subdifferentials and explore its applications in nonsmooth analysis. Keywords: Smooth variational principle, fuzzy sum rules, mean value inequalities and partially smooth spaces. Short Title: Partially smooth variational principles. AMS (1991) subject classification: 49J50, 49J52. 1 Introduction Smooth variational analysis [7] has been highly successful in providing tools for the study of non smooth analysis and optimization problems: especially when married to viscosity concepts [10, 17]. Outside of smoothable Banach spaces (thus, notably in / L 1 spaces) general constructions such as those of Ioffe [25, 28, 29] require a largely nonconstructive intersection over smooth or finite dimensional subspaces. Equally, outside of Asplund or Fr'echet spaces the most puissant results [41, 42] fail. Nonetheless, many problems inevitably lie in large (nonsmooth or nonFr'echet) spaces, X. In such settings the ...
OPTIMAL CONTROL OF UNBOUNDED DIFFERENTIAL INCLUSIONS
, 1994
"... We consider a Mayer problem of optimal control, whose dynamic constraint is given by a convexvalued differential inclusion. Both state and endpoint constraints are involved. We prove necessary conditions incorporating the Hamiltonian inclusion, the EulerLagrange inclusion, and the WeierstrassPon ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
We consider a Mayer problem of optimal control, whose dynamic constraint is given by a convexvalued differential inclusion. Both state and endpoint constraints are involved. We prove necessary conditions incorporating the Hamiltonian inclusion, the EulerLagrange inclusion, and the WeierstrassPontryagin maximum condition. Our results weaken the hypotheses and strengthen the conclusions of earlier works. Their main focus is to allow the admissible velocity sets to be unbounded, provided they satisfy a certain continuity hypothesis. They also sharpen the assertion of the EulerLagrange inclusion by replacing Clarke’s subgradient of the essential Lagrangian with a subset formed by partial convexification of limiting subgradients. In cases where the velocity sets are compact, the traditional Lipschitz condition implies the continuity hypothesis mentioned above, the assumption of “integrable boundedness” is shown to be superfluous, and our refinement of the EulerLagrange inclusion remains a strict improvement on previous forms of this condition.
A Nonconvex Separation Property In Banach Spaces
, 1997
"... We establish, in infinite dimensional Banach space, a nonconvex separation property for general closed sets that is an extension of HahnBanach separation theorem. We provide some consequences in optimization, in particular the existence of singular multipliers and show the relation of our principle ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
We establish, in infinite dimensional Banach space, a nonconvex separation property for general closed sets that is an extension of HahnBanach separation theorem. We provide some consequences in optimization, in particular the existence of singular multipliers and show the relation of our principle with the extremal principle of Mordukhovich. Keywords: subgradients, nonconvex separation, singular multiplier. 1 Introduction One of the cornestones of functional analysis is the HahnBanach separation theorem. This result establish that for two (closed) convex sets Z 1 and Z 2 of a Banach space X such that when 0 = 2 int(Z 1 \Gamma Z 2 ) there exists a closed hyperplane separating Z 1 and Z 2 . In particular Research supported by NSERC and the Shrum Endowment at Simon Fraser University. y This work was partially supported by FONDECYT, FONDAPMatem'aticas Aplicadas and CCE 1 when 0 2 Z 1 \Gamma Z 2 there exist z 2 Z 1 " Z 2 and p 6= 0 belonging to the topological dual X such ...
Dualization of subgradient conditions for optimality
"... Abstract. A basic relationship is derived between generalized subgradients of a given function, possibly nonsmooth and nonconvex, and those of a second function obtained from it by partial conjugation. Applications are made to the study of multiplier rules in finitedimensional optimization and to t ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
(Show Context)
Abstract. A basic relationship is derived between generalized subgradients of a given function, possibly nonsmooth and nonconvex, and those of a second function obtained from it by partial conjugation. Applications are made to the study of multiplier rules in finitedimensional optimization and to the theory of the EulerLagrange condition and Hamiltonian condition in nonsmooth optimal control. Keywords. Subgradients, nonsmooth analysis, Lagrange multiplier rules, EulerLagrange
Nonsmooth analysis and parametric optimization
 in Methods of Nonconvex Analysis, Lecture Notes in
, 1990
"... Abstract. In an optimization problem that depends on parameters, an important issue is the effect that perturbations of the parameters can have on solutions to the problem and their associated multipliers. Under quite broad conditions the possibly multivalued mapping that gives these elements in te ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
(Show Context)
Abstract. In an optimization problem that depends on parameters, an important issue is the effect that perturbations of the parameters can have on solutions to the problem and their associated multipliers. Under quite broad conditions the possibly multivalued mapping that gives these elements in terms of the parameters turns out to enjoy a property of “protodifferentiability. ” Generalized derivatives can then be calculated by solving an auxiliary optimization problem with auxiliary parameters. This is constructed from the original problem by taking secondorder epiderivatives of an essential objective function. 1. Solutions to Optimization Problems with Parameters. From an abstract point of view, a general optimization problem relative to elements x of a Banach space X can be seen in terms of minimizing an expression f(x) over all x ∈ X, where f is a function on X with values in IR = IR ∪ {±∞}. The effective domain dom f: = { x ∈ X ∣ } f(x) < ∞ gives the “feasible ” or “admissible ” elements x. Under the assumption that f is lower semicontinuous and proper (the latter meaning that f(x) < ∞ for at least one x, but f(x)> − ∞ for all x), a solution ¯x to the problem must satisfy 0 ∈ ∂f(¯x), where ∂f denotes subgradients in the sense of Clarke [1] (see also Rockafellar
On the Subdifferentiability of Functions of a Matrix Spectrum I: Mathematical Foundations
 Nonsmooth Optimization: Methods and Applications
, 1993
"... We consider analytic matrix valued mappings A:C 7! C n\Thetan and study the variational properties of the spectrum of A(ffl). Of particular interest are ff(ffl) and ae(ffl), respectively the spectral abscissa and the spectral radius of A(ffl). In this paper, we introduce the mathematical techniqu ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We consider analytic matrix valued mappings A:C 7! C n\Thetan and study the variational properties of the spectrum of A(ffl). Of particular interest are ff(ffl) and ae(ffl), respectively the spectral abscissa and the spectral radius of A(ffl). In this paper, we introduce the mathematical techniques required for this analysis. We begin with polynomials and discuss the bifurcation of the roots of a polynomial having analytic coefficients. It is this bifurcation phenomenon that leads to the nonlipschitzian behavior of the types of functions that we wish to study. PuiseuxNewton series and diagrams are then introduced as a means for analyzing these bifurcations. It is shown how these techniques can be used to describe the tangent cone to certain sets of stable polynomials. Matrices and polynomials are connected via characteristic polynomials. Further properties of the spectrum of a matrix A (0) are obtained from a block diagonalization of A (0) , where the kth diagonal block is an...
An Open Mapping Theorem using unbounded Generalized Jacobians
"... In this paper, three key theorems (the open mapping theorem, the inverse function theorem, and the implicit function theorem) for continuously di#erentiable maps are shown to hold for nonsmooth continuous maps which are not necessarily Lipschitz continuous. The significance of these extensions is th ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
In this paper, three key theorems (the open mapping theorem, the inverse function theorem, and the implicit function theorem) for continuously di#erentiable maps are shown to hold for nonsmooth continuous maps which are not necessarily Lipschitz continuous. The significance of these extensions is that they are given using generalized Jacobians, called approximate Jacobians. The approximate Jacobian which replaces the nonexistent Jacobian matrix at the points of nondi#erentiability for (not necessarily Lipschitzian) nonsmooth continuous maps by an unbounded set of matrices, enjoy rich and often exact calculus, and produce sharp results for Lipschitzian problems. The main tools are a generalized mean value theorem for continuous maps and the recession cones of unbounded approximate Jacobians. A general chain rule formula for these approximate Jacobians play a crucial role in the extensions. 1 Introduction One of the key results of classical analysis is the open mapping theorem for conti...