Results 1  10
of
15
LAGRANGE MULTIPLIERS AND OPTIMALITY
, 1993
"... Lagrange multipliers used to be viewed as auxiliary variables introduced in a problem of constrained minimization in order to write firstorder optimality conditions formally as a system of equations. Modern applications, with their emphasis on numerical methods and more complicated side conditions ..."
Abstract

Cited by 88 (7 self)
 Add to MetaCart
Lagrange multipliers used to be viewed as auxiliary variables introduced in a problem of constrained minimization in order to write firstorder optimality conditions formally as a system of equations. Modern applications, with their emphasis on numerical methods and more complicated side conditions than equations, have demanded deeper understanding of the concept and how it fits into a larger theoretical picture. A major line of research has been the nonsmooth geometry of onesided tangent and normal vectors to the set of points satisfying the given constraints. Another has been the gametheoretic role of multiplier vectors as solutions to a dual problem. Interpretations as generalized derivatives of the optimal value with respect to problem parameters have also been explored. Lagrange multipliers are now being seen as arising from a general rule for the subdifferentiation of a nonsmooth objective function which allows blackandwhite constraints to be replaced by penalty expressions. This paper traces such themes in the current theory of Lagrange multipliers, providing along the way a freestanding exposition of basic nonsmooth analysis as motivated by and applied to this subject.
Generalized Hessian properties of regularized nonsmooth functions
 SIAM Journal on Optimization
, 1996
"... Abstract. The question of secondorder expansions is taken up for a class of functions of importance in optimization, namely Moreau envelope regularizations of nonsmooth functions f. It is shown that when f is proxregular, which includes convex functions and the extendedrealvalued functions repre ..."
Abstract

Cited by 12 (4 self)
 Add to MetaCart
Abstract. The question of secondorder expansions is taken up for a class of functions of importance in optimization, namely Moreau envelope regularizations of nonsmooth functions f. It is shown that when f is proxregular, which includes convex functions and the extendedrealvalued functions representing problems of nonlinear programming, the many secondorder properties that can be formulated around the existence and stability of expansions of the envelopes of f or of their gradient mappings are linked by surprisingly extensive lists of equivalences with each other and with generalized differentiation properties of f itself. This clarifies the circumstances conducive to developing computational methods based on envelope functions, such as secondorder approximations in nonsmooth optimization and variants of the proximal point algorithm. The results establish that generalized secondorder expansions of Moreau envelopes, at least, can be counted on in most situations of interest in finitedimensional optimization. Keywords. Proxregularity, amenable functions, primallowernice functions, Hessians, first and secondorder expansions, strict protoderivatives, proximal mappings, Moreau envelopes, regularization, subgradient mappings, nonsmooth analysis, variational analysis, protoderivatives, secondorder epiderivatives, Attouch’s theorem.
Approximate Jacobian Matrices for Nonsmooth Continuous Maps and C¹Optimization
 TO APPEAR IN SIAM JOURNAL ON CONTROL AND OPTIMIZATION
, 1997
"... A notion of approximate Jacobian matrix is introduced for a continuous vectorvalued map. It is shown for instance that the Clarke generalized Jacobian is an approximate Jacobian for a locally Lipschitz map. The approach is based on the idea of convexificators of realvalued functions. Mean value co ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
A notion of approximate Jacobian matrix is introduced for a continuous vectorvalued map. It is shown for instance that the Clarke generalized Jacobian is an approximate Jacobian for a locally Lipschitz map. The approach is based on the idea of convexificators of realvalued functions. Mean value conditions for continuous vectorvalued maps and Taylor's expansions for continuously Gateaux differentiable functions ( i.e. C¹functions) are presented in terms of approximate Jacobians and approximate Hessians respectively. Secondorder necessary, and sufficient conditions for optimality and convexity of C¹functions are also given.
A calculus of epiderivatives applicable to optimization
, 1992
"... When an optimization problem is represented by its essential objective function, which incorporates constraints through infinite penalties, first and secondorder conditions for optimality can be stated in terms of the first and secondorder epiderivatives of that function. Such derivatives also ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
When an optimization problem is represented by its essential objective function, which incorporates constraints through infinite penalties, first and secondorder conditions for optimality can be stated in terms of the first and secondorder epiderivatives of that function. Such derivatives also are the key to the formulation of subproblems determining the response of a problem’s solution when the data values on which the problem depends are perturbed. It is vital for such reasons to have available a calculus of epiderivatives. This paper builds on a central case already understood, where the essential objective function is the composite of a convex function and a smooth mapping with certain qualifications, in order to develop differentiation rules covering operations such as addition of functions and a more general form of composition. Classes of “amenable” functions are introduced to mark out territory in which this sharper form of nonsmooth analysis can be carried out.
Amenable functions in optimization
 IN NONSMOOTH OPTIMIZATION METHODS AND APPLICATIONS
, 1992
"... ..."
Protoderivative formulas for basic subgradient mappings in mathematical programming
, 1993
"... Subgradient mappings associated with various convex and nonconvex functions are a vehicle for stating optimality conditions, and their protodifferentiability plays a role therefore in the sensitivity analysis of solutions to problems of optimization. Examples of special interest are the subgradient ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
Subgradient mappings associated with various convex and nonconvex functions are a vehicle for stating optimality conditions, and their protodifferentiability plays a role therefore in the sensitivity analysis of solutions to problems of optimization. Examples of special interest are the subgradients of the max of finitely many C² functions, and the subgradients of the indicator of a set defined by finitely many C² constraints satisfying a basic constraint qualification. In both cases the function has a property called full amenability, so the general theory of existence and calculus of protoderivatives of subgradient mappings associated with fully amenable functions is applicable. This paper works out the details for such examples. A formula of Auslender and Cominetti in the case of a max function is improved in particular.
Nonsmooth analysis and parametric optimization
 in Methods of Nonconvex Analysis, Lecture Notes in
, 1990
"... Abstract. In an optimization problem that depends on parameters, an important issue is the effect that perturbations of the parameters can have on solutions to the problem and their associated multipliers. Under quite broad conditions the possibly multivalued mapping that gives these elements in te ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Abstract. In an optimization problem that depends on parameters, an important issue is the effect that perturbations of the parameters can have on solutions to the problem and their associated multipliers. Under quite broad conditions the possibly multivalued mapping that gives these elements in terms of the parameters turns out to enjoy a property of “protodifferentiability. ” Generalized derivatives can then be calculated by solving an auxiliary optimization problem with auxiliary parameters. This is constructed from the original problem by taking secondorder epiderivatives of an essential objective function. 1. Solutions to Optimization Problems with Parameters. From an abstract point of view, a general optimization problem relative to elements x of a Banach space X can be seen in terms of minimizing an expression f(x) over all x ∈ X, where f is a function on X with values in IR = IR ∪ {±∞}. The effective domain dom f: = { x ∈ X ∣ } f(x) < ∞ gives the “feasible ” or “admissible ” elements x. Under the assumption that f is lower semicontinuous and proper (the latter meaning that f(x) < ∞ for at least one x, but f(x)> − ∞ for all x), a solution ¯x to the problem must satisfy 0 ∈ ∂f(¯x), where ∂f denotes subgradients in the sense of Clarke [1] (see also Rockafellar
Approximate Hessian Matrices and Secondorder Optimality Conditions for Nonlinear Programming Problems with C¹Data
, 1997
"... In this paper, we present generalizations of the Jacobian matrix and the Hessian matrix to continuous maps and continuously differentiable functions respectively. We then establish secondorder optimality conditions for mathematical programming problems with continuously differentiable functions. Th ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
In this paper, we present generalizations of the Jacobian matrix and the Hessian matrix to continuous maps and continuously differentiable functions respectively. We then establish secondorder optimality conditions for mathematical programming problems with continuously differentiable functions. The results also sharpen the corresponding results for problems involving C 1;1 functions.
EquiCalmness And EpiDerivatives That Are Pointwise Limits
, 1997
"... . Recently Moussaoui and Seeger [1] studied the monotonicity of first and secondorder difference quotients with primary goal the simplification of epilimits. It is well known that epilimits (liminf and limsup) can be written as pointwise limits in the case of a sequence of functions that is equi ..."
Abstract
 Add to MetaCart
. Recently Moussaoui and Seeger [1] studied the monotonicity of first and secondorder difference quotients with primary goal the simplification of epilimits. It is well known that epilimits (liminf and limsup) can be written as pointwise limits in the case of a sequence of functions that is equilsc. In this paper we introduce equicalmness as a condition that guarantees equilsc, and our primary goal is to give conditions that guarantee that first and secondorder difference quotients are equicalm. We show that a "piecewiseC 1 " function f with convex domain is epidifferentiable at any point ¯ x of its domain, moreover if ¯ x + t¯u 2 dom f for some t ? 0, then the epilimit at ¯ x in the direction ¯ u is given as a pointwise limit. We show that a convex "piecewise C 2 " function with "polyhedral pieces" is twice epidifferentiable and that the epiderivatives are given as pointwise limits along rays. We thus obtain a (modest) extension of Rockafellar's result concerning t...
Graphical Methods in First and SecondOrder Differentiability Theory of Integral Functionals
, 1993
"... Abstract. We discuss several notions of first and secondorder differentiability for integral functionats on a Hilbert space. ..."
Abstract
 Add to MetaCart
Abstract. We discuss several notions of first and secondorder differentiability for integral functionats on a Hilbert space.