Results 1 
9 of
9
The ULagrangian of a convex function
 Trans. Amer. Math. Soc
"... Abstract. At a given point p, a convex function f is differentiable in a certain subspace U (the subspace along which ∂f(p) has 0breadth). This property opens the way to defining a suitably restricted second derivative of f at p. We do this via an intermediate function, convex on U. We call this fu ..."
Abstract

Cited by 26 (7 self)
 Add to MetaCart
Abstract. At a given point p, a convex function f is differentiable in a certain subspace U (the subspace along which ∂f(p) has 0breadth). This property opens the way to defining a suitably restricted second derivative of f at p. We do this via an intermediate function, convex on U. We call this function the ULagrangian; it coincides with the ordinary Lagrangian in composite cases: exact penalty, semidefinite programming. Also, we use this new theory to design a conceptual pattern for superlinearly convergent minimization algorithms. Finally, we establish a connection with the MoreauYosida regularization. 1.
TILT STABILITY OF A LOCAL MINIMUM
 SIAM J. OPTIMIZATION
"... The behavior of a minimizing point when an objective function is tilted by adding a small linear term is studied from the perspective of secondorder conditions for local optimality. The classical condition of a positivedefinite Hessian in smooth problems without constraints is found to have an exa ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
The behavior of a minimizing point when an objective function is tilted by adding a small linear term is studied from the perspective of secondorder conditions for local optimality. The classical condition of a positivedefinite Hessian in smooth problems without constraints is found to have an exact counterpart much more broadly in the positivity of a certain generalized Hessian mapping. This fully characterizes the case where tilt perturbations cause the minimizing point to shift in a lipschitzian manner.
A calculus of epiderivatives applicable to optimization
, 1992
"... When an optimization problem is represented by its essential objective function, which incorporates constraints through infinite penalties, first and secondorder conditions for optimality can be stated in terms of the first and secondorder epiderivatives of that function. Such derivatives also ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
When an optimization problem is represented by its essential objective function, which incorporates constraints through infinite penalties, first and secondorder conditions for optimality can be stated in terms of the first and secondorder epiderivatives of that function. Such derivatives also are the key to the formulation of subproblems determining the response of a problem’s solution when the data values on which the problem depends are perturbed. It is vital for such reasons to have available a calculus of epiderivatives. This paper builds on a central case already understood, where the essential objective function is the composite of a convex function and a smooth mapping with certain qualifications, in order to develop differentiation rules covering operations such as addition of functions and a more general form of composition. Classes of “amenable” functions are introduced to mark out territory in which this sharper form of nonsmooth analysis can be carried out.
Amenable functions in optimization
 IN NONSMOOTH OPTIMIZATION METHODS AND APPLICATIONS
, 1992
"... ..."
Protoderivative formulas for basic subgradient mappings in mathematical programming
, 1993
"... Subgradient mappings associated with various convex and nonconvex functions are a vehicle for stating optimality conditions, and their protodifferentiability plays a role therefore in the sensitivity analysis of solutions to problems of optimization. Examples of special interest are the subgradient ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
Subgradient mappings associated with various convex and nonconvex functions are a vehicle for stating optimality conditions, and their protodifferentiability plays a role therefore in the sensitivity analysis of solutions to problems of optimization. Examples of special interest are the subgradients of the max of finitely many C² functions, and the subgradients of the indicator of a set defined by finitely many C² constraints satisfying a basic constraint qualification. In both cases the function has a property called full amenability, so the general theory of existence and calculus of protoderivatives of subgradient mappings associated with fully amenable functions is applicable. This paper works out the details for such examples. A formula of Auslender and Cominetti in the case of a max function is improved in particular.
Protoderivatives and the geometry of solution mappings in nonlinear programming
 249260 IN NONLINEAR OPTIMIZATION AND APPLICATIONS
, 1996
"... We quantify the sensitivity of KKT pairs associated with a parameterized family of nonlinear programming problems. Our approach involves protoderivatives, which are generalized derivatives appropriate even in cases when the KKT pairs are not unique; we investigate what the theory of such derivativ ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We quantify the sensitivity of KKT pairs associated with a parameterized family of nonlinear programming problems. Our approach involves protoderivatives, which are generalized derivatives appropriate even in cases when the KKT pairs are not unique; we investigate what the theory of such derivatives yields in the special case when the KKT pairs are unique (locally). We demonstrate that the graph of the KKT multifunction is just a reoriented graph of a Lipschitz mapping, and use protodifferentiability to show that the graph of the KKT multifunction actually has the stronger property of being a reorientation of the graph of a Bdifferentiable mapping. Our results indicate that protoderivatives provide the same kind of information for possibly setvalued mappings (like the KKT multifunction) that Bderivatives provide for singlevalued mappings.
Differentiable Selections of SetValued Mappings With Application in Stochastic Programming
"... We consider setvalued mappings defined on a linear normed space with convex closed images in R^n. Our aim is to construct selections which are (Hadamard) directionally differentiable using some approximation of the multifunction. The constructions suggested assume existence of a cone approximation ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
We consider setvalued mappings defined on a linear normed space with convex closed images in R^n. Our aim is to construct selections which are (Hadamard) directionally differentiable using some approximation of the multifunction. The constructions suggested assume existence of a cone approximation given by a certain "derivative" of the mapping. The first one makes use of the properties of Steiner points. The notion of Steiner center is generalized for a class of unbounded sets, which include the polyhedral sets. The second construction defines a continuous selection through a given point of the graph of the multifunction and being Hadamard directionally differentiable at that point with derivatives belonging to the corresponding "derivative" of the multifunction. Both constructions lead to a directionally differentiable Castaing representation of measurable multifunctions with the required differentiability properties. The results are applied to obtain statements about the asymptotic behaviour of measurable selections of random sets via the deltaapproach. Particularly, random sets of this kind build the solutions of twostage stochastic programs.
A DERIVATIVECODERIVATIVE INCLUSION IN SECONDORDER NONSMOOTH ANALYSIS
, 1996
"... For twice smooth functions, the symmetry of the matrix of second partial derivatives is automatic and can be seen as the symmetry of the Jacobian matrix of the gradient mapping. For nonsmooth functions, possibly even extendedrealvalued, the gradient mapping can be replaced by a subgradient mappin ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
For twice smooth functions, the symmetry of the matrix of second partial derivatives is automatic and can be seen as the symmetry of the Jacobian matrix of the gradient mapping. For nonsmooth functions, possibly even extendedrealvalued, the gradient mapping can be replaced by a subgradient mapping, and generalized second derivative objects can then be introduced through graphical differentiation of this mapping, but the question of what analog of symmetry might persist has remained open. An answer is provided here in terms of a derivativecoderivative inclusion.
EquiCalmness And EpiDerivatives That Are Pointwise Limits
, 1997
"... . Recently Moussaoui and Seeger [1] studied the monotonicity of first and secondorder difference quotients with primary goal the simplification of epilimits. It is well known that epilimits (liminf and limsup) can be written as pointwise limits in the case of a sequence of functions that is equi ..."
Abstract
 Add to MetaCart
. Recently Moussaoui and Seeger [1] studied the monotonicity of first and secondorder difference quotients with primary goal the simplification of epilimits. It is well known that epilimits (liminf and limsup) can be written as pointwise limits in the case of a sequence of functions that is equilsc. In this paper we introduce equicalmness as a condition that guarantees equilsc, and our primary goal is to give conditions that guarantee that first and secondorder difference quotients are equicalm. We show that a "piecewiseC 1 " function f with convex domain is epidifferentiable at any point ¯ x of its domain, moreover if ¯ x + t¯u 2 dom f for some t ? 0, then the epilimit at ¯ x in the direction ¯ u is given as a pointwise limit. We show that a convex "piecewise C 2 " function with "polyhedral pieces" is twice epidifferentiable and that the epiderivatives are given as pointwise limits along rays. We thus obtain a (modest) extension of Rockafellar's result concerning t...