Results 1  10
of
10
LAGRANGE MULTIPLIERS AND OPTIMALITY
, 1993
"... Lagrange multipliers used to be viewed as auxiliary variables introduced in a problem of constrained minimization in order to write firstorder optimality conditions formally as a system of equations. Modern applications, with their emphasis on numerical methods and more complicated side conditions ..."
Abstract

Cited by 98 (7 self)
 Add to MetaCart
Lagrange multipliers used to be viewed as auxiliary variables introduced in a problem of constrained minimization in order to write firstorder optimality conditions formally as a system of equations. Modern applications, with their emphasis on numerical methods and more complicated side conditions than equations, have demanded deeper understanding of the concept and how it fits into a larger theoretical picture. A major line of research has been the nonsmooth geometry of onesided tangent and normal vectors to the set of points satisfying the given constraints. Another has been the gametheoretic role of multiplier vectors as solutions to a dual problem. Interpretations as generalized derivatives of the optimal value with respect to problem parameters have also been explored. Lagrange multipliers are now being seen as arising from a general rule for the subdifferentiation of a nonsmooth objective function which allows blackandwhite constraints to be replaced by penalty expressions. This paper traces such themes in the current theory of Lagrange multipliers, providing along the way a freestanding exposition of basic nonsmooth analysis as motivated by and applied to this subject.
TILT STABILITY OF A LOCAL MINIMUM
 SIAM J. OPTIMIZATION
"... The behavior of a minimizing point when an objective function is tilted by adding a small linear term is studied from the perspective of secondorder conditions for local optimality. The classical condition of a positivedefinite Hessian in smooth problems without constraints is found to have an exa ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
The behavior of a minimizing point when an objective function is tilted by adding a small linear term is studied from the perspective of secondorder conditions for local optimality. The classical condition of a positivedefinite Hessian in smooth problems without constraints is found to have an exact counterpart much more broadly in the positivity of a certain generalized Hessian mapping. This fully characterizes the case where tilt perturbations cause the minimizing point to shift in a lipschitzian manner.
Generalized Hessian properties of regularized nonsmooth functions
 SIAM Journal on Optimization
, 1996
"... Abstract. The question of secondorder expansions is taken up for a class of functions of importance in optimization, namely Moreau envelope regularizations of nonsmooth functions f. It is shown that when f is proxregular, which includes convex functions and the extendedrealvalued functions repre ..."
Abstract

Cited by 13 (5 self)
 Add to MetaCart
Abstract. The question of secondorder expansions is taken up for a class of functions of importance in optimization, namely Moreau envelope regularizations of nonsmooth functions f. It is shown that when f is proxregular, which includes convex functions and the extendedrealvalued functions representing problems of nonlinear programming, the many secondorder properties that can be formulated around the existence and stability of expansions of the envelopes of f or of their gradient mappings are linked by surprisingly extensive lists of equivalences with each other and with generalized differentiation properties of f itself. This clarifies the circumstances conducive to developing computational methods based on envelope functions, such as secondorder approximations in nonsmooth optimization and variants of the proximal point algorithm. The results establish that generalized secondorder expansions of Moreau envelopes, at least, can be counted on in most situations of interest in finitedimensional optimization. Keywords. Proxregularity, amenable functions, primallowernice functions, Hessians, first and secondorder expansions, strict protoderivatives, proximal mappings, Moreau envelopes, regularization, subgradient mappings, nonsmooth analysis, variational analysis, protoderivatives, secondorder epiderivatives, Attouch’s theorem.
A GaussNewton Method for Convex Composite Optimization
, 1993
"... An extension of the GaussNewton method for nonlinear equations to convex composite optimization is described and analyzed. Local quadratic convergence is established for the minimization of h ffi F under two conditions, namely h has a set of weak sharp minima, C, and there is a regular point of th ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
An extension of the GaussNewton method for nonlinear equations to convex composite optimization is described and analyzed. Local quadratic convergence is established for the minimization of h ffi F under two conditions, namely h has a set of weak sharp minima, C, and there is a regular point of the inclusion F (x) 2 C. This result extends a similar convergence result due to Womersley which employs the assumption of a strongly unique solution of the composite function h ffi F . A backtracking line search is proposed as a globalization strategy. For this algorithm, a global convergence result is established, with a quadratic rate under the regularity assumption. This material is based on research supported by National Science Foundation Grants CCR9157632 and DMS9102059 and Air Force Office of Scientific Research Grant AFOSR890410 y Department of Mathematics, GN50, University of Washington, Seattle, Washington 98195 z Computer Sciences Department, University of Wisconsin, ...
Amenable functions in optimization
 IN NONSMOOTH OPTIMIZATION METHODS AND APPLICATIONS
, 1992
"... ..."
A calculus of epiderivatives applicable to optimization
, 1992
"... When an optimization problem is represented by its essential objective function, which incorporates constraints through infinite penalties, first and secondorder conditions for optimality can be stated in terms of the first and secondorder epiderivatives of that function. Such derivatives also ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
When an optimization problem is represented by its essential objective function, which incorporates constraints through infinite penalties, first and secondorder conditions for optimality can be stated in terms of the first and secondorder epiderivatives of that function. Such derivatives also are the key to the formulation of subproblems determining the response of a problem’s solution when the data values on which the problem depends are perturbed. It is vital for such reasons to have available a calculus of epiderivatives. This paper builds on a central case already understood, where the essential objective function is the composite of a convex function and a smooth mapping with certain qualifications, in order to develop differentiation rules covering operations such as addition of functions and a more general form of composition. Classes of “amenable” functions are introduced to mark out territory in which this sharper form of nonsmooth analysis can be carried out.
Secondorder global optimality conditions for convex composite optimization
, 1995
"... In recent years secondorder sufficient conditions of an isolated local minimizer for convex composite optimization problems have been established. In this paper, secondorder optimality conditions are obtained of a global minimizer for convex composite problems with a nonfinite valued convex func ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In recent years secondorder sufficient conditions of an isolated local minimizer for convex composite optimization problems have been established. In this paper, secondorder optimality conditions are obtained of a global minimizer for convex composite problems with a nonfinite valued convex function and a twice strictly differentiable function by introducing a generalized representation condition. This result is applied to a minimization problem with a closed convex set constraint which is shown to satisfy the basic onstraint qualification. In particular, secondorder necessary and sufficient conditions of a solution for a variational inequality problem with convex composite inequality constraints are obtained. © 1998 The
OPTIMIZATION PROBLEMS WITH PERTURBATIONS: A GUIDED TOUR ∗
"... Abstract. This paper presents an overview of some recent, and significant, progress in the theory of optimization problems with perturbations. We put the emphasis on methods based on upper and lower estimates of the objective function of the perturbed problems. These methods allow one to compute exp ..."
Abstract
 Add to MetaCart
Abstract. This paper presents an overview of some recent, and significant, progress in the theory of optimization problems with perturbations. We put the emphasis on methods based on upper and lower estimates of the objective function of the perturbed problems. These methods allow one to compute expansions of the optimal value function and approximate optimal solutions in situations where the set of Lagrange multipliers is not a singleton, may be unbounded, or is even empty. We give rather complete results for nonlinear programming problems and describe some extensions of the method to more general problems. We illustrate the results by computing the equilibrium position of a chain that is almost vertical or horizontal.
SECOND ORDER OPTIMALITY CONDITIONS BASED ON PARABOLIC SECOND ORDER TANGENT SETS ∗
"... Abstract. In this paper we discuss second order optimality conditions in optimization problems subject to abstract constraints. Our analysis is based on various concepts of second order tangent sets and parametric duality. We introduce a condition, called second order regularity, under which there i ..."
Abstract
 Add to MetaCart
Abstract. In this paper we discuss second order optimality conditions in optimization problems subject to abstract constraints. Our analysis is based on various concepts of second order tangent sets and parametric duality. We introduce a condition, called second order regularity, under which there is no gap between the corresponding second order necessary and second order sufficient conditions. We show that the second order regularity condition always holds in the case of semidefinite programming.
EquiCalmness And EpiDerivatives That Are Pointwise Limits
, 1997
"... . Recently Moussaoui and Seeger [1] studied the monotonicity of first and secondorder difference quotients with primary goal the simplification of epilimits. It is well known that epilimits (liminf and limsup) can be written as pointwise limits in the case of a sequence of functions that is equi ..."
Abstract
 Add to MetaCart
. Recently Moussaoui and Seeger [1] studied the monotonicity of first and secondorder difference quotients with primary goal the simplification of epilimits. It is well known that epilimits (liminf and limsup) can be written as pointwise limits in the case of a sequence of functions that is equilsc. In this paper we introduce equicalmness as a condition that guarantees equilsc, and our primary goal is to give conditions that guarantee that first and secondorder difference quotients are equicalm. We show that a "piecewiseC 1 " function f with convex domain is epidifferentiable at any point ¯ x of its domain, moreover if ¯ x + t¯u 2 dom f for some t ? 0, then the epilimit at ¯ x in the direction ¯ u is given as a pointwise limit. We show that a convex "piecewise C 2 " function with "polyhedral pieces" is twice epidifferentiable and that the epiderivatives are given as pointwise limits along rays. We thus obtain a (modest) extension of Rockafellar's result concerning t...