Results 1 
9 of
9
LAGRANGE MULTIPLIERS AND OPTIMALITY
, 1993
"... Lagrange multipliers used to be viewed as auxiliary variables introduced in a problem of constrained minimization in order to write firstorder optimality conditions formally as a system of equations. Modern applications, with their emphasis on numerical methods and more complicated side conditions ..."
Abstract

Cited by 92 (7 self)
 Add to MetaCart
Lagrange multipliers used to be viewed as auxiliary variables introduced in a problem of constrained minimization in order to write firstorder optimality conditions formally as a system of equations. Modern applications, with their emphasis on numerical methods and more complicated side conditions than equations, have demanded deeper understanding of the concept and how it fits into a larger theoretical picture. A major line of research has been the nonsmooth geometry of onesided tangent and normal vectors to the set of points satisfying the given constraints. Another has been the gametheoretic role of multiplier vectors as solutions to a dual problem. Interpretations as generalized derivatives of the optimal value with respect to problem parameters have also been explored. Lagrange multipliers are now being seen as arising from a general rule for the subdifferentiation of a nonsmooth objective function which allows blackandwhite constraints to be replaced by penalty expressions. This paper traces such themes in the current theory of Lagrange multipliers, providing along the way a freestanding exposition of basic nonsmooth analysis as motivated by and applied to this subject.
Sensitivity Analysis in (Degenerate) Quadratic Programming
 DELFT UNIVERSITY OF TECHNOLOGY
, 1996
"... In this paper we deal with sensitivity analysis in convex quadratic programming, without making assumptions on nondegeneracy, strict convexity of the objective function, and the existence of a strictly complementary solution. We show that the optimal value as a function of a righthand side element ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
In this paper we deal with sensitivity analysis in convex quadratic programming, without making assumptions on nondegeneracy, strict convexity of the objective function, and the existence of a strictly complementary solution. We show that the optimal value as a function of a righthand side element (or an element of the linear part of the objective) is piecewise quadratic, where the pieces can be characterized by maximal complementary solutions and tripartitions. Further, we investigate differentiability of this function. A new algorithm to compute the optimal value function is proposed. Finally, we discuss the advantages of this approach when applied to meanvariance portfolio models.
R.: Strong convexity and directional derivatives of marginal values in twostage stochastic programming
 Kall (Eds.) &quot;Stochastic Programming  Numerical Techniques and Engineering Applications&quot;, SpringerVerlag, Lecture Notes in Economics and Mathematical Systems 423
, 1995
"... Abstract. Twostage stochastic programs with random righthand side are considered. Veriable sucient conditions for the existence of secondorder directional derivatives of marginal values are presented. The central role of the strong convexity of the expected recourse function as well as of a Lips ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Abstract. Twostage stochastic programs with random righthand side are considered. Veriable sucient conditions for the existence of secondorder directional derivatives of marginal values are presented. The central role of the strong convexity of the expected recourse function as well as of a Lipschitz stability result for optimal sets is emphasized.
SPACE MAPPING FOR OPTIMAL CONTROL OF PARTIAL DIFFERENTIAL EQUATIONS
"... Abstract. Solving optimal control problems for nonlinear partial differential equations represents a significant numerical challenge due to the tremendous size and possible model difficulties (e.g., nonlinearities) of the discretized problems. In this paper, a novel spacemapping technique for solvi ..."
Abstract
 Add to MetaCart
Abstract. Solving optimal control problems for nonlinear partial differential equations represents a significant numerical challenge due to the tremendous size and possible model difficulties (e.g., nonlinearities) of the discretized problems. In this paper, a novel spacemapping technique for solving the aforementioned problem class is introduced, analyzed, and tested. The advantage of the spacemapping approach compared to classical multigrid techniques lies in the flexibility of not only using grid coarsening as a model reduction but also employing (perhaps less nonlinear) surrogates. The space mapping is based on a regularization approach which, in contrast to other spacemapping techniques, results in a smooth mapping and, thus, avoids certain irregular situations at kinks. A new Broyden’s update formula for the sensitivities of the space map is also introduced. This quasiNewton update is motivated by the usual secant condition combined with a secant condition resulting from differentiating the spacemapping surrogate. The overall algorithm employs a trustregion framework for global convergence. Issues involved in the computations are highlighted, and a report on a few illustrative numerical tests is given.
On Concepts of Directional Differentiability A.
"... Abstract. Various definitions of directional derivatives in topological vector spaces are compared. Directional derivatives in the sense of G~teaux, Fr6chet, and Hadamard are singled out from the general framework of crdirectional differentiability. It is pointed out that, in the case of finitedim ..."
Abstract
 Add to MetaCart
Abstract. Various definitions of directional derivatives in topological vector spaces are compared. Directional derivatives in the sense of G~teaux, Fr6chet, and Hadamard are singled out from the general framework of crdirectional differentiability. It is pointed out that, in the case of finitedimensional spaces and locally Lipschitz mappings, all these concepts of directional differentiability are equivalent. The chain rule for directional derivatives of a composite mapping is discussed. Key Words. Directional derivatives, positively homogeneous mapping, locally Lipschitz mapping, chain rule. 1.
Abstract Continuous Optimization On generalized semiinfinite optimization and bilevel
, 2000
"... The paper studies the connections and differences between bilevel problems (BL) and generalized semiinfinite problems (GSIP). Under natural assumptions (GSIP) can be seen as a special case of a (BL). We consider the socalled reduction approach for (BL) and (GSIP) leading to optimality conditions a ..."
Abstract
 Add to MetaCart
The paper studies the connections and differences between bilevel problems (BL) and generalized semiinfinite problems (GSIP). Under natural assumptions (GSIP) can be seen as a special case of a (BL). We consider the socalled reduction approach for (BL) and (GSIP) leading to optimality conditions and Newtontype methods for solving the problems. We show by a structural analysis that for (GSIP)problems the regularity assumptions for the reduction approach can be expected to hold generically at a solution but for general (BL)problems not. The genericity behavior of (BL) and (GSIP) is in particular studied for linear problems.
Characterization of the Smoothness and Curvature of a Marginal Function for a TrustRegion Problem
, 1997
"... This paper studies the smoothness and curvature of a marginal function for a trustregion problem. In this problem, a quadratic function is minimized over an ellipsoid. The marginal function considered is obtained by perturbing the trust radius, i.e., by changing the size of the ellipsoidal constrai ..."
Abstract
 Add to MetaCart
This paper studies the smoothness and curvature of a marginal function for a trustregion problem. In this problem, a quadratic function is minimized over an ellipsoid. The marginal function considered is obtained by perturbing the trust radius, i.e., by changing the size of the ellipsoidal constraint. The values of the marginal function and of its first and second derivatives are explicitly calculated in all possible scenarios. A complete study of the smoothness and curvature of this marginal function is given. The main motivation for this work arises from an application in Statistics. Keywords. Marginal or value function, perturbation or sensitivity analysis, trust regions AMS subject classifications. 65U05, 90C20, 90C30, 90C31 1 Introduction Consider the following minimization problem: minimize q(s) j g ? s + 1 2 s ? Hs subject to ksk \Delta ; (1:1) where \Delta 2 IR + , s; g 2 IR n , H 2 IR n\Thetan , H = H ? , and n is a positive integer. The function k \Delta k...