Results 1  10
of
10
LAGRANGE MULTIPLIERS AND OPTIMALITY
, 1993
"... Lagrange multipliers used to be viewed as auxiliary variables introduced in a problem of constrained minimization in order to write firstorder optimality conditions formally as a system of equations. Modern applications, with their emphasis on numerical methods and more complicated side conditions ..."
Abstract

Cited by 89 (7 self)
 Add to MetaCart
Lagrange multipliers used to be viewed as auxiliary variables introduced in a problem of constrained minimization in order to write firstorder optimality conditions formally as a system of equations. Modern applications, with their emphasis on numerical methods and more complicated side conditions than equations, have demanded deeper understanding of the concept and how it fits into a larger theoretical picture. A major line of research has been the nonsmooth geometry of onesided tangent and normal vectors to the set of points satisfying the given constraints. Another has been the gametheoretic role of multiplier vectors as solutions to a dual problem. Interpretations as generalized derivatives of the optimal value with respect to problem parameters have also been explored. Lagrange multipliers are now being seen as arising from a general rule for the subdifferentiation of a nonsmooth objective function which allows blackandwhite constraints to be replaced by penalty expressions. This paper traces such themes in the current theory of Lagrange multipliers, providing along the way a freestanding exposition of basic nonsmooth analysis as motivated by and applied to this subject.
Primaldual projected gradient algorithms for extended linearquadratic programming
 SIAM J. Optimization
"... Abstract. Many largescale problems in dynamic and stochastic optimization can be modeled with extended linearquadratic programming, which admits penalty terms and treats them through duality. In general the objective functions in such problems are only piecewise smooth and must be minimized or max ..."
Abstract

Cited by 16 (2 self)
 Add to MetaCart
Abstract. Many largescale problems in dynamic and stochastic optimization can be modeled with extended linearquadratic programming, which admits penalty terms and treats them through duality. In general the objective functions in such problems are only piecewise smooth and must be minimized or maximized relative to polyhedral sets of high dimensionality. This paper proposes a new class of numerical methods for “fully quadratic ” problems within this framework, which exhibit secondorder nonsmoothness. These methods, combining the idea of finiteenvelope representation with that of modified gradient projection, work with local structure in the primal and dual problems simultaneously, feeding information back and forth to trigger advantageous restarts. Versions resembling steepest descent methods and conjugate gradient methods are presented. When a positive threshold of εoptimality is specified, both methods converge in a finite number of iterations. With threshold 0, it is shown under mild assumptions that the steepest descent version converges linearly, while the conjugate gradient version still has a finite termination property. The algorithms are designed to exploit features of primal and dual decomposability of the Lagrangian, which are typically available in a largescale setting, and they are open to considerable parallelization. Key words. Extended linearquadratic programming, largescale numerical optimization, finiteenvelope representation, gradient projection, primaldual methods, steepest descent methods, conjugate gradient methods. AMS(MOS) subject classifications. 65K05, 65K10, 90C20 1. Introduction. A
Newton's Method for Quadratic Stochastic Programs with Recourse
 Journal of Computational and Applied Mathematics
, 1995
"... . Quadratic stochastic programs (QSP) with recourse can be formulated as nonlinear convex programming problems. By attaching a Lagrange multiplier vector to the nonlinear convex program, a QSP is written as a system of nonsmooth equations. A Newtonlike method for solving the QSP is proposed and glo ..."
Abstract

Cited by 10 (8 self)
 Add to MetaCart
. Quadratic stochastic programs (QSP) with recourse can be formulated as nonlinear convex programming problems. By attaching a Lagrange multiplier vector to the nonlinear convex program, a QSP is written as a system of nonsmooth equations. A Newtonlike method for solving the QSP is proposed and global convergence and local superlinear convergence of the method are established. The current method is more general than previous methods which were developed for boxdiagonal and fully quadratic QSP. Numerical experiments are given to demonstrate the efficiency of the algorithm, and to compare the use of MonteCarlo rules and lattice rules for multiple integration in the algorithm. Keywords: Newton's method, quadratic stochastic programs, nonsmooth equations. Short title: Newton's method for stochastic programs 1 This work is supported by the Australian Research Council. 1. Introduction Let P 2 R n\Thetan be symmetric positive semidefinite and H 2 R m\Thetam be symmetric positive...
Largescale extended linearquadratic programming and multistage optimization
 Advances in Numerical Partial Differential Equations and Optimization, chapter 15
, 1991
"... Abstract. Optimization problems in discrete time can be modeled more flexibly by extended linearquadratic programming than by traditional linear or quadratic programming, because penalties and other expressions that may substitute for constraints can readily be incorporated and dualized. At the same ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
Abstract. Optimization problems in discrete time can be modeled more flexibly by extended linearquadratic programming than by traditional linear or quadratic programming, because penalties and other expressions that may substitute for constraints can readily be incorporated and dualized. At the same time, dynamics can be written with state vectors as in dynamic programming and optimal control. This suggests new primaldual approaches to solving multistage problems. The special setting for such numerical methods is described. New results are presented on the calculation of gradients of the primal and dual objective functions and on the convergence effects of strict quadratic regularization.
SEMISMOOTH SQP METHOD FOR EQUALITYCONSTRAINED OPTIMIZATION PROBLEMS WITH AN APPLICATION TO THE LIFTED REFORMULATION OF MATHEMATICAL PROGRAMS WITH COMPLEMENTARITY CONSTRAINTS
, 2010
"... We consider the sequential quadratic programming algorithm (SQP) applied to equalityconstrained optimization problems, where the problem data is differentiable with Lipschitzcontinuous first derivatives. For this setting, DennisMoré type analysis of primal superlinear convergence is presented. Our ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
We consider the sequential quadratic programming algorithm (SQP) applied to equalityconstrained optimization problems, where the problem data is differentiable with Lipschitzcontinuous first derivatives. For this setting, DennisMoré type analysis of primal superlinear convergence is presented. Our main motivation is a special modification of SQP tailored to the structure of the lifted reformulation of mathematical programs with complementarity constraints (MPCC). For this problem, we propose a special positive definite modification of the matrices in the generalized Hessian, which is suitable for globalization of SQP based on the penalty function, and at the same time can be expected to satisfy our general DennisMoré type conditions, thus preserving local superlinear convergence. (Standard quasiNewton updates in the SQP framework require twice differentiability of the problem data at the solution for superlinear convergence.) Preliminary numerical results comparing a number of quasiNewton versions of semismooth SQP applied to MPCC are also reported. Key words: sequential quadratic programming, semismoothness, Bdifferential, BDregularity, semismooth Newton method, secondorder sufficiency, mathematical programs with complementarity constraints.
A NOTE ON UPPER LIPSCHITZ STABILITY, ERROR BOUNDS, AND CRITICAL MULTIPLIERS FOR LIPSCHITZCONTINUOUS KKT SYSTEMS
, 2012
"... We prove a new local upper Lipschitz stability result and the associated local error bound for solutions of parametric Karush–Kuhn–Tucker systems corresponding to variational problems with Lipschitzian base mappings and constraints possessing Lipschitzian derivatives, and without any constraint qual ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
We prove a new local upper Lipschitz stability result and the associated local error bound for solutions of parametric Karush–Kuhn–Tucker systems corresponding to variational problems with Lipschitzian base mappings and constraints possessing Lipschitzian derivatives, and without any constraint qualifications. This property is equivalent to the appropriately extended to this nonsmooth setting notion of noncriticality of the Lagrange multiplier associated to the primal solution, which is weaker than secondorder sufficiency. All this extends several results previously known only for optimization problems with twice differentiable data, or assuming some constraint qualifications. In addition, our results are obtained in the more general variational setting.
THE JOSEPHY–NEWTON METHOD FOR SEMISMOOTH GENERALIZED EQUATIONS AND SEMISMOOTH SQP FOR OPTIMIZATION
, 2011
"... While generalized equations with differentiable singlevalued base mappings and the associated Josephy–Newton method have been studied extensively, the setting with semismooth base mapping had not been previously considered (apart from the two special cases of usual nonlinear equations and of Karush ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
While generalized equations with differentiable singlevalued base mappings and the associated Josephy–Newton method have been studied extensively, the setting with semismooth base mapping had not been previously considered (apart from the two special cases of usual nonlinear equations and of KarushKuhnTucker optimality systems). We introduce for the general semismooth case appropriate notions of solution regularity and prove local convergence of the corresponding Josephy–Newton method. As an application, we immediately recover the known primaldual local convergence properties of semismooth SQP, but also obtain some new results that complete the analysis of the SQP primal rate of convergence, including its quasiNewton variant. Key words: generalized equation, Bdifferential, generalized Jacobian, BDregularity, CDregularity,
RESEARCH ARTICLE Semismooth SQP Method for EqualityConstrained Optimization Problems with an Application to the Lifted Reformulation of Mathematical Programs with Complementarity Constraints
, 2010
"... We consider the sequential quadratic programming algorithm (SQP) applied to equalityconstrained optimization problems, where the problem data is differentiable with Lipschitzcontinuous first derivatives. For this setting, DennisMoré type analysis of primal superlinear convergence is presented. Our ..."
Abstract
 Add to MetaCart
We consider the sequential quadratic programming algorithm (SQP) applied to equalityconstrained optimization problems, where the problem data is differentiable with Lipschitzcontinuous first derivatives. For this setting, DennisMoré type analysis of primal superlinear convergence is presented. Our main motivation is a special modification of SQP tailored to the structure of the lifted reformulation of mathematical programs with complementarity constraints (MPCC). For this problem, we propose a special positive definite modification of the matrices in the generalized Hessian, which is suitable for globalization of SQP based on the penalty function, and at the same time can be expected to satisfy our general DennisMoré type conditions, thus preserving local superlinear convergence. (Standard quasiNewton updates in the SQP framework require twice differentiability of the problem data at the solution for superlinear convergence.) Preliminary numerical results comparing a number of quasiNewton versions of semismooth SQP applied to MPCC are also reported. Keywords: sequential quadratic programming; semismoothness; Bdifferential; BDregularity; semismooth Newton method; secondorder sufficiency; mathematical programs with complementarity constraints AMS Subject Classification: 90C30; 90C33; 90C55; 65K05 1.