Results 1  10
of
12
Curvilinear Stabilization Techniques for Truncated Newton Methods in Large Scale Unconstrained Optimization: the . . .
 SIAM J. Optim
, 1998
"... The aim of this paper is to define a new class of minimization algorithms for solving large scale unconstrained problems. In particular we describe a stabilization framework, based on a curvilinear linesearch, which uses a combination of a Newtontype direction and a negative curvature direction. Th ..."
Abstract

Cited by 15 (4 self)
 Add to MetaCart
The aim of this paper is to define a new class of minimization algorithms for solving large scale unconstrained problems. In particular we describe a stabilization framework, based on a curvilinear linesearch, which uses a combination of a Newtontype direction and a negative curvature direction. The motivation for using negative curvature direction is that of taking into account local nonconvexity of the objective function. On the basis of this framework, we propose an algorithm which uses the Lanczos method for determining at each iteration both a Newtontype direction and an effective negative curvature direction. The results of an extensive numerical testing is reported together with a comparison with the LANCELOT package. These results show that the algorithm is very competitive and this seems to indicate that the proposed approach is promising. 1 Introduction In this work, we deal with the definition of new efficient unconstrained minimization algorithms for solving large scal...
Convergence to Second Order Stationary Points in Inequality Constrained Optimization
 Mathematics of Operations Research
, 1998
"... : We propose a new algorithm for the nonlinear inequality constrained minimization problem, and prove that it generates a sequence converging to points satisfying the KKT second order necessary conditions for optimality. The algorithm is a line search algorithm using directions of negative curvature ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
: We propose a new algorithm for the nonlinear inequality constrained minimization problem, and prove that it generates a sequence converging to points satisfying the KKT second order necessary conditions for optimality. The algorithm is a line search algorithm using directions of negative curvature and it can be viewed as a non trivial extension of corresponding known techniques from unconstrained to constrained problems. The main tools employed in the definition and in the analysis of the algorithm are a differentiable exact penalty function and results from the theory of LC 1 functions. Key Words: Inequality constrained optimization, KKT second order necessary conditions, penalty function, LC 1 function, negative curvature direction. 1 Introduction We are concerned with the inequality constrained minimization problem (P) min f(x) s.t. g(x) 0; where f : IR n ! IR and g : IR n ! IR m are three times continuously differentiable. Our aim is to develope an algorithm that g...
Exploiting Negative Curvature Directions in Linesearch Methods for Unconstrained Optimization
, 1997
"... In this paper we consider the definition of new efficient linesearch algorithms for solving large scale unconstrained optimization problems which exploit the local nonconvexity of the objective function. Existing algorithms of this class compute, at each iteration, two search directions: a Newtonty ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
In this paper we consider the definition of new efficient linesearch algorithms for solving large scale unconstrained optimization problems which exploit the local nonconvexity of the objective function. Existing algorithms of this class compute, at each iteration, two search directions: a Newtontype direction which ensures a global and fast convergence, and a negative curvature direction which enables the iterates to escape from the region of local nonconvexity. A new point is then generated by performing a movement along a curve obtained by combining these two directions. However, the respective scaling of the directions is typically ignored. We propose a new algorithm which aims to avoid the scaling problem by selecting the more promising of the two directions, and then performs a step along this direction. The selection is based on a test on the rate of decrease of the quadratic model of the objective function. We prove global convergence to secondorder critical points for the ne...
Relaxing Convergence Conditions To Improve The Convergence Rate
, 1999
"... Standard global convergence proofs are examined to determine why some algorithms perform better than other algorithms. We show that relaxing the conditions required to prove global convergence can improve an algorithm's performance. Further analysis indicates that minimizing an estimate of the dista ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Standard global convergence proofs are examined to determine why some algorithms perform better than other algorithms. We show that relaxing the conditions required to prove global convergence can improve an algorithm's performance. Further analysis indicates that minimizing an estimate of the distance to the minimum relaxes the convergence conditions in such a way as to improve an algorithm's convergence rate. A new linesearch algorithm based on these ideas is presented that does not force a reduction in the objective function at each iteration, yet it allows the objective function to increase during an iteration only if this will result in faster convergence. Unlike the nonmonotone algorithms in the literature, these new functions dynamically adjust to account for changes between the influence of curvature and descent. The result is an optimal algorithm in the sense that an estimate of the distance to the minimum is minimized at each iteration. The algorithm is shown to be well defi...
WOMBAT A program for Mixed Model Analyses by Restricted Maximum Likelihood Version 1.0 USER NOTES
"... This document has been typeset using L AT E X2e with the hyperref package. This gives a document which is fully navigable all references to other sections and citations are ‘clickable ’ within document links, and all links to external URLs can be accessed from within the PDF viewer (if this is conf ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
This document has been typeset using L AT E X2e with the hyperref package. This gives a document which is fully navigable all references to other sections and citations are ‘clickable ’ within document links, and all links to external URLs can be accessed from within the PDF viewer (if this is configured to do so). © Karin Meyer 2006–2010 Permission is granted to make and distribute verbatim copies of this document, provided it is preserved complete and unmodified.
Cost Approximation Algorithms With Nonmonotone Line Searches for a General Class of Nonlinear Programs
, 1996
"... . When solving illconditioned nonlinear programs by descent algorithms, the descent requirement may induce the step lengths to become very small, thus resulting in very poor performances. Recently, suggestions have been made to circumvent this problem, among which is a class of approaches in which ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
. When solving illconditioned nonlinear programs by descent algorithms, the descent requirement may induce the step lengths to become very small, thus resulting in very poor performances. Recently, suggestions have been made to circumvent this problem, among which is a class of approaches in which the objective value may be allowed to increase temporarily. Grippo et al. [GLL91] introduce nonmonotone line searches in the class of deflected gradient methods in unconstrained differentiable optimization; this technique allows for longer steps (typically of unit length) to be taken, and is successfully applied to some illconditioned problems. This paper extends their nonmonotone approach and convergence results to the large class of cost approximation algorithms of Patriksson [Pat93b], and to optimization problems with both convex constraints and nondifferentiable objective functions. Key Words. Nondifferentiable optimization, cost approximation, nonmonotone algorithms Abbreviated Title...
in large scale optimization
, 2007
"... Abstract In this paper we deal with the iterative computation of negative curvature directions of an objective function, within large scale optimization frameworks. In particular, suitable directions of negative curvature of the objective function represent an essential tool, to guarantee convergenc ..."
Abstract
 Add to MetaCart
Abstract In this paper we deal with the iterative computation of negative curvature directions of an objective function, within large scale optimization frameworks. In particular, suitable directions of negative curvature of the objective function represent an essential tool, to guarantee convergence to second order critical points. However, an “adequate ” negative curvature direction is often required to have a good resemblance to an eigenvector corresponding to the smallest eigenvalue of the Hessian matrix. Thus, its computation may be a very difficult task on large scale problems. Several strategies proposed in literature compute such a direction relying on matrix factorizations, so that they may be inefficient or even impracticable in a large scale setting. On the other hand, the iterative methods proposed either need to store a large matrix, or they need to rerun the recurrence. On this guideline, in this paper we propose the use of an iterative method, based on a planar Conjugate Gradient scheme. Under mild assumptions, we provide theory for using the latter method to compute adequate negative curvature directions, within optimization frameworks. In our proposal any matrix storage is avoided, along with any additional rerun.
FOR THE DEGREE OF
"... We present a general arc search algorithm for linearly constrained optimization. The method constructs and searches along smooth arcs that satisfy a small and practical set of properties. An activeset strategy is used to manage linear inequality constraints. When second derivatives are used, the me ..."
Abstract
 Add to MetaCart
We present a general arc search algorithm for linearly constrained optimization. The method constructs and searches along smooth arcs that satisfy a small and practical set of properties. An activeset strategy is used to manage linear inequality constraints. When second derivatives are used, the method is shown to converge to a secondorder critical point and have a quadratic rate of convergence under standard conditions. The theory is applied to the methods of line search, curvilinear search, and modified gradient flow that have previously been proposed for unconstrained problems. A key issue when generalizing unconstrained methods to linearly constrained problems using an activeset strategy is the complexity of how the arc intersects hyperplanes. We introduce a new arc that is derived from the regularized Newton equation. Computing the intersection between this arc and a linear constraint reduces to finding the roots of a quadratic polynomial. The new arc scales to large problems, does not require modification to the Hessian, and is rarely dependent on the scaling of directions of negative curvature. Numerical experiments show the effectiveness of this arc search method on problems from the CUTEr test set and on a specific class of problems for which identifying negative curvature is critical. A second set of experiments demonstrates that when using SR1 quasiNewton updates, this arc search method is competitive with a line search method using