Results 1  10
of
59
GLOBAL CONVERGENCE PROPERTIES OF CONJUGATE GRADIENT METHODS FOR OPTIMIZATION
, 1992
"... This paper explores the convergence ofnonlinear conjugate gradient methods without restarts, and with practical line searches. The analysis covers two classes ofmethods that are globally convergent on smooth, nonconvex functions. Some properties of the FletcherReeves method play an important role ..."
Abstract

Cited by 129 (3 self)
 Add to MetaCart
This paper explores the convergence ofnonlinear conjugate gradient methods without restarts, and with practical line searches. The analysis covers two classes ofmethods that are globally convergent on smooth, nonconvex functions. Some properties of the FletcherReeves method play an important role in the first family, whereas the second family shares an important property with the PolakRibire method. Numerical experiments are presented.
Line Search Algorithms With Guaranteed Sufficient Decrease
 ACM Trans. Math. Software
, 1992
"... The problem of finding a point that satisfies the sufficient decrease and curvature condition is formulated in terms of finding a point in a set T (). We describe a search algorithms for this problem that produces a sequence of iterates that converge to a point in T () and that, except for pathologi ..."
Abstract

Cited by 121 (0 self)
 Add to MetaCart
(Show Context)
The problem of finding a point that satisfies the sufficient decrease and curvature condition is formulated in terms of finding a point in a set T (). We describe a search algorithms for this problem that produces a sequence of iterates that converge to a point in T () and that, except for pathological cases, terminates in a finite number of steps. Numerical results for an implementation of the search algorithm on a set of test functions show that the algorithm terminates within a small number of iterations. LINE SEARCH ALGORITHMS WITH GUARANTEED SUFFICIENT DECREASE Jorge J. Mor'e and David J. Thuente 1 Introduction Given a continuously differentiable function OE : IR ! IR defined on [0; 1) with OE 0 (0) ! 0, and constants and j in (0; 1), we are interested in finding an ff ? 0 such that OE(ff) OE(0) + OE 0 (0)ff (1:1) and jOE 0 (ff)j jjOE 0 (0)j: (1:2) The development of a search procedure that satisfies these conditions is a crucial ingredient in a line search meth...
Theory of Algorithms for Unconstrained Optimization
, 1992
"... this article I will attempt to review the most recent advances in the theory of unconstrained optimization, and will also describe some important open questions. Before doing so, I should point out that the value of the theory of optimization is not limited to its capacity for explaining the behavio ..."
Abstract

Cited by 111 (1 self)
 Add to MetaCart
this article I will attempt to review the most recent advances in the theory of unconstrained optimization, and will also describe some important open questions. Before doing so, I should point out that the value of the theory of optimization is not limited to its capacity for explaining the behavior of the most widely used techniques. The question
A new conjugate gradient method with guaranteed descent and an efficient line search
 SIAM J. OPTIM
, 2005
"... A new nonlinear conjugate gradient method and an associated implementation, based on an inexact line search, are proposed and analyzed. With exact line search, our method reduces to a nonlinear version of the Hestenes–Stiefel conjugate gradient scheme. For any (inexact) line search, our scheme sat ..."
Abstract

Cited by 78 (6 self)
 Add to MetaCart
(Show Context)
A new nonlinear conjugate gradient method and an associated implementation, based on an inexact line search, are proposed and analyzed. With exact line search, our method reduces to a nonlinear version of the Hestenes–Stiefel conjugate gradient scheme. For any (inexact) line search, our scheme satisfies the descent condition gT k dk ≤ − 7 8 ‖gk‖2. Moreover, a global convergence result is established when the line search fulfills the Wolfe conditions. A new line search scheme is developed that is efficient and highly accurate. Efficiency is achieved by exploiting properties of linear interpolants in a neighborhood of a local minimizer. High accuracy is achieved by using a convergence criterion, which we call the “approximate Wolfe ” conditions, obtained by replacing the sufficient decrease criterion in the Wolfe conditions with an approximation that can be evaluated with greater precision in a neighborhood of a local minimum than the usual sufficient decrease criterion. Numerical comparisons are given with both LBFGS and conjugate gradient methods using the unconstrained optimization problems in the CUTE library.
A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
 SIAM J. Optim
, 1999
"... . Conjugate gradient methods are widely used for unconstrained optimization, especially large scale problems. However, the strong Wolfe conditions are usually used in the analyses and implementations of conjugate gradient methods. This paper presents a new version of the conjugate gradient method, w ..."
Abstract

Cited by 73 (9 self)
 Add to MetaCart
(Show Context)
. Conjugate gradient methods are widely used for unconstrained optimization, especially large scale problems. However, the strong Wolfe conditions are usually used in the analyses and implementations of conjugate gradient methods. This paper presents a new version of the conjugate gradient method, which converges globally provided the line search satisfies the standard Wolfe conditions. The conditions on the objective function are also weak, which are similar to that required by the Zoutendijk condition. Key words. unconstrained optimization, new conjugate gradient method, Wolfe conditions, global convergence. AMS subject classifications. 65k, 90c 1. Introduction. Our problem is to minimize a function of n variables min f(x); (1.1) where f is smooth and its gradient g(x) is available. Conjugate gradient methods for solving (1.1) are iterative methods of the form x k+1 = x k + ff k d k ; (1.2) where ff k ? 0 is a steplength, d k is a search direction. Normally the search direction at...
A survey of nonlinear conjugate gradient methods,
 Pacific J. Optim.,
, 2006
"... ..."
(Show Context)
LimitedMemory Matrix Methods with Applications
, 1997
"... Abstract. The focus of this dissertation is on matrix decompositions that use a limited amount of computer memory � thereby allowing problems with a very large number of variables to be solved. Speci�cally � we will focus on two applications areas � optimization and information retrieval. We introdu ..."
Abstract

Cited by 33 (6 self)
 Add to MetaCart
(Show Context)
Abstract. The focus of this dissertation is on matrix decompositions that use a limited amount of computer memory � thereby allowing problems with a very large number of variables to be solved. Speci�cally � we will focus on two applications areas � optimization and information retrieval. We introduce a general algebraic form for the matrix update in limited�memory quasi� Newton methods. Many well�known methods such as limited�memory Broyden Family meth� ods satisfy the general form. We are able to prove several results about methods which sat� isfy the general form. In particular � we show that the only limited�memory Broyden Family method �using exact line searches � that is guaranteed to terminate within n iterations on an n�dimensional strictly convex quadratic is the limited�memory BFGS method. Further� more � we are able to introduce several new variations on the limited�memory BFGS method that retain the quadratic termination property. We also have a new result that shows that full�memory Broyden Family methods �using exact line searches � that skip p updates to the quasi�Newton matrix will terminate in no more than n�p steps on an n�dimensional strictly convex quadratic. We propose several new variations on the limited�memory BFGS method
Algorithm 851: CG DESCENT, a conjugate gradient method with guaranteed descent
 ACM Trans. Math. Softw
, 2006
"... Recently, a new nonlinear conjugate gradient scheme was developed which satisfies the descent condition gT kdk ≤ − 7 8 ‖gk‖2 and which is globally convergent whenever the line search fulfills the Wolfe conditions. This article studies the convergence behavior of the algorithm; extensive numerical t ..."
Abstract

Cited by 25 (4 self)
 Add to MetaCart
Recently, a new nonlinear conjugate gradient scheme was developed which satisfies the descent condition gT kdk ≤ − 7 8 ‖gk‖2 and which is globally convergent whenever the line search fulfills the Wolfe conditions. This article studies the convergence behavior of the algorithm; extensive numerical tests and comparisons with other methods for largescale unconstrained optimization are given.
Convergence Properties Of Nonlinear Conjugate Gradient Methods
 Institute of Computational Mathematics and Scientific/Engineering Computing, Chinese Academy of Sciences
, 1998
"... Recently, important contributions on convergence studies of conjugate gradient methods have been made by Gilbert and Nocedal [6]. They introduce a "sufficient descent condition" to establish global convergence results, whereas this condition is not needed in the convergence analyses of New ..."
Abstract

Cited by 21 (7 self)
 Add to MetaCart
(Show Context)
Recently, important contributions on convergence studies of conjugate gradient methods have been made by Gilbert and Nocedal [6]. They introduce a "sufficient descent condition" to establish global convergence results, whereas this condition is not needed in the convergence analyses of Newton and quasiNewton methods, [6] hints that the sufficient descent condition, which was enforced by their twostage line search algorithm, may be crucial for ensuring the global convergence of conjugate gradient methods. This paper shows that the sufficient descent condition is actually not needed in the convergence analyses of conjugate gradient methods. Consequently, convergence results on the FletcherReevestype and PolakRibi`eretype methods are established in the absence of the sufficient descent condition. To show the differences between the convergence properties of FletcherReevestype and PolakRibi`eretype methods, two examples are constructed, showing that neither the boundedness of the ...