Results 1  10
of
14
Line Search Algorithms With Guaranteed Sufficient Decrease
 ACM Trans. Math. Software
, 1992
"... The problem of finding a point that satisfies the sufficient decrease and curvature condition is formulated in terms of finding a point in a set T (). We describe a search algorithms for this problem that produces a sequence of iterates that converge to a point in T () and that, except for pathologi ..."
Abstract

Cited by 86 (0 self)
 Add to MetaCart
The problem of finding a point that satisfies the sufficient decrease and curvature condition is formulated in terms of finding a point in a set T (). We describe a search algorithms for this problem that produces a sequence of iterates that converge to a point in T () and that, except for pathological cases, terminates in a finite number of steps. Numerical results for an implementation of the search algorithm on a set of test functions show that the algorithm terminates within a small number of iterations. LINE SEARCH ALGORITHMS WITH GUARANTEED SUFFICIENT DECREASE Jorge J. Mor'e and David J. Thuente 1 Introduction Given a continuously differentiable function OE : IR ! IR defined on [0; 1) with OE 0 (0) ! 0, and constants and j in (0; 1), we are interested in finding an ff ? 0 such that OE(ff) OE(0) + OE 0 (0)ff (1:1) and jOE 0 (ff)j jjOE 0 (0)j: (1:2) The development of a search procedure that satisfies these conditions is a crucial ingredient in a line search meth...
LARGESCALE LINEARLY CONSTRAINED OPTIMIZATION
, 1978
"... An algorithm for solving largescale nonlinear ' programs with linear constraints is presented. The method combines efficient sparsematrix techniques as in the revised simplex method with stable quasiNewton methods for handling the nonlinearities. A generalpurpose production code (MINOS) is descr ..."
Abstract

Cited by 75 (11 self)
 Add to MetaCart
An algorithm for solving largescale nonlinear ' programs with linear constraints is presented. The method combines efficient sparsematrix techniques as in the revised simplex method with stable quasiNewton methods for handling the nonlinearities. A generalpurpose production code (MINOS) is described, along with computational experience on a wide variety of problems.
Efficient Training of FeedForward Neural Networks
, 1997
"... : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 61 A.2 Introduction : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 61 A.2.1 Motivation : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 61 A.3 Optimization strategy : : : : : : : : : : : : ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
: : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 61 A.2 Introduction : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 61 A.2.1 Motivation : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 61 A.3 Optimization strategy : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 62 A.4 The Backpropagation algorithm : : : : : : : : : : : : : : : : : : : : : : : : 63 A.5 Conjugate direction methods : : : : : : : : : : : : : : : : : : : : : : : : : : 63 A.5.1 Conjugate gradients : : : : : : : : : : : : : : : : : : : : : : : : : : 65 A.5.2 The CGL algorithm : : : : : : : : : : : : : : : : : : : : : : : : : : : 67 A.5.3 The BFGS algorithm : : : : : : : : : : : : : : : : : : : : : : : : : : 67 A.6 The SCG algorithm : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 67 A.7 Test results : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 70 A.7.1 Comparison metric : : : : : : : : : : : : : : : : : : : : : : : :...
2. Specification
"... nag opt bounds no deriv (e04jbc) is a comprehensive quasiNewton algorithm for finding: an unconstrained minimum of a function of several variables a minimum of a function of several variables subject to fixed upper and/or lower bounds on the variables. No derivatives are required. The function na ..."
Abstract
 Add to MetaCart
nag opt bounds no deriv (e04jbc) is a comprehensive quasiNewton algorithm for finding: an unconstrained minimum of a function of several variables a minimum of a function of several variables subject to fixed upper and/or lower bounds on the variables. No derivatives are required. The function nag opt bounds no deriv is intended for objective functions which have continuous first and second derivatives (although it will usually work even if the derivatives have occasional discontinuities).
double e1, double e2, double *a, double *b,
"... e04abc nag opt one var no deriv (e04abc) 1. Purpose nag opt one var no deriv (e04abc) searches for a minimum, in a given nite interval, of a continuous function of a single variable, using function values only. The method (based on quadratic interpolation) is intended for functions which havea conti ..."
Abstract
 Add to MetaCart
e04abc nag opt one var no deriv (e04abc) 1. Purpose nag opt one var no deriv (e04abc) searches for a minimum, in a given nite interval, of a continuous function of a single variable, using function values only. The method (based on quadratic interpolation) is intended for functions which havea continuous rst derivative (although it will usually work if the derivative has occasional discontinuities). 2. Speci cation
e04 { Minimizing or Maximizing a Function e04lbc nag opt bounds 2nd deriv (e04lbc)
"... 1. Purpose nag opt bounds 2nd deriv (e04lbc) is a comprehensive modi edNewton algorithm for nding: { an unconstrained minimum of a function of several variables { a minimum of a function of several variables subject to xed upper and/or lower bounds on the variables. First and second derivatives are ..."
Abstract
 Add to MetaCart
1. Purpose nag opt bounds 2nd deriv (e04lbc) is a comprehensive modi edNewton algorithm for nding: { an unconstrained minimum of a function of several variables { a minimum of a function of several variables subject to xed upper and/or lower bounds on the variables. First and second derivatives are required. The function nag opt bounds 2nd deriv is intended for objective functions whichhavecontinuous rst and second derivatives (although it will usually work even if the derivatives have occasional discontinuities). 2. Speci cation #include <nag.h> #include <nage04.h> void nag_opt_bounds_2nd_deriv(Integer n, void (*objfun)(Integer n, double x[], double *objf, double g[], Nag_Comm *comm), void (*hessfun)(Integer n, double x[], double h[],
3. Description
"... nag opt one var no deriv (e04abc) searches for a minimum, in a given finite interval, of a continuous function of a single variable, using function values only. The method (based on quadratic interpolation) is intended for functions which have a continuous first derivative (although it will usually ..."
Abstract
 Add to MetaCart
nag opt one var no deriv (e04abc) searches for a minimum, in a given finite interval, of a continuous function of a single variable, using function values only. The method (based on quadratic interpolation) is intended for functions which have a continuous first derivative (although it will usually work if the derivative has occasional discontinuities).
3. Description
"... nag opt one var deriv (e04bbc) searches for a minimum, in a given finite interval, of a continuous function of a single variable, using function and first derivative values. The method (based on cubic interpolation) is intended for functions which have a continuous first derivative (although it will ..."
Abstract
 Add to MetaCart
nag opt one var deriv (e04bbc) searches for a minimum, in a given finite interval, of a continuous function of a single variable, using function and first derivative values. The method (based on cubic interpolation) is intended for functions which have a continuous first derivative (although it will usually work if the derivative has occasional discontinuities).
2. Specification
"... nag opt bounds 2nd deriv (e04lbc) is a comprehensive modifiedNewton algorithm for finding: an unconstrained minimum of a function of several variables a minimum of a function of several variables subject to fixed upper and/or lower bounds on the variables. First and second derivatives are require ..."
Abstract
 Add to MetaCart
nag opt bounds 2nd deriv (e04lbc) is a comprehensive modifiedNewton algorithm for finding: an unconstrained minimum of a function of several variables a minimum of a function of several variables subject to fixed upper and/or lower bounds on the variables. First and second derivatives are required. The function nag opt bounds 2nd deriv is intended for objective functions which have continuous first and second derivatives (although it will usually work even if the derivatives have occasional discontinuities).
#include
"... e04bbc nag opt one var deriv (e04bbc) 1. Purpose nag opt one var deriv (e04bbc) searches for a minimum, in a given nite interval, of a continuous function of a single variable, using function and rst derivative values. The method (based on cubic interpolation) is intended for functions which havea c ..."
Abstract
 Add to MetaCart
e04bbc nag opt one var deriv (e04bbc) 1. Purpose nag opt one var deriv (e04bbc) searches for a minimum, in a given nite interval, of a continuous function of a single variable, using function and rst derivative values. The method (based on cubic interpolation) is intended for functions which havea continuous rst derivative (although it will usually work if the derivative has occasional discontinuities). 2. Speci cation