Results 1  10
of
14
On the limited memory BFGS method for large scale optimization
 Mathematical Programming
, 1989
"... this paper has appeared in ..."
Theory and implementation of numerical methods based on RungeKutta integration for solving optimal control problems
, 1996
"... ..."
On the Use of ElementbyElement Preconditioners to Solve Large Scale Partially Separable Optimization Problems
"... We study the solution of largescale nonlinear optimization problems by methods which aim to exploit their inherent structure. In particular, we consider the allpervasive property of partial separability, first studied by Griewank and Toint (1982b). A typical minimizationmethod for nonlinear optimi ..."
Abstract

Cited by 8 (5 self)
 Add to MetaCart
We study the solution of largescale nonlinear optimization problems by methods which aim to exploit their inherent structure. In particular, we consider the allpervasive property of partial separability, first studied by Griewank and Toint (1982b). A typical minimizationmethod for nonlinear optimization problems approximately solves a sequence of simplified linearized subproblems. In this paper, we explore how partial separability may be exploited by iterative methods for solving these subproblems. We particularly address the issue of computing effective preconditioners for such iterative methods. Numerical experiments indicate the effectiveness of these preconditioners on largescale examples. Keywords: largescale problems, unconstrained optimization, elememtbyelement preconditioners, conjugategradients. AMS(MOS) subject classifications: 65F05, 65F10, 65F15, 65F50, 65K05, 90C30. Also appeared as ENSEEIHTIRIT report RT/APO/94/4. 1 Travel was funded, in part, by the ALLIANCE...
A Modified BFGS Method and Its Global Convergence in Nonconvex Minimization
, 1998
"... In this paper, we propose a modication of the BFGS method for unconstrained optimization. A remarkable feature of the proposed method is that it possesses a global convergence property even without convexity assumption on the objective function. Under certain conditions, we also establish superlinea ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
In this paper, we propose a modication of the BFGS method for unconstrained optimization. A remarkable feature of the proposed method is that it possesses a global convergence property even without convexity assumption on the objective function. Under certain conditions, we also establish superlinear convergence of the method. Key words: BFGS method, global convergence, superlinear convergence 1 Present address (available before October, 1999): Department of Applied Mathematics and Physics, Graduate School of Engineering, Kyoto University, Kyoto 606, Japan, email: lidh@kuamp.kyotou.ac.jp 1 Introduction Let f : R n ! R be continuously dierentiable. Consider the following unconstrained optimization problem: min f(x); x 2 R n : (1:1) Among numerous iterative methods for solving (1.1), quasiNewton methods constitute particularly important class. Throughout the paper, we assume that f in (1.1) has Lipschitz continuous gradients, i.e. there is a constant L > 0 such kg(x) g(y)k ...
Algorithms for Solving Nonlinear Systems of Equations
, 1994
"... In this paper we survey numerical methods for solving nonlinear systems of equations F (x) = 0, where F : IR n ! IR n . We are especially interested in large problems. We describe modern implementations of the main local algorithms, as well as their globally convergent counterparts. 1. INTRODUC ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
In this paper we survey numerical methods for solving nonlinear systems of equations F (x) = 0, where F : IR n ! IR n . We are especially interested in large problems. We describe modern implementations of the main local algorithms, as well as their globally convergent counterparts. 1. INTRODUCTION Nonlinear systems of equations appear in many real  life problems. Mor'e [1989] has reported a collection of practical examples which include: Aircraft Stability problems, Inverse Elastic Rod problems, Equations of Radiative Transfer, Elliptic Boundary Value problems, etc.. We have also worked with Power Flow problems, Distribution of Water on a Pipeline, Discretization of Evolution problems using Implicit Schemes, Chemical Plant Equilibrium problems, and others. The scope of applications becomes even greater if we include the family of Nonlinear Programming problems, since the firstorder optimality conditions of these problems are nonlinear systems. Given F : IR n ! IR n ; F = (...
Solving Nonlinear Systems Of Equations By Means Of QuasiNewton Methods With A Nonmonotone Strategy
, 1997
"... A nonmonotone strategy for solving nonlinear systems of equations is introduced. The idea consists of combining efficient local methods with an algorithm that reduces monotonically the squared norm of the system in a proper way. The local methods used are Newton's method and two quasiNewton algorith ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
A nonmonotone strategy for solving nonlinear systems of equations is introduced. The idea consists of combining efficient local methods with an algorithm that reduces monotonically the squared norm of the system in a proper way. The local methods used are Newton's method and two quasiNewton algorithms. Global iterations are based on recently introduced boxconstrained minimization algorithms. We present numerical experiments. 1 INTRODUCTION Given F : IR n ! IR n ; F = (f 1 ; : : : ; f n ) T , our aim is to find solutions of F (x) = 0: (1) We assume that F is well defined and has continuous partial derivatives on an open set of IR n . J(x) denotes the Jacobian matrix of partial derivatives of F (x). We are mostly interested in problems where n is large and J(x) is structurally sparse. This means that most entries of J(x) are zero for all x in the domain of F . The package Nightingale has been developed at the Department of Applied Mathematics of the University of Campinas for...
A Globally and Superlinearly Convergent GaussNewton Based BFGS Method for Symmetric Equations
, 1998
"... In this paper, we present a GaussNewton based BFGS method for solving symmetric nonlinear equations which contain, as a special case, an unconstrained optimization problem, a saddle point problem and an equality constrained optimization problem. A suitable line search is proposed with which the pre ..."
Abstract

Cited by 5 (4 self)
 Add to MetaCart
In this paper, we present a GaussNewton based BFGS method for solving symmetric nonlinear equations which contain, as a special case, an unconstrained optimization problem, a saddle point problem and an equality constrained optimization problem. A suitable line search is proposed with which the presented BFGS method exhibits an approximate norm descent property. Under appropriate conditions, global convergence and superlinear convergence of the method are established. Key words: BFGS method, global convergence, superlinear convergence, symmetric equations 1 Present address (available until October, 1999): Department of Applied Mathematics and Physics, Graduate School of Engineering, Kyoto University, Kyoto 606, Japan, email: lidh@kuamp.kyotou.ac.jp 1 Introduction During the past two decades, much eort has been made to establish global convergence of quasiNewton methods, especially for convex unconstrained minimization problems (e.g. [2], [3], [11], [14], [15], [16], [17]). We ...
Recursive trustregion methods for multiscale nonlinear optimization
 SIAM J. Optim
"... Abstract. A class of trustregion methods is presented for solving unconstrained nonlinear and possibly nonconvex discretized optimization problems, like those arising in systems governed by partial differential equations. The algorithms in this class make use of the discretization level as a mean o ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
Abstract. A class of trustregion methods is presented for solving unconstrained nonlinear and possibly nonconvex discretized optimization problems, like those arising in systems governed by partial differential equations. The algorithms in this class make use of the discretization level as a mean of speeding up the computation of the step. This use is recursive, leading to true multilevel/multiscale optimization methods reminiscent of multigrid methods in linear algebra and the solution of partialdifferential equations. A simple algorithm of the class is then described and its numerical performance is shown to be numerically promising. This observation then motivates a proof of global convergence to firstorder stationary points on the fine grid that is valid for all algorithms in the class.
Solution of Unassembled Linear Systems Using Block Stretching: Preliminary Experiments.
, 1997
"... We consider socalled "matrix stretching" technique that make structured unassembled linear systems larger, but sparser. Our solution technique combines a direct factorization of the leading block diagonal submatrix of the stretched system, with a preconditioned conjugate gradient solution of the Sc ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We consider socalled "matrix stretching" technique that make structured unassembled linear systems larger, but sparser. Our solution technique combines a direct factorization of the leading block diagonal submatrix of the stretched system, with a preconditioned conjugate gradient solution of the Schur complement system which results from the factorization of the diagonal blocks. We show that matrix stretching is an effective technique, particularly for illconditioned systems. The Schur complement is often considerably better conditioned than the whole system. The main challenge is to find a suitable preconditioner for this matrix. We consider a range of preconditioners, including those proposed by Chan, and band approximations. We also study the use of some ElementbyElement preconditioners such as EBE and the recently introduced SubspacebySubspace preconditioner. We report on experiments using structured problems and examples from the HarwellBoeing sparse matrix collection. We al...
Global Convergence of a Class of Collinear Scaling Algorithms with Inexact Line Searches on Convex Functions
 Computing
, 1999
"... Global Convergence of a Class of Collinear Scaling Algorithms with Inexact Line Searches on Convex Functions. Ariyawansa [2] has presented a class of collinear scaling algorithms for unconstrained minimization. A certain family of algorithms contained in this class may be considered as an extension ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Global Convergence of a Class of Collinear Scaling Algorithms with Inexact Line Searches on Convex Functions. Ariyawansa [2] has presented a class of collinear scaling algorithms for unconstrained minimization. A certain family of algorithms contained in this class may be considered as an extension of quasiNewton methods with the Broyden family [11] of approximants of the objective function Hessian. Byrd, Nocedal and Yuan [7] have shown that all members except the DFP [11] method of the Broyden convex family of quasiNewton methods with Armijo [1] and Goldstein [12] line search termination criteria are globally and qsuperlinearly convergent on uniformly convex functions. Extension of this result to the above class of collinear scaling algorithms of Ariyawansa [2] has been impossible because line search termination criteria for collinear scaling algorithms were not known until recently. Ariyawansa [4] has recently proposed such line search termination criteria. In this paper, we prove ...