Results 1  10
of
80
On the limited memory BFGS method for large scale optimization
 MATHEMATICAL PROGRAMMING
, 1989
"... ..."
Representations Of QuasiNewton Matrices And Their Use In Limited Memory Methods
, 1996
"... We derive compact representations of BFGS and symmetric rankone matrices for optimization. These representations allow us to efficiently implement limited memory methods for large constrained optimization problems. In particular, we discuss how to compute projections of limited memory matrices onto ..."
Abstract

Cited by 159 (10 self)
 Add to MetaCart
We derive compact representations of BFGS and symmetric rankone matrices for optimization. These representations allow us to efficiently implement limited memory methods for large constrained optimization problems. In particular, we discuss how to compute projections of limited memory matrices onto subspaces. We also present a compact representation of the matrices generated by Broyden's update for solving systems of nonlinear equations.
Theory of Algorithms for Unconstrained Optimization
, 1992
"... this article I will attempt to review the most recent advances in the theory of unconstrained optimization, and will also describe some important open questions. Before doing so, I should point out that the value of the theory of optimization is not limited to its capacity for explaining the behavio ..."
Abstract

Cited by 111 (1 self)
 Add to MetaCart
this article I will attempt to review the most recent advances in the theory of unconstrained optimization, and will also describe some important open questions. Before doing so, I should point out that the value of the theory of optimization is not limited to its capacity for explaining the behavior of the most widely used techniques. The question
LBFGSB  Fortran Subroutines for LargeScale Bound Constrained Optimization
, 1994
"... LBFGSB is a limited memory algorithm for solving large nonlinear optimization problems subject to simple bounds on the variables. It is intended for problems in which information on the Hessian matrix is di cult to obtain, or for large dense problems. LBFGSB can also be used for unconstrained pr ..."
Abstract

Cited by 56 (2 self)
 Add to MetaCart
(Show Context)
LBFGSB is a limited memory algorithm for solving large nonlinear optimization problems subject to simple bounds on the variables. It is intended for problems in which information on the Hessian matrix is di cult to obtain, or for large dense problems. LBFGSB can also be used for unconstrained problems, and in this case performs similarly to its predecessor, algorithm LBFGS (Harwell routine VA15). The algorithm is implemented in Fortran 77.
Automatic preconditioning by limited memory QuasiNewton updating
 SIAM J. OPTIM
, 1999
"... The paper proposes a preconditioner for the conjugate gradient method (CG) that is designed for solving systems of equations Ax = bi with different right hand side vectors, or for solving a sequence of slowly varying systems Akx = bk. The preconditioner has the form of a limited memory quasiNewton ..."
Abstract

Cited by 44 (2 self)
 Add to MetaCart
The paper proposes a preconditioner for the conjugate gradient method (CG) that is designed for solving systems of equations Ax = bi with different right hand side vectors, or for solving a sequence of slowly varying systems Akx = bk. The preconditioner has the form of a limited memory quasiNewton matrix and is generated using information from the CG iteration. The automatic preconditioner does not require explicit knowledge of the coefficient matrix A and is therefore suitable for problems where only products of A times avector can be computed. Numerical experiments indicate that the preconditioner has most to offer when these matrixvector products are expensive to compute, and when low accuracy in the solution is required. The effectiveness of the preconditioner is tested within a Hessianfree Newton method for optimization, and by solving certain linear systems arising in finite element models.
Testing a marine ecosystem model: sensitivity analysis and parameter optimization
 Journal of Marine Systems
, 2001
"... optimization ..."
Sensitivity analysis in variational data assimilation
 J. Meteorol. Soc. Japan
, 1997
"... Optimal control theory is applied to a variational data assimilation problem in the context of the assimilation of altimeter data in a quasigeostrophic ocean model. Related to the issue of the minimization of the cost function, a sensitivity analysis is applied to the optimality system to derive the ..."
Abstract

Cited by 24 (3 self)
 Add to MetaCart
(Show Context)
Optimal control theory is applied to a variational data assimilation problem in the context of the assimilation of altimeter data in a quasigeostrophic ocean model. Related to the issue of the minimization of the cost function, a sensitivity analysis is applied to the optimality system to derive the sensitivity of the retrieved control variable ( here the initial condition) with respect to the observations. The derivation of the sensitivity of a response function in the case of data assimilation is reviewed and a new method of performing the derivation of this sensitivity is proposed. 1 1
2008b: An Ensemblebased Fourdimensional Variational Data Assimilation Scheme
 Part II: Observing System Simulation Experiments with the Advanced Research WRF
"... The incremental approach of fourdimensional variational (4DVar) data assimilation (Courtier et al. 1994) and Ensemble Kalman filters (EnKF, Evensen 1994) are well known as two advanced data ..."
Abstract

Cited by 22 (0 self)
 Add to MetaCart
The incremental approach of fourdimensional variational (4DVar) data assimilation (Courtier et al. 1994) and Ensemble Kalman filters (EnKF, Evensen 1994) are well known as two advanced data
DassFlow v1.0: a variational data assimilation software for 2D river flows
, 2007
"... apport de recherche ISSN 02496399 ISRN INRIA/RR6150FR+ENGDassFlow v1.0: a variational data assimilation software for 2D river flows ..."
Abstract

Cited by 14 (7 self)
 Add to MetaCart
apport de recherche ISSN 02496399 ISRN INRIA/RR6150FR+ENGDassFlow v1.0: a variational data assimilation software for 2D river flows
Enriched Methods for LargeScale Unconstrained Optimization.
, 2000
"... This paper describes a class of optimization methods that interlace iterations of the limited memory BFGS method (LBFGS) and a Hessianfree Newton method (HFN) in such a way that the information collected by one type of iteration improves the performance of the other. Curvature information about th ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
This paper describes a class of optimization methods that interlace iterations of the limited memory BFGS method (LBFGS) and a Hessianfree Newton method (HFN) in such a way that the information collected by one type of iteration improves the performance of the other. Curvature information about the objective function is stored in the form of a limited memory matrix, and plays the dual role of preconditioning the inner conjugate gradient iteration in the HFN method and of providing a warm start for LBFGS iterations. The lengths of the LBFGS and HFN cycles are adjusted dynamically during the course of the optimization. Numerical experiments indicate that the the new algorithms is very effective and is not sensitive tothechoice of parameters.