Results 1  10
of
11
On the limited memory BFGS method for large scale optimization
 MATHEMATICAL PROGRAMMING
, 1989
"... ..."
CUTE: Constrained and unconstrained testing environment
, 1993
"... The purpose of this paper is to discuss the scope and functionality of a versatile environment for testing small and largescale nonlinear optimization algorithms. Although many of these facilities were originally produced by the authors in conjunction with the software package LANCELOT, we belie ..."
Abstract

Cited by 161 (3 self)
 Add to MetaCart
The purpose of this paper is to discuss the scope and functionality of a versatile environment for testing small and largescale nonlinear optimization algorithms. Although many of these facilities were originally produced by the authors in conjunction with the software package LANCELOT, we believe that they will be useful in their own right and should be available to researchers for their development of optimization software. The tools are available by anonymous ftp from a number of sources and may, in many cases, be installed automatically. The scope of a major collection of test problems written in the standard input format (SIF) used by the LANCELOT software package is described. Recognising that most software was not written with the SIF in mind, we provide tools to assist in building an interface between this input format and other optimization packages. These tools already provide a link between the SIF and an number of existing packages, including MINOS and OSL. In ad...
LimitedMemory Matrix Methods with Applications
, 1997
"... Abstract. The focus of this dissertation is on matrix decompositions that use a limited amount of computer memory � thereby allowing problems with a very large number of variables to be solved. Speci�cally � we will focus on two applications areas � optimization and information retrieval. We introdu ..."
Abstract

Cited by 30 (6 self)
 Add to MetaCart
Abstract. The focus of this dissertation is on matrix decompositions that use a limited amount of computer memory � thereby allowing problems with a very large number of variables to be solved. Speci�cally � we will focus on two applications areas � optimization and information retrieval. We introduce a general algebraic form for the matrix update in limited�memory quasi� Newton methods. Many well�known methods such as limited�memory Broyden Family meth� ods satisfy the general form. We are able to prove several results about methods which sat� isfy the general form. In particular � we show that the only limited�memory Broyden Family method �using exact line searches � that is guaranteed to terminate within n iterations on an n�dimensional strictly convex quadratic is the limited�memory BFGS method. Further� more � we are able to introduce several new variations on the limited�memory BFGS method that retain the quadratic termination property. We also have a new result that shows that full�memory Broyden Family methods �using exact line searches � that skip p updates to the quasi�Newton matrix will terminate in no more than n�p steps on an n�dimensional strictly convex quadratic. We propose several new variations on the limited�memory BFGS method
BFGS with update skipping and varying memory
 SIAM J. Optim
, 1998
"... Abstract. We give conditions under which limitedmemory quasiNewton methods with exact line searches will terminate in n steps when minimizing ndimensional quadratic functions. We show that although all Broyden family methods terminate in n steps in their fullmemory versions, only BFGS does so wi ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
Abstract. We give conditions under which limitedmemory quasiNewton methods with exact line searches will terminate in n steps when minimizing ndimensional quadratic functions. We show that although all Broyden family methods terminate in n steps in their fullmemory versions, only BFGS does so with limitedmemory. Additionally, we show that fullmemory Broyden family methods with exact line searches terminate in at most n + p steps when p matrix updates are skipped. We introduce new limitedmemory BFGS variants and test them on nonquadratic minimization problems.
LargeScale Nonlinear Constrained Optimization: A Current Survey
, 1994
"... . Much progress has been made in constrained nonlinear optimization in the past ten years, but most largescale problems still represent a considerable obstacle. In this survey paper we will attempt to give an overview of the current approaches, including interior and exterior methods and algorithm ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
. Much progress has been made in constrained nonlinear optimization in the past ten years, but most largescale problems still represent a considerable obstacle. In this survey paper we will attempt to give an overview of the current approaches, including interior and exterior methods and algorithms based upon trust regions and line searches. In addition, the importance of software, numerical linear algebra and testing will be addressed. We will try to explain why the difficulties arise, how attempts are being made to overcome them and some of the problems that still remain. Although there will be some emphasis on the LANCELOT and CUTE projects, the intention is to give a broad picture of the stateoftheart. 1 IBM T.J. Watson Research Center, P.O.Box 218, Yorktown Heights, NY 10598, USA 2 Parallel Algorithms Team, CERFACS, 42 Ave. G. Coriolis, 31057 Toulouse Cedex, France 3 Central Computing Department, Rutherford Appleton Laboratory, Chilton, Oxfordshire, OX11 0QX, England ...
1 Benchmark Functions for CEC’2013 Special Session and Competition on Niching Methods for Multimodal Function Optimization
"... Evolutionary Algorithms (EAs) in their original forms are usually designed for locating a single global solution. These algorithms typically converge to a single solution because of the global selection scheme used. Nevertheless, many realworld problems are “multimodal ” by nature, i.e., multiple sa ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
Evolutionary Algorithms (EAs) in their original forms are usually designed for locating a single global solution. These algorithms typically converge to a single solution because of the global selection scheme used. Nevertheless, many realworld problems are “multimodal ” by nature, i.e., multiple satisfactory solutions exist. It may be desirable to locate many such satisfactory solutions so that a decision maker can choose one that is most proper in his/her problem domain. Numerous techniques have been developed in the past for locating multiple optima (global or local). These techniques are commonly referred to as “niching ” methods. A niching method can be incorporated into a standard EA to promote and maintain formation of multiple stable subpopulations within a single population, with an aim to locate multiple globally optimal or suboptimal solutions. Many niching methods have
Cooperative Coevolution with Differential Grouping for Large Scale Optimization
"... Abstract—Cooperative coevolution has been introduced into evolutionary algorithms with the aim of solving increasingly complex optimization problems through a divideandconquer paradigm. In theory, the idea of coadapted subcomponents is desirable for solving largescale optimization problems. How ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract—Cooperative coevolution has been introduced into evolutionary algorithms with the aim of solving increasingly complex optimization problems through a divideandconquer paradigm. In theory, the idea of coadapted subcomponents is desirable for solving largescale optimization problems. However in practice, without prior knowledge about the problem, it is not clear how the problem should be decomposed. In this paper we propose an automatic decomposition strategy called differential grouping that can uncover the underlying interaction structure of the decision variables and form subcomponents such that the interdependence between them is kept to a minimum. We show mathematically how such a decomposition strategy can be derived from a definition of partial separability. The empirical studies show that such nearoptimal decomposition can greatly improve the solution quality on largescale global optimization problems. Finally, we show how such an automated decomposition allows for a better approximation of the contribution of various subcomponents, leading to a more efficient assignment of the computational budget to various subcomponents. Index Terms—cooperative coevolution, largescale optimization, problem decomposition, nonseparability, numerical optimization
A NUMERICAL STUDY OF THE LIMITED MEMORY BFGS METHOD AND THE TRUNCATEDNEWTON METHOD FOR LARGE SCALE OPTIMIZATION*
"... Abstract. This paper examines the numerical performances of two methods for largescale optimization: a limited memory quasiNewton method (LBFGS), and a discrete truncatedNewton method (TN). Various ways of classifying test problems are discussed in order to better understand the types of problem ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. This paper examines the numerical performances of two methods for largescale optimization: a limited memory quasiNewton method (LBFGS), and a discrete truncatedNewton method (TN). Various ways of classifying test problems are discussed in order to better understand the types of problems that each algorithm solves well. The LBFGS and TN methods are also compared with the PolakRibire conjugate gradient method. Key words, large scale nonlinear optimization, limited memory method, truncatedNewton method, conjugategradient method AMS(MOS) subject classifications. 65, 49 1. Introduction. Suppose