Results 1  10
of
10
A Subspace, Interior, and Conjugate Gradient Method for LargeScale BoundConstrained Minimization Problems
 SIAM Journal on Scientific Computing
, 1999
"... A subspace adaptation of the ColemanLi trust region and interior method is proposed for solving largescale boundconstrained minimization problems. This method can be implemented with either sparse Cholesky factorization or conjugate gradient computation. Under reasonable conditions the convergenc ..."
Abstract

Cited by 35 (1 self)
 Add to MetaCart
A subspace adaptation of the ColemanLi trust region and interior method is proposed for solving largescale boundconstrained minimization problems. This method can be implemented with either sparse Cholesky factorization or conjugate gradient computation. Under reasonable conditions the convergence properties of this subspace trust region method are as strong as those of its fullspace version.
Isotropic Effective Energy Simulated Annealing Searches for Low Energy Molecular Cluster States
, 1993
"... . The search for low energy states of molecular clusters is associated with the study of molecular conformation and especially protein folding. This paper describes a new global minimization algorithm which is effective and efficient for finding low energy states and hence stable structures of molec ..."
Abstract

Cited by 19 (6 self)
 Add to MetaCart
. The search for low energy states of molecular clusters is associated with the study of molecular conformation and especially protein folding. This paper describes a new global minimization algorithm which is effective and efficient for finding low energy states and hence stable structures of molecular clusters. The algorithm combines simulated annealing with a class of effective energy functions which are transformed from the original energy function based on the theory of renormalization groups. The algorithm converges to low energy states asymptotically, and is more efficient than a general simulated annealing method. Abbreviated title: Effective Energy Simulated Annealing for Molecular Conformation Key words: global/local minimization, simulated annealing, renormalization group, parallel computation, protein folding AMS (MOS) subject classification: 49M37, 68Q22, 92C40 y Department of Computer Science and Advanced Computing Research Institute, Cornell University, Ithaca, NY 148...
Convex Relaxations Of 01 Quadratic Programming
, 1993
"... We consider three parametric relaxations of the 01 quadratic programming problem. These relaxations are to: quadratic maximization over simple box constraints, quadratic maximization over the sphere, and the maximum eigenvalue of a bordered matrix. When minimized over the parameter, each of the rel ..."
Abstract

Cited by 17 (6 self)
 Add to MetaCart
We consider three parametric relaxations of the 01 quadratic programming problem. These relaxations are to: quadratic maximization over simple box constraints, quadratic maximization over the sphere, and the maximum eigenvalue of a bordered matrix. When minimized over the parameter, each of the relaxations provides an upper bound on the original discrete problem. Moreover, these bounds are efficiently computable. Our main result is that, surprisingly, all three bounds are equal. This author would like to thank the Department of Civil Engineering and Operations Research, Princeton University, for their support during his research leave. Key words: quadratic boolean programming, bounds, quadratic programming, trust region subproblems, minmax eigenvalue problems. AMS 1991 Subject Classification: Primary: 90C09, 90C25; Secondary: 90C27, 90C20. 1 INTRODUCTION Consider the \Sigma1 quadratic programming problem (P ) ¯ := max q(x) := x t Qx + c t x; x 2 F := f\Gamma1; 1g n ; ...
An Accelerated Interior Point Method Whose Running Time Depends Only on A
 IN PROCEEDINGS OF 26TH ANNUAL ACM SYMPOSIUM ON THE THEORY OF COMPUTING
, 1993
"... We propose a "layeredstep" interior point (LIP) algorithm for linear programming. This algorithm follows the central path, either with short steps or with a new type of step called a "layered least squares" (LLS) step. The algorithm returns the exact global minimum after a finite number of steps ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
We propose a "layeredstep" interior point (LIP) algorithm for linear programming. This algorithm follows the central path, either with short steps or with a new type of step called a "layered least squares" (LLS) step. The algorithm returns the exact global minimum after a finite number of stepsin particular, after O(n 3:5 c(A)) iterations, where c(A) is a function of the coefficient matrix. The LLS steps can be thought of as accelerating a pathfollowing interior point method whenever neardegeneracies occur. One consequence of the new method is a new characterization of the central path: we show that it composed of at most n 2 alternating straight and curved
Large Scale Unconstrained Optimization
 The State of the Art in Numerical Analysis
, 1996
"... This paper reviews advances in Newton, quasiNewton and conjugate gradient methods for large scale optimization. It also describes several packages developed during the last ten years, and illustrates their performance on some practical problems. Much attention is given to the concept of partial ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
This paper reviews advances in Newton, quasiNewton and conjugate gradient methods for large scale optimization. It also describes several packages developed during the last ten years, and illustrates their performance on some practical problems. Much attention is given to the concept of partial separabilitywhich is gaining importance with the arrival of automatic differentiation tools and of optimization software that fully exploits its properties.
Computing sparse Hessian and Jacobian approximations with optimal hereditary properties
 LargeScale Optimization with Applications, Part II: Optimal Design and Control
, 1996
"... In nonlinear optimization it is often important to estimate large sparse Hessian or Jacobian matrices, to be used for example in a trust region method. We propose an algorithm for computing a matrix B with a given sparsity pattern from a bundle of the m most recent difference vectors \Delta = h ff ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
In nonlinear optimization it is often important to estimate large sparse Hessian or Jacobian matrices, to be used for example in a trust region method. We propose an algorithm for computing a matrix B with a given sparsity pattern from a bundle of the m most recent difference vectors \Delta = h ffi k\Gammam+1 : : : ffi k i ; \Gamma = h fl k\Gammam+1 : : : fl k i ; where B should approximately map \Delta into \Gamma. In this paper B is chosen such that it satisfies m quasiNewton conditions B\Delta = \Gamma in the least squares sense. We show that B can always be computed by solving a positive semidefinite system of equations in the nonzero components of B. We give necessary and sufficient conditions under which this system is positive definite and indicate how B can be computed efficiently using a conjugate gradient method. In the case of unconstrained optimization we use the technique to determine a Hessian approximation which is used in a trust region method. Some n...
Cyberspace geography visualization  Mapping the WorldWide Web to help people nd their way in cyberspace, heiwww.unige.ch/girardin/cgv
 of the WorldWide Web, heiwww.unige.ch/girardin/ cgv/www5/index.html HIPPNER
, 1995
"... Abstract As cyberspace becomes an integral part of our daily life, its mastering becomes harder. To help, cyberspace can be represented by resources arranged in a multidimensional space. With geographical maps to exhibit the topology of this virtual space, people can have a better visual understandi ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Abstract As cyberspace becomes an integral part of our daily life, its mastering becomes harder. To help, cyberspace can be represented by resources arranged in a multidimensional space. With geographical maps to exhibit the topology of this virtual space, people can have a better visual understanding. In this paper, methods focusing on the construction of lower dimension representations of this space are examined and illustrated with the WorldWide Web. It is expected that this work will contribute to addressing issues of navigation in cyberspace and, especially, avoiding the lostincyberspace syndrome. Résumé Alors que le cyberspace envahit notre vie quotidienne, sa maîtrise devient de plus en plus complexe. On peut l’imaginer comme un ensemble de ressources arrangées dans un espace multidimensionnel. En utilisant des cartes géographiques pour représente la topologie virtuelle de cet espace, on arrive à mieux le comprendre, le cerner. Dans ce papier, des méthodes se concentrant sur la construction de représentations à dimensions réduites sont étudiées en les appliquant au WorldWide Web. On espère que ce travail contribuera à résoudre les problèmes de navigation dans ce monde virtuel et en particulier à éviter de s’y perdre. Ubersicht In einer Zeit, in der der Cyberspace ein integraler Bestandteil unseres täglichen Lebens wird, wird seine Beherrschung zunehmend schwieriger. Zur Erleichterung kann Cyberspace anhand von Quellen, angeordnet in einem multidimensionalen Raum, dargestellt werden. Mit geographischen Karten, die die Topologie dieses künstlichen Raumes aufzeigen, kann das visuelle Verständnis verbessert werden. In dieser Arbeit werden Methoden zur Konstruktion von Darstellungen mit niedriger Dimension dieses Raumes untersucht und anhand des WorldWide Web verdeutlicht.
Survey on Nonlinear Optimization
, 1996
"... In this survey paper, an overview on the different approaches for solving nonlinear optimization problems is given. The presentation includes theory, numeric, interval and symbolic methods. ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
In this survey paper, an overview on the different approaches for solving nonlinear optimization problems is given. The presentation includes theory, numeric, interval and symbolic methods.
Advanced Computing Research Institute Theory Center Cornell University Semiannual Research Activity Report April 1992  September 1992
"... This report consists of two parts. The first part contains a short summary of the progress made in the last six months on each of the four main projects: parallelizing compilers, computational linear algebra, computational optimization, and numerical methods for partial differential equations. Inclu ..."
Abstract
 Add to MetaCart
This report consists of two parts. The first part contains a short summary of the progress made in the last six months on each of the four main projects: parallelizing compilers, computational linear algebra, computational optimization, and numerical methods for partial differential equations. Included also are a list of ACRI researchers and their research interests, a list of technical reports produced in the last six months, and a list of ACRI seminars. In the second part we highlight one of the projects, the parallelizing compiler work, where we give a more detailed introduction into this area and sketch our novel approach. Contents
Computing sparse Hessian and Jacobian approximations with optimal hereditary properties
, 1996
"... In nonlinear optimization it is often important to estimate large sparse Hessian or Jacobian matrices, to be used for example in a trust region method. We propose an algorithm for computing a matrix B with a given sparsity pattern from a bundle of the m most recent difference vectors \Delta = h ..."
Abstract
 Add to MetaCart
In nonlinear optimization it is often important to estimate large sparse Hessian or Jacobian matrices, to be used for example in a trust region method. We propose an algorithm for computing a matrix B with a given sparsity pattern from a bundle of the m most recent difference vectors \Delta = h ffi k\Gammam+1 : : : ffi k i ; \Gamma = h fl k\Gammam+1 : : : fl k i ; where B should approximately map \Delta into \Gamma. In this paper B is chosen such that it satisfies m quasiNewton conditions B\Delta = \Gamma in the least squares sense. We show that B can always be computed by solving a positive semidefinite system of equations in the nonzero components of B. We give necessary and sufficient conditions under which this system is positive definite and indicate how B can be computed efficiently using a conjugate gradient method. In the case of unconstrained optimization we use the technique to determine a Hessian approximation which is used in a trust region method. Some numerical results are presented for a range of unconstrained test problems. Keywords: Sparse nonlinear equations, sparse Hessian, limited memory, Procrustes Problems 1