Results 1  10
of
13
A Subspace, Interior, and Conjugate Gradient Method for LargeScale BoundConstrained Minimization Problems
 SIAM JOURNAL ON SCIENTIFIC COMPUTING
, 1999
"... A subspace adaptation of the ColemanLi trust region and interior method is proposed for solving largescale boundconstrained minimization problems. This method can be implemented with either sparse Cholesky factorization or conjugate gradient computation. Under reasonable conditions the convergenc ..."
Abstract

Cited by 36 (1 self)
 Add to MetaCart
A subspace adaptation of the ColemanLi trust region and interior method is proposed for solving largescale boundconstrained minimization problems. This method can be implemented with either sparse Cholesky factorization or conjugate gradient computation. Under reasonable conditions the convergence properties of this subspace trust region method are as strong as those of its fullspace version. Computational
Isotropic Effective Energy Simulated Annealing Searches for Low Energy Molecular Cluster States
, 1993
"... . The search for low energy states of molecular clusters is associated with the study of molecular conformation and especially protein folding. This paper describes a new global minimization algorithm which is effective and efficient for finding low energy states and hence stable structures of molec ..."
Abstract

Cited by 19 (6 self)
 Add to MetaCart
. The search for low energy states of molecular clusters is associated with the study of molecular conformation and especially protein folding. This paper describes a new global minimization algorithm which is effective and efficient for finding low energy states and hence stable structures of molecular clusters. The algorithm combines simulated annealing with a class of effective energy functions which are transformed from the original energy function based on the theory of renormalization groups. The algorithm converges to low energy states asymptotically, and is more efficient than a general simulated annealing method. Abbreviated title: Effective Energy Simulated Annealing for Molecular Conformation Key words: global/local minimization, simulated annealing, renormalization group, parallel computation, protein folding AMS (MOS) subject classification: 49M37, 68Q22, 92C40 y Department of Computer Science and Advanced Computing Research Institute, Cornell University, Ithaca, NY 148...
Convex Relaxations Of 01 Quadratic Programming
, 1993
"... We consider three parametric relaxations of the 01 quadratic programming problem. These relaxations are to: quadratic maximization over simple box constraints, quadratic maximization over the sphere, and the maximum eigenvalue of a bordered matrix. When minimized over the parameter, each of the rel ..."
Abstract

Cited by 17 (6 self)
 Add to MetaCart
We consider three parametric relaxations of the 01 quadratic programming problem. These relaxations are to: quadratic maximization over simple box constraints, quadratic maximization over the sphere, and the maximum eigenvalue of a bordered matrix. When minimized over the parameter, each of the relaxations provides an upper bound on the original discrete problem. Moreover, these bounds are efficiently computable. Our main result is that, surprisingly, all three bounds are equal. This author would like to thank the Department of Civil Engineering and Operations Research, Princeton University, for their support during his research leave. Key words: quadratic boolean programming, bounds, quadratic programming, trust region subproblems, minmax eigenvalue problems. AMS 1991 Subject Classification: Primary: 90C09, 90C25; Secondary: 90C27, 90C20. 1 INTRODUCTION Consider the \Sigma1 quadratic programming problem (P ) ¯ := max q(x) := x t Qx + c t x; x 2 F := f\Gamma1; 1g n ; ...
An Accelerated Interior Point Method Whose Running Time Depends Only on A
 IN PROCEEDINGS OF 26TH ANNUAL ACM SYMPOSIUM ON THE THEORY OF COMPUTING
, 1993
"... We propose a "layeredstep" interior point (LIP) algorithm for linear programming. This algorithm follows the central path, either with short steps or with a new type of step called a "layered least squares" (LLS) step. The algorithm returns the exact global minimum after a finit ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
We propose a "layeredstep" interior point (LIP) algorithm for linear programming. This algorithm follows the central path, either with short steps or with a new type of step called a "layered least squares" (LLS) step. The algorithm returns the exact global minimum after a finite number of stepsin particular, after O(n 3:5 c(A)) iterations, where c(A) is a function of the coefficient matrix. The LLS steps can be thought of as accelerating a pathfollowing interior point method whenever neardegeneracies occur. One consequence of the new method is a new characterization of the central path: we show that it composed of at most n 2 alternating straight and curved
Large Scale Unconstrained Optimization
 The State of the Art in Numerical Analysis
, 1996
"... This paper reviews advances in Newton, quasiNewton and conjugate gradient methods for large scale optimization. It also describes several packages developed during the last ten years, and illustrates their performance on some practical problems. Much attention is given to the concept of partial ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
This paper reviews advances in Newton, quasiNewton and conjugate gradient methods for large scale optimization. It also describes several packages developed during the last ten years, and illustrates their performance on some practical problems. Much attention is given to the concept of partial separabilitywhich is gaining importance with the arrival of automatic differentiation tools and of optimization software that fully exploits its properties.
Computing sparse Hessian and Jacobian approximations with optimal hereditary properties
 LargeScale Optimization with Applications, Part II: Optimal Design and Control
, 1996
"... In nonlinear optimization it is often important to estimate large sparse Hessian or Jacobian matrices, to be used for example in a trust region method. We propose an algorithm for computing a matrix B with a given sparsity pattern from a bundle of the m most recent difference vectors \Delta = h ff ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
In nonlinear optimization it is often important to estimate large sparse Hessian or Jacobian matrices, to be used for example in a trust region method. We propose an algorithm for computing a matrix B with a given sparsity pattern from a bundle of the m most recent difference vectors \Delta = h ffi k\Gammam+1 : : : ffi k i ; \Gamma = h fl k\Gammam+1 : : : fl k i ; where B should approximately map \Delta into \Gamma. In this paper B is chosen such that it satisfies m quasiNewton conditions B\Delta = \Gamma in the least squares sense. We show that B can always be computed by solving a positive semidefinite system of equations in the nonzero components of B. We give necessary and sufficient conditions under which this system is positive definite and indicate how B can be computed efficiently using a conjugate gradient method. In the case of unconstrained optimization we use the technique to determine a Hessian approximation which is used in a trust region method. Some n...
Cyberspace geography visualization  Mapping the WorldWide Web to help people find their way in cyberspace
 OF THE WORLDWIDE WEB, HEIWWW.UNIGE.CH/GIRARDIN/ CGV/WWW5/INDEX.HTML HIPPNER
, 1995
"... As cyberspace becomes an integral part of our daily life, its mastering becomes harder. To help, cyberspace can be represented by resources arranged in a multidimensional space. With geographical maps to exhibit the topology of this virtual space, people can have a better visual understanding. In th ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
As cyberspace becomes an integral part of our daily life, its mastering becomes harder. To help, cyberspace can be represented by resources arranged in a multidimensional space. With geographical maps to exhibit the topology of this virtual space, people can have a better visual understanding. In this paper, methods focusing on the construction of lower dimension representations of this space are examined and illustrated with the WorldWide Web. It is expected that this work will contribute to addressing issues of navigation in cyberspace and, especially, avoiding the lostincyberspace syndrome.
Survey on Nonlinear Optimization
, 1996
"... In this survey paper, an overview on the different approaches for solving nonlinear optimization problems is given. The presentation includes theory, numeric, interval and symbolic methods. ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
In this survey paper, an overview on the different approaches for solving nonlinear optimization problems is given. The presentation includes theory, numeric, interval and symbolic methods.
A PrimalDual Accelerated Interior Point Method Whose Running Time Depends Only on A*
, 1994
"... We propose a primaldual "layeredstep" interior point (LIP) algorithm for linear programming with data given by real numbers. This algorithm follows the central path, either with short steps or with a new type of step called a "layered least squares" (LLS) step. The algorithm re ..."
Abstract
 Add to MetaCart
We propose a primaldual "layeredstep" interior point (LIP) algorithm for linear programming with data given by real numbers. This algorithm follows the central path, either with short steps or with a new type of step called a "layered least squares" (LLS) step. The algorithm returns an exact optimum after a finite number of stepsin particular, after O(n 3:5 c(A)) iterations, where c(A) is a function of This paper represents a simplification of an earlier manuscript "An accelerated interior point method whose running depends only on A" by the same authors. y Department of Computer Science, Upson Hall, Cornell University, Ithaca, NY 14853. Email: vavasis@cs.cornell.edu. This work is supported in part by the National Science Foundation, the Air Force Office of Scientific Research, and the Office of Naval Research, through NSF grant DMS8920550. Also supported in part by an NSF Presidential Young Investigator award with matching funds received from AT&T and Xerox Corp. Part of ...
A PrimalDual Interior Point Method Whose Running Time Depends Only on the Constraint Matrix \Lambda
, 1995
"... Abstract We propose a primaldual "layeredstep " interior point (LIP) algorithm for linear programming with data given by real numbers. This algorithm follows the central path, either with short steps or with a new type of step called a "layered least squares " ( ..."
Abstract
 Add to MetaCart
Abstract We propose a primaldual &quot;layeredstep &quot; interior point (LIP) algorithm for linear programming with data given by real numbers. This algorithm follows the central path, either with short steps or with a new type of step called a &quot;layered least squares &quot; (LLS) step. The algorithm returns an exact optimum after a finite number of stepsin particular, after O(n3:5c(A)) iterations, where c(A) is a function of the \Lambda This paper represents a simplification of an earlier manuscript &quot;An accelerated interior point method whose running time depends only on A &quot; by the same authors.