Results 1 
5 of
5
Complete search in continuous global optimization and constraint satisfaction, Acta Numerica 13
, 2004
"... A chapter for ..."
A Comparison of Complete Global Optimization Solvers
"... Results are reported of testing a number of existing state of the art solvers for global constrained optimization and constraint satisfaction on a set of over 1000 test problems in up to 1000 variables. ..."
Abstract

Cited by 23 (4 self)
 Add to MetaCart
Results are reported of testing a number of existing state of the art solvers for global constrained optimization and constraint satisfaction on a set of over 1000 test problems in up to 1000 variables.
Sequential experiment designs for screening and tuning parameters of stochastic heuristics
 In Proceedings of the Ninth International Conference on Parallel Problem Solving from Nature (PPSN
, 2006
"... stochastic heuristics ..."
An Improved Unconstrained Global Optimization Algorithm
, 1996
"... Global optimization is a very hard problem especially when the number of variables is large (greater than several hundred). Recently, some methods including simulated annealing, branch and bound, and an interval Newton's method have made it possible to solve global optimization problems with several ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
Global optimization is a very hard problem especially when the number of variables is large (greater than several hundred). Recently, some methods including simulated annealing, branch and bound, and an interval Newton's method have made it possible to solve global optimization problems with several hundred variables. However, this is a small number of variables when one considers that integer programming can tackle problems with thousands of variables, and linear programming is able to solve problems with millions of variables. The goal of this research is to examine the present state of the art for algorithms to solve the unconstrained global optimization problem (GOP) and then to suggest some new approaches that allow problems of a larger size to be solved with an equivalent amount of computer time. This algorithm is then implemented using portable C++ and the software will be released for general use. This new algorithm is given with some theoretical results under which the algorit...
Relaxing Convergence Conditions To Improve The Convergence Rate
, 1999
"... Standard global convergence proofs are examined to determine why some algorithms perform better than other algorithms. We show that relaxing the conditions required to prove global convergence can improve an algorithm's performance. Further analysis indicates that minimizing an estimate of the dista ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Standard global convergence proofs are examined to determine why some algorithms perform better than other algorithms. We show that relaxing the conditions required to prove global convergence can improve an algorithm's performance. Further analysis indicates that minimizing an estimate of the distance to the minimum relaxes the convergence conditions in such a way as to improve an algorithm's convergence rate. A new linesearch algorithm based on these ideas is presented that does not force a reduction in the objective function at each iteration, yet it allows the objective function to increase during an iteration only if this will result in faster convergence. Unlike the nonmonotone algorithms in the literature, these new functions dynamically adjust to account for changes between the influence of curvature and descent. The result is an optimal algorithm in the sense that an estimate of the distance to the minimum is minimized at each iteration. The algorithm is shown to be well defi...