Results 1  10
of
22
A Comparison of Complete Global Optimization Solvers
"... Results are reported of testing a number of existing state of the art solvers for global constrained optimization and constraint satisfaction on a set of over 1000 test problems in up to 1000 variables. ..."
Abstract

Cited by 25 (4 self)
 Add to MetaCart
Results are reported of testing a number of existing state of the art solvers for global constrained optimization and constraint satisfaction on a set of over 1000 test problems in up to 1000 variables.
Benchmarking Global Optimization and Constraint Satisfaction Codes
 Global Optimization and Constraint Satisfaction, First International Workshop on Global Constraint Optimization and Constraint Satisfaction, COCOS 2002, LNCS2861
, 2003
"... A benchmarking suite describing over 1000 optimization problems and constraint satisfaction problems covering problems from dierent traditions is described, annotated with best known solutions, and accompanied by recommended benchmarking protocols for comparing test results. ..."
Abstract

Cited by 24 (3 self)
 Add to MetaCart
(Show Context)
A benchmarking suite describing over 1000 optimization problems and constraint satisfaction problems covering problems from dierent traditions is described, annotated with best known solutions, and accompanied by recommended benchmarking protocols for comparing test results.
Using sampling and simplex derivatives in pattern search methods
 SIAM Journal on Optimization
, 2007
"... Abstract. Pattern search methods can be made more ecient if past function evaluations are appropriately reused. In this paper we will introduce a number of ways of reusing previous evaluations of the objective function based on the computation of simplex derivatives (e.g., simplex gradients) to impr ..."
Abstract

Cited by 19 (7 self)
 Add to MetaCart
(Show Context)
Abstract. Pattern search methods can be made more ecient if past function evaluations are appropriately reused. In this paper we will introduce a number of ways of reusing previous evaluations of the objective function based on the computation of simplex derivatives (e.g., simplex gradients) to improve the eciency of a pattern search iteration. At each iteration of a pattern search method, one can attempt to compute an accurate simplex gradient by identifying a sampling set of previous iterates with good geometrical properties. This simplex gradient computation can be done using only past successful iterates or by considering all past function evaluations. The simplex gradient can then be used, for instance, to reorder the evaluations of the objective function associated with the positive spanning set or positive basis used in the poll step. But it can also be used to update the mesh size parameter according to a sucient decrease criterion. None of these modications demands new function evaluations. A search step can also be tried along the negative simplex gradient at the beginning of the current pattern search iteration. We will present these procedures in detail and show how promising they are to enhance the practical performance of pattern search methods.
Implementing generating set search methods for linearly constrained minimization
 Department of Computer Science, College of William and Mary
, 2005
"... Abstract. We discuss an implementation of a derivativefree generating set search method for linearly constrained minimization with no assumption of nondegeneracy placed on the constraints. The convergence guarantees for generating set search methods require that the set of search directions possess ..."
Abstract

Cited by 16 (5 self)
 Add to MetaCart
(Show Context)
Abstract. We discuss an implementation of a derivativefree generating set search method for linearly constrained minimization with no assumption of nondegeneracy placed on the constraints. The convergence guarantees for generating set search methods require that the set of search directions possesses certain geometrical properties that allow it to approximate the feasible region near the current iterate. In the hard case, the calculation of the search directions corresponds to finding the extreme rays of a cone with a degenerate vertex at the origin, a difficult problem. We discuss here how stateoftheart computational geometry methods make it tractable to solve this problem in connection with generating set search. We also discuss a number of other practical issues of implementation, such as the careful treatment of equality constraints and the desirability of augmenting the set of search directions beyond the theoretically minimal set. We illustrate the behavior of the implementation on several problems from the CUTEr test suite. We have found it to be successful on problems with several hundred variables and linear constraints.
a Matlab trustregion solver for systems of nonlinear equalities and inequalities
, 2010
"... Abstract. The Matlab implementation of a trustregion GaussNewton method for boundconstrained nonlinear leastsquares problems is presented. The solver, called TRESNEI, is adequate for zero and smallresidual problems and handles the solution of nonlinear systems of equalities and inequalities. Th ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
(Show Context)
Abstract. The Matlab implementation of a trustregion GaussNewton method for boundconstrained nonlinear leastsquares problems is presented. The solver, called TRESNEI, is adequate for zero and smallresidual problems and handles the solution of nonlinear systems of equalities and inequalities. The structure and the usage of the solver are described and an extensive numerical comparison with functions from the Matlab Optimization Toolbox is carried out. Key words. Boundconstrained nonlinear leastsquares; nonlinear systems; nonlinear systems of inequalities; simple bounds; trustregion methods; algorithm design.
Globally Convergent Evolution Strategies and CMAES
, 2012
"... In this paper we show how to modify a large class of evolution strategies (ES) to rigorously achieve a form of global convergence, meaning convergence to stationary points independently of the starting point. The type of ES under consideration recombine the parents by means of a weighted sum, around ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
In this paper we show how to modify a large class of evolution strategies (ES) to rigorously achieve a form of global convergence, meaning convergence to stationary points independently of the starting point. The type of ES under consideration recombine the parents by means of a weighted sum, around which the offsprings are computed by random generation. One relevant instance of such ES is CMAES. The modifications consist essentially of the reduction of the size of the steps whenever a sufficient decrease condition on the function values is not verified. When such a condition is satisfied, the step size can be reset to the step size maintained by the ES themselves, as long as this latter one is sufficiently large. We suggest a number of ways of imposing sufficient decrease for which global convergence holds under reasonable assumptions, and extend our theory to the constrained case. Given a limited budget of function evaluations, our numerical experiments have shown that the modified CMAES is capable of further progress in function values. Moreover, we have observed that such an improvement in efficiency comes without deteriorating the behavior of the underlying method in the presence of nonconvexity.
unknown title
, 2009
"... A framework for simulating and estimating the state and functional topology of complex dynamic geometric networks ..."
Abstract
 Add to MetaCart
(Show Context)
A framework for simulating and estimating the state and functional topology of complex dynamic geometric networks
OPTIMIZATION IN SCILAB
"... In this document, we make an overview of optimization features in Scilab. The goal of this document is to present all existing and nonexisting features, such that a user who wants to solve a particular optimization problem can know what to look for. In the introduction, we analyse a classification ..."
Abstract
 Add to MetaCart
(Show Context)
In this document, we make an overview of optimization features in Scilab. The goal of this document is to present all existing and nonexisting features, such that a user who wants to solve a particular optimization problem can know what to look for. In the introduction, we analyse a classification of optimization problems. In the first chapter, we analyse the flagship of Scilab in terms of nonlinear optimization: the optim function. We analyse its features, the management of the cost function, the linear algebra and the management of the memory. Then we consider the algorithms which are used behind optim, depending on the type of algorithm and the constraints. In the remaining chapters, we present the algorithms available to solve quadratic problems, nonlinear least squares problems, semidefinite programming, genetic algorithms, simulated annealing and linear matrix inequalities. A chapter focus on optimization data files managed by Scilab, especially MPS and SIF files. Some optimization features are available in the form of toolboxes, the most important of which are the Quapro and CUTEr toolboxes. The final chapter is devoted
Benchmarking Global Optimization and Constraint Satisfaction Codes
, 2003
"... Abstract. A benchmarking suite describing over 1000 optimization problems and constraint satisfaction problems covering problems from different traditions is described, annotated with best known solutions, and accompanied by recommended benchmarking protocols for comparing test results. 1 ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. A benchmarking suite describing over 1000 optimization problems and constraint satisfaction problems covering problems from different traditions is described, annotated with best known solutions, and accompanied by recommended benchmarking protocols for comparing test results. 1
IMPLEMENTING GENERATING SET SEARCH METHODS FOR LINEARLY CONSTRAINED MINIMIZATION
, 2005
"... Abstract. We discuss an implementation of a derivativefree generating set search method for linearly constrained minimization with no assumption of nondegeneracy placed on the constraints. The convergence guarantees for generating set search methods require that the set of search directions possess ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. We discuss an implementation of a derivativefree generating set search method for linearly constrained minimization with no assumption of nondegeneracy placed on the constraints. The convergence guarantees for generating set search methods require that the set of search directions possesses certain geometrical properties that allow it to approximate the feasible region near the current iterate. In the hard case, the calculation of the search directions corresponds to finding the extreme rays of a cone with a degenerate vertex at the origin, a difficult problem. We discuss here how stateoftheart computational geometry methods make it tractable to solve this problem in connection with generating set search. We also discuss a number of other practical issues of implementation, such as the careful treatment of equality constraints and the desirability of augmenting the set of search directions beyond the theoretically minimal set. We illustrate the behavior of the implementation on several problems from the CUTEr test suite. We have found it to be successful on problems with several hundred variables and linear constraints.