Results 1  10
of
130
Filter Pattern Search Algorithms for Mixed Variable Constrained Optimization Problems
 SIAM Journal on Optimization
, 2004
"... A new class of algorithms for solving nonlinearly constrained mixed variable optimization problems is presented. This class combines and extends the AudetDennis Generalized Pattern Search (GPS) algorithms for bound constrained mixed variable optimization, and their GPSfilter algorithms for gene ..."
Abstract

Cited by 47 (8 self)
 Add to MetaCart
A new class of algorithms for solving nonlinearly constrained mixed variable optimization problems is presented. This class combines and extends the AudetDennis Generalized Pattern Search (GPS) algorithms for bound constrained mixed variable optimization, and their GPSfilter algorithms for general nonlinear constraints. In generalizing existing algorithms, new theoretical convergence results are presented that reduce seamlessly to existing results for more specific classes of problems. While no local continuity or smoothness assumptions are required to apply the algorithm, a hierarchy of theoretical convergence results based on the Clarke calculus is given, in which local smoothness dictate what can be proved about certain limit points generated by the algorithm. To demonstrate the usefulness of the algorithm, the algorithm is applied to the design of a loadbearing thermal insulation system. We believe this is the first algorithm with provable convergence results to directly target this class of problems.
A particle swarm pattern search method for bound constrained nonlinear optimization
, 2006
"... ..."
Worst case complexity of direct search
, 2010
"... In this paper we prove that direct search of directional type shares the worst case complexity bound of steepest descent when sufficient decrease is imposed using a quadratic function of the step size parameter. This result is proved under smoothness of the objective function and using a framework o ..."
Abstract

Cited by 29 (3 self)
 Add to MetaCart
(Show Context)
In this paper we prove that direct search of directional type shares the worst case complexity bound of steepest descent when sufficient decrease is imposed using a quadratic function of the step size parameter. This result is proved under smoothness of the objective function and using a framework of the type of GSS (generating set search). We also discuss the worst case complexity of direct search when only simple decrease is imposed and when the objective function is nonsmooth.
Analysis of Direct Searches for Discontinuous Functions
, 2010
"... It is known that the Clarke generalized directional derivative is nonnegative along the limit directions generated by directional directsearch methods at a limit point of certain subsequences of unsuccessful iterates, if the function being minimized is Lipschitz continuous near the limit point. In ..."
Abstract

Cited by 21 (3 self)
 Add to MetaCart
It is known that the Clarke generalized directional derivative is nonnegative along the limit directions generated by directional directsearch methods at a limit point of certain subsequences of unsuccessful iterates, if the function being minimized is Lipschitz continuous near the limit point. In this paper we generalize this result for discontinuous functions using Rockafellar generalized directional derivatives (upper subderivatives). We show that Rockafellar derivatives are also nonnegative along the limit directions of those subsequences of unsuccessful iterates when the function values converge to the function value at the limit point. This result is obtained assuming that the function is directionally Lipschitz with respect to the limit direction. It is also possible under appropriate conditions to establish more insightful results by showing that the sequence of points generated by these methods eventually approaches the limit point along the locally best branch or step function (when the number of steps is equal to two). The results of this paper are presented for constrained optimization and illustrated numerically.
ORTHOMADS: A deterministic MADS instance with orthogonal directions ∗
, 2008
"... The purpose of this paper is to introduce a new way of choosing directions for the Mesh Adaptive Direct Search (MADS) class of algorithms. The advantages of this new ORTHOMADS instantiation of MADS are that the polling directions are chosen deterministically, ensuring that the results of a given run ..."
Abstract

Cited by 15 (2 self)
 Add to MetaCart
(Show Context)
The purpose of this paper is to introduce a new way of choosing directions for the Mesh Adaptive Direct Search (MADS) class of algorithms. The advantages of this new ORTHOMADS instantiation of MADS are that the polling directions are chosen deterministically, ensuring that the results of a given run are repeatable, and that they are orthogonal to each other, therefore the convex cones of missed directions at each iteration are minimal in size. The convergence results for ORTHOMADS follow directly from those already published for MADS, and they hold deterministically, rather than with probability one, as for LTMADS, the first MADS instance. The initial numerical results are quite good for both smooth and nonsmooth, and constrained and unconstrained problems considered here.
Convergence analysis of the DIRECT algorithm
 North Carolina State University, Center for
, 2004
"... Abstract. The DIRECT algorithm is a deterministic sampling method for bound constrained Lipschitz continuous optimization. We prove a subsequential convergence result for the DIRECT algorithm that quantifies some of the convergence observations in the literature. Our results apply to several variati ..."
Abstract

Cited by 14 (1 self)
 Add to MetaCart
(Show Context)
Abstract. The DIRECT algorithm is a deterministic sampling method for bound constrained Lipschitz continuous optimization. We prove a subsequential convergence result for the DIRECT algorithm that quantifies some of the convergence observations in the literature. Our results apply to several variations on the original method, including one that will handle general constraints. We use techniques from nonsmooth analysis, and our framework is based on recent results for the MADS sampling algorithms.
Convergence of mesh adaptive direct search to secondorder stationary points
 tel00639257, version 1  8 Nov 2011
, 2006
"... Abstract. A previous analysis of secondorder behavior of generalized pattern search algorithms for unconstrained and linearly constrained minimization is extended to the more general class of mesh adaptive direct search (MADS) algorithms for general constrained optimization. Because of the ability ..."
Abstract

Cited by 14 (2 self)
 Add to MetaCart
(Show Context)
Abstract. A previous analysis of secondorder behavior of generalized pattern search algorithms for unconstrained and linearly constrained minimization is extended to the more general class of mesh adaptive direct search (MADS) algorithms for general constrained optimization. Because of the ability of MADS to generate an asymptotically dense set of search directions, we are able to establish reasonable conditions under which a subsequence of MADS iterates converges to a limit point satisfying secondorder necessary or sufficient optimality conditions for general setconstrained optimization problems.
Comparison of DerivativeFree Optimization Methods for Groundwater Supply and Hydraulic Capture Community Problems
 ADVANCES IN WATER RESOURCES
, 2008
"... Management decisions involving groundwater supply and remediation often rely on optimization techniques to determine an effective strategy. We introduce several derivativefree sampling methods for solving constrained optimization problems that have not yet been considered in this field, and we incl ..."
Abstract

Cited by 12 (5 self)
 Add to MetaCart
Management decisions involving groundwater supply and remediation often rely on optimization techniques to determine an effective strategy. We introduce several derivativefree sampling methods for solving constrained optimization problems that have not yet been considered in this field, and we include a genetic algorithm for completeness. Two welldocumented community problems are used for illustration purposes: a groundwater supply problem and a hydraulic capture problem. The community problems were found to be challenging applications due to the objective functions being nonsmooth, nonlinear, and having many local minima. Because the results were found to be sensitive to initial iterates for some methods, guidance is provided in selecting initial iterates for these problems that improve the likelihood Preprint submitted to Elsevier 14 January 2008of achieving significant reductions in the objective function to be minimized. In addition, we suggest some potentially fruitful areas for future research.
Direct Multisearch for Multiobjective Optimization
, 2010
"... In practical applications of optimization it is common to have several conflicting objective functions to optimize. Frequently, these functions are subject to noise or can be of blackbox type, preventing the use of derivativebased techniques. We propose a novel multiobjective derivativefree metho ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
In practical applications of optimization it is common to have several conflicting objective functions to optimize. Frequently, these functions are subject to noise or can be of blackbox type, preventing the use of derivativebased techniques. We propose a novel multiobjective derivativefree methodology, calling it direct multisearch (DMS), which does not aggregate any of the objective functions. Our framework is inspired by the search/poll paradigm of directsearch methods of directional type and uses the concept of Pareto dominance to maintain a list of nondominated points (from which the new iterates or poll centers are chosen). The aim of our method is to generate as many points in the Pareto front as possible from the polling procedure itself, while keeping the whole framework general enough to accommodate other disseminating strategies, in particular when using the (here also) optional search step. DMS generalizes to multiobjective optimization (MOO) all directsearch methods of directional type. We prove under the common assumptions used in direct search for single optimization that at least one limit point of the sequence of iterates generated by DMS lies in (a stationary
Modern continuous optimization algorithms for tuning real and integer algorithm parameters
 LNCS 6234. Proceedings of the International Conference on Swarm Intelligence (ANTS 2010
, 2010
"... Abstract. To obtain peak performance from optimization algorithms, it is required to set appropriately their parameters. Frequently, algorithm parameters can take values from the set of real numbers, or from a large integer set. To tune this kind of parameters, it is interesting to apply stateofth ..."
Abstract

Cited by 10 (3 self)
 Add to MetaCart
Abstract. To obtain peak performance from optimization algorithms, it is required to set appropriately their parameters. Frequently, algorithm parameters can take values from the set of real numbers, or from a large integer set. To tune this kind of parameters, it is interesting to apply stateoftheart continuous optimization algorithms instead of using a tedious, and errorprone, handson approach. In this paper, we study the performance of several continuous optimization algorithms for the algorithm parameter tuning task. As case studies, we use a number of optimization algorithms from the swarm intelligence literature. 1