Results 1  10
of
62
Optimization by Direct Search: New Perspectives on Some Classical and Modern Methods
 SIAM REVIEW VOL. 45, NO. 3, PP. 385–482
, 2003
"... Direct search methods are best known as unconstrained optimization techniques that do not explicitly use derivatives. Direct search methods were formally proposed and widely applied in the 1960s but fell out of favor with the mathematical optimization community by the early 1970s because they lacked ..."
Abstract

Cited by 222 (15 self)
 Add to MetaCart
(Show Context)
Direct search methods are best known as unconstrained optimization techniques that do not explicitly use derivatives. Direct search methods were formally proposed and widely applied in the 1960s but fell out of favor with the mathematical optimization community by the early 1970s because they lacked coherent mathematical analysis. Nonetheless, users remained loyal to these methods, most of which were easy to program, some of which were reliable. In the past fifteen years, these methods have seen a revival due, in part, to the appearance of mathematical analysis, as well as to interest in parallel and distributed computing. This review begins by briefly summarizing the history of direct search methods and considering the special properties of problems for which they are well suited. Our focus then turns to a broad class of methods for which we provide a unifying framework that lends itself to a variety of convergence results. The underlying principles allow generalization to handle bound constraints and linear constraints. We also discuss extensions to problems with nonlinear constraints.
Retrospective on Optimization
 25 TH YEAR ISSUE ON COMPUTERS AND CHEMICAL ENGINEERING
"... In this paper we provide a general classification of mathematical optimization problems, followed by a matrix of applications that shows the areas in which these problems have been typically applied in process systems engineering. We then provide a review of solution methods of the major types of op ..."
Abstract

Cited by 36 (1 self)
 Add to MetaCart
In this paper we provide a general classification of mathematical optimization problems, followed by a matrix of applications that shows the areas in which these problems have been typically applied in process systems engineering. We then provide a review of solution methods of the major types of optimization problems for continuous and discrete variable optimization, particularly nonlinear and mixedinteger nonlinear programming. We also review their extensions to dynamic optimization and optimization under uncertainty. While these areas are still subject to significant research efforts, the emphasis in this paper is on major developments that have taken place over the last twenty five years.
Snobfit  Stable Noisy Optimization by Branch and Fit
"... this paper produces a userspeci ed number of suggested evaluation points in each step; proceeds by successive partitioning of the box (branch) and building local quadratic models ( t); combines local and global search and allows the user to determine which of both should be emphasized; h ..."
Abstract

Cited by 26 (3 self)
 Add to MetaCart
(Show Context)
this paper produces a userspeci ed number of suggested evaluation points in each step; proceeds by successive partitioning of the box (branch) and building local quadratic models ( t); combines local and global search and allows the user to determine which of both should be emphasized; handles local search from the best point with the aid of trust regions; allows for hidden constraints and assigns to such points a function value based on the function values of nearby feasible points
On Trust Region Methods for Unconstrained Minimization Without Derivatives
, 2002
"... We consider some algorithms for unconstrained minimization without derivatives that form linear or quadratic models by interpolation to values of the objective function. Then a new vector of variables is calculated by minimizing the current model within a trust region. Techniques are described for a ..."
Abstract

Cited by 25 (1 self)
 Add to MetaCart
We consider some algorithms for unconstrained minimization without derivatives that form linear or quadratic models by interpolation to values of the objective function. Then a new vector of variables is calculated by minimizing the current model within a trust region. Techniques are described for adjusting the trust region radius, and for choosing positions of the interpolation points that maintain not only nonsingularity of the interpolation equations but also the adequacy of the model. Particular attention is given to quadratic models with diagonal second derivative matrices, because numerical experiments show that they are often more efficient than full quadratic models for general objective functions. Finally, some recent research on the updating of full quadratic models is described briefly, using fewer interpolation equations than before. The resultant freedom is taken up by minimizing the Frobenius norm of the change to the second derivative matrix of the model. A preliminary version of this method provides some very promising numerical results.
Continuous lunches are free plus the design of optimal optimization algorithms
 ALGORITHMICA
, 2009
"... ..."
ORBIT: Optimization by radial basis function interpolation in trustregions
 SIAM Journal on Scientific Computing
, 2008
"... Abstract. We present a new derivativefree algorithm, ORBIT, for unconstrained local optimization of computationally expensive functions. A trustregion framework using interpolating Radial Basis Function (RBF) models is employed. The RBF models considered often allow ORBIT to interpolate nonlinear ..."
Abstract

Cited by 20 (4 self)
 Add to MetaCart
(Show Context)
Abstract. We present a new derivativefree algorithm, ORBIT, for unconstrained local optimization of computationally expensive functions. A trustregion framework using interpolating Radial Basis Function (RBF) models is employed. The RBF models considered often allow ORBIT to interpolate nonlinear functions using fewer function evaluations than the polynomial models considered by present techniques. Approximation guarantees are obtained by ensuring that a subset of the interpolation points are sufficiently poised for linear interpolation. The RBF property of conditional positive definiteness yields a natural method for adding additional points. We present numerical results on test problems to motivate the use of ORBIT when only a relatively small number of expensive function evaluations are available. Results on two very different application problems, calibration of a watershed model and optimization of a PDEbased bioremediation plan, are also very encouraging and support ORBIT’s effectiveness on blackbox functions for which no special mathematical structure is known or available.
MNH: A DerivativeFree Optimization Algorithm Using Minimal Norm Hessians
, 2008
"... We introduce MNH, a new algorithm for unconstrained optimization when derivatives are unavailable, primarily targeting applications that require running computationally expensive deterministic simulations. MNH relies on a trustregion framework with an underdetermined quadratic model that interpolat ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
(Show Context)
We introduce MNH, a new algorithm for unconstrained optimization when derivatives are unavailable, primarily targeting applications that require running computationally expensive deterministic simulations. MNH relies on a trustregion framework with an underdetermined quadratic model that interpolates the function at a set of data points. We show how to construct this interpolation set to yield computationally stable parameters for the model and, in doing so, obtain an algorithm which converges to firstorder critical points. Preliminary results are encouraging and show that MNH makes effective use of the points evaluated in the course of the optimization. 1
THE CORRELATED KNOWLEDGE GRADIENT FOR SIMULATION OPTIMIZATION OF CONTINUOUS PARAMETERS USING GAUSSIAN PROCESS REGRESSION
"... Abstract. We extend the concept of the correlated knowledgegradient policy for ranking and selection of a finite set of alternatives to the case of continuous decision variables. We propose an approximate knowledge gradient for problems with continuous decision variables in the context of a Gaussia ..."
Abstract

Cited by 11 (5 self)
 Add to MetaCart
(Show Context)
Abstract. We extend the concept of the correlated knowledgegradient policy for ranking and selection of a finite set of alternatives to the case of continuous decision variables. We propose an approximate knowledge gradient for problems with continuous decision variables in the context of a Gaussian process regression model in a Bayesian setting, along with an algorithm to maximize the approximate knowledge gradient. In the problem class considered, we use the knowledge gradient for continuous parameters to sequentially choose where to sample an expensive noisy function in order to find the maximum quickly. We show that the knowledge gradient for continuous decisions is a generalization of the efficient global optimization algorithm proposed by Jones, Schonlau, and Welch.
Highlevel Approach to Modeling of Observed System Behavior
"... Current computer systems and communication networks tend to be highly complex, and they typically hide their internal structure from their users. Thus, for selected aspects of capacity planning, overload control and related applications, it is useful to have a method allowing one to find good and re ..."
Abstract

Cited by 11 (6 self)
 Add to MetaCart
(Show Context)
Current computer systems and communication networks tend to be highly complex, and they typically hide their internal structure from their users. Thus, for selected aspects of capacity planning, overload control and related applications, it is useful to have a method allowing one to find good and relatively simple approximations for the observed system behavior. This paper investigates one such approach where we attempt to represent the latter by adequately selecting the parameters of a set of queueing models. We identify a limited number of queueing models that we use as “Building Blocks” (BBs) in our procedure. The selected BBs allow us to accurately approximate the measured behavior of a range of different systems. We propose an approach for selecting and combining suitable BB, as well as for their calibration. Finally, we validate our methodology and discuss the potential and the limitations of the proposed approach. 1
Lyapunov methods in nonsmooth optimization, Part I: QuasiNewton algorithms for Lipschitz, regular functions
"... A recent converse Lyapunov theorem for differential inclusions is used to generate a large class of algorithms for nonsmooth optimization. Particular attention is given to quasiNewton algorithms for the minimization of locally Lipschitz, regular functions. 1 Introduction 1.1 Background The focus ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
A recent converse Lyapunov theorem for differential inclusions is used to generate a large class of algorithms for nonsmooth optimization. Particular attention is given to quasiNewton algorithms for the minimization of locally Lipschitz, regular functions. 1 Introduction 1.1 Background The focus of this paper is unconstrained nonlinear programming for locally Lipschitz functions. We address the task of designing numerical algorithms that asymptotically determine a point that globally minimizes a locally Lipschitz function defined on Euclidean space. For continuously differentiable functions, this problem and its solutions have reached a very mature state, which is summarized in many excellent textbooks (see, for example, [2],[11]). The nonsmooth optimization problem is more recent. Serious attention was first given to it in the 1960's and, over the years, manyauthors have addressed the problem by imposing various extra assumptions, beyond Lipschitz continuity,onthe function to be...