Results 1 
5 of
5
An Implicit Filtering Algorithm For Optimization Of Functions With Many Local Minima
 SIAM J. Optim
, 1995
"... . In this paper we describe and analyze an algorithm for certain box constrained optimization problems that may have several local minima. A paradigm for these problems is one in which the function to be minimized is the sum of a simple function, such as a convex quadratic, and high frequency, low a ..."
Abstract

Cited by 53 (16 self)
 Add to MetaCart
. In this paper we describe and analyze an algorithm for certain box constrained optimization problems that may have several local minima. A paradigm for these problems is one in which the function to be minimized is the sum of a simple function, such as a convex quadratic, and high frequency, low amplitude terms which cause local minima away from the global minimum of the simple function. Our method is gradient based and therefore the performance can be improved by use of quasiNewton methods. Key words. filtering, projected gradient algorithm, quasiNewton method AMS(MOS) subject classifications. 65H10, 65K05, 65K10 1. Introduction. In this paper we describe and analyze an algorithm for bound constrained optimization problems that may have several local minima. The type of problem we have in mind is one in which the function to be minimized is the sum of a simple function, such as a convex quadratic, and high frequency, low amplitude terms which cause the local minima. Of particul...
Detection And Remediation Of Stagnation In The NelderMead Algorithm Using A Sufficient Decrease Condition
 SIAM J. OPTIM
, 1997
"... The NelderMead algorithm can stagnate and converge to a nonoptimal point, even for very simple problems. In this note we propose a test for sufficient decrease which, if passed for the entire iteration, will guarantee convergence of the NelderMead iteration to a stationary point if the objective ..."
Abstract

Cited by 31 (1 self)
 Add to MetaCart
The NelderMead algorithm can stagnate and converge to a nonoptimal point, even for very simple problems. In this note we propose a test for sufficient decrease which, if passed for the entire iteration, will guarantee convergence of the NelderMead iteration to a stationary point if the objective function is smooth. Failure of this condition is an indicator of potential stagnation. As a remedy we propose a new step, which we call an oriented restart, which reinitializes the simplex to a smaller one with orthogonal edges which contains an approximate steepest descent step from the current best point. We also give results that apply when objective function is a lowamplitude perturbation of a smooth function. We illustrate our results with some numerical examples.
The Simplex Gradient and Noisy Optimization Problems
 in Computational Methods in Optimal Design and Control
, 1998
"... this paper we consider objective functions that are perturbations of simple, smooth functions. The surface in on the left in Figure 1, taken from [24], and the graph on the right illustrate this type of problem. Figure 1: Optimization Landscapes 0 5 10 15 20 25 0 5 10 15 20 25 80 60 40 20 0 20 0 ..."
Abstract

Cited by 19 (4 self)
 Add to MetaCart
this paper we consider objective functions that are perturbations of simple, smooth functions. The surface in on the left in Figure 1, taken from [24], and the graph on the right illustrate this type of problem. Figure 1: Optimization Landscapes 0 5 10 15 20 25 0 5 10 15 20 25 80 60 40 20 0 20 0.5 1.5 2.5 3.5 4.5 21.51 0.5 0.5 1 1.5 2 The perturbations may be results of discontinuities or nonsmoth effects in the underlying models, randomness in the function evaluation, or experimental or measurement errors. Conventional gradientbased methods will be trapped in local minima even if the noise is smooth. Many classes of methods for noisy optimization problems are based on function information computed on sequences of simplices. The NelderMead, [18], multidirectional search, [8], [21], and implicit filtering, [12], methods are three examples. The performance of such methods can be explained in terms of the difference approximation of the gradient that is implicit in the function evaluations they perform.
Optimization Of Automotive Valve Train Components With Implict Filtering
 Optimization and Engineering
, 1998
"... . In this paper we show how the implicit filtering algorithm can be parallelized and applied to problems in parameter identification and optimization in automotive valve train design. We extend our previous work by using a more refined model of the valve train and exploiting parallelism in a new way ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
. In this paper we show how the implicit filtering algorithm can be parallelized and applied to problems in parameter identification and optimization in automotive valve train design. We extend our previous work by using a more refined model of the valve train and exploiting parallelism in a new way. We apply the parameter identification results to obtain optimal profiles for camshaft lobes. Key words. Noisy Optimization, Implicit Filtering, Mechanical Systems, Automotive Valve Trains AMS subject classifications. 65K05, 65K10, 65L05, 65Y05 1. Introduciton. In this paper we report on a parallel implementation of the implicit filtering [17], [19] algorithm and its application to problems in parameter identification and optimization in automotive valve train design. We extend our previous work [11], [10] on parameter identification by using a more refined model of the valve train and exploiting parallelism in a new way. We then apply the parameter identification results to obtain optim...
Implicit Filtering And Optimal Design Problems
, 1994
"... Implicit filtering is a form of the gradient projection method of Bertsekas in which the stepsize in a difference approximation of the gradient is changed as the iteration progresses. In this way the algorithm is able to avoid certain types of local minima and in some cases find accurate approximati ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
Implicit filtering is a form of the gradient projection method of Bertsekas in which the stepsize in a difference approximation of the gradient is changed as the iteration progresses. In this way the algorithm is able to avoid certain types of local minima and in some cases find accurate approximations to the global minimum. The algorithm is particularly effective in avoiding local minima that are caused by highfrequency lowamplitude terms in the objective function. In this report we will discuss the algorithm and its theoretical properties. We will also present applications for modeling of subsurface contaminant transport and highfield magnet design.