Results 1 
7 of
7
Detection And Remediation Of Stagnation In The NelderMead Algorithm Using A Sufficient Decrease Condition
 SIAM J. OPTIM
, 1997
"... The NelderMead algorithm can stagnate and converge to a nonoptimal point, even for very simple problems. In this note we propose a test for sufficient decrease which, if passed for the entire iteration, will guarantee convergence of the NelderMead iteration to a stationary point if the objective ..."
Abstract

Cited by 31 (1 self)
 Add to MetaCart
The NelderMead algorithm can stagnate and converge to a nonoptimal point, even for very simple problems. In this note we propose a test for sufficient decrease which, if passed for the entire iteration, will guarantee convergence of the NelderMead iteration to a stationary point if the objective function is smooth. Failure of this condition is an indicator of potential stagnation. As a remedy we propose a new step, which we call an oriented restart, which reinitializes the simplex to a smaller one with orthogonal edges which contains an approximate steepest descent step from the current best point. We also give results that apply when objective function is a lowamplitude perturbation of a smooth function. We illustrate our results with some numerical examples.
The Simplex Gradient and Noisy Optimization Problems
 in Computational Methods in Optimal Design and Control
, 1998
"... this paper we consider objective functions that are perturbations of simple, smooth functions. The surface in on the left in Figure 1, taken from [24], and the graph on the right illustrate this type of problem. Figure 1: Optimization Landscapes 0 5 10 15 20 25 0 5 10 15 20 25 80 60 40 20 0 20 0 ..."
Abstract

Cited by 19 (4 self)
 Add to MetaCart
this paper we consider objective functions that are perturbations of simple, smooth functions. The surface in on the left in Figure 1, taken from [24], and the graph on the right illustrate this type of problem. Figure 1: Optimization Landscapes 0 5 10 15 20 25 0 5 10 15 20 25 80 60 40 20 0 20 0.5 1.5 2.5 3.5 4.5 21.51 0.5 0.5 1 1.5 2 The perturbations may be results of discontinuities or nonsmoth effects in the underlying models, randomness in the function evaluation, or experimental or measurement errors. Conventional gradientbased methods will be trapped in local minima even if the noise is smooth. Many classes of methods for noisy optimization problems are based on function information computed on sequences of simplices. The NelderMead, [18], multidirectional search, [8], [21], and implicit filtering, [12], methods are three examples. The performance of such methods can be explained in terms of the difference approximation of the gradient that is implicit in the function evaluations they perform.
Algorithms for Noisy Problems in Gas Transmission Pipeline Optimization
, 2000
"... In this paper we describe some algorithms for noisy optimization in the context of problems from the gas transmission industry. The algorithms are implicit filtering, DIRECT, and a new hybrid of these methods, which uses DIRECT to find an initial iterate for implicit filtering. We report on numerica ..."
Abstract

Cited by 18 (5 self)
 Add to MetaCart
In this paper we describe some algorithms for noisy optimization in the context of problems from the gas transmission industry. The algorithms are implicit filtering, DIRECT, and a new hybrid of these methods, which uses DIRECT to find an initial iterate for implicit filtering. We report on numerical results that illustrate the performance of the methods.
Solution Of A Groundwater Control Problem With Implicit Filtering
, 2000
"... In this paper we describe the application of a parallel implementation of the implicit filtering algorithm to a control problem from hydrology. We seek to control the temperature at a group of drinking water wells by placing barrier wells between the drinking water wells and a well that injects heat ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
In this paper we describe the application of a parallel implementation of the implicit filtering algorithm to a control problem from hydrology. We seek to control the temperature at a group of drinking water wells by placing barrier wells between the drinking water wells and a well that injects heated water from an industrial site.
Implicit Filtering And Optimal Design Problems
, 1994
"... Implicit filtering is a form of the gradient projection method of Bertsekas in which the stepsize in a difference approximation of the gradient is changed as the iteration progresses. In this way the algorithm is able to avoid certain types of local minima and in some cases find accurate approximati ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
Implicit filtering is a form of the gradient projection method of Bertsekas in which the stepsize in a difference approximation of the gradient is changed as the iteration progresses. In this way the algorithm is able to avoid certain types of local minima and in some cases find accurate approximations to the global minimum. The algorithm is particularly effective in avoiding local minima that are caused by highfrequency lowamplitude terms in the objective function. In this report we will discuss the algorithm and its theoretical properties. We will also present applications for modeling of subsurface contaminant transport and highfield magnet design.
Convergence analysis of sampling methods for perturbed Lipschitz functions
 Pacific J. Opt
"... Abstract. In this short note we observe that results of Dennis and Audet extend naturally to a wide variety of deterministic sampling methods. For boundconstrained problems, we show that any method based on coordinate search which includes a sufficiently rich set of directions, for example random d ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
Abstract. In this short note we observe that results of Dennis and Audet extend naturally to a wide variety of deterministic sampling methods. For boundconstrained problems, we show that any method based on coordinate search which includes a sufficiently rich set of directions, for example random directions at each state of the sampling, will, when applied to Lipschitz continuous problems, have cluster points that satisfy generalized necessary conditions for optimality. The results also apply to the case of more general constraints, including socalled “hidden ” or “yesno ” constraints.