Results 1 
2 of
2
A Pattern Search Filter Method for Nonlinear Programming without Derivatives
 SIAM Journal on Optimization
, 2000
"... : This paper presents and analyzes a pattern search method for general constrained optimization based on filter methods for step acceptance. Roughly, a filter method accepts a step that either improves the objective function value or the value of some function that measures the constraint violation. ..."
Abstract

Cited by 47 (12 self)
 Add to MetaCart
: This paper presents and analyzes a pattern search method for general constrained optimization based on filter methods for step acceptance. Roughly, a filter method accepts a step that either improves the objective function value or the value of some function that measures the constraint violation. The new algorithm does not compute or approximate any derivatives, penalty constants or Lagrange multipliers. It reduces trivially to the Torczon GPS (generalized pattern search) algorithm when there are no constraints, and indeed, it is formulated here to reduce to the version of GPS designed to handle finitely many linear constraints if they are treated explicitly. A key feature is that it preserves the useful division into search and poll steps. Assuming local smoothness, the algorithm produces a KKT point for a problem related to the original problem. Key words Pattern search algorithm, filter algorithm, surrogatebased optimization, derivativefree convergence analysis, constrained op...
Optimization using Surrogates for Engineering Design
, 2002
"... The goal of these lectures is to acquaint the audience with some approaches to a class of nasty optimization problems involving nonconvex nonlinear extendedvalued functions. Such functions arise often in multidisciplinary optimization (MDO). The first three lectures are meant to set the context for ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
The goal of these lectures is to acquaint the audience with some approaches to a class of nasty optimization problems involving nonconvex nonlinear extendedvalued functions. Such functions arise often in multidisciplinary optimization (MDO). The first three lectures are meant to set the context for applying our algorithms. The context determines the form of the algorithms, and to present this context requires a bit more than just a short list of assumptions. Briefly though, the objective function and constraints depend not only on the optimization variables, but also on some ancillary variables such as the solutions of some coupled systems by standalone solvers for partial differential equations, table lookups, and other nonsmooth simulation codes. This has important algorithmic implications. First, the function and constraint values may be very expensive. Second, the functions may be nondifferentiable and discontinuous. In fact, they are often treated as extended valued since a function call may not return a value even if all the specified constraints are satisfied. The approach we treat in these lectures has been successful for some real problems in engineering design. We hope to convince engineers and mathematicians alike that not only are the algorithms given here useful, but the mathematics involved is interesting and relevant. We hope to convince mathematicians that good applied problems produce good mathematics, and that contrary to what they may have heard, they will suffer no loss of virtue as a direct result of considering them.