Results 1  10
of
21
Optimization by direct search: New perspectives on some classical and modern methods
 SIAM Review
, 2003
"... Abstract. Direct search methods are best known as unconstrained optimization techniques that do not explicitly use derivatives. Direct search methods were formally proposed and widely applied in the 1960s but fell out of favor with the mathematical optimization community by the early 1970s because t ..."
Abstract

Cited by 125 (14 self)
 Add to MetaCart
Abstract. Direct search methods are best known as unconstrained optimization techniques that do not explicitly use derivatives. Direct search methods were formally proposed and widely applied in the 1960s but fell out of favor with the mathematical optimization community by the early 1970s because they lacked coherent mathematical analysis. Nonetheless, users remained loyal to these methods, most of which were easy to program, some of which were reliable. In the past fifteen years, these methods have seen a revival due, in part, to the appearance of mathematical analysis, as well as to interest in parallel and distributed computing. This review begins by briefly summarizing the history of direct search methods and considering the special properties of problems for which they are well suited. Our focus then turns to a broad class of methods for which we provide a unifying framework that lends itself to a variety of convergence results. The underlying principles allow generalization to handle bound constraints and linear constraints. We also discuss extensions to problems with nonlinear constraints.
Inverse Kinematics Positioning Using Nonlinear Programming for Highly Articulated Figures
 ACM Transactions on Graphics
, 1994
"... An articulated figure is often modeled as a set of rigid segments connected with joints. Its configuration can be altered by varying the joint angles. Although it is straightforward to compute figure configurations given joint angles (forward kinematics), it is not so to find the joint angles for ..."
Abstract

Cited by 101 (9 self)
 Add to MetaCart
An articulated figure is often modeled as a set of rigid segments connected with joints. Its configuration can be altered by varying the joint angles. Although it is straightforward to compute figure configurations given joint angles (forward kinematics), it is not so to find the joint angles for a desired configuration (inverse kinematics). Since the inverse kinematics problem is of special importance to an animator wishing to set a figure to a posture satisfying a set of positioning constraints, researchers have proposed many approaches. But when we try to follow these approaches in an interactive animation system where the object to operate on is as highly articulated as a realistic human figure, they fail in either generality or performance, and so a new approach is fostered. Our approach is based on nonlinear programming techniques. It has been used for several years in the spatial constraint system in the Jack TM human figure simulation software developed at the Compute...
On the implementation of an algorithm for largescale equality constrained optimization
 SIAM Journal on Optimization
, 1998
"... Abstract. This paper describes a software implementation of Byrd and Omojokun’s trust region algorithm for solving nonlinear equality constrained optimization problems. The code is designed for the efficient solution of large problems and provides the user with a variety of linear algebra techniques ..."
Abstract

Cited by 38 (11 self)
 Add to MetaCart
Abstract. This paper describes a software implementation of Byrd and Omojokun’s trust region algorithm for solving nonlinear equality constrained optimization problems. The code is designed for the efficient solution of large problems and provides the user with a variety of linear algebra techniques for solving the subproblems occurring in the algorithm. Second derivative information can be used, but when it is not available, limited memory quasiNewton approximations are made. The performance of the code is studied using a set of difficult test problems from the CUTE collection.
Robust Process Simulation Using Interval Methods
 Comput. Chem. Eng
, 1996
"... Ideally, for the needs of robust process simulation, one would like a nonlinear equation solving technique that can find any and all roots to a problem, and do so with mathematical certainty. In general, currently used techniques do not provide such rigorous guarantees. One approach to providing suc ..."
Abstract

Cited by 31 (19 self)
 Add to MetaCart
Ideally, for the needs of robust process simulation, one would like a nonlinear equation solving technique that can find any and all roots to a problem, and do so with mathematical certainty. In general, currently used techniques do not provide such rigorous guarantees. One approach to providing such assurances can be found in the use of interval analysis, in particular the use of interval Newton methods combined with generalized bisection. However, these methods have generally been regarded as extremely inefficient. Motivated by recent progress in interval analysis, as well as continuing advances in computer speed and the availability of parallel computing, we consider here the feasibility of using an interval Newton/generalized bisection algorithm on process simulation problems. An algorithm designed for parallel computing on an MIMD machine is described, and results of tests on several problems are reported. Experiments indicate that the interval Newton/generalized bisection method works quite well on relatively small problems, providing a powerful method for finding all solutions to a problem. For larger problems, the method performs inconsistently with regard to efficiency, at least when reasonable initial bounds are not provided.
A survey of nonlinear conjugate gradient methods
 Pacific Journal of Optimization
, 2006
"... Abstract. This paper reviews the development of different versions of nonlinear conjugate gradient methods, with special attention given to global convergence properties. ..."
Abstract

Cited by 26 (3 self)
 Add to MetaCart
Abstract. This paper reviews the development of different versions of nonlinear conjugate gradient methods, with special attention given to global convergence properties.
A Direct Search Algorithm for Optimization With Noisy Function Evaluations
, 1999
"... We consider the unconstrained optimization of a function when each function evaluation is subject to a random noise. We assume that there is some control over the variance of the noise term, in the sense that additional computational eort will reduce the amount of noise. This situation may occur whe ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
We consider the unconstrained optimization of a function when each function evaluation is subject to a random noise. We assume that there is some control over the variance of the noise term, in the sense that additional computational eort will reduce the amount of noise. This situation may occur when function evaluations involve simulation or the approximate solution of a numerical problem. It also occurs in an experimental setting when averaging repeated observations at the same point can lead to a better estimate of the underlying function value. We describe a new direct search algorithm for this type of problem. We prove convergence of the new algorithm when the noise is controlled so that the standard deviation of the noise approaches zero faster than the step size. We also report some numerical results on the performance of the new algorithm. Australian Graduate School of Management, University of New South Wales Sydney 2052 Australia y University of Wisconsin{Madison, Compu...
Asymptotics of multibump blowup selfsimilar solutions of the nonlinear Schrödinger equation
 SIAM J. APPL. MATH
, 2001
"... In this article we construct, both asymptotically and numerically, multibump, blowup, selfsimilar solutions to the complex GinzburgLandau equation in the limit of small dispersion. Through a careful asymptotic analysis, involving a balance of both algebraic and exponential terms, we determine the ..."
Abstract

Cited by 10 (3 self)
 Add to MetaCart
In this article we construct, both asymptotically and numerically, multibump, blowup, selfsimilar solutions to the complex GinzburgLandau equation in the limit of small dispersion. Through a careful asymptotic analysis, involving a balance of both algebraic and exponential terms, we determine the parameter range over which these solutions may exist. Most intriguingly, we determine a branch of solutions that are not perturbations of solutions to the nonlinear Schrödinger equation, moreover, they are not monotone but they are stable. Furthermore, these ringlike solutions exist over a broader parameter regime than the monotone profile.
Nonmonotone Trust Region Methods for Nonlinear Equality Constrained Optimization without a Penalty Function
 MATH. PROGRAM., SER. B
, 2000
"... We propose and analyze a class of penaltyfunctionfree nonmonotone trustregion methods for nonlinear equality constrained optimization problems. The algorithmic framework yields global convergence without using a merit function and allows nonmonotonicity independently for both, the constraint viol ..."
Abstract

Cited by 9 (5 self)
 Add to MetaCart
We propose and analyze a class of penaltyfunctionfree nonmonotone trustregion methods for nonlinear equality constrained optimization problems. The algorithmic framework yields global convergence without using a merit function and allows nonmonotonicity independently for both, the constraint violation and the value of the Lagrangian function. Similar to the ByrdOmojokun class of algorithms, each step is composed of a quasinormal and a tangential step. Both steps are required to satisfy a decrease condition for their respective trustregion subproblems. The proposed mechanism for accepting steps combines nonmonotone decrease conditions on the constraint violation and/or the Lagrangian function, which leads to a flexibility and acceptance behavior comparable to filterbased methods. We establish the global convergence of the method. Furthermore, transition to quadratic local convergence is proved. Numerical tests are presented that confirm the robustness and efficiency of the approach.
Practical quasiNewton methods for solving nonlinear systems
, 2000
"... Practical quasiNewton methods for solving nonlinear systems are surveyed. The definition of quasiNewton methods that includes Newton 's method as a particular case is adopted. However, especial emphasis is given to the methods that satisfy the secant equation at every iteration, which are call ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
Practical quasiNewton methods for solving nonlinear systems are surveyed. The definition of quasiNewton methods that includes Newton 's method as a particular case is adopted. However, especial emphasis is given to the methods that satisfy the secant equation at every iteration, which are called here, as usually, secant methods. The leastchange secant update (LCSU) theory is revisited and convergence results of methods that do not belong to the LCSU family are discussed. The family of methods reviewed in this survey includes Broyden 's methods, structured quasiNewton methods, methods with direct updates of factorizations, rowscaling methods and columnupdating methods. Some implementation features are commented. The survey includes a discussion on global convergence tools and linearsystem implementations of Broyden's methods. In the final section, practical and theoretical perspectives of this area are discussed. 1 Introduction In this survey we consider nonlinear ...
Large Scale Unconstrained Optimization
 The State of the Art in Numerical Analysis
, 1996
"... This paper reviews advances in Newton, quasiNewton and conjugate gradient methods for large scale optimization. It also describes several packages developed during the last ten years, and illustrates their performance on some practical problems. Much attention is given to the concept of partial ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
This paper reviews advances in Newton, quasiNewton and conjugate gradient methods for large scale optimization. It also describes several packages developed during the last ten years, and illustrates their performance on some practical problems. Much attention is given to the concept of partial separabilitywhich is gaining importance with the arrival of automatic differentiation tools and of optimization software that fully exploits its properties.