Results 1  10
of
35
Numerica: a Modeling Language for Global Optimization
, 1997
"... Introduction Many science and engineering applications require the user to find solutions to systems of nonlinear constraints over real numbers or to optimize a nonlinear function subject to nonlinear constraints. This includes applications such the modeling of chemical engineering processes and of ..."
Abstract

Cited by 170 (11 self)
 Add to MetaCart
Introduction Many science and engineering applications require the user to find solutions to systems of nonlinear constraints over real numbers or to optimize a nonlinear function subject to nonlinear constraints. This includes applications such the modeling of chemical engineering processes and of electrical circuits, robot kinematics, chemical equilibrium problems, and design problems (e.g., nuclear reactor design). The field of global optimization is the study of methods to find all solutions to systems of nonlinear constraints and all global optima to optimization problems. Nonlinear problems raise many issues from a computation standpoint. On the one hand, deciding if a set of polynomial constraints has a solution is NPhard. In fact, Canny [ Canny, 1988 ] and Renegar [ Renegar, 1988 ] have shown that the problem is in PSPACE and it is not known whether the problem lies in NP. Nonlinear programming problems can be so hard that some methods are designed only to solve probl
Solving Polynomial Systems Using a Branch and Prune Approach
 SIAM Journal on Numerical Analysis
, 1997
"... This paper presents Newton, a branch & prune algorithm to find all isolated solutions of a system of polynomial constraints. Newton can be characterized as a global search method which uses intervals for numerical correctness and for pruning the search space early. The pruning in Newton consists in ..."
Abstract

Cited by 101 (7 self)
 Add to MetaCart
This paper presents Newton, a branch & prune algorithm to find all isolated solutions of a system of polynomial constraints. Newton can be characterized as a global search method which uses intervals for numerical correctness and for pruning the search space early. The pruning in Newton consists in enforcing at each node of the search tree a unique local consistency condition, called boxconsistency, which approximates the notion of arcconsistency wellknown in artificial intelligence. Boxconsistency is parametrized by an interval extension of the constraint and can be instantiated to produce the HansenSegupta's narrowing operator (used in interval methods) as well as new operators which are more effective when the computation is far from a solution. Newton has been evaluated on a variety of benchmarks from kinematics, chemistry, combustion, economics, and mechanics. On these benchmarks, it outperforms the interval methods we are aware of and compares well with stateoftheart continuation methods. Limitations of Newton (e.g., a sensitivity to the size of the initial intervals on some problems) are also discussed. Of particular interest is the mathematical and programming simplicity of the method.
A Fortran 90 Environment for Research and Prototyping of Enclosure Algorithms for Nonlinear Equations and Global Optimization
"... An environment for general research into and prototyping of algorithms for reliable constrained and unconstrained global nonlinear optimization and reliable enclosure of all roots of nonlinear systems of equations, with or without inequality constraints, is being developed. This environment should b ..."
Abstract

Cited by 40 (19 self)
 Add to MetaCart
An environment for general research into and prototyping of algorithms for reliable constrained and unconstrained global nonlinear optimization and reliable enclosure of all roots of nonlinear systems of equations, with or without inequality constraints, is being developed. This environment should be portable, easy to learn, use, and maintain, and sufficiently fast for some production work. The motivation, design principles, uses, and capabilities for this environment are outlined. The environment includes an interval data type, a symbolic form of automatic differentiation to obtain an internal representation for functions, a special technique to allow conditional branches with operator overloading and interval computations, and generic routines to give interval and noninterval function and derivative information. Some of these generic routines use a special version of the backward mode of automatic differentiation. The package also includes dynamic data structures for exhaustive sear...
Robust Process Simulation Using Interval Methods
 Comput. Chem. Eng
, 1996
"... Ideally, for the needs of robust process simulation, one would like a nonlinear equation solving technique that can find any and all roots to a problem, and do so with mathematical certainty. In general, currently used techniques do not provide such rigorous guarantees. One approach to providing suc ..."
Abstract

Cited by 31 (19 self)
 Add to MetaCart
Ideally, for the needs of robust process simulation, one would like a nonlinear equation solving technique that can find any and all roots to a problem, and do so with mathematical certainty. In general, currently used techniques do not provide such rigorous guarantees. One approach to providing such assurances can be found in the use of interval analysis, in particular the use of interval Newton methods combined with generalized bisection. However, these methods have generally been regarded as extremely inefficient. Motivated by recent progress in interval analysis, as well as continuing advances in computer speed and the availability of parallel computing, we consider here the feasibility of using an interval Newton/generalized bisection algorithm on process simulation problems. An algorithm designed for parallel computing on an MIMD machine is described, and results of tests on several problems are reported. Experiments indicate that the interval Newton/generalized bisection method works quite well on relatively small problems, providing a powerful method for finding all solutions to a problem. For larger problems, the method performs inconsistently with regard to efficiency, at least when reasonable initial bounds are not provided.
A Review Of Techniques In The Verified Solution Of Constrained Global Optimization Problems
, 1996
"... Elements and techniques of stateoftheart automatically verified constrained global optimization algorithms are reviewed, including a description of ways of rigorously verifying feasibility for equality constraints and a careful consideration of the role of active inequality constraints. Previousl ..."
Abstract

Cited by 25 (6 self)
 Add to MetaCart
Elements and techniques of stateoftheart automatically verified constrained global optimization algorithms are reviewed, including a description of ways of rigorously verifying feasibility for equality constraints and a careful consideration of the role of active inequality constraints. Previously developed algorithms and general work on the subject are also listed. Limitations of present knowledge are mentioned, and advice is given on which techniques to use in various contexts. Applications are discussed. 1 INTRODUCTION, BASIC IDEAS AND LITERATURE We consider the constrained global optimization problem minimize OE(X) subject to c i (X) = 0; i = 1; : : : ; m (1.1) a i j x i j b i j ; j = 1; : : : ; q; where X = (x 1 ; : : : ; xn ) T . A general constrained optimization problem, including inequality constraints g(X) 0 can be put into this form by introducing slack variables s, replacing by s + g(X) = 0, and appending the bound constraint 0 s ! 1; see x2.2. 2 Chapter 1 W...
A Constraint Satisfaction Approach to a Circuit Design Problem
, 1998
"... A classical circuitdesign problem from Ebers and Moll [6] features a system of nine nonlinear equations in nine variables that is very challenging both for local and global methods. This system was solved globally using an interval method by Ratschek and Rokne [23] in the box [0; 10] 9 . Their ..."
Abstract

Cited by 21 (1 self)
 Add to MetaCart
A classical circuitdesign problem from Ebers and Moll [6] features a system of nine nonlinear equations in nine variables that is very challenging both for local and global methods. This system was solved globally using an interval method by Ratschek and Rokne [23] in the box [0; 10] 9 . Their algorithm had enormous costs (i.e., over 14 months using a network of 30 Sun Sparc1 workstations) but they state that "at this time, we know no other method which has been applied to this circuit design problem and which has led to the same guaranteed result of locating exactly one solution in this huge domain, completed with a reliable error estimate." The present paper gives a novel branchandprune algorithm that obtains a unique safe box for the above system within reasonable computation times. The algorithm combines traditional interval techniques with an adaptation of discrete constraintsatisfaction techniques to continuous problems. Of particular interest is the simplicity o...
Decomposition of Arithmetic Expressions to Improve the Behavior of Interval Iteration for Nonlinear Systems
, 1991
"... Interval iteration can be used, in conjunction with other techniques, for rigorously bounding all solutions to a nonlinear system of equations within a given region, or for verifying approximate solutions. However, because of overestimation which occurs when the interval Jacobian matrix is accumul ..."
Abstract

Cited by 20 (9 self)
 Add to MetaCart
Interval iteration can be used, in conjunction with other techniques, for rigorously bounding all solutions to a nonlinear system of equations within a given region, or for verifying approximate solutions. However, because of overestimation which occurs when the interval Jacobian matrix is accumulated and applied, straightforward linearization of the original nonlinear system sometimes leads to nonconvergent iteration. In this paper, we examine interval iterations based on an expanded system obtained from the intermediate quantities in the original system. In this system, there is no overestimation in entries of the interval Jacobi matrix, and nonlinearities can be taken into account to obtain sharp bounds. We present an example in detail, algorithms, and detailed experimental results obtained from applying our algorithms to the example.
Empirical Evaluation Of Innovations In Interval Branch And Bound Algorithms For Nonlinear Systems
 SIAM J. Sci. Comput
, 1994
"... . Interval branch and bound algorithms for finding all roots use a combination of a computational existence / uniqueness procedure and a tesselation process (generalized bisection). Such algorithms identify, with mathematical rigor, a set of boxes that contains unique roots and a second set within w ..."
Abstract

Cited by 18 (10 self)
 Add to MetaCart
. Interval branch and bound algorithms for finding all roots use a combination of a computational existence / uniqueness procedure and a tesselation process (generalized bisection). Such algorithms identify, with mathematical rigor, a set of boxes that contains unique roots and a second set within which all remaining roots must lie. Though each root is contained in a box in one of the sets, the second set may have several boxes in clusters near a single root. Thus, the output is of higher quality if there are relatively more boxes in the first set. In contrast to previously implemented similar techniques, a box expansion technique in this paper, based on using an approximate root finder, fflinflation and exact set complementation, decreases the size of the second set, increases the size of the first set, and never loses roots. In addition to the expansion technique, use of secondorder extensions to eliminate small boxes that do not contain roots, and interval slopes versus interval d...
GLOPT  A Program for Constrained Global Optimization
 Developments in Global Optimization
, 1996
"... . GLOPT is a Fortran77 program for global minimization of a blockseparable objective function subject to bound constraints and blockseparable constraints. It finds a nearly globally optimal point that is near a true local minimizer. Unless there are several local minimizers that are nearly global, ..."
Abstract

Cited by 15 (7 self)
 Add to MetaCart
. GLOPT is a Fortran77 program for global minimization of a blockseparable objective function subject to bound constraints and blockseparable constraints. It finds a nearly globally optimal point that is near a true local minimizer. Unless there are several local minimizers that are nearly global, we thus find a good approximation to the global minimizer. GLOPT uses a branch and bound technique to split the problem recursively into subproblems that are either eliminated or reduced in their size. This is done by an extensive use of the block separable structure of the optimization problem. In this paper we discuss a new reduction technique for boxes and new ways for generating feasible points of constrained nonlinear programs. These are implemented as the first stage of our GLOPT project. The current implementation of GLOPT uses neither derivatives nor simultaneous information about several constraints. Numerical results are already encouraging. Work on an extension using curvature inf...
An Interval Branch and Bound Algorithm for Bound Constrained Optimization Problems
 JOURNAL OF GLOBAL OPTIMIZATION
, 1992
"... In this paper, we propose modifications to a prototypical branch and bound algorithm for nonlinear optimization so that the algorithm efficiently handles constrained problems with constant bound constraints. The modifications involve treating subregions of the boundary identically to interior region ..."
Abstract

Cited by 13 (5 self)
 Add to MetaCart
In this paper, we propose modifications to a prototypical branch and bound algorithm for nonlinear optimization so that the algorithm efficiently handles constrained problems with constant bound constraints. The modifications involve treating subregions of the boundary identically to interior regions during the branch and bound process, but using reduced gradients for the interval Newton method. The modifications also involve preconditioners for the interval GaussSeidel method which are optimal in the sense that their application selectively gives a coordinate bound of minimum width, a coordinate bound whose left endpoint is as large as possible, or a coordinate bound whose right endpoint is as small as possible. We give experimental results on a selection of problems with different properties.