Results 1  10
of
11
A Review of Preconditioners for the Interval GaussSeidel Method
, 1991
"... . Interval Newton methods in conjunction with generalized bisection can form the basis of algorithms that find all real roots within a specified box X ae R n of a system of nonlinear equations F (X) = 0 with mathematical certainty, even in finiteprecision arithmetic. In such methods, the system ..."
Abstract

Cited by 50 (16 self)
 Add to MetaCart
. Interval Newton methods in conjunction with generalized bisection can form the basis of algorithms that find all real roots within a specified box X ae R n of a system of nonlinear equations F (X) = 0 with mathematical certainty, even in finiteprecision arithmetic. In such methods, the system F (X) = 0 is transformed into a linear interval system 0 = F (M) +F 0 (X)( ~ X \Gamma M); if interval arithmetic is then used to bound the solutions of this system, the resulting box ~ X contains all roots of the nonlinear system. We may use the interval GaussSeidel method to find these solution bounds. In order to increase the overall efficiency of the interval Newton / generalized bisection algorithm, the linear interval system is multiplied by a preconditioner matrix Y before the interval GaussSeidel method is applied. Here, we review results we have obtained over the past few years concerning computation of such preconditioners. We emphasize importance and connecting relationships,...
A Fortran 90 Environment for Research and Prototyping of Enclosure Algorithms for Nonlinear Equations and Global Optimization
"... An environment for general research into and prototyping of algorithms for reliable constrained and unconstrained global nonlinear optimization and reliable enclosure of all roots of nonlinear systems of equations, with or without inequality constraints, is being developed. This environment should b ..."
Abstract

Cited by 40 (19 self)
 Add to MetaCart
An environment for general research into and prototyping of algorithms for reliable constrained and unconstrained global nonlinear optimization and reliable enclosure of all roots of nonlinear systems of equations, with or without inequality constraints, is being developed. This environment should be portable, easy to learn, use, and maintain, and sufficiently fast for some production work. The motivation, design principles, uses, and capabilities for this environment are outlined. The environment includes an interval data type, a symbolic form of automatic differentiation to obtain an internal representation for functions, a special technique to allow conditional branches with operator overloading and interval computations, and generic routines to give interval and noninterval function and derivative information. Some of these generic routines use a special version of the backward mode of automatic differentiation. The package also includes dynamic data structures for exhaustive sear...
Decomposition of Arithmetic Expressions to Improve the Behavior of Interval Iteration for Nonlinear Systems
, 1991
"... Interval iteration can be used, in conjunction with other techniques, for rigorously bounding all solutions to a nonlinear system of equations within a given region, or for verifying approximate solutions. However, because of overestimation which occurs when the interval Jacobian matrix is accumul ..."
Abstract

Cited by 20 (9 self)
 Add to MetaCart
Interval iteration can be used, in conjunction with other techniques, for rigorously bounding all solutions to a nonlinear system of equations within a given region, or for verifying approximate solutions. However, because of overestimation which occurs when the interval Jacobian matrix is accumulated and applied, straightforward linearization of the original nonlinear system sometimes leads to nonconvergent iteration. In this paper, we examine interval iterations based on an expanded system obtained from the intermediate quantities in the original system. In this system, there is no overestimation in entries of the interval Jacobi matrix, and nonlinearities can be taken into account to obtain sharp bounds. We present an example in detail, algorithms, and detailed experimental results obtained from applying our algorithms to the example.
An Interval Branch and Bound Algorithm for Bound Constrained Optimization Problems
 JOURNAL OF GLOBAL OPTIMIZATION
, 1992
"... In this paper, we propose modifications to a prototypical branch and bound algorithm for nonlinear optimization so that the algorithm efficiently handles constrained problems with constant bound constraints. The modifications involve treating subregions of the boundary identically to interior region ..."
Abstract

Cited by 13 (5 self)
 Add to MetaCart
In this paper, we propose modifications to a prototypical branch and bound algorithm for nonlinear optimization so that the algorithm efficiently handles constrained problems with constant bound constraints. The modifications involve treating subregions of the boundary identically to interior regions during the branch and bound process, but using reduced gradients for the interval Newton method. The modifications also involve preconditioners for the interval GaussSeidel method which are optimal in the sense that their application selectively gives a coordinate bound of minimum width, a coordinate bound whose left endpoint is as large as possible, or a coordinate bound whose right endpoint is as small as possible. We give experimental results on a selection of problems with different properties.
Test Results for an Interval Branch and Bound Algorithm for EqualityConstrained Optimization
 In: Computational Methods and Applications, Kluwer
, 1995
"... . Various techniques have been proposed for incorporating constraints in interval branch and bound algorithms for global optimization. However, few reports of practical experience with these techniques have appeared to date. Such experimental results appear here. The underlying implementation includ ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
. Various techniques have been proposed for incorporating constraints in interval branch and bound algorithms for global optimization. However, few reports of practical experience with these techniques have appeared to date. Such experimental results appear here. The underlying implementation includes use of an approximate optimizer combined with a careful tesselation process and rigorous verification of feasibility. The experiments include comparison of methods of handling bound constraints and comparison of two methods for normalizing Lagrange multipliers. Selected test problems from the Floudas / Pardalos monograph are used, as well as selected unconstrained test problems appearing in reports of interval branch and bound methods for unconstrained global optimization. Keywords: constrained global optimization, verified computations, interval computations, bound constraints, experimental results 1. Introduction We consider the constrained global optimization problem minimize OE(X) s...
Global Optimization of Nonconvex Nonlinear Programs Using Parallel Branch and Bound
, 1995
"... A branch and bound algorithm for computing globally optimal solutions to nonconvex nonlinear programs in continuous variables is presented. The algorithm is directly suitable for a wide class of problems arising in chemical engineering design. It can solve problems defined using algebraic functions ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
A branch and bound algorithm for computing globally optimal solutions to nonconvex nonlinear programs in continuous variables is presented. The algorithm is directly suitable for a wide class of problems arising in chemical engineering design. It can solve problems defined using algebraic functions and twice differentiable transcendental functions, in which finite upper and lower bounds can be placed on each variable. The algorithm uses rectangular partitions of the variable domain and a new bounding program based on convex/concave envelopes and positive definite combinations of quadratic terms. The algorithm is deterministic and obtains convergence with final regions of finite size. The partitioning strategy uses a sensitivity analysis of the bounding program to predict the best variable to split and the split location. Two versions of the algorithm are considered, the first using a local NLP algorithm (MINOS) and the second using a sequence of lower bounding programs in the search fo...
An Improved Unconstrained Global Optimization Algorithm
, 1996
"... Global optimization is a very hard problem especially when the number of variables is large (greater than several hundred). Recently, some methods including simulated annealing, branch and bound, and an interval Newton's method have made it possible to solve global optimization problems with several ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
Global optimization is a very hard problem especially when the number of variables is large (greater than several hundred). Recently, some methods including simulated annealing, branch and bound, and an interval Newton's method have made it possible to solve global optimization problems with several hundred variables. However, this is a small number of variables when one considers that integer programming can tackle problems with thousands of variables, and linear programming is able to solve problems with millions of variables. The goal of this research is to examine the present state of the art for algorithms to solve the unconstrained global optimization problem (GOP) and then to suggest some new approaches that allow problems of a larger size to be solved with an equivalent amount of computer time. This algorithm is then implemented using portable C++ and the software will be released for general use. This new algorithm is given with some theoretical results under which the algorit...
Rigorous Computation of Surface Patch Intersection Curves
, 1993
"... A rigorous and efficient algorithm is presented for computing a sequence of points on all the branches of surface patch intersection curves within a given box. In the algorithm, an interval step control continuation method makes certain that the predictor algorithm will not jump from one branch to t ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
A rigorous and efficient algorithm is presented for computing a sequence of points on all the branches of surface patch intersection curves within a given box. In the algorithm, an interval step control continuation method makes certain that the predictor algorithm will not jump from one branch to the another. These reliability properties are independent of any choice of tuning parameters. Both a 3dimensional box complement method and a containment checking method are able to guarantee that all branches are located. Initial experimental results show that, even with this reliability, the amount of computation is orders of magnitude less than a uniform tesselation of the threedimensional viewing box. Keywords: computational geometry, marching method, continuation method, surface patch intersections, interval computations. 1 Introduction and Notation The goal of this paper is to present general algorithms for computing all surface / surface intersection curves that are mathematically ...
Fast Interval BranchAndBound Methods For Unconstrained Global Optimization With Affine Arithmetic
, 1997
"... We show that faster solutions to unconstrained global optimization problems can be obtained by combining previous accelerations techniques for interval branchandbound methods with affine arithmetic, a recent alternative to interval arithmetic that often provides tighter estimates. We support this c ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
We show that faster solutions to unconstrained global optimization problems can be obtained by combining previous accelerations techniques for interval branchandbound methods with affine arithmetic, a recent alternative to interval arithmetic that often provides tighter estimates. We support this claim by solving a few wellknown problems.
Computational Complexity and Feasibility of Fuzzy Data Processing: Why Fuzzy Numbers, Which Fuzzy Numbers, Which Operations with Fuzzy Numbers
 Proceedings of the International Conference on Information Processing and Management of Uncertainty in KnowledgeBased Systems (IPMU'98
, 1997
"... In many reallife situations, we cannot directly measure or estimate the desired quantity r. In these situations, we measure or estimate other quantities r 1 ; : : : ; r n related to r, and then reconstruct r from the estimates for r i . This reconstruction is called data processing. Often, we o ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
In many reallife situations, we cannot directly measure or estimate the desired quantity r. In these situations, we measure or estimate other quantities r 1 ; : : : ; r n related to r, and then reconstruct r from the estimates for r i . This reconstruction is called data processing. Often, we only have fuzzy information about r i . In such cases, we have fuzzy data processing. Fuzzy data means that instead of a single number r i , we have several numbers that describes the fuzzy knowledge about the corresponding quantity. Since we need to process more numbers, the computation time for fuzzy data processing is often much larger than for the usual nonfuzzy one. It is, therefore, desirable to select representations and processing algorithms that minimize this increase and thus, make fuzzy data processing feasible. In this paper, we show that the necessity to minimize computation time explains why we use fuzzy numbers, and describes what operations we should use. 1 Formulation of th...