Results 1  10
of
51
A Review of Preconditioners for the Interval GaussSeidel Method
, 1991
"... . Interval Newton methods in conjunction with generalized bisection can form the basis of algorithms that find all real roots within a specified box X ae R n of a system of nonlinear equations F (X) = 0 with mathematical certainty, even in finiteprecision arithmetic. In such methods, the system ..."
Abstract

Cited by 51 (16 self)
 Add to MetaCart
. Interval Newton methods in conjunction with generalized bisection can form the basis of algorithms that find all real roots within a specified box X ae R n of a system of nonlinear equations F (X) = 0 with mathematical certainty, even in finiteprecision arithmetic. In such methods, the system F (X) = 0 is transformed into a linear interval system 0 = F (M) +F 0 (X)( ~ X \Gamma M); if interval arithmetic is then used to bound the solutions of this system, the resulting box ~ X contains all roots of the nonlinear system. We may use the interval GaussSeidel method to find these solution bounds. In order to increase the overall efficiency of the interval Newton / generalized bisection algorithm, the linear interval system is multiplied by a preconditioner matrix Y before the interval GaussSeidel method is applied. Here, we review results we have obtained over the past few years concerning computation of such preconditioners. We emphasize importance and connecting relationships,...
Detecting global optimality and extracting solutions in GloptiPoly
 Chapter in D. Henrion, A. Garulli (Editors). Positive polynomials in control. Lecture Notes in Control and Information Sciences
, 2005
"... GloptiPoly is a Matlab/SeDuMi addon to build and solve convex linear matrix inequality (LMI) relaxations of nonconvex optimization problems with multivariate polynomial objective function and constraints, based on the theory of moments. In contrast with the dual sumofsquares decompositions of po ..."
Abstract

Cited by 47 (10 self)
 Add to MetaCart
GloptiPoly is a Matlab/SeDuMi addon to build and solve convex linear matrix inequality (LMI) relaxations of nonconvex optimization problems with multivariate polynomial objective function and constraints, based on the theory of moments. In contrast with the dual sumofsquares decompositions of positive polynomials, the theory of moments allows to detect global optimality of an LMI relaxation and extract globally optimal solutions. In this report, we describe and illustrate the numerical linear algebra algorithm implemented in GloptiPoly for detecting global optimality and extracting solutions. We also mention some related heuristics that could be useful to reduce the number of variables in the LMI relaxations. 1
Solving Systems of Nonlinear Equations Using the Nonzero Value of the Topological Degree”; “CHABIS: A Mathematical Software Package for Locating and Evaluating Roots of Systems of Nonlinear Equations
 ACM Trans. Math. Software
, 1988
"... Two algorithms are described here for the numerical solution of a system of nonlinear equations F(X) = 0, where 0 = (0, 0,..., 0) E Iw”, and F is a given continuous mapping of a region D in R” into R”. The first algorithm locates at least one root of the system within an ndimensional polyhedron, u ..."
Abstract

Cited by 40 (21 self)
 Add to MetaCart
Two algorithms are described here for the numerical solution of a system of nonlinear equations F(X) = 0, where 0 = (0, 0,..., 0) E Iw”, and F is a given continuous mapping of a region D in R” into R”. The first algorithm locates at least one root of the system within an ndimensional polyhedron, using the nonzero value of the topological degree of F at 0 relative to the polyhedron; the second algorithm applies a new generalized bisection method in order to compute an approximate solution of the system. The size of the original ndimensional polyhedron is arbitrary, and the method is globally convergent in a residual sense. These algorithms, in the various function evaluations, only make use of the algebraic sign of F and do not require computations of the topological degree. Moreover, they can be applied to nondifferentiable continuous functions F and do not involve derivatives of F or approximations of such derivatives.
On the Complexity of Isolating Real Roots and Computing with Certainty the Topological Degree
, 2002
"... In this contribution the isolation of real roots and the computation of the topological degree... ..."
Abstract

Cited by 39 (20 self)
 Add to MetaCart
In this contribution the isolation of real roots and the computation of the topological degree...
Robust Process Simulation Using Interval Methods
 Comput. Chem. Eng
, 1996
"... Ideally, for the needs of robust process simulation, one would like a nonlinear equation solving technique that can find any and all roots to a problem, and do so with mathematical certainty. In general, currently used techniques do not provide such rigorous guarantees. One approach to providing suc ..."
Abstract

Cited by 31 (19 self)
 Add to MetaCart
Ideally, for the needs of robust process simulation, one would like a nonlinear equation solving technique that can find any and all roots to a problem, and do so with mathematical certainty. In general, currently used techniques do not provide such rigorous guarantees. One approach to providing such assurances can be found in the use of interval analysis, in particular the use of interval Newton methods combined with generalized bisection. However, these methods have generally been regarded as extremely inefficient. Motivated by recent progress in interval analysis, as well as continuing advances in computer speed and the availability of parallel computing, we consider here the feasibility of using an interval Newton/generalized bisection algorithm on process simulation problems. An algorithm designed for parallel computing on an MIMD machine is described, and results of tests on several problems are reported. Experiments indicate that the interval Newton/generalized bisection method works quite well on relatively small problems, providing a powerful method for finding all solutions to a problem. For larger problems, the method performs inconsistently with regard to efficiency, at least when reasonable initial bounds are not provided.
Reduction Of Constraint Systems
, 1993
"... Geometric modeling by constraints leads to large systems of algebraic equations. This paper studies bipartite graphs underlaid by systems of equations. It shows how these graphs make possible to polynomially decompose these systems into well constrained, over, and underconstrained subsystems. This ..."
Abstract

Cited by 29 (2 self)
 Add to MetaCart
Geometric modeling by constraints leads to large systems of algebraic equations. This paper studies bipartite graphs underlaid by systems of equations. It shows how these graphs make possible to polynomially decompose these systems into well constrained, over, and underconstrained subsystems. This paper also gives an efficient method to decompose well constrained systems into irreducible ones. These decompositions greatly speed up the resolution in case of reducible systems. They also allow debugging systems of constraints. Key Words: geometric modeling, constraints, bipartite graphs, matching, maximum matching, perfect matching. 1. INTRODUCTION Geometric modeling by constraints is an interesting approach in CAD. Typically, in 2D, geometric modeling by constraints specifies geometrical objects such as points, lines, circles, conics by a set of constraints : distances between points, points and lines, parallel lines, angles between lines, incidence relations between points and lines,...
Global optimization by continuous GRASP
 Optimization Letters
"... ABSTRACT. We introduce a novel global optimization method called Continuous GRASP (CGRASP) which extends Feo and Resende’s greedy randomized adaptive search procedure (GRASP) from the domain of discrete optimization to that of continuous global optimization. This stochastic local search method is s ..."
Abstract

Cited by 24 (9 self)
 Add to MetaCart
ABSTRACT. We introduce a novel global optimization method called Continuous GRASP (CGRASP) which extends Feo and Resende’s greedy randomized adaptive search procedure (GRASP) from the domain of discrete optimization to that of continuous global optimization. This stochastic local search method is simple to implement, is widely applicable, and does not make use of derivative information, thus making it a wellsuited approach for solving global optimization problems. We illustrate the effectiveness of the procedure on a set of standard test problems as well as two hard global optimization problems. 1.
UniCalc, a novel approach to solving systems of algebraic equations
 Proceedings of ALP’96, 5th International Conference on Algebraic and Logic Programming
, 1993
"... This paper describes a novel approach to solving systems of algebraic equations and inequalities that is based on subdefinite calculations. The use of these methods makes it possible to solve overdetermined and underdetermined systems, as well as systems with imprecise and incomplete data. The appro ..."
Abstract

Cited by 15 (0 self)
 Add to MetaCart
This paper describes a novel approach to solving systems of algebraic equations and inequalities that is based on subdefinite calculations. The use of these methods makes it possible to solve overdetermined and underdetermined systems, as well as systems with imprecise and incomplete data. The approach was implemented with the help of the methods of interval mathematics. The UniCalc solver, also described in this paper, was developed on the basis of this approach. To illustrate the capabilities of UniCalc, we give examples of problems solved with its help. UniCalc: новый подход к решению систем алгебраических уравнений
Automatic Generation of Numerical Redundancies for Nonlinear Constraint Solving
 RELIABLE COMPUTING
, 1997
"... In this paper we present a framework for the cooperation of symbolic and propagationbased numerical solvers over the real numbers. This cooperation is expressed in terms of fixed points of closure operators over a complete lattice of constraint systems. In a second part we instantiate this framewor ..."
Abstract

Cited by 15 (4 self)
 Add to MetaCart
In this paper we present a framework for the cooperation of symbolic and propagationbased numerical solvers over the real numbers. This cooperation is expressed in terms of fixed points of closure operators over a complete lattice of constraint systems. In a second part we instantiate this framework to a particular cooperation scheme, where propagation is associated to pruning operators implementing interval algorithms enclosing the possible solutions of constraint systems, whereas symbolic methods are mainly devoted to generate redundant constraints. When carefully chosen, it is well known that the addition of redundant constraint drastically improve the performances of systems based on local consistency (e.g. Prolog IV or Newton). We propose here a method which computes sets of redundant polynomials called partial Grobner bases and show on some benchmarks the advantages of such computations. Keywords: Numerical constraints, interval constraints, approximate solving, local consist...
Efficient and safe global constraints for handling numerical constraint systems
 SIAM J. NUMER. ANAL
, 2005
"... Numerical constraint systems are often handled by branch and prune algorithms that combine splitting techniques, local consistencies, and interval methods. This paper first recalls the principles of Quad, a global constraint that works on a tight and safe linear relaxation of quadratic subsystems ..."
Abstract

Cited by 14 (3 self)
 Add to MetaCart
Numerical constraint systems are often handled by branch and prune algorithms that combine splitting techniques, local consistencies, and interval methods. This paper first recalls the principles of Quad, a global constraint that works on a tight and safe linear relaxation of quadratic subsystems of constraints. Then, it introduces a generalization of Quad to polynomial constraint systems. It also introduces a method to get safe linear relaxations and shows how to compute safe bounds of the variables of the linear constraint system. Different linearization techniques are investigated to limit the number of generated constraints. QuadSolver, a new branch and prune algorithm that combines Quad, local consistencies, and interval methods, is introduced. QuadSolver has been evaluated on a variety of benchmarks from kinematics, mechanics, and robotics. On these benchmarks, it outperforms classical interval methods as well as constraint satisfaction problem solvers and it compares well with stateoftheart optimization solvers.