Results 1 
8 of
8
A Review of Preconditioners for the Interval GaussSeidel Method
, 1991
"... . Interval Newton methods in conjunction with generalized bisection can form the basis of algorithms that find all real roots within a specified box X ae R n of a system of nonlinear equations F (X) = 0 with mathematical certainty, even in finiteprecision arithmetic. In such methods, the system ..."
Abstract

Cited by 50 (16 self)
 Add to MetaCart
. Interval Newton methods in conjunction with generalized bisection can form the basis of algorithms that find all real roots within a specified box X ae R n of a system of nonlinear equations F (X) = 0 with mathematical certainty, even in finiteprecision arithmetic. In such methods, the system F (X) = 0 is transformed into a linear interval system 0 = F (M) +F 0 (X)( ~ X \Gamma M); if interval arithmetic is then used to bound the solutions of this system, the resulting box ~ X contains all roots of the nonlinear system. We may use the interval GaussSeidel method to find these solution bounds. In order to increase the overall efficiency of the interval Newton / generalized bisection algorithm, the linear interval system is multiplied by a preconditioner matrix Y before the interval GaussSeidel method is applied. Here, we review results we have obtained over the past few years concerning computation of such preconditioners. We emphasize importance and connecting relationships,...
A Review Of Techniques In The Verified Solution Of Constrained Global Optimization Problems
, 1996
"... Elements and techniques of stateoftheart automatically verified constrained global optimization algorithms are reviewed, including a description of ways of rigorously verifying feasibility for equality constraints and a careful consideration of the role of active inequality constraints. Previousl ..."
Abstract

Cited by 25 (6 self)
 Add to MetaCart
Elements and techniques of stateoftheart automatically verified constrained global optimization algorithms are reviewed, including a description of ways of rigorously verifying feasibility for equality constraints and a careful consideration of the role of active inequality constraints. Previously developed algorithms and general work on the subject are also listed. Limitations of present knowledge are mentioned, and advice is given on which techniques to use in various contexts. Applications are discussed. 1 INTRODUCTION, BASIC IDEAS AND LITERATURE We consider the constrained global optimization problem minimize OE(X) subject to c i (X) = 0; i = 1; : : : ; m (1.1) a i j x i j b i j ; j = 1; : : : ; q; where X = (x 1 ; : : : ; xn ) T . A general constrained optimization problem, including inequality constraints g(X) 0 can be put into this form by introducing slack variables s, replacing by s + g(X) = 0, and appending the bound constraint 0 s ! 1; see x2.2. 2 Chapter 1 W...
An Interval Branch and Bound Algorithm for Bound Constrained Optimization Problems
 JOURNAL OF GLOBAL OPTIMIZATION
, 1992
"... In this paper, we propose modifications to a prototypical branch and bound algorithm for nonlinear optimization so that the algorithm efficiently handles constrained problems with constant bound constraints. The modifications involve treating subregions of the boundary identically to interior region ..."
Abstract

Cited by 13 (5 self)
 Add to MetaCart
In this paper, we propose modifications to a prototypical branch and bound algorithm for nonlinear optimization so that the algorithm efficiently handles constrained problems with constant bound constraints. The modifications involve treating subregions of the boundary identically to interior regions during the branch and bound process, but using reduced gradients for the interval Newton method. The modifications also involve preconditioners for the interval GaussSeidel method which are optimal in the sense that their application selectively gives a coordinate bound of minimum width, a coordinate bound whose left endpoint is as large as possible, or a coordinate bound whose right endpoint is as small as possible. We give experimental results on a selection of problems with different properties.
On Interval Weighted Threelayer Neural Networks
 In Proceedings of the 31 Annual Simulation Symposium
, 1998
"... In solving application problems, the data sets used to train a neural network may not be hundred percent precise but within certain ranges. Representing data sets with intervals, we have interval neural networks. By analyzing the mathematical model, we categorize general threelayer neural network t ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
In solving application problems, the data sets used to train a neural network may not be hundred percent precise but within certain ranges. Representing data sets with intervals, we have interval neural networks. By analyzing the mathematical model, we categorize general threelayer neural network training problems into two types. One of them can be solved by finding numerical solutions of nonlinear systems of equations. The other can be transformed into nonlinear optimization problems. Reliable interval algorithms such as interval Newton/generalized bisection method and interval branchandbound algorithm are applied to obtain optimal weights for interval neural networks. The applicable stateofart interval software packages are reviewed in this paper as well. I. Introduction A. Threelayer Neural Network A threelayer neural network includes I (input), H (hidden), and O (output) layers. Each of these layers contains some nodes, called neurons. There is no any direct links for any...
On Interval Weighted Threelayer Neural Networks
 Proc. of the 31 Annual Simulation Symposium
, 1998
"... In solving application problems, the data sets used to train a neural network may not be hundred percent precise but within certain ranges. Representing data sets with intervals, we have interval neural networks. By analyzing the mathematical model, we categorize general threelayer neural networ ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
In solving application problems, the data sets used to train a neural network may not be hundred percent precise but within certain ranges. Representing data sets with intervals, we have interval neural networks. By analyzing the mathematical model, we categorize general threelayer neural network training problems into two types. One of them can be solved by flnding numerical solutions of nonlinear systems of equations. The other can be transformed into nonlinear optimization problems. Reliable interval algorithms such as interval Newton/generalized bisection method and interval branchandbound algorithm are applied to obtain optimal weights for interval neural networks. The applicable stateofart interval software packages are reviewed in this paper as well. I.
An Online Interval Calculator
 Society for Computer Simulation
, 1998
"... In this paper, we report the motivation, design, implementation, and usage of an online interval calculator which we developed very recently. ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
In this paper, we report the motivation, design, implementation, and usage of an online interval calculator which we developed very recently.
Parallel Reliable Computing with Interval Arithmetic
"... Reliability of computational results is crucial in computational science and engineering. In this paper, we report some current research results on parallel reliable computing with interval arithmetic. In section 1, a brief introduction to interval arithmetic is provided. In section 2, an interval a ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Reliability of computational results is crucial in computational science and engineering. In this paper, we report some current research results on parallel reliable computing with interval arithmetic. In section 1, a brief introduction to interval arithmetic is provided. In section 2, an interval algorithm to reliably solving largescale sparse nonlinear systems of equations is presented. In section 3, polynomial interpolation with interval arithmetic is studied. We conclude this paper with section 4. I. Introduction Interval arithmetic, first introduced by Moore [24] in the 1960's, has become an active research area in scientific computing. Here is the definition of interval arithmetic. Definition 1.1: Let x and y be two real intervals 1 , and op be one of the arithmetic operations +; \Gamma; \Theta, \Xi. Then, x op y = fx op y : x 2 x; y 2 yg, provided that 0 62 y if op represents \Xi. For example, [1; 2] + [\Gamma1; 0] = [0; 2] and [2; 4] \Xi [1; 2] = [1; 4]. Some reasons for...
unknown title
"... A general iterative sparse linear solver and its parallelization for interval Newton methods CHENY ~ HU, ANNA FROLOV*, R. BAKER KF.~RFOTr, and QING YANG Interval Newton/Generalized Bisection methods reliably find all numerical solutions within a given domain. Both computational complexity analysis a ..."
Abstract
 Add to MetaCart
A general iterative sparse linear solver and its parallelization for interval Newton methods CHENY ~ HU, ANNA FROLOV*, R. BAKER KF.~RFOTr, and QING YANG Interval Newton/Generalized Bisection methods reliably find all numerical solutions within a given domain. Both computational complexity analysis and numerical experiment s have shown that solving the corresponding interval linear system generated by interval Newton's methods can be computationally expensive (especially when the nonlinear system is large). In applications, many largescale nonlinear systems of equations result in sparse interval Jacobian matrices. In this paper, we first profx)se a general indexed storage scheme to store sparse interval matrices We then present an iterative interval linear solver that utilizes the proposed index storage scheme It is expected that the newly proposed general interval iterative sparse linear solver will improve the overall performance for interval Newton/Generalized bisection methods when the Jacobian matrices are sparse. In Section 1, we briefly review interval Newton's methods. In Section 2, we review some currently used storage schemes for sparse systems. In Section 3, we introduce a new index scheme to store general sparse matrices. In Section 4, we present both sequential and parallel algorithms to evaluate a general sparse Jacobian matrix. In Section 5, we present both.~xluential and parallel algorithms to solve the corresponding interval linear system by the allrow preconditioned scheme. Condusions and future work are discussed in Section 6. O606IIIeHHl~Ifl ~TepaT~Bm,ift hHHeflHl~Ift pemaTeAB pa3pe)KeHHblX CHCTeM ero