Results 1  10
of
190
Safety verification of hybrid systems by constraint propagation based abstraction refinement
, 2005
"... This paper deals with the problem of safety verification of nonlinear hybrid systems. We start from a classical method that uses interval arithmetic to check whether trajectories can move over the boundaries in a rectangular grid. We put this method into an abstraction refinement framework and impr ..."
Abstract

Cited by 55 (11 self)
 Add to MetaCart
(Show Context)
This paper deals with the problem of safety verification of nonlinear hybrid systems. We start from a classical method that uses interval arithmetic to check whether trajectories can move over the boundaries in a rectangular grid. We put this method into an abstraction refinement framework and improve it by developing an additional refinement step that employs interval constraint propagation to add information to the abstraction without introducing new grid elements. Moreover, the resulting method allows switching conditions, initial states and unsafe states to be described by complex constraints instead of sets that correspond to grid elements. Nevertheless, the method can be easily implemented since it is based on a welldefined set of constraints, on which one can run any constraint propagation based solver. Tests of such an implementation are promising.
Efficient solving of quantified inequality constraints over the real numbers
 ACM Transactions on Computational Logic
"... Let a quantified inequality constraint over the reals be a formula in the firstorder predicate language over the structure of the real numbers, where the allowed predicate symbols are ≤ and <. Solving such constraints is an undecidable problem when allowing function symbols such sin or cos. In t ..."
Abstract

Cited by 29 (8 self)
 Add to MetaCart
(Show Context)
Let a quantified inequality constraint over the reals be a formula in the firstorder predicate language over the structure of the real numbers, where the allowed predicate symbols are ≤ and <. Solving such constraints is an undecidable problem when allowing function symbols such sin or cos. In the paper we give an algorithm that terminates with a solution for all, except for very special, pathological inputs. We ensure the practical efficiency of this algorithm by employing constraint programming techniques. 1
Population Variance under Interval Uncertainty: A
 New Algorithm, Reliable Computing
, 2006
"... In statistical analysis of measurement results, it is often beneficial to compute the range V of the population variance V = 1 n · n∑ (xi − E) i=1 ..."
Abstract

Cited by 20 (17 self)
 Add to MetaCart
(Show Context)
In statistical analysis of measurement results, it is often beneficial to compute the range V of the population variance V = 1 n · n∑ (xi − E) i=1
The design of the Boost interval arithmetic library
, 2006
"... We present the design of the Boost interval arithmetic library, a C++ library designed to efficiently handle mathematical intervals in a generic way. Interval computations are an essential tool for reliable computing. Increasingly a number of mathematical proofs have relied on global optimization pr ..."
Abstract

Cited by 20 (9 self)
 Add to MetaCart
We present the design of the Boost interval arithmetic library, a C++ library designed to efficiently handle mathematical intervals in a generic way. Interval computations are an essential tool for reliable computing. Increasingly a number of mathematical proofs have relied on global optimization problems solved using branchandbound algorithms with interval computations; it is therefore extremely important to have a mathematically correct implementation of interval arithmetic. Various implementations exist with diverse semantics. Our design is unique in that it uses policies to specify three independent variable behaviors: rounding, checking, comparisons. As a result, with the proper policies, our interval library is able to emulate almost any of the specialized libraries available for interval arithmetic, without any loss of performance nor sacrificing the ease of use. This library is openly available at www.boost.org.
How to take into account dependence between the inputs: from interval computations to constraintrelated set computations, with potential applications to nuclear safety, bio and geosciences
 Proceedings of the Second International Workshop on Reliable Engineering Computing
"... In the traditional interval computations approach to handling uncertainty, we assume that we know the intervals xi of possible values of different parameters xi, and we assume that an arbitrary combination of these values is possible. In geometric terms, in the traditional interval computations appr ..."
Abstract

Cited by 12 (12 self)
 Add to MetaCart
(Show Context)
In the traditional interval computations approach to handling uncertainty, we assume that we know the intervals xi of possible values of different parameters xi, and we assume that an arbitrary combination of these values is possible. In geometric terms, in the traditional interval computations approach, the set of possible combinations x = (x1,..., xn) is a box x = x1 ×... × xn. In many reallife situations, in addition to knowing the intervals xi of possible values of each variable xi, we also know additional restrictions on the possible combinations of xi; in this case, the set x of possible values of x is a subset of the original box. For example, in addition to knowing the bounds on x1 and x2, we may also know that the difference between x1 and x2 cannot exceed a certain amount. Informally speaking, the parameters xi are no longer independent – in the sense that the set of possible values of xi may depend on the values of other parameters. In interval computations, we start with independent inputs; as we follow computations, we get dependent intermediate results: e.g., for x1 − x 2 1, the values of x1
Fast Quantum Algorithms for Handling Probabilistic, Interval, and Fuzzy Uncertainty
, 2003
"... We show how quantum computing can speed up computations related to processing probabilistic, interval, and fuzzy uncertainty. ..."
Abstract

Cited by 12 (9 self)
 Add to MetaCart
(Show Context)
We show how quantum computing can speed up computations related to processing probabilistic, interval, and fuzzy uncertainty.
RealPaver: An Interval Solver using Constraint Satisfaction Techniques
 ACM TRANS. ON MATHEMATICAL SOFTWARE
, 2006
"... RealPaver is an interval software for modeling and solving nonlinear systems. Reliable approximations of continuous or discrete solution sets are computed, using Cartesian products of intervals. Systems are given by sets of equations or inequality constraints over integer and real variables. Moreove ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
(Show Context)
RealPaver is an interval software for modeling and solving nonlinear systems. Reliable approximations of continuous or discrete solution sets are computed, using Cartesian products of intervals. Systems are given by sets of equations or inequality constraints over integer and real variables. Moreover, they may have different natures, being square or non square, sparse or dense, linear, polynomial or involving transcendental functions. The modeling language permits stating constraint models and tuning parameters of solving algorithms, which efficiently combine interval methods and constraint satisfaction techniques. Several consistency techniques (box, hull, 3B) are implemented. The distribution includes C sources, executables for different machine architectures, documentation and benchmarks. The portability is ensured by the GNU C compiler.
Proving bounds on realvalued functions with computations
 4th International Joint Conference on Automated Reasoning. Volume 5195 of Lecture Notes in Artificial Intelligence
, 2008
"... Abstract. Intervalbased methods are commonly used for computing numerical bounds on expressions and proving inequalities on real numbers. Yet they are hardly used in proof assistants, as the large amount of numerical computations they require keeps them out of reach from deductive proof processes. ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
(Show Context)
Abstract. Intervalbased methods are commonly used for computing numerical bounds on expressions and proving inequalities on real numbers. Yet they are hardly used in proof assistants, as the large amount of numerical computations they require keeps them out of reach from deductive proof processes. However, evaluating programs inside proofs is an efficient way for reducing the size of proof terms while performing numerous computations. This work shows how programs combining automatic differentiation with floatingpoint and interval arithmetic can be used as efficient yet certified solvers. They have been implemented in a library for the Coq proof system. This library provides tactics for proving inequalities on realvalued expressions. 1
Computing Population Variance and Entropy under Interval Uncertainty: LinearTime Algorithms
, 2006
"... In statistical analysis of measurement results it is often necessary to compute the range [V, V] of the population variance V = 1 n · n∑ (xi − E) 2 where E = 1 n · n∑ xi when we only know the intervals i=1 [˜xi − ∆i, ˜xi + ∆i] of possible values of the xi. While V can be computed efficiently, the pr ..."
Abstract

Cited by 11 (7 self)
 Add to MetaCart
(Show Context)
In statistical analysis of measurement results it is often necessary to compute the range [V, V] of the population variance V = 1 n · n∑ (xi − E) 2 where E = 1 n · n∑ xi when we only know the intervals i=1 [˜xi − ∆i, ˜xi + ∆i] of possible values of the xi. While V can be computed efficiently, the problem of computing V is, in general, NPhard. In our previous paper “Population Variance under Interval Uncertainty: A New Algorithm ” (Reliable Computing, 2006, Vol. 12, No. 4, pp. 273–280) we showed that in
Intervalbased Robust Statistical Techniques for Nonnegative Convex Functions, with Application to Timing Analysis of Computer Chips
 Proceedings of the Second International Workshop on Reliable Engineering Computing
, 2006
"... In chip design, one of the main objectives is to decrease its clock cycle. On the design stage, this time is usually estimated by using worstcase (interval) techniques, in which we only use the bounds on the parameters that lead to delays. This analysis does not take into account that the probabili ..."
Abstract

Cited by 11 (4 self)
 Add to MetaCart
(Show Context)
In chip design, one of the main objectives is to decrease its clock cycle. On the design stage, this time is usually estimated by using worstcase (interval) techniques, in which we only use the bounds on the parameters that lead to delays. This analysis does not take into account that the probability of the worstcase values is usually very small; thus, the resulting estimates are overconservative, leading to unnecessary overdesign and underperformance of circuits. If we knew the exact probability distributions of the corresponding parameters, then we could use MonteCarlo simulations (or the corresponding analytical techniques) to get the desired estimates. In practice, however, we only have partial information about the corresponding distributions, and we want to produce estimates that are valid for all distributions which are consistent with this information.