Results 1 
8 of
8
Experimental Uncertainty Estimation and Statistics for Data Having Interval Uncertainty
 11733, SAND20070939. hal00839639, version 1  28 Jun 2013
"... Sandia is a multiprogram laboratory operated by Sandia Corporation, ..."
Abstract

Cited by 39 (20 self)
 Add to MetaCart
(Show Context)
Sandia is a multiprogram laboratory operated by Sandia Corporation,
Towards combining probabilistic and interval uncertainty in engineering calculations
 Proceedings of the Workshop on Reliable Engineering Computing
, 2004
"... Abstract. In many engineering applications, we have to combine probabilistic and interval errors. For example, in environmental analysis, we observe a pollution level x(t) in a lake at different moments of time t, and we would like to estimate standard statistical characteristics such as mean, varia ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
Abstract. In many engineering applications, we have to combine probabilistic and interval errors. For example, in environmental analysis, we observe a pollution level x(t) in a lake at different moments of time t, and we would like to estimate standard statistical characteristics such as mean, variance, autocorrelation, correlation with other measurements. In environmental measurements, we often only know the values with interval uncertainty. We must therefore modify the existing statistical algorithms to process such interval data. Such modification are described in this paper.
Applicationmotivated combinations of fuzzy, interval, and probability approaches, and their use in geoinformatics, bioinformatics, and engineering
 INT. J. AUTOMATION AND CONTROL
, 2007
"... ..."
I. TWO MAIN SOURCES OF INFORMATION ABOUT THE REAL WORLD
"... Abstract—In traditional interval computations, we assume that the interval data corresponds to guaranteed interval bounds, and that fuzzy estimates provided by experts are correct. In practice, measuring instruments are not 100% reliable, and experts are not 100 % reliable, we may have estimates whi ..."
Abstract
 Add to MetaCart
Abstract—In traditional interval computations, we assume that the interval data corresponds to guaranteed interval bounds, and that fuzzy estimates provided by experts are correct. In practice, measuring instruments are not 100% reliable, and experts are not 100 % reliable, we may have estimates which are “way off”, intervals which do not contain the actual values at all. Usually, we know the percentage of such outlier unreliable measurements. However, it is desirable to check that the reliability of the actual data is indeed within the given percentage. The problem of checking (gauging) this reliability is, in general, NPhard; in reasonable cases, there exist feasible algorithms for solving this problem. In this paper, we show that quantum computations techniques can drastically speed up the computation of reliability of given data.
Designing, Understanding, and Analyzing Unconventional Computation: The Important Role of Logic and Constructive Mathematics
"... In this paper, we explain why, in our opinion, logic and constructive mathematics are playing – and should play – an important role in the design, understanding, and analysis of unconventional computation. ..."
Abstract
 Add to MetaCart
(Show Context)
In this paper, we explain why, in our opinion, logic and constructive mathematics are playing – and should play – an important role in the design, understanding, and analysis of unconventional computation.
Received (Day Month Year)
"... Communicated by (xxxxxxxxxx) In theoretical computer science, researchers usually distinguish between feasible problems (that can be solved in polynomial time) and problems that require more computation time. A natural question is: can we use new physical processes, processes that have not been used ..."
Abstract
 Add to MetaCart
Communicated by (xxxxxxxxxx) In theoretical computer science, researchers usually distinguish between feasible problems (that can be solved in polynomial time) and problems that require more computation time. A natural question is: can we use new physical processes, processes that have not been used in modern computers, to make computations drastically faster – e.g., to make intractable problems feasible? Such a possibility would occur if a physical process provides a superpolynomial ( = faster than polynomial) speedup. In this direction, the most active research is undertaken in quantum computing. It is well known that quantum processes can speed up computations; however, the only proven quantum speedups are polynomial. Parallelization is another potential source of speedup. In Euclidean space, parallelization only leads to a polynomial speedup. We show that in quantum spacetime, parallelization can potentially leads to superpolynomial speedup of computations.
1Towards Combining Probabilistic and Interval Uncertainty in Engineering Calculations
, 2004
"... Abstract. In many engineering applications, we have to combine probabilistic and interval errors. For example, in environmental analysis, we observe a pollution level x(t) in a lake at different moments of time t, and we would like to estimate standard statistical characteristics such as mean, vari ..."
Abstract
 Add to MetaCart
Abstract. In many engineering applications, we have to combine probabilistic and interval errors. For example, in environmental analysis, we observe a pollution level x(t) in a lake at different moments of time t, and we would like to estimate standard statistical characteristics such as mean, variance, autocorrelation, correlation with other measurements. In environmental measurements, we often only know the