Results 1  10
of
20
MonteCarlotype techniques for processing interval uncertainty, and their potential engineering applications
 Reliable Computing
, 2007
"... Abstract. In engineering applications, we need to make decisions under uncertainty. Traditionally, in engineering, statistical methods are used, methods assuming that we know the probability distribution of different uncertain parameters. Usually, we can safely linearize the dependence of the desire ..."
Abstract

Cited by 12 (6 self)
 Add to MetaCart
Abstract. In engineering applications, we need to make decisions under uncertainty. Traditionally, in engineering, statistical methods are used, methods assuming that we know the probability distribution of different uncertain parameters. Usually, we can safely linearize the dependence of the desired quantities y (e.g., stress at different structural points) on the uncertain parameters xi – thus enabling sensitivity analysis. Often, the number n of uncertain parameters is huge, so sensitivity analysis leads to a lot of computation time. To speed up the processing, we propose to use special MonteCarlotype simulations.
Probabilities, intervals, what next? Extension of interval computations to situations with partial information about probabilities
 Proceedings of the 10th IMEKO TC7 International Symposium on Advances of Measurement Science
, 2004
"... Abstract. In many reallife situations, we are interested in the value of a physical quantity y that is difficult or impossible to measure directly. To estimate y, we find some easiertomeasure quantities x1,..., xn which are related to y by a known relation y = f(x1,..., xn). Measurements are neve ..."
Abstract

Cited by 10 (5 self)
 Add to MetaCart
Abstract. In many reallife situations, we are interested in the value of a physical quantity y that is difficult or impossible to measure directly. To estimate y, we find some easiertomeasure quantities x1,..., xn which are related to y by a known relation y = f(x1,..., xn). Measurements are never 100 % accurate; hence, the measured values �xi are different from xi, and the resulting estimate �y = f(�x1,..., �xn) is different from the desired value y = f(x1,..., xn). How different? Traditional engineering to error estimation in data processing assumes that we know the probabilities of different def measurement error ∆xi = �xi − xi. In many practical situations, we only know the upper bound ∆i for this error; hence, after the measurement, the only information that we have about xi is that it belongs def to the interval xi = [�xi − ∆i, �xi + ∆i]. In this case, it is important to find the range y of all possible values of y = f(x1,..., xn) when xi ∈ xi. We start with a brief overview of the corresponding interval computation problems. We then discuss what to do when, in addition to the upper bounds ∆i, we have some partial information about the probabilities of different values of ∆xi.
Interval Computations and IntervalRelated Statistical Techniques: Tools for Estimating Uncertainty of the Results of Data Processing and Indirect Measurements
"... In many practical situations, we only know the upper bound ∆ on the (absolute value of the) measurement error ∆x, i.e., we only know that the measurement error is located on the interval [−∆, ∆]. The traditional engineering approach to such situations is to assume that ∆x is uniformly distributed on ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
In many practical situations, we only know the upper bound ∆ on the (absolute value of the) measurement error ∆x, i.e., we only know that the measurement error is located on the interval [−∆, ∆]. The traditional engineering approach to such situations is to assume that ∆x is uniformly distributed on [−∆, ∆], and to use the corresponding statistical techniques. In some situations, however, this approach underestimates the error of indirect measurements. It is therefore desirable to directly process this interval uncertainty. Such “interval computations” methods have been developed since the 1950s. In this chapter, we provide a brief overview of related algorithms, results, and remaining open problems.
Computational Methods for Decision Making Based on Imprecise Information
, 2006
"... In this paper, we investigate computational methods for decision making based on imprecise information in the context of engineering design. The goal is to identify the subtleties of engineering design problems that impact the choice of computational solution methods, and to evaluate some existing s ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
In this paper, we investigate computational methods for decision making based on imprecise information in the context of engineering design. The goal is to identify the subtleties of engineering design problems that impact the choice of computational solution methods, and to evaluate some existing solution methods to determine their suitability and limitations. Although several approaches for propagating imprecise probabilities have been published in the literature, these methods are insufficient for practical engineering analysis. The dependency bounds convolution approach of Williamson and Downs and the distribution envelope determination approach of Berleant work sufficiently well only for open models (that is, models with known mathematical operations). Both of these approaches rely on interval arithmetic and are therefore limited to problems with few repeated variables. In an attempt to overcome the difficulties faced by these deterministic methods, we propose an alternative approach that utilizes both Monte Carlo simulation and optimization. The Monte Carlo/optimization hybrid approach has its own drawbacks in that it assumes that the uncertain inputs can be parameterized, that it requires the solution of a global optimization problem, and that it assumes independence between the uncertain inputs.
Propagation and provenance of probabilistic and interval uncertainty in cyberinfrastructurerelated data processing and data fusion
 Proceedings of the International Workshop on Reliable Engineering Computing REC’08
, 2008
"... Abstract. In the past, communications were much slower than computations. As a result, researchers and practitioners collected different data into huge databases located at a single location such as NASA and US Geological Survey. At present, communications are so much faster that it is possible to k ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Abstract. In the past, communications were much slower than computations. As a result, researchers and practitioners collected different data into huge databases located at a single location such as NASA and US Geological Survey. At present, communications are so much faster that it is possible to keep different databases at different locations, and automatically select, transform, and collect relevant data when necessary. The corresponding cyberinfrastructure is actively used in many applications. It drastically enhances scientists ’ ability to discover, reuse and combine a large number of resources, e.g., data and services. Because of this importance, it is desirable to be able to gauge the the uncertainty of the results obtained by using cyberinfrastructure. This problem is made more urgent by the fact that the level of uncertainty associated with cyberinfrastructure resources can vary greatly – and that scientists have much less control over the quality of different resources than in the centralized database. Thus, with the cyberinfrastructure promise comes the need to analyze how data uncertainty propagates via this cyberinfrastructure. When the resulting accuracy is too low, it is desirable to produce the provenance of this inaccuracy: to find out which data points contributed most to it, and how an improved accuracy of these data points will improve the accuracy of the result. In this paper, we describe algorithms for propagating uncertainty and for finding the provenance for this uncertainty.
V.: Estimating Probability of Failure of a Complex System Based on
 Partial Information about Subsystems and Components, with Potential Applications to Aircraft Maintenance, Proceedings of the International Workshop on Soft Computing Applications and Knowledge Discovery SCAKD’2011
, 2011
"... Abstract. In many reallife applications (e.g., in aircraft maintenance), we need to estimate the probability of failure of a complex system (such as an aircraft as a whole or one of its subsystems). Complex systems are usually built with redundancy allowing them to withstand the failure of a small ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Abstract. In many reallife applications (e.g., in aircraft maintenance), we need to estimate the probability of failure of a complex system (such as an aircraft as a whole or one of its subsystems). Complex systems are usually built with redundancy allowing them to withstand the failure of a small number of components. In this paper, we assume that we know the structure of the system, and, as a result, for each possible set of failed components, we can tell whether this set will lead to a system failure. For each component A, we know the probability P (A) of its failure with some uncertainty: e.g., we know the lower and upper bounds P (A) and P (A) for this probability. Usually, it is assumed that failures of different components are independent events. Our objective is to use all this information to estimate the probability of failure of the entire the complex system. In this paper, we describe a new efficient method for such estimation based on Cauchy deviates.
Interval Computations as an Important Part of Granular Computing: An Introduction
"... This chapter provides a general introduction to interval computations, especially to interval computations as an important part of granular computing. ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
This chapter provides a general introduction to interval computations, especially to interval computations as an important part of granular computing.
Estimating Probability of Failure of a Complex System Based on Inexact Information about Subsystems and Components, with Potential Applications to Aircraft Maintenance
"... Abstract. In many reallife applications (e.g., in aircraft maintenance), we need to estimate the probability of failure of a complex system (such as an aircraft as a whole or one of its subsystems). Complex systems are usually built with redundancy allowing them to withstand the failure of a small ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract. In many reallife applications (e.g., in aircraft maintenance), we need to estimate the probability of failure of a complex system (such as an aircraft as a whole or one of its subsystems). Complex systems are usually built with redundancy allowing them to withstand the failure of a small number of components. In this paper, we assume that we know the structure of the system, and, as a result, for each possible set of failed components, we can tell whether this set will lead to a system failure. For each component A, we know the probability P (A) of its failure with some uncertainty: e.g., we know the lower and upper bounds P (A) and P (A) for this probability. Usually, it is assumed that failures of different components are independent events. Our objective is to use all this information to estimate the probability of failure of the entire the complex system. In this paper, we describe a new efficient method for such estimation based on Cauchy deviates.
Fast Algorithms for Uncertainty Propagation, and Their Applications to Structural Integrity
"... Abstract—In many practical situations, we need to know how uncertainty propagates through data processing algorithms, i.e., how the uncertainty in the inputs affects the results of data processing. This problem is important for all types of uncertainty: probabilistic, interval, and fuzzy. From the c ..."
Abstract
 Add to MetaCart
Abstract—In many practical situations, we need to know how uncertainty propagates through data processing algorithms, i.e., how the uncertainty in the inputs affects the results of data processing. This problem is important for all types of uncertainty: probabilistic, interval, and fuzzy. From the computational viewpoint, however, this problem is much more complex for interval and fuzzy uncertainty. Therefore, for these types of uncertainty, it is desirable to design faster algorithms. In this paper, we describe faster algorithms for two practically important situations: • linearization situations, when the approximation errors are small and therefore, the data processing algorithms can be replaced by a linear function, and monotonic situations, when the dependence of the result y of data processing on each of the inputs x1,..., xn is either
Deviate Method and Need for Intuitive Explanation 1.1 Practical Need for Uncertainty Propagation
"... Abstract — One of the most efficient techniques for processing interval and fuzzy data is a MonteCarlo type technique of Cauchy deviates that uses Cauchy distributions. This technique is mathematically valid, but somewhat counterintuitive. In this paper, following the ideas of Paul Werbos, we provi ..."
Abstract
 Add to MetaCart
Abstract — One of the most efficient techniques for processing interval and fuzzy data is a MonteCarlo type technique of Cauchy deviates that uses Cauchy distributions. This technique is mathematically valid, but somewhat counterintuitive. In this paper, following the ideas of Paul Werbos, we provide a natural neural network explanation for this technique.