Results 1  10
of
21
Representation and Problem Solving with the Distribution Envelope Determination (DEnv) Method
 Method, Reliability Engineering and System Safety
, 2004
"... Distribution Envelope Determination (DEnv) is a technique for computing descriptions of derived random variables. Derived random variables have samples that are a function of samples of other random variable(s), which are termed inputs. DEnv can compute these descriptions despite uncertainty abou ..."
Abstract

Cited by 29 (13 self)
 Add to MetaCart
Distribution Envelope Determination (DEnv) is a technique for computing descriptions of derived random variables. Derived random variables have samples that are a function of samples of other random variable(s), which are termed inputs. DEnv can compute these descriptions despite uncertainty about the precise forms of probability distributions describing the inputs. For example, inputs whose distribution functions have means and variances known only to within intervals can be handled. More generally, inputs can be handled if the set of all plausible cumulative distributions describing them can be enclosed between left and right envelopes. When inputs are these envelopes rather than specific distribution functions, or when inputs are specific distribution functions but their dependency relationship is unspecified, or both, derived distributions will typically be envelopes. For example in the case of specific input distribution functions with unspecified dependency relationships, each of the infinite number of possible dependency relationships would imply some specific output distribution, and the set of all such output distributions can be bounded with envelopes. The DEnv algorithm is one way to obtain the bounding envelopes, and is implemented in a tool used to solve problems from a benchmark set. Keywords. DEnv, pboxes, aleatory uncertainty, epistemic uncertainty, second order uncertainty, uncertainty quantification, 2 order uncertainty, reducible uncertainty, challenge problems, envelopes, derived distributions, derived random variables, random sets, Statool. 1
Using probability trees to compute marginals with imprecise probabilities
 INTERNATIONAL JOURNAL OF APPROXIMATE REASONING
, 2002
"... This paper presents an approximate algorithm to obtain a posteriori intervals of probability, when available information is also given with intervals. The algorithm uses probability trees as a means of representing and computing with the convex sets of ..."
Abstract

Cited by 22 (2 self)
 Add to MetaCart
This paper presents an approximate algorithm to obtain a posteriori intervals of probability, when available information is also given with intervals. The algorithm uses probability trees as a means of representing and computing with the convex sets of
Inference in Credal Networks with BranchAndBound Algorithms
 IN INT. SYMP. ON IMPRECISE PROBABILITIES AND THEIR APPLICATIONS
, 2003
"... A credal network associates sets of probability distributions with directed acyclic graphs. Under strong independence assumptions, inference with credal networks is equivalent to a signomial program under linear constraints, a problem that is NPhard even for categorical variables and polytree mo ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
A credal network associates sets of probability distributions with directed acyclic graphs. Under strong independence assumptions, inference with credal networks is equivalent to a signomial program under linear constraints, a problem that is NPhard even for categorical variables and polytree models. We describe
A Review of Propagation Algorithms for Imprecise Probabilities
, 1999
"... This paper reviews algorithms for local computation with imprecise probabilities. These algorithms try to solve problems of inference (calculation of conditional or unconditional probabilities) in cases in which there are a large number of variables. There are two main types depending on the nature ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
This paper reviews algorithms for local computation with imprecise probabilities. These algorithms try to solve problems of inference (calculation of conditional or unconditional probabilities) in cases in which there are a large number of variables. There are two main types depending on the nature of assumed independence relationships in each case. In both of them the global knowledge is composed of several pieces of local information. The objective is to carry out a sound global computation but mainly using the initial local representation. Keywords. Propagation algorithms, valuations based systems, imprecise probabilities. 1
Sets of joint probability measures generated by weighted marginal focal sets
 Proc. 2st International Symposium on Imprecise Probabilities and Their Applications
, 2001
"... This paper is devoted to the construction of sets of joint probability measures for the case that the marginal sets of probability measures are generated by weighted focal sets. Different conditions on the choice of the weights of the joint focal sets and on the probability measures on these sets le ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
This paper is devoted to the construction of sets of joint probability measures for the case that the marginal sets of probability measures are generated by weighted focal sets. Different conditions on the choice of the weights of the joint focal sets and on the probability measures on these sets lead to different types of independence such as strong independence, random set independence, fuzzy set independence and unknown interaction. As an application the upper probabilities of failure of a beam are computed.
Climate Projections for the 21st Century Using Random Sets
, 2003
"... We apply random set theory to an analysis of future climate change. Bounds on cumulative probability are used to quantify uncertainties in natural and socioeconomic factors that influence estimates of global mean temperature. We explore the link of... ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
We apply random set theory to an analysis of future climate change. Bounds on cumulative probability are used to quantify uncertainties in natural and socioeconomic factors that influence estimates of global mean temperature. We explore the link of...
Computing Lower Expectations with Kuznetsov's Independence Condition
 Zaffalon (Eds.), ISIPTA ’03 – Proceedings of the Third International Symposium on Imprecise Probabilities and Their Applications, Carleton Scientific
, 2003
"... Kuznetsov's condition says that variables X and Y are independent when any product of bounded functions f (X) and g(Y ) behaves in a certain way: the interval of expected values f (X)g(Y )] must be equal to the interval product f (X)]E[g(Y)]. The main result of this paper shows how to comput ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
Kuznetsov's condition says that variables X and Y are independent when any product of bounded functions f (X) and g(Y ) behaves in a certain way: the interval of expected values f (X)g(Y )] must be equal to the interval product f (X)]E[g(Y)]. The main result of this paper shows how to compute lower expectations using Kuznetsov's condition. We also generalize Kuznetsov's condition to conditional expectation intervals, and study the relationship between Kuznetsov's conditional condition and the semigraphoid properties.
Bayesian Networks with Imprecise Probabilities: Theory and Application to Classification
, 2010
"... Bayesian network are powerful probabilistic graphical models for modelling uncertainty. Among others, classification represents an important application: some of the most used classifiers are based on Bayesian networks. Bayesian networks are precise models: exact numeric values should be provided fo ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
Bayesian network are powerful probabilistic graphical models for modelling uncertainty. Among others, classification represents an important application: some of the most used classifiers are based on Bayesian networks. Bayesian networks are precise models: exact numeric values should be provided for quantification. This requirement is sometimes too narrow. Sets instead of single distributions can provide a more realistic description in these cases. Bayesian networks can be generalized to cope with sets of distributions. This leads to a novel class of imprecise probabilistic graphical models, called credal networks. In particular, classifiers based on Bayesian networks are generalized to socalled credal classifiers. Unlike Bayesian classifiers, which always detect a single class as the one maximizing the posterior class probability, a credal classifier may eventually be unable to discriminate a single class. In other words, if the available information is not sufficient, credal classifiers allow for indecision between two or more classes, this providing a less informative but more robust conclusion than Bayesian classifiers.
Separation Properties of Sets of Probability Measures
 In Conference on Uncertainty in Artificial Intelligence
, 2000
"... This paper analyzes independence concepts for sets of probability measures associated with directed acyclic graphs. The paper shows that epistemic independence and the standard Markov condition violate desirable separation properties. The adoption of a contraction condition leads to dseparati ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
This paper analyzes independence concepts for sets of probability measures associated with directed acyclic graphs. The paper shows that epistemic independence and the standard Markov condition violate desirable separation properties. The adoption of a contraction condition leads to dseparation but still fails to guarantee a belief separation property. To overcome this unsatisfactory situation, a strong Markov condition is proposed, based on epistemic independence. The main result is that the strong Markov condition leads to strong independence and does enforce separation properties; this result implies that (1) separation properties of Bayesian networks do extend to epistemic independence and sets of probability measures, and (2) strong independence has a clear justi cation based on epistemic independence and the strong Markov condition. 1
Arithmetic on Random Variables: Squeezing the Envelopes with New Joint Distribution Constraints
"... Uncertainty is a key issue in decision analysis and other kinds of applications. Researchers have developed a number of approaches to address computations on uncertain quantities. When doing arithmetic operations on random variables, an important question has to be considered: the dependency relatio ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Uncertainty is a key issue in decision analysis and other kinds of applications. Researchers have developed a number of approaches to address computations on uncertain quantities. When doing arithmetic operations on random variables, an important question has to be considered: the dependency relationships among the variables. In practice, we often have partial information about the dependency relationship between two random variables. This information may result from experience or system requirements. We can use this information to improve bounds on the cumulative distributions of random variables derived from the marginals whose dependency is partially known.