Results 1  10
of
16
Exact Bayesian structure discovery in Bayesian networks
 J. of Machine Learning Research
, 2004
"... We consider a Bayesian method for learning the Bayesian network structure from complete data. Recently, Koivisto and Sood (2004) presented an algorithm that for any single edge computes its marginal posterior probability in O(n2 n) time, where n is the number of attributes; the number of parents per ..."
Abstract

Cited by 55 (8 self)
 Add to MetaCart
We consider a Bayesian method for learning the Bayesian network structure from complete data. Recently, Koivisto and Sood (2004) presented an algorithm that for any single edge computes its marginal posterior probability in O(n2 n) time, where n is the number of attributes; the number of parents per attribute is bounded by a constant. In this paper we show that the posterior probabilities for all the n(n−1) potential edges can be computed in O(n2 n) total time. This result is achieved by a forward–backward technique and fast Möbius transform algorithms, which are of independent interest. The resulting speedup by a factor of about n 2 allows us to experimentally study the statistical power of learning moderatesize networks. We report results from a simulation study that covers data sets with 20 to 10,000 records over 5 to 25 discrete attributes. 1
Target Identification Based on the Transferable Belief Model Interpretation of DempsterShafer Model. Pars I: Methodology
, 2001
"... This paper explains how multisensor data fusion and target identification can be performed within the transferable belief model, a model for the representation of quantified uncertainty based on belief functions. The paper is presented in two parts: methodology and application. In this part, we pres ..."
Abstract

Cited by 21 (3 self)
 Add to MetaCart
This paper explains how multisensor data fusion and target identification can be performed within the transferable belief model, a model for the representation of quantified uncertainty based on belief functions. The paper is presented in two parts: methodology and application. In this part, we present the underlying theory, in particular the General Bayesian Theorem needed to transform likelihoods into beliefs and the pignistic transformation needed to build the probability measure required for decision making. We end with a simple example. More sophisticated examples and some comparative studies are presented in Part II. The results presented here can be extended directly to many problems of data fusion and diagnosis.
Matrix Calculus for Belief Functions
, 2001
"... The mathematic of belief functions can be handled by the use of the matrix notation. This representation helps greatly the user thanks to its notational simplicity and its efficiency for proving theorems. We show how to use them for several problems related to belief functions and the transferable b ..."
Abstract

Cited by 19 (0 self)
 Add to MetaCart
The mathematic of belief functions can be handled by the use of the matrix notation. This representation helps greatly the user thanks to its notational simplicity and its efficiency for proving theorems. We show how to use them for several problems related to belief functions and the transferable belief model.
Conjunctive and Disjunctive Combination of Belief Functions Induced by Non Distinct Bodies of Evidence
 ARTIFICIAL INTELLIGENCE
, 2007
"... Dempster’s rule plays a central role in the theory of belief functions. However, it assumes the combined bodies of evidence to be distinct, an assumption which is not always verified in practice. In this paper, a new operator, the cautious rule of combination, is introduced. This operator is commuta ..."
Abstract

Cited by 17 (9 self)
 Add to MetaCart
Dempster’s rule plays a central role in the theory of belief functions. However, it assumes the combined bodies of evidence to be distinct, an assumption which is not always verified in practice. In this paper, a new operator, the cautious rule of combination, is introduced. This operator is commutative, associative and idempotent. This latter property makes it suitable to combine belief functions induced by reliable, but possibly overlapping bodies of evidence. A dual operator, the bold disjunctive rule, is also introduced. This operator is also commutative, associative and idempotent, and can be used to combine belief functions issues from possibly overlapping and unreliable sources. Finally, the cautious and bold rules are shown to be particular members of infinite families of conjunctive and disjunctive combination rules based on triangular norms and conorms.
Binary models for marginal independence
 JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B
, 2005
"... A number of authors have considered multivariate Gaussian models for marginal independence. In this paper we develop models for binary data with the same independence structure. The models can be parameterized based on Möbius inversion and maximum likelihood estimation can be performed using a versi ..."
Abstract

Cited by 16 (2 self)
 Add to MetaCart
A number of authors have considered multivariate Gaussian models for marginal independence. In this paper we develop models for binary data with the same independence structure. The models can be parameterized based on Möbius inversion and maximum likelihood estimation can be performed using a version of the Iterated Conditional Fitting algorithm. The approach is illustrated on a simple example. Relations to multivariate logistic and dependence ratio models are discussed.
Dempster's rule for evidence ordered in a complete directed acyclic graph
 International Journal of Approximate Reasoning
, 1993
"... For the case of evidence ordered in a complete directed acyclic graph this paper presents a new algorithm with lower computational complexity for Dempster's rule than that of stepbystep application of Dempster's rule. In this problem, every original pair of evidences, has a corresponding evidence ..."
Abstract

Cited by 15 (7 self)
 Add to MetaCart
For the case of evidence ordered in a complete directed acyclic graph this paper presents a new algorithm with lower computational complexity for Dempster's rule than that of stepbystep application of Dempster's rule. In this problem, every original pair of evidences, has a corresponding evidence against the simultaneous belief in both propositions. In this case, it is uncertain whether the propositions of any two evidences are in logical conflict. The original evidences are associated with the vertices and the additional evidences are associated with the edges. The original evidences are ordered, i.e., for every pair of evidences it is determinable which of the two evidences is the earlier one. We are interested in finding the most probable completely specified path through the graph, where transitions are possible only from lower to higherranked vertices. The path is here a representation for a sequence of states, for instance a sequence of snapshots of a physical object's track. A completely specified path means that the path includes no other vertices than those stated in the path representation, as opposed to an incompletely specified path that may also include other vertices than those stated. In a hierarchical network of all subsets of the frame, i.e., of all incompletely specified paths, the original and additional evidences support subsets that are not disjoint, thus it is not possible to prune the network to a tree. Instead of propagating belief, the new algorithm reasons about the logical conditions of a completely specified path through the graph. The new algorithm is O(Θ log Θ), compared to O(Θ^log Θ) of the classic brute force algorithm. After a detailed presentation of the reasoning behind the new algorithm we conclude that it is feasible to reason without approximation about completely specified paths through a complete directed acyclic graph.
Theory of evidence  a survey of its mathematical foundations, applications and computational aspects
 ZOR MATHEMATICAL METHODS OF OPERATIONS RESEARCH
, 1994
"... The mathematical theory of evidence has been introduced by Glenn Shafer in 1976 as a new approach to the representation of uncertainty. This theory can be represented under several distinct but more or less equivalent forms. Probabilistic interpretations of evidence theory have their roots in Arthur ..."
Abstract

Cited by 15 (0 self)
 Add to MetaCart
The mathematical theory of evidence has been introduced by Glenn Shafer in 1976 as a new approach to the representation of uncertainty. This theory can be represented under several distinct but more or less equivalent forms. Probabilistic interpretations of evidence theory have their roots in Arthur Dempster's multivalued mappings of probability spaces. This leads to random set and more generally to random lter models of evidence. In this probabilistic view evidence is seen as more or less probable arguments for certain hypotheses and they can be used to support those hypotheses to certain degrees. These degrees of support are in fact the reliabilities with which the hypotheses can be derived from the evidence. Alternatively, the mathematical theory of evidence can be founded axiomatically on the notion of belief functions or on the allocation of belief masses to subsets of a frame of discernment. These approaches aim to present evidence theory as an extension of probability theory. Evidence theory has been used to represent uncertainty in expert systems, especially in the domain of diagnostics. It can be applied to decision analysis and it gives a new perspective for statistical analysis. Among its further applications are image processing, project planing and scheduling and risk analysis. The computational problems of evidence theory
Markov Chain MonteCarlo Algorithms for the Calculation of DempsterShafer Belief , technical report, in preparation
, 1994
"... A simple MonteCarlo algorithm can be used to calculate DempsterShafer belief very efficiently unless the conflict between the evidences is very high. This paper introduces and explores Markov Chain MonteCarlo algorithms for calculating DempsterShafer belief that can also work well when the confl ..."
Abstract

Cited by 10 (6 self)
 Add to MetaCart
A simple MonteCarlo algorithm can be used to calculate DempsterShafer belief very efficiently unless the conflict between the evidences is very high. This paper introduces and explores Markov Chain MonteCarlo algorithms for calculating DempsterShafer belief that can also work well when the conflict is high. 1
MonteCarlo methods make DempsterShafer formalism feasible
 In
, 1994
"... Abstract: One of the main obstacles to the applications of DempsterShafer formalism is its computational complexity. If we combine rn different pieces of knowledge, then in general case we have to perform up to 2 m computational steps, which for large m is infeasible. For several important cases al ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
Abstract: One of the main obstacles to the applications of DempsterShafer formalism is its computational complexity. If we combine rn different pieces of knowledge, then in general case we have to perform up to 2 m computational steps, which for large m is infeasible. For several important cases algorithms with smaller running time have been proposed. We prove, however, that if we want to compute the belief bd(Q) in any given query Q, then exponential time is inevitable. It is still inevitable, if we want to compute bel(Q) with given precision e. This restriction corresponds to the natural idea that since initial masses are known only approximately, there is no sense in trying to compute beI(Q) precisely. A further idea is that there is always some doubt in the whole knowledge, so there is always a probability P0 that the expert's knowledge is wrong. In view of that it is sufficient to have an algorithm that gives a correct answer a probability> 1P0. If we use the original Dempster's combination rule, this possibility diminishes the running time, but still leaves the problem infeasible in the general case. We show that for the alternative combination rules proposed by Smets and Yager
FastDivision Architecture for DempsterShafer Belief Functions. In
 eds), First Int. Joint Conf. on Qualitative REFERENCES 40 and Quantitative Practical Reasoning, ECSQARU–FAPR’97, Springer, for Lecture Notes in Artif. Intell
, 1997
"... Abstract. Given a number of DempsterShafer belief functions there are different architectures which allow to do a compilation of the given knowledge. These architectures are the ShenoyShafer Architecture, the LauritzenSpiegelhalter Architecture and the HUGIN Architecture. We propose a new archite ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
Abstract. Given a number of DempsterShafer belief functions there are different architectures which allow to do a compilation of the given knowledge. These architectures are the ShenoyShafer Architecture, the LauritzenSpiegelhalter Architecture and the HUGIN Architecture. We propose a new architecture called ”FastDivision Architecture ” which is similar to the former two. But there are two important advantages: (i) results of intermediate computations are always valid DempsterShafer belief functions and (ii) some operations can often be performed much more efficiently. 1