Results 1  10
of
11
Axioms for probability and belieffunction propagation
 Uncertainty in Artificial Intelligence
, 1990
"... In this paper, we describe an abstract framework and axioms under which exact local computation of marginals is possible. The primitive objects of the framework are variables and valuations. The primitive operators of the framework are combination and marginalization. These operate on valuations. We ..."
Abstract

Cited by 137 (17 self)
 Add to MetaCart
In this paper, we describe an abstract framework and axioms under which exact local computation of marginals is possible. The primitive objects of the framework are variables and valuations. The primitive operators of the framework are combination and marginalization. These operate on valuations. We state three axioms for these operators and we derive the possibility of local computation from the axioms. Next, we describe a propagation scheme for computing marginals of a valuation when we have a factorization of the valuation on a hypertree. Finally we show how the problem of computing marginals of joint probability distributions and joint belief functions fits the general framework. 1.
Perspectives on the Theory and Practice of Belief Functions
 International Journal of Approximate Reasoning
, 1990
"... The theory of belief functions provides one way to use mathematical probability in subjective judgment. It is a generalization of the Bayesian theory of subjective probability. When we use the Bayesian theory to quantify judgments about a question, we must assign probabilities to the possible answer ..."
Abstract

Cited by 86 (7 self)
 Add to MetaCart
The theory of belief functions provides one way to use mathematical probability in subjective judgment. It is a generalization of the Bayesian theory of subjective probability. When we use the Bayesian theory to quantify judgments about a question, we must assign probabilities to the possible answers to that question. The theory of belief functions is more flexible; it allows us to derive degrees of belief for a question from probabilities for a related question. These degrees of belief may or may not have the mathematical properties of probabilities; how much they differ from probabilities will depend on how closely the two questions are related. Examples of what we would now call belieffunction reasoning can be found in the late seventeenth and early eighteenth centuries, well before Bayesian ideas were developed. In 1689, George Hooper gave rules for combining testimony that can be recognized as special cases of Dempster's rule for combining belief functions (Shafer 1986a). Similar rules were formulated by Jakob Bernoulli in his Ars Conjectandi, published posthumously in 1713, and by JohannHeinrich Lambert in his Neues Organon, published in 1764 (Shafer 1978). Examples of belieffunction reasoning can also be found in more recent work, by authors
Computation in Valuation Algebras
 IN HANDBOOK OF DEFEASIBLE REASONING AND UNCERTAINTY MANAGEMENT SYSTEMS, VOLUME 5: ALGORITHMS FOR UNCERTAINTY AND DEFEASIBLE REASONING
, 1999
"... Many different formalisms for treating uncertainty or, more generally, information and knowledge, have a common underlying algebraic structure. ..."
Abstract

Cited by 22 (4 self)
 Add to MetaCart
Many different formalisms for treating uncertainty or, more generally, information and knowledge, have a common underlying algebraic structure.
Optimality issues in constructing a Markov tree from graphical models
 Computational and Graphical Statistics
, 1991
"... Several recent papers have described probability models which used graph and hypergraphs to represent relationships among the variables. Two related computing algorithms are commonly used to manipulate such models: the peeling algorithm which eliminates variables one at a time to find the marginal d ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
Several recent papers have described probability models which used graph and hypergraphs to represent relationships among the variables. Two related computing algorithms are commonly used to manipulate such models: the peeling algorithm which eliminates variables one at a time to find the marginal distribution of a single variable, and the fusion and propagation algorithm which simultaneously solves for many marginal distributions by passing messages in a Tree of Cliques whose nodes correspond to subsets of variables. The peeling algorithm requires an elimination order. As demonstrated in this paper, the elimination order can in turn be used to construct a Tree of Cliques for propagation and fusion. This paper addresses three computational issues: 1) The choice of elimination order determines the size of the largest node of the Tree of Cliques, which dominates the computational cost for the probability model using either peeling or fusion and propagation. We review heuristics for choosing an elimination order. (2) Inserting intersection nodes into the tree of cliques produces a junction tree which has a lower computational cost. We present an algorithm which produces a junction trees with a high computational efficiency. (3) Augmenting the tree of cliques with additional nodes can lead to a new tree structure which more clearly expresses the relationship between the original graphical model and the tree model.
Theory of evidence  a survey of its mathematical foundations, applications and computational aspects
 ZOR MATHEMATICAL METHODS OF OPERATIONS RESEARCH
, 1994
"... The mathematical theory of evidence has been introduced by Glenn Shafer in 1976 as a new approach to the representation of uncertainty. This theory can be represented under several distinct but more or less equivalent forms. Probabilistic interpretations of evidence theory have their roots in Arthur ..."
Abstract

Cited by 15 (0 self)
 Add to MetaCart
The mathematical theory of evidence has been introduced by Glenn Shafer in 1976 as a new approach to the representation of uncertainty. This theory can be represented under several distinct but more or less equivalent forms. Probabilistic interpretations of evidence theory have their roots in Arthur Dempster's multivalued mappings of probability spaces. This leads to random set and more generally to random lter models of evidence. In this probabilistic view evidence is seen as more or less probable arguments for certain hypotheses and they can be used to support those hypotheses to certain degrees. These degrees of support are in fact the reliabilities with which the hypotheses can be derived from the evidence. Alternatively, the mathematical theory of evidence can be founded axiomatically on the notion of belief functions or on the allocation of belief masses to subsets of a frame of discernment. These approaches aim to present evidence theory as an extension of probability theory. Evidence theory has been used to represent uncertainty in expert systems, especially in the domain of diagnostics. It can be applied to decision analysis and it gives a new perspective for statistical analysis. Among its further applications are image processing, project planing and scheduling and risk analysis. The computational problems of evidence theory
Belief Decision Trees: Theoretical foundations
, 2000
"... This paper extends the decision tree technique to an uncertain environment where the uncertainty is represented by belief functions as interpreted in the Transferable Belief Model (TBM). This socalled belief decision tree is a new classification method adapted to uncertain data. We will be concerne ..."
Abstract

Cited by 15 (4 self)
 Add to MetaCart
This paper extends the decision tree technique to an uncertain environment where the uncertainty is represented by belief functions as interpreted in the Transferable Belief Model (TBM). This socalled belief decision tree is a new classification method adapted to uncertain data. We will be concerned with the construction of the belief decision tree from a training set where the knowledge about the instances' classes is represented by belief functions, and its use for the classification of new instances where the knowledge about the attributes' values is represented by belief functions. Keywords: Belief Functions, Decision Tree, Belief Decision Tree, Classification, Transferable Belief Model. 1 Introduction Several learning methods have been developed to ensure classification. Among these, the decision tree method may be one of the most commonly used in supervised learning approaches. Indeed decision trees are characterized by their capability to break down a complex decision problem ...
Using DempsterShafer's BeliefFunction Theory in Expert Systems
 Advances in the DempsterShafer Theory of Evidence, 395–414
, 1994
"... The main objective of this paper is to describe how DempsterShafer’s (DS) theory of belief functions fits in the framework of valuationbased systems (VBS). Since VBS serve as a framework for managing uncertainty in expert systems, this facilitates the use of DS belieffunction theory in expert sys ..."
Abstract

Cited by 6 (6 self)
 Add to MetaCart
The main objective of this paper is to describe how DempsterShafer’s (DS) theory of belief functions fits in the framework of valuationbased systems (VBS). Since VBS serve as a framework for managing uncertainty in expert systems, this facilitates the use of DS belieffunction theory in expert systems. Keywords: DempsterShafer’s theory of belief functions, valuationbased systems, expert systems 1.
Using possibility theory in expert systems
 Fuzzy Sets and Systems
, 1992
"... This paper has two main objectives. The first objective is to give a characterization of a qualitative description of a possibility function. A qualitative description of a possibility function is called a consistent possibilistic state. The qualitative description and its characterization serve as ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
This paper has two main objectives. The first objective is to give a characterization of a qualitative description of a possibility function. A qualitative description of a possibility function is called a consistent possibilistic state. The qualitative description and its characterization serve as qualitative semantics for possibility functions. These semantics are useful in representing knowledge as possibility functions. The second objective is to describe how Zadeh’s theory of possibility fits in the framework of valuationbased systems (VBS). Since VBS serve as a framework for managing uncertainty and imprecision in expert systems, this facilitates the use of possibility theory in expert systems.
Axioms for dynamic programming
 Computational Learning and Probabilistic Reasoning
, 1996
"... This paper describes an abstract framework, called valuation network (VN), for representing and solving discrete optimization problems. In VNs, we represent information in an optimization problem using functions called valuations. Valuations represent factors of an objective function. Solving a VN i ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
This paper describes an abstract framework, called valuation network (VN), for representing and solving discrete optimization problems. In VNs, we represent information in an optimization problem using functions called valuations. Valuations represent factors of an objective function. Solving a VN involves using two operators called combination and marginalization. The combination operator tells us how to combine the factors of the objective function to form the global objective function (also called joint valuation). Marginalization is either maximization or minimization. Solving a VN can be described simply as finding the marginal of the joint valuation for the empty set. We state some simple axioms that combination and marginalization need to satisfy to enable us to solve a VN using local computation. We describe a fusion algorithm for solving a VN using local computation. For optimization problems, the fusion algorithm reduces to nonserial description of the dynamic programming method, and the axioms can be viewed as conditions that permit the use of dynamic programming. Subject classification: Dynamic programming: theory, algorithm. 1