Results 1  10
of
39
Fusion, Propagation, and Structuring in Belief Networks
 ARTIFICIAL INTELLIGENCE
, 1986
"... Belief networks are directed acyclic graphs in which the nodes represent propositions (or variables), the arcs signify direct dependencies between the linked propositions, and the strengths of these dependencies are quantified by conditional probabilities. A network of this sort can be used to repre ..."
Abstract

Cited by 382 (7 self)
 Add to MetaCart
Belief networks are directed acyclic graphs in which the nodes represent propositions (or variables), the arcs signify direct dependencies between the linked propositions, and the strengths of these dependencies are quantified by conditional probabilities. A network of this sort can be used to represent the generic knowledge of a domain expert, and it turns into a computational architecture if the links are used not merely for storing factual knowledge but also for directing and activating the data flow in the computations which manipulate this knowledge. The first part of the paper deals with the task of fusing and propagating the impacts of new information through the networks in such a way that, when equilibrium is reached, each proposition will be assigned a measure of belief consistent with the axioms of probability theory. It is shown that if the network is singly connected (e.g. treestructured), then probabilities can be updated by local propagation in an isomorphic network of parallel and autonomous processors and that the impact of new information can be imparted to all propositions in time proportional to the longest path in the network. The second part of the paper deals with the problem of finding a treestructured representation for a collection of probabilistically coupled propositions using auxiliary (dummy) variables, colloquially called "hidden causes. " It is shown that if such a treestructured representation exists, then it is possible to uniquely uncover the topology of the tree by observing pairwise dependencies among the available propositions (i.e., the leaves of the tree). The entire tree structure, including the strengths of all internal relationships, can be reconstructed in time proportional to n log n, where n is the number of leaves.
Bucket Elimination: A Unifying Framework for Probabilistic Inference
, 1996
"... . Probabilistic inference algorithms for belief updating, finding the most probable explanation, the maximum a posteriori hypothesis, and the maximum expected utility are reformulated within the bucket elimination framework. This emphasizes the principles common to many of the algorithms appearing ..."
Abstract

Cited by 291 (31 self)
 Add to MetaCart
. Probabilistic inference algorithms for belief updating, finding the most probable explanation, the maximum a posteriori hypothesis, and the maximum expected utility are reformulated within the bucket elimination framework. This emphasizes the principles common to many of the algorithms appearing in the probabilistic inference literature and clarifies the relationship of such algorithms to nonserial dynamic programming algorithms. A general method for combining conditioning and bucket elimination is also presented. For all the algorithms, bounds on complexity are given as a function of the problem's structure. 1. Overview Bucket elimination is a unifying algorithmic framework that generalizes dynamic programming to accommodate algorithms for many complex problemsolving and reasoning activities, including directional resolution for propositional satisfiability (Davis and Putnam, 1960), adaptive consistency for constraint satisfaction (Dechter and Pearl, 1987), Fourier and Gaussian el...
Fundamental Concepts of Qualitative Probabilistic Networks
 ARTIFICIAL INTELLIGENCE
, 1990
"... Graphical representations for probabilistic relationships have recently received considerable attention in A1. Qualitative probabilistic networks abstract from the usual numeric representations by encoding only qualitative relationships, which are inequality constraints on the joint probability dist ..."
Abstract

Cited by 121 (6 self)
 Add to MetaCart
Graphical representations for probabilistic relationships have recently received considerable attention in A1. Qualitative probabilistic networks abstract from the usual numeric representations by encoding only qualitative relationships, which are inequality constraints on the joint probability distribution over the variables. Although these constraints are insufficient to determine probabilities uniquely, they are designed to justify the deduction of a class of relative likelihood conclusions that imply useful decisionmaking properties. Two types of qualitative relationship are defined, each a probabilistic form of monotonicity constraint over a group of variables. Qualitative influences describe the direction of the relationship between two variables. Qualitative synergies describe interactions among influences. The probabilistic definitions chosen justify sound and efficient inference procedures based on graphical manipulations of the network. These procedures answer queries about qualitative relationships among variables separated in the network and determine structural properties of optimal assignments to decision variables.
Towards normative experts systems: Part i. the path nder project
 Methods of Information in Medicine
, 1992
"... ..."
Random Worlds and Maximum Entropy
 In Proc. 7th IEEE Symp. on Logic in Computer Science
, 1994
"... Given a knowledge base KB containing firstorder and statistical facts, we consider a principled method, called the randomworlds method, for computing a degree of belief that some formula ' holds given KB . If we are reasoning about a world or system consisting of N individuals, then we can co ..."
Abstract

Cited by 49 (12 self)
 Add to MetaCart
Given a knowledge base KB containing firstorder and statistical facts, we consider a principled method, called the randomworlds method, for computing a degree of belief that some formula ' holds given KB . If we are reasoning about a world or system consisting of N individuals, then we can consider all possible worlds, or firstorder models, with domain f1; : : : ; Ng that satisfy KB , and compute the fraction of them in which ' is true. We define the degree of belief to be the asymptotic value of this fraction as N grows large. We show that when the vocabulary underlying ' and KB uses constants and unary predicates only, we can naturally associate an entropy with each world. As N grows larger, there are many more worlds with higher entropy. Therefore, we can use a maximumentropy computation to compute the degree of belief. This result is in a similar spirit to previous work in physics and artificial intelligence, but is far more general. Of equal interest to the result itself are...
Network delay tomography
 IEEE Transactions on Signal Processing
, 2003
"... Abstract—The substantial overhead of performing internal network monitoring motivates techniques for inferring spatially localized information about performance using only endtoend measurements. In this paper, we present a novel methodology for inferring the queuing delay distributions across inte ..."
Abstract

Cited by 42 (3 self)
 Add to MetaCart
Abstract—The substantial overhead of performing internal network monitoring motivates techniques for inferring spatially localized information about performance using only endtoend measurements. In this paper, we present a novel methodology for inferring the queuing delay distributions across internal links in the network based solely on unicast, endtoend measurements. The major contributions are: 1) we formulate a measurement procedure for estimation and localization of delay distribution based on endtoend packet pairs; 2) we develop a simple way to compute maximum likelihood estimates (MLEs) using the expectationmaximization (EM) algorithm; 3) we develop a new estimation methodology based on recently proposed nonparametric, waveletbased density estimation method; and 4) we optimize the computational complexity of the EM algorithm by developing a new fast Fourier transform implementation. Realistic network simulations are carried out using networklevel simulator ns2 to demonstrate the accuracy of the estimation procedure. Index Terms—Computer network performance, delay estimation, Internet, tomography. I.
Performing bayesian inference by weighted model counting
 In Proceedings of the National Conference on Artificial Intelligence (AAAI
, 2005
"... Over the past decade general satisfiability testing algorithms have proven to be surprisingly effective at solving a wide variety of constraint satisfaction problem, such as planning and scheduling (Kautz and Selman 2003). Solving such NPcomplete tasks by “compilation to SAT ” has turned out to be a ..."
Abstract

Cited by 28 (0 self)
 Add to MetaCart
Over the past decade general satisfiability testing algorithms have proven to be surprisingly effective at solving a wide variety of constraint satisfaction problem, such as planning and scheduling (Kautz and Selman 2003). Solving such NPcomplete tasks by “compilation to SAT ” has turned out to be an approach that is of both practical and theoretical interest. Recently, (Sang et al. 2004) have shown that state of the art SAT algorithms can be efficiently extended to the harder task of counting the number of models (satisfying assignments) of a formula, by employing a technique called component caching. This paper begins to investigate the question of whether “compilation to modelcounting ” could be a practical technique for solving realworld #Pcomplete problems, in particular Bayesian inference. We describe an efficient translation from Bayesian networks to weighted model counting, extend the best modelcounting algorithms to weighted model counting, develop an efficient method for computing all marginals in a single counting pass, and evaluate the approach on computationally challenging reasoning problems.
A Computational Scheme For Reasoning In Dynamic Probabilistic Networks
, 1992
"... A computational scheme for reasoning about dynamic systems using (causal) probabilistic networks is presented. The scheme is based on the framework of Lauritzen & Spiegelhalter (1988), and may be viewed as a generalization of the inference methods of classical timeseries analysis in the sens ..."
Abstract

Cited by 23 (0 self)
 Add to MetaCart
A computational scheme for reasoning about dynamic systems using (causal) probabilistic networks is presented. The scheme is based on the framework of Lauritzen & Spiegelhalter (1988), and may be viewed as a generalization of the inference methods of classical timeseries analysis in the sense that it allows description of nonlinear, multivariate dynamic systems with complex conditional independence structures. Further, the scheme provides a method for efficient backward smoothing and possibilities for efficient, approximate forecasting methods. The scheme has been implemented on top of the HUGIN shell.
Towards Precision of Probabilistic Bounds Propagation
 PROC. OF THE 8 TH CONFERENCE ON UNCERTAINTY IN ARTIFICIAL INTELLIGENCE
, 1992
"... The DUCKcalculus presented here is a recent approach to cope with probabilistic uncertainty in a sound and efficient way. Uncertain rules with bounds for probabilities and explicit conditional independences can be maintained incrementally. The basic inference mechanism relies on local bounds propag ..."
Abstract

Cited by 19 (1 self)
 Add to MetaCart
The DUCKcalculus presented here is a recent approach to cope with probabilistic uncertainty in a sound and efficient way. Uncertain rules with bounds for probabilities and explicit conditional independences can be maintained incrementally. The basic inference mechanism relies on local bounds propagation, implementable by deductive databases with a bottomup fixpoint evaluation. In situations, where no precise bounds are deducible, it can be combined with simple operations research techniques on a local scope. In particular, we provide new precise analytical bounds for probabilistic entailment.
Graphical Explanation in Belief Networks
 In Journal of Computational and Graphical Statistics
, 1997
"... Belief networks provide an important bridge between statistical modeling and expert systems. In this paper we present methods for visualizing probabilistic "evidence flows" in belief networks, thereby enabling belief networks to explain their behavior. Building on earlier research on expla ..."
Abstract

Cited by 16 (4 self)
 Add to MetaCart
Belief networks provide an important bridge between statistical modeling and expert systems. In this paper we present methods for visualizing probabilistic "evidence flows" in belief networks, thereby enabling belief networks to explain their behavior. Building on earlier research on explanation in expert systems, we present a hierarchy of explanations, ranging from simple colorings to detailed displays. Our approach complements parallel work on textual explanations in belief networks. GRAPHICALBELIEF, Mathsoft Inc.'s belief network software, implements the methods. 1 Introduction A fundamental reason for building a mathematical or statistical model is to foster deeper understanding of complex, realworld systems. Consequently, explanationsdescriptions of the mechanisms which comprise such modelsform an important part of model validation, exploration, and use. Early tests of rulebased expert system models indicated the critical need for detailed explanations in that setting (...