Results 1  10
of
77
Exploiting Causal Independence in Bayesian Network Inference
 Journal of Artificial Intelligence Research
, 1996
"... A new method is proposed for exploiting causal independencies in exact Bayesian network inference. ..."
Abstract

Cited by 155 (9 self)
 Add to MetaCart
A new method is proposed for exploiting causal independencies in exact Bayesian network inference.
Selectivity Estimation using Probabilistic Models
, 2001
"... Estimating the result size of complex queries that involve selection on multiple attributes and the join of several relations is a difficult but fundamental task in database query processing. It arises in costbased query optimization, query profiling, and approximate query answering. In this paper, ..."
Abstract

Cited by 80 (3 self)
 Add to MetaCart
Estimating the result size of complex queries that involve selection on multiple attributes and the join of several relations is a difficult but fundamental task in database query processing. It arises in costbased query optimization, query profiling, and approximate query answering. In this paper, we show how probabilistic graphical models can be effectively used for this task as an accurate and compact approximation of the joint frequency distribution of multiple attributes across multiple relations. Probabilistic Relational Models (PRMs) are a recent development that extends graphical statistical models such as Bayesian Networks to relational domains. They represent the statistical dependencies between attributes within a table, and between attributes across foreignkey joins. We provide an efficient algorithm for constructing a PRM from a database, and show how a PRM can be used to compute selectivity estimates for a broad class of queries. One of the major contributions of this work is a unified framework for the estimation of queries involving both select and foreignkey join operations. Furthermore, our approach is not limited to answering a small set of predetermined queries; a single model can be used to effectively estimate the sizes of a wide collection of potential queries across multiple tables. We present results for our approach on several realworld databases. For both singletable multiattribute queries and a general class of selectjoin queries, our approach produces more accurate estimates than standard approaches to selectivity estimation, using comparable space and time.
Local Learning in Probabilistic Networks With Hidden Variables
, 1995
"... Probabilistic networks, which provide compact descriptions of complex stochastic relationships among several random variables, are rapidly becoming the tool of choice for uncertain reasoning in artificial intelligence. We show that networks with fixed structure containing hidden variables can be lea ..."
Abstract

Cited by 76 (4 self)
 Add to MetaCart
Probabilistic networks, which provide compact descriptions of complex stochastic relationships among several random variables, are rapidly becoming the tool of choice for uncertain reasoning in artificial intelligence. We show that networks with fixed structure containing hidden variables can be learned automatically from data using a gradientdescent mechanism similar to that used in neural networks. We also extend the method to networks with intensionally represented distributions, including networks with continuous variables and dynamic probabilistic networks. Because probabilistic networks provide explicit representations of causal structure, human experts can easily contribute prior knowledge to the training process, thereby significantly improving the learning rate. Adaptive probabilistic networks (APNs) may soon compete directly with neural networks as models in computational neuroscience as well as in industrial and financial applications. 1 Introduction Intelligent systems, ...
AISBN: An Adaptive Importance Sampling Algorithm for Evidential Reasoning in Large Bayesian Networks
 Journal of Artificial Intelligence Research
, 2000
"... Stochastic sampling algorithms, while an attractive alternative to exact algorithms in very large Bayesian network models, have been observed to perform poorly in evidential reasoning with extremely unlikely evidence. To address this problem, we propose an adaptive importance sampling algorithm, ..."
Abstract

Cited by 70 (4 self)
 Add to MetaCart
Stochastic sampling algorithms, while an attractive alternative to exact algorithms in very large Bayesian network models, have been observed to perform poorly in evidential reasoning with extremely unlikely evidence. To address this problem, we propose an adaptive importance sampling algorithm, AISBN, that shows promising convergence rates even under extreme conditions and seems to outperform the existing sampling algorithms consistently. Three sources of this performance improvement are (1) two heuristics for initialization of the importance function that are based on the theoretical properties of importance sampling in nitedimensional integrals and the structural advantages of Bayesian networks, (2) a smooth learning method for the importance function, and (3) a dynamic weighting function for combining samples from dierent stages of the algorithm. We tested the performance of the AISBN algorithm along with two state of the art general purpose sampling algorithms, lik...
Minibuckets: A general scheme for bounded inference
 Journal of the ACM (JACM
"... Abstract. This article presents a class of approximation algorithms that extend the idea of boundedcomplexity inference, inspired by successful constraint propagation algorithms, to probabilistic inference and combinatorial optimization. The idea is to bound the dimensionality of dependencies create ..."
Abstract

Cited by 57 (20 self)
 Add to MetaCart
Abstract. This article presents a class of approximation algorithms that extend the idea of boundedcomplexity inference, inspired by successful constraint propagation algorithms, to probabilistic inference and combinatorial optimization. The idea is to bound the dimensionality of dependencies created by inference algorithms. This yields a parameterized scheme, called minibuckets, that offers adjustable tradeoff between accuracy and efficiency. The minibucket approach to optimization problems, such as finding the most probable explanation (MPE) in Bayesian networks, generates both an approximate solution and bounds on the solution quality. We present empirical results demonstrating successful performance of the proposed approximation scheme for the MPE task, both on randomly generated problems and on realistic domains such as medical diagnosis and probabilistic decoding.
The Sensitivity of Belief Networks to Imprecise Probabilities: An Experimental Investigation
, 1995
"... bilities may not impair diagnostic performance significantly, and that simple binary representations may often be adequate. These findings of robustness suggest that belief networks are a practical representation without requiring undue precision. Probabilistic Reasoning, Bayesian Networks als ..."
Abstract

Cited by 56 (2 self)
 Add to MetaCart
bilities may not impair diagnostic performance significantly, and that simple binary representations may often be adequate. These findings of robustness suggest that belief networks are a practical representation without requiring undue precision. Probabilistic Reasoning, Bayesian Networks also Section on Medical Informatics, Stanford University, Stanford, CA 94305. also EngineeringEconomic Systems, Stanford University, Stanford, CA 94305. 1 The Tradeoff Between Accuracy and Cost sufficiently accurate 1.1 Experiments on Belief Networks Each knowledge representation or model is, by definition, a simplification of reality. When the representation is derived from a human expert, it is a simplification even of the expert's perception of reality. The question in choosing a representation is not whether the representation is completely accurateit cannot bebut whether the model is for the purposes for which it is designed. This question is the one tha
Hybrid Bayesian Networks for Reasoning about Complex Systems
, 2002
"... Many realworld systems are naturally modeled as hybrid stochastic processes, i.e., stochastic processes that contain both discrete and continuous variables. Examples include speech recognition, target tracking, and monitoring of physical systems. The task is usually to perform probabilistic inferen ..."
Abstract

Cited by 48 (0 self)
 Add to MetaCart
Many realworld systems are naturally modeled as hybrid stochastic processes, i.e., stochastic processes that contain both discrete and continuous variables. Examples include speech recognition, target tracking, and monitoring of physical systems. The task is usually to perform probabilistic inference, i.e., infer the hidden state of the system given some noisy observations. For example, we can ask what is the probability that a certain word was pronounced given the readings of our microphone, what is the probability that a submarine is trying to surface given our sonar data, and what is the probability of a valve being open given our pressure and flow readings. Bayesian networks are
MiniBuckets: A General Scheme for Approximating Inference
 Journal of ACM
, 1998
"... The paper presents a class of approximation algorithms that extend the idea of bounded inference, inspired by successful constraint propagation algorithms, to probabilistic inference and combinatorial optimization. The idea is to bound the dimensionality of dependencies created by inference algor ..."
Abstract

Cited by 45 (16 self)
 Add to MetaCart
The paper presents a class of approximation algorithms that extend the idea of bounded inference, inspired by successful constraint propagation algorithms, to probabilistic inference and combinatorial optimization. The idea is to bound the dimensionality of dependencies created by inference algorithms. This yields a parameterized scheme, called minibuckets, that offers adjustable levels of accuracy and efficiency. The minibucket approach generates both an approximate solution and a bound on the solution quality. We present empirical results demonstrating successful performance of the proposed approximation scheme for probabilistic tasks, both on randomly generated problems and on realistic domains such as medical diagnosis and probabilistic decoding. 1 Introduction Automated reasoning tasks such as constraint satisfaction and optimization, probabilistic inference, decisionmaking, and planning are generally hard (NPhard). One way to cope This work was partially supported...
A General Scheme for Automatic Generation of Search Heuristics from Specification Dependencies
 Artificial Intelligence
, 2001
"... The paper presents and evaluates the power of a new scheme that generates search heuristics mechanically for problems expressed using a set of functions or relations over a finite set of variables. The heuristics are extracted from a parameterized approximation scheme called MiniBucket eliminati ..."
Abstract

Cited by 37 (18 self)
 Add to MetaCart
The paper presents and evaluates the power of a new scheme that generates search heuristics mechanically for problems expressed using a set of functions or relations over a finite set of variables. The heuristics are extracted from a parameterized approximation scheme called MiniBucket elimination that allows controlled tradeoff between computation and accuracy. The heuristics are used to guide BranchandBound and BestFirst search. Their performance is compared on two optimization tasks: the MaxCSP task defined on deterministic databases and the Most Probable Explanation task defined on probabilistic databases. Benchmarks were random data sets as well as applications to coding and medical diagnosis problems. Our results demonstrate that the heuristics generated are effective for both search schemes, permitting controlled tradeoff between preprocessing (for heuristic generation) and search.
Why is Diagnosis Using Belief Networks Insensitive to Imprecision in Probabilities?
, 1996
"... Recent research has found that diagnostic performance with Bayesian belief networks is often surprisingly insensitive to imprecision in the numerical probabilities. For example, the authors have recently completed an extensive study in which they applied random noise to the numerical probabilities i ..."
Abstract

Cited by 35 (0 self)
 Add to MetaCart
Recent research has found that diagnostic performance with Bayesian belief networks is often surprisingly insensitive to imprecision in the numerical probabilities. For example, the authors have recently completed an extensive study in which they applied random noise to the numerical probabilities in a set of belief networks for medical diagnosis, subsets of the CPCS network, a subset of the QMR (Quick Medical Reference) focused on liver and bile diseases. The diagnostic performance in terms of the average probabilities assigned to the actual diseases showed small sensitivity even to large amounts of noise. In this paper, we summarize the findings of this study and discuss possible explanations of this low sensitivity. One reason is that the criterion for performance is average probability of the true hypotheses, rather than average error in probability, which is insensitive to symmetric noise distributions. But, we show that even asymmetric, logoddsnormal noise has modest effects. A ...