Results 1  10
of
22
A Bayesian method for the induction of probabilistic networks from data
 Machine Learning
, 1992
"... Abstract. This paper presents a Bayesian method for constructing probabilistic networks from databases. In particular, we focus on constructing Bayesian belief networks. Potential applications include computerassisted hypothesis testing, automated scientific discovery, and automated construction of ..."
Abstract

Cited by 1081 (27 self)
 Add to MetaCart
Abstract. This paper presents a Bayesian method for constructing probabilistic networks from databases. In particular, we focus on constructing Bayesian belief networks. Potential applications include computerassisted hypothesis testing, automated scientific discovery, and automated construction of probabilistic expert systems. We extend the basic method to handle missing data and hidden (latent) variables. We show how to perform probabilistic inference by averaging over the inferences of multiple belief networks. Results are presented of a preliminary evaluation of an algorithm for constructing a belief network from a database of cases. Finally, we relate the methods in this paper to previous work, and we discuss open problems.
Fundamental Concepts of Qualitative Probabilistic Networks
 ARTIFICIAL INTELLIGENCE
, 1990
"... Graphical representations for probabilistic relationships have recently received considerable attention in A1. Qualitative probabilistic networks abstract from the usual numeric representations by encoding only qualitative relationships, which are inequality constraints on the joint probability dist ..."
Abstract

Cited by 119 (6 self)
 Add to MetaCart
Graphical representations for probabilistic relationships have recently received considerable attention in A1. Qualitative probabilistic networks abstract from the usual numeric representations by encoding only qualitative relationships, which are inequality constraints on the joint probability distribution over the variables. Although these constraints are insufficient to determine probabilities uniquely, they are designed to justify the deduction of a class of relative likelihood conclusions that imply useful decisionmaking properties. Two types of qualitative relationship are defined, each a probabilistic form of monotonicity constraint over a group of variables. Qualitative influences describe the direction of the relationship between two variables. Qualitative synergies describe interactions among influences. The probabilistic definitions chosen justify sound and efficient inference procedures based on graphical manipulations of the network. These procedures answer queries about qualitative relationships among variables separated in the network and determine structural properties of optimal assignments to decision variables.
Variational Probabilistic Inference and the QMRDT Network
 JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH
, 1999
"... We describe a variational approximation method for efficient inference in largescale probabilistic models. Variational methods are deterministic procedures that provide approximations to marginal and conditional probabilities of interest. They provide alternatives to approximate inference method ..."
Abstract

Cited by 57 (3 self)
 Add to MetaCart
We describe a variational approximation method for efficient inference in largescale probabilistic models. Variational methods are deterministic procedures that provide approximations to marginal and conditional probabilities of interest. They provide alternatives to approximate inference methods based on stochastic sampling or search. We describe a variational approach to the problem of diagnostic inference in the "Quick Medical Reference" (QMR) network. The QMR network is a largescale probabilistic graphical model built on statistical and expert knowledge. Exact probabilistic inference is infeasible in this model for all but a small set of cases. We evaluate our variational inference algorithm on a large set of diagnostic test cases, comparing the algorithm to a stateoftheart stochastic sampling method.
Thinking backward for knowledge acquisition
 AI Magazine
, 1987
"... This article examines the direction in which knowledge bases are constructed for diagnosis and decision making When building an expert system, it is traditional to elicit knowledge from an expert in the direction in which the knowledge is to be applied, namely, from observable evidence toward unobse ..."
Abstract

Cited by 20 (4 self)
 Add to MetaCart
This article examines the direction in which knowledge bases are constructed for diagnosis and decision making When building an expert system, it is traditional to elicit knowledge from an expert in the direction in which the knowledge is to be applied, namely, from observable evidence toward unobservable hypotheses However, experts usually find it simpler to reason in the opposite directionfrom hypotheses to unobservable evidencebecause this direction reflects causal relationships Therefore, we argue that a knowledge base be constructed following the expert’s natural reasoning
A Bayesian Analysis of Simulation Algorithms for Inference in Belief Networks,
 Networks
, 1993
"... A belief network is a graphical representation of the underlying probabilistic relationships in a complex system. Belief networks have been employed as a representation of uncertain relationships in computerbased diagnostic systems. These diagnostic systems provide assistance by assigning likeli ..."
Abstract

Cited by 17 (3 self)
 Add to MetaCart
A belief network is a graphical representation of the underlying probabilistic relationships in a complex system. Belief networks have been employed as a representation of uncertain relationships in computerbased diagnostic systems. These diagnostic systems provide assistance by assigning likelihoods to alternative explanatory hypotheses in response to a set of findings or observations. Approximation algorithms have been used to compute likelihoods of hypotheses in large networks. We analyze the performance of leading Monte Carlo approximation algorithms for computing posterior probabilities in belief networks. The analysis differs from earlier attempts to characterize the behavior of simulation algorithms in our explicit use of Bayesian statistics: We update a probability distribution over target probabilities of interest with information from randomized trials. For real ffl; ffi ! 1 and for a probabilistic inference Pr[xje], the output of an inference approximation algorithm is an (ffl; ffi)estimate of Pr[xje] if with probability at least 1 \Gamma ffi the output is within relative error ffl of Pr[xje]. We construct a stopping rule for the number of simulations required by logic sampling, randomized approximation schemes, and likelihood weighting to provide (ffl; ffi)estimates of Pr[xje]. With probability 1 \Gamma ffi, the stopping rule is optimal in the sense that the algorithm performs the minimum number of required simulations. We prove that our stopping rules are insensitive to the prior probability distribution on Pr[xje].
Variational probabilistic inference and the QMRDT database
 Journal of Artificial Intelligence Research
, 1999
"... We describe a variational approximation method for efficient inference in largescale probabilistic models. Variational methods are deterministic procedures that provide approximations to marginal and conditional probabilities of interest. They provide alternatives to approximate inference methods b ..."
Abstract

Cited by 16 (3 self)
 Add to MetaCart
We describe a variational approximation method for efficient inference in largescale probabilistic models. Variational methods are deterministic procedures that provide approximations to marginal and conditional probabilities of interest. They provide alternatives to approximate inference methods based on stochastic sampling or search. We describe a variational approach to the problem of diagnostic inference in the "Quick Medical Reference" (QMR) database. The QMR database is a largescale probabilistic graphical model built on statistical and expert knowledge. Exact probabilistic inference is infeasible in this model for all but a small set of cases. We evaluate our variational inference algorithm on a large set of diagnostic test cases, comparing the algorithm to a stateoftheart stochastic sampling method. 1 Introduction Probabilistic models have become increasingly prevalent in AI in recent years. Beyond the significant representational advantages of probability theory, inclu...
Sensitivity Analysis: an Aid for Beliefnetwork Quantification
, 2000
"... When building a Bayesian belief network, usually a large number of probabilities have to be assessed by experts in the domain of application. Experience shows that experts often are reluctant to assess all probabilities required, feeling that they are unable to give assessments with a high level of ..."
Abstract

Cited by 16 (7 self)
 Add to MetaCart
When building a Bayesian belief network, usually a large number of probabilities have to be assessed by experts in the domain of application. Experience shows that experts often are reluctant to assess all probabilities required, feeling that they are unable to give assessments with a high level of accuracy. We argue that the elicitation of probabilities from experts can be supported to a large extent by iteratively performing sensitivity analyses of the belief network in the making, starting with rough, initial assessments. Since it gives insight into which probabilities require a high level of accuracy and which do not, performing a sensitivity analysis allows for focusing further elicitation efforts. We propose an elicitation procedure in which, alternatingly, sensitivity analyses are performed and probability assessments are refined, until satisfactory behaviour of the belief network is obtained, until the costs of further elicitation outweigh the benefits of higher accur...
A method for learning belief networks that contain hidden variables
 in Proceedings of the Workshop on Knowledge Discovery in Databases
, 1994
"... This paper presents a Bayesian method for computing the probability of a Bayesian beliefnetwork structure from a database. In particular, the paper focuses on computing the probability of a beliefnetwork structure that contains e. hidden (latent) variable. A hidden variable represents a postulated ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
This paper presents a Bayesian method for computing the probability of a Bayesian beliefnetwork structure from a database. In particular, the paper focuses on computing the probability of a beliefnetwork structure that contains e. hidden (latent) variable. A hidden variable represents a postulated entity about which we have no data. For example, we may wish to postulate the existence of a hidden
An evaluation of the diagnostic accuracy
 of Pathfinder. Computers and Biomedical Research
, 1992
"... This work is an adaptation of Heckerman (1991). All figures and tables are printed with permission from MIT Press. We present an evaluation of the diagnostic accuracy of Pathfinder, an expert system that assists pathologists with the diagnosis of lymphnode diseases. We evaluate two versions of the ..."
Abstract

Cited by 8 (5 self)
 Add to MetaCart
This work is an adaptation of Heckerman (1991). All figures and tables are printed with permission from MIT Press. We present an evaluation of the diagnostic accuracy of Pathfinder, an expert system that assists pathologists with the diagnosis of lymphnode diseases. We evaluate two versions of the system using both informal and decisiontheoretic metrics of performance. In one version of Pathfinder, we assume incorrectly that all observations are conditionally independent. In the other version, we use a belief network to represent accurately the probabilistic dependencies among the observations. In both versions, we make the assumption—reasonable for this domain—that diseases are mutually exclusive and exhaustive. The results of the study show that (1) it is cost effective to represent probabilistic dependencies among observations in the lymphnode domain, and (2) the diagnostic accuracy of the more complex version of Pathfinder is at least as good as that of the Pathfinder expert. In addition, the study illustrates how informal and decisiontheoretic metrics for performance complement one another. 2 1
A Method of Learning Implication Networks from Empirical Data: Algorithm and MonteCarlo Simulation Based Validation
 IEEE Transactions on Knowledge and Data Engineering
, 1997
"... This paper describes an algorithmic means for inducing implication networks from empirical data samples. The induced network enables efficient inferences about the values of network nodes if certain observations are made. This implication induction method is approximate in nature as probablistic net ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
This paper describes an algorithmic means for inducing implication networks from empirical data samples. The induced network enables efficient inferences about the values of network nodes if certain observations are made. This implication induction method is approximate in nature as probablistic network requirements are relaxed in the construction of dependence relationships based on statistical testing. In order to examine the effectiveness and validity of the induction method, several MonteCarlo simulations were conducted where theoretical Bayesian networks were used to generate empirical data samples \Gamma some of which were used to induce implication relations whereas others were used to verify the results of evidential reasoning with the induced networks. The values in the implication networks were predicted by applying a modified version of DempsterShafer belief updating scheme. The results of predictions were, furthermore, compared to the ones generated by Pearl's stochastic ...