Results 1  10
of
15
Learning Bayesian networks: The combination of knowledge and statistical data
 Machine Learning
, 1995
"... We describe scoring metrics for learning Bayesian networks from a combination of user knowledge and statistical data. We identify two important properties of metrics, which we call event equivalence and parameter modularity. These properties have been mostly ignored, but when combined, greatly simpl ..."
Abstract

Cited by 1142 (36 self)
 Add to MetaCart
We describe scoring metrics for learning Bayesian networks from a combination of user knowledge and statistical data. We identify two important properties of metrics, which we call event equivalence and parameter modularity. These properties have been mostly ignored, but when combined, greatly simplify the encoding of a user’s prior knowledge. In particular, a user can express his knowledge—for the most part—as a single prior Bayesian network for the domain. 1
A Tutorial on Learning Bayesian Networks
 Communications of the ACM
, 1995
"... We examine a graphical representation of uncertain knowledge called a Bayesian network. The representation is easy to construct and interpret, yet has formal probabilistic semantics making it suitable for statistical manipulation. We show how we can use the representation to learn new knowledge by c ..."
Abstract

Cited by 363 (13 self)
 Add to MetaCart
We examine a graphical representation of uncertain knowledge called a Bayesian network. The representation is easy to construct and interpret, yet has formal probabilistic semantics making it suitable for statistical manipulation. We show how we can use the representation to learn new knowledge by combining domain knowledge with statistical data. 1 Introduction Many techniques for learning rely heavily on data. In contrast, the knowledge encoded in expert systems usually comes solely from an expert. In this paper, we examine a knowledge representation, called a Bayesian network, that lets us have the best of both worlds. Namely, the representation allows us to learn new knowledge by combining expert domain knowledge and statistical data. A Bayesian network is a graphical representation of uncertain knowledge that most people find easy to construct and interpret. In addition, the representation has formal probabilistic semantics, making it suitable for statistical manipulation (Howard,...
Causal independence for probability assessment and inference using Bayesian networks
 IEEE Trans. on Systems, Man and Cybernetics
, 1994
"... ABayesian network is a probabilistic representation for uncertain relationships, which has proven to be useful for modeling realworld problems. When there are many potential causes of a given e ect, however, both probability assessment and inference using a Bayesian network can be di cult. In this ..."
Abstract

Cited by 74 (4 self)
 Add to MetaCart
ABayesian network is a probabilistic representation for uncertain relationships, which has proven to be useful for modeling realworld problems. When there are many potential causes of a given e ect, however, both probability assessment and inference using a Bayesian network can be di cult. In this paper, we describe causal independence, a collection of conditional independence assertions and functional relationships that are often appropriate to apply to the representation of the uncertain interactions between causes and e ect. We show how the use of causal independence in a Bayesian network can greatly simplify probability assessment aswell as probabilistic inference. 1
A Bayesian approach to learning causal networks
 In Uncertainty in AI: Proceedings of the Eleventh Conference
, 1995
"... Whereas acausal Bayesian networks represent probabilistic independence, causal Bayesian networks represent causal relationships. In this paper, we examine Bayesian methods for learning both types of networks. Bayesian methods for learning acausal networks are fairly well developed. These methods oft ..."
Abstract

Cited by 72 (11 self)
 Add to MetaCart
Whereas acausal Bayesian networks represent probabilistic independence, causal Bayesian networks represent causal relationships. In this paper, we examine Bayesian methods for learning both types of networks. Bayesian methods for learning acausal networks are fairly well developed. These methods often employ assumptions to facilitate the construction of priors, including the assumptions of parameter independence, parameter modularity, and likelihood equivalence. We show that although these assumptions also can be appropriate for learning causal networks, we need additional assumptions in order to learn causal networks. We introduce two sufficient assumptions, called mechanism independence and component independence. We show that these new assumptions, when combined with parameter independence, parameter modularity, and likelihood equivalence, allow us to apply methods for learning acausal networks to learn causal networks. 1
DecisionTheoretic Foundations for Causal Reasoning
 Journal of Artificial Intelligence Research
, 1995
"... We present a definition of cause and effect in terms of decisiontheoretic primitives and thereby provide a principled foundation for causal reasoning. Our definition departs from the traditional view of causation in that causal assertions may vary with the set of decisions available. We argue that ..."
Abstract

Cited by 60 (10 self)
 Add to MetaCart
(Show Context)
We present a definition of cause and effect in terms of decisiontheoretic primitives and thereby provide a principled foundation for causal reasoning. Our definition departs from the traditional view of causation in that causal assertions may vary with the set of decisions available. We argue that this approach provides added clarity to the notion of cause. Also in this paper, we examine the encoding of causal relationships in directed acyclic graphs. We describe a special class of influence diagrams, those in canonical form, and show its relationship to Pearl's representation of cause and effect. Finally, we show how canonical form facilitates counterfactual reasoning. 1. Introduction Knowledge of cause and effect is crucial for modeling the affects of actions. For example, if we observe a statistical correlation between smoking and lung cancer, we can not conclude from this observation alone that our chances of getting lung cancer will change if we stop smoking. If, however, we als...
Sequential troubleshooting under uncertainty
 Communications of the ACM
, 1994
"... We develop a series of approximations for decisiontheoretic troubleshooting under uncertainty. Our approach generates troubleshooting plans in the face of uncertainty in the relationships among components and device status, observations, as well as the affect of actions on device status. Included i ..."
Abstract

Cited by 49 (8 self)
 Add to MetaCart
We develop a series of approximations for decisiontheoretic troubleshooting under uncertainty. Our approach generates troubleshooting plans in the face of uncertainty in the relationships among components and device status, observations, as well as the affect of actions on device status. Included in our approach is a Bayesiannetwork representation of these relationships. We have applied our technique successfully to troubleshooting problems with printing, photocopier feeders, automobiles, and gas turbines. We report empirical findings demonstrating the high quality of plans produced by our approach. 1
A Clinician's Tool for Analyzing Noncompliance
, 1996
"... We describe a computer program to assist a clinician with assessing the efficacy of treatments in experimental studies for which treatment assignment is random but subject compliance is imperfect. The major difficulty in such studies is that treatment efficacy is not "identifiable", th ..."
Abstract

Cited by 24 (13 self)
 Add to MetaCart
We describe a computer program to assist a clinician with assessing the efficacy of treatments in experimental studies for which treatment assignment is random but subject compliance is imperfect. The major difficulty in such studies is that treatment efficacy is not "identifiable", that is, it cannot be estimated from the data, even when the number of subjects is infinite, unless additional knowledge is provided. Our system combines Bayesian learning with Gibbs sampling using two inputs: (1) the investigator's prior probabilities of the relative sizes of subpopulations and (2) the observed data from the experiment. The system outputs a histogram depicting the posterior distribution of the average treatment effect, that is, the probability that the average outcome (e.g., survival) would attain a given level, had the treatment been taken uniformly by the entire population. This paper describes the theoretical basis for the proposed approach and presents experimental results on ...
Learning Causal Networks from Data: A survey and a new algorithm for recovering possibilistic causal networks
, 1997
"... Introduction Reasoning in terms of cause and effect is a strategy that arises in many tasks. For example, diagnosis is usually defined as the task of finding the causes (illnesses) from the observed effects (symptoms). Similarly, prediction can be understood as the description of a future plausible ..."
Abstract

Cited by 20 (5 self)
 Add to MetaCart
Introduction Reasoning in terms of cause and effect is a strategy that arises in many tasks. For example, diagnosis is usually defined as the task of finding the causes (illnesses) from the observed effects (symptoms). Similarly, prediction can be understood as the description of a future plausible situation where observed effects will be in accordance with the known causal structure of the phenomenon being studied. Causal models are a summary of the knowledge about a phenomenon expressed in terms of causation. Many areas of the ap # This work has been partially supported by the Spanish Comission Interministerial de Ciencia y Tecnologia Project CICYTTIC96 0878. plied sciences (econometry, biomedics, engineering, etc.) have used such a term to refer to models that yield explanations, allow for prediction and facilitate planning and decision making. Causal reasoning can be viewed as inference guided by a causation theory. That kind of inference can be further specialised into induc
Model Building with Belief Networks and Influence Diagrams
"... Belief networks and influence diagrams use directed graphs to represent models for probabilistic reasoning and decision making under uncertainty. They have proven to be effective at facilitating communication with decision makers and with computers. Many of the important relationships among uncertai ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
Belief networks and influence diagrams use directed graphs to represent models for probabilistic reasoning and decision making under uncertainty. They have proven to be effective at facilitating communication with decision makers and with computers. Many of the important relationships among uncertainties, decisions, and values can be captured in the structure of these diagrams, explicitly revealing irrelevance and the flow of information. We explore a variety of examples illustrating some of these basic structures, along with an algorithm that efficiently analyzes their model structure. We also show how algorithms based on these structures can be used to resolve
Believing Change and Changing Belief
 IEEE Transactions on Systems, Man, and Cybernetics Special Issue on HigherOrder Uncertainty
, 1996
"... We present a firstorder logic of time, chance, and probability that is capable of expressing the four types of higherorder probability sentences relating subjective probability and objective chance at different times. We define a causal notion of objective chance and show how it can be used in con ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
We present a firstorder logic of time, chance, and probability that is capable of expressing the four types of higherorder probability sentences relating subjective probability and objective chance at different times. We define a causal notion of objective chance and show how it can be used in conjunction with subjective probability to distinguish between causal and evidential correlation by distinguishing between conditions, events, and actions that 1) influence the agent's belief in chance and 2) the agent believes to influence chance. Furthermore, the semantics of the logic captures some commonsense inferences concerning objective chance and causality. We show that an agent's subjective probability is the expected value of its beliefs concerning objective chance. We also prove that an agent using this representation believes with certainty that the past cannot be causally influenced. To appear in IEEE SMC special issue on HigherOrder Probability. 1 Introduction Temporal probab...