Results 11 
19 of
19
A Statistical Semantics for Causation
, 1991
"... We propose a modeltheoretic definition of causation, and show that, contrary to common folklore, genuine causal influences can be distinguished from spurious covariations following standard norms of inductive reasoning. We also establish a complete characterization of the conditions under which suc ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
We propose a modeltheoretic definition of causation, and show that, contrary to common folklore, genuine causal influences can be distinguished from spurious covariations following standard norms of inductive reasoning. We also establish a complete characterization of the conditions under which such a distinction is possible. Finally, we provide a prooftheoretical procedure for inductive causation and show that, for a large class of data and structures, effective algorithms exist that uncover the direction of causal influences as defined above. 1 The Model We view the task of causal modeling as an identification game which scientists play against Nature. Nature possesses stable causal mechanisms which, on a microscopic level are deterministic functional relationships between variables, some of which are unobservable. These mechanisms are organized in the form of an acyclic schema which the scientist attempts to identify. Definition 1 A causal model over a set of variables U is a di...
Fuzzy Bayesian Networks  A General Formalism for Representation, Inference and Learning with Hybrid Bayesian Networks
"... This paper proposes a general formalism for representation, inference and learning with general hybrid Bayesian networks in which continuous and discrete variables may appear anywhere in a directed acyclic graph. The formalism fuzzies a hybrid Bayesian network into two alternative forms: The rst ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
This paper proposes a general formalism for representation, inference and learning with general hybrid Bayesian networks in which continuous and discrete variables may appear anywhere in a directed acyclic graph. The formalism fuzzies a hybrid Bayesian network into two alternative forms: The rst form replaces each continuous variable in the given directed acyclic graph (DAG) by a partner discrete variable and adding a directed link from the partner discrete variable to the continuous one. The mapping between two variables is not crisp quantization but is approximated (fuzzied) by a conditional Gaussian (CG) distribution. The CG model is equivalent to a fuzzy set but no fuzzy logic formalism is employed. The conditional distribution of a discrete variable given its discrete parents is still assumed to be multinomial as in discrete Bayesian networks. The second form only replaces each continuous variable whose descendants include discrete variables by a partner discrete variable a...
A Parallel Learning Algorithm for Bayesian Inference Networks
 IEEE Transactions on Knowledge and Data Engineering
"... We present a new parallel algorithm for learning Bayesian inference networks from data. Our learning algorithm exploits both properties of the MDLbased score metric, and a distributed, asynchronous, adaptive search technique called nagging. Nagging is intrinsically fault tolerant, has dynamic load ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
We present a new parallel algorithm for learning Bayesian inference networks from data. Our learning algorithm exploits both properties of the MDLbased score metric, and a distributed, asynchronous, adaptive search technique called nagging. Nagging is intrinsically fault tolerant, has dynamic load balancing features, and scales well. We demonstrate the viability, effectiveness, and scalability of our approach empirically with several experiments using on the order of 20 machines. More specifically, we show that our distributed algorithm can provide optimal solutions for larger problems as well as good solutions for Bayesian networks of up to 150 variables. Keywords: Machine Learning, Bayesian Networks, Minimum Description Length Principle, Distributed Systems Support for this research was provided by the Office of Naval Research through grant N00149411178, and by the Advanced Research Project Agency through Rome Laboratory Contract Number F3060293C0018 via Odyssey Research As...
Learning Bayesian Networks for Solving RealWorld Problems
, 1998
"... Bayesian networks, which provide a compact graphical way to express complex probabilistic relationships among several random variables, are rapidly becoming the tool of choice for dealing with uncertainty in knowledge based systems. However, approaches based on Bayesian networks have often been dism ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Bayesian networks, which provide a compact graphical way to express complex probabilistic relationships among several random variables, are rapidly becoming the tool of choice for dealing with uncertainty in knowledge based systems. However, approaches based on Bayesian networks have often been dismissed as unfit for many realworld applications since probabilistic inference is intractable for most problems of realistic size, and algorithms for learning Bayesian networks impose the unrealistic requirement of datasets being complete. In this thesis, I present practical solutions to these two problems, and demonstrate their effectiveness on several realworld problems. The solution proposed to the first problem is to learn selective Bayesian networks, i.e., ones that use only a subset of the given attributes to model a domain. The aim is to learn networks that are smaller, and henc...
A Critique of Inductive Causation
 Proc. 5th European Conf. on Symbolic and Quantitative Approaches to Reasoning and Uncertainty (ECSQARU '99, London), LNAI 1638, 6879
, 1999
"... : In this paper we consider the problem of inducing causal relations from statistical data. Although it is well known that a correlation does not justify the claim of a causal relation between two measures, the question seems not to be settled. Research in the field of Bayesian networks revived an a ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
: In this paper we consider the problem of inducing causal relations from statistical data. Although it is well known that a correlation does not justify the claim of a causal relation between two measures, the question seems not to be settled. Research in the field of Bayesian networks revived an approach suggested in [16]. It is based on the idea that there are relationships between the causal structure of a domain and its corresponding probability distribution, which could be exploited to infer at least part of the causal structure from a set of dependence and independence statements. This idea was developed into the inductive causation algorithm [14]. We review this algorithm and examine the assumptions underlying it. 1 Introduction If A causes B, an occurrence of A should be accompanied or (closely) followed by an occurrence of B. That causation implies conjunction is the basis of all reasoning about causation in statistics. But is this enough to infer causal relations from stati...
Improving HighDimensional Bayesian Network Structure Learning by Exploiting Search Space Information
, 2006
"... Bayesian networks are frequently used to model statistical dependencies in data. Without prior knowledge of dependencies in the data, the structure of a Bayesian network is learned from the data. Bayesian network structure learning is commonly posed as an optimization problem where search is used to ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Bayesian networks are frequently used to model statistical dependencies in data. Without prior knowledge of dependencies in the data, the structure of a Bayesian network is learned from the data. Bayesian network structure learning is commonly posed as an optimization problem where search is used to find structures that maximize a scoring function. Since the structure search space is superexponential in the number of variables in a network, heuristics are applied to constrain the search space of highdimensional networks. Greedy hill climbing is then applied in the reduced search space. The constrained search space of highdimensional networks contains many local maxima that greedy hill climbing cannot overcome. This issue has only been addressed by augmenting greedy search with TABU lists or random moves. This is not a holistic solution to the problem. By using a search algorithm that is global in nature, we are not confined to results in a particular region of the search space, like previous approaches. We present ModelBased Search (MBS) [1] applied to Bayesian network structure learning. MBS uses information gained during search to explore promising search space regions. Maintaining this search space information keeps a global view of the search task and helps find structures at higher maxima than greedy hill climbing. We show that MBS performs better than hill climbing in the MaxMin Parents and Children (MMPC) [30] search space and can find better highdimensional network structures than other leading structure learning algorithms. 1
Lecture 11: Conditional Independence Learning
"... re are two basic approaches: ffl Learning from conditional independencies (CI learning) ffl Learning using a scoring metric (Metric learning) 443 Reasoning Under Uncertainty Korb 5 CI learning Verma and Pearl, 1991) Suppose you have an Oracle who can answer yes or no to any question of the ty ..."
Abstract
 Add to MetaCart
re are two basic approaches: ffl Learning from conditional independencies (CI learning) ffl Learning using a scoring metric (Metric learning) 443 Reasoning Under Uncertainty Korb 5 CI learning Verma and Pearl, 1991) Suppose you have an Oracle who can answer yes or no to any question of the type: X q Y jS? Then two rules allow discovery of the set of causal models consistent with all such answers ("patterns"): 1. Principle I Put an undirected link between any two variables X and Y iff for every S s.t. X;Y 62 S :(X q Y )jS 2. Principle II For every undirected vstructure X \Gamma Z \Gamma Y orient the arcs X ! Z / Y<F10.65