Results 1  10
of
48
Learning Bayesian Networks from Data: An InformationTheory Based Approach
"... This paper provides algorithms that use an informationtheoretic analysis to learn Bayesian network structures from data. Based on our threephase learning framework, we develop efficient algorithms that can effectively learn Bayesian networks, requiring only polynomial numbers of conditional indepe ..."
Abstract

Cited by 93 (5 self)
 Add to MetaCart
This paper provides algorithms that use an informationtheoretic analysis to learn Bayesian network structures from data. Based on our threephase learning framework, we develop efficient algorithms that can effectively learn Bayesian networks, requiring only polynomial numbers of conditional independence (CI) tests in typical cases. We provide precise conditions that specify when these algorithms are guaranteed to be correct as well as empirical evidence (from real world applications and simulation tests) that demonstrates that these systems work efficiently and reliably in practice.
Conditional Independence Relations Have No Finite Complete Characterization
, 1990
"... The hypothesis of existence of a finite characterization of conditionalindependence relations (CIRs) is refused. This result is shown to be equivalent with the nonexistence of a simple deductive system describing relationships among CIstatements (it is certain type of syntactic description). H ..."
Abstract

Cited by 43 (6 self)
 Add to MetaCart
The hypothesis of existence of a finite characterization of conditionalindependence relations (CIRs) is refused. This result is shown to be equivalent with the nonexistence of a simple deductive system describing relationships among CIstatements (it is certain type of syntactic description). However, under the assumption that CIRs are grasped the existence of a countable characterization of CIRs is shown. Finally, the problem of characterization of CIRs is shown to be diverse from an analogical problem of axiomatization EMVDs arising in the theory of relational databases. INTRODUCTION Let [¸ i ] i2N be a random vector (2 card N ! 1) and let us suppose for simplicity that its components are finitevalued random variables. Then we can define a ternary disjoint relation I on expN (disjoint means that its domain is the set of triplets of pairwise disjoint subsets of N ): I(A; BjC) holds iff [¸ i ] i2A is conditionally independent of [¸ i ] i2B given [¸ i ] i2C . We shall ca...
On the toric algebra of graphical models
, 2006
"... We formulate necessary and sufficient conditions for an arbitrary discrete probability distribution to factor according to an undirected graphical model, or a loglinear model, or other more general exponential models. For decomposable graphical models these conditions are equivalent to a set of con ..."
Abstract

Cited by 36 (6 self)
 Add to MetaCart
We formulate necessary and sufficient conditions for an arbitrary discrete probability distribution to factor according to an undirected graphical model, or a loglinear model, or other more general exponential models. For decomposable graphical models these conditions are equivalent to a set of conditional independence statements similar to the Hammersley–Clifford theorem; however, we show that for nondecomposable graphical models they are not. We also show that nondecomposable models can have nonrational maximum likelihood estimates. These results are used to give several novel characterizations of decomposable graphical models.
On the Implication Problem for Probabilistic Conditional Independency
, 2000
"... The implication problem is to test whether a given set of independencies logically implies another independency. This problem is crucial in the design of a probabilistic reasoning system. We advocate that Bayesian networks are a generalization of standard relational databases. On the contrary, it ha ..."
Abstract

Cited by 35 (30 self)
 Add to MetaCart
The implication problem is to test whether a given set of independencies logically implies another independency. This problem is crucial in the design of a probabilistic reasoning system. We advocate that Bayesian networks are a generalization of standard relational databases. On the contrary, it has been suggested that Bayesian networks are different from the relational databases because the implication problem of these two systems does not coincide for some classes of probabilistic independencies. This remark, however, does not take into consideration one important issue, namely, the solvability of the implication problem.
Learning Bayesian Networks from Data: An Efficient Approach Based on Information Theory
, 1997
"... This paper addresses the problem of learning Bayesian network structures from data by using an information theoretic dependency analysis approach. Based on our threephase construction mechanism, two efficient algorithms have been developed. One of our algorithms deals with a special case where the ..."
Abstract

Cited by 35 (0 self)
 Add to MetaCart
This paper addresses the problem of learning Bayesian network structures from data by using an information theoretic dependency analysis approach. Based on our threephase construction mechanism, two efficient algorithms have been developed. One of our algorithms deals with a special case where the node ordering is given, the algorithm only require ) ( 2 N O CI tests and is correct given that the underlying model is DAGFaithful [Spirtes et. al., 1996]. The other algorithm deals with the general case and requires ) ( 4 N O conditional independence (CI) tests. It is correct given that the underlying model is monotone DAGFaithful (see Section 4.4). A system based on these algorithms has been developed and distributed through the Internet. The empirical results show that our approach is efficient and reliable. 1 Introduction The Bayesian network is a powerful knowledge representation and reasoning tool under conditions of uncertainty. A Bayesian network is a directed acyclic graph ...
Using Path Diagrams as a Structural Equation Modelling Tool
, 1997
"... this paper, we will show how path diagrams can be used to solve a number of important problems in structural equation modelling. There are a number of problems associated with structural equation modeling. These problems include: ..."
Abstract

Cited by 29 (7 self)
 Add to MetaCart
this paper, we will show how path diagrams can be used to solve a number of important problems in structural equation modelling. There are a number of problems associated with structural equation modeling. These problems include:
Constructing the Dependency Structure of a Multiagent Probabilistic Network
 IEEE Transactions on Knowledge and Data Engineering
, 2001
"... this paper, we propose an automated process for constructing the combined dependency structure of a ########## probabilistic network. Each domain expert supplies any known conditional independency information and not necessarily an explicit dependency structure. Our method determines a succinct r ..."
Abstract

Cited by 26 (16 self)
 Add to MetaCart
this paper, we propose an automated process for constructing the combined dependency structure of a ########## probabilistic network. Each domain expert supplies any known conditional independency information and not necessarily an explicit dependency structure. Our method determines a succinct representation of all the supplied independency information called a ####### #####. This process involves detecting all ############ information and removing all ######### information. A ###### dependency structure of the multiagent probabilistic network can be constructed directly from this minimal cover. The main result of this paper is that the constructed dependency structure is a ########### of the minimal cover. That is, every probabilistic conditional independency logically implied by the minimal cover can be inferred from the dependency structure and every probabilistic conditional independency inferred from the dependency structure is logically implied by the minimal cover
On Chain Graph Models For Description Of Conditional Independence Structures
 Ann. Statist
, 1998
"... This paper deals with chain graphs (CGs) which allow both directed and undirected edges. This class of graphs, introduced by Lauritzen and Wermuth [15], generalizes both UGs and DAGs. To establish the semantics of CGs one should associate an independency model to every CG. Some steps were already ma ..."
Abstract

Cited by 19 (3 self)
 Add to MetaCart
This paper deals with chain graphs (CGs) which allow both directed and undirected edges. This class of graphs, introduced by Lauritzen and Wermuth [15], generalizes both UGs and DAGs. To establish the semantics of CGs one should associate an independency model to every CG. Some steps were already made. Lauritzen and Wermuth [16] intended to use CGs to describe independency models for strictly positive probability distributions and introduced the concept of the chain Markov property which is analogous to the concept of causal input list for DAGs. Lauritzen and Frydenberg [17, 9] generalized the concept of moral graph and introduced a moralization criterion for reading independency statements from a CG. Frydenberg [9] characterized CGs with the same Markov ON CHAIN GRAPH MODELS 3 property (that is producing the same CGmodel) and Andersson, Madigan and Perlman [3] used special CGs to represent uniquely classes of Markov equivalent DAGs. Whittaker [31] in his book gave several examples of the use of CGs, and other recent works also deal with them [6, 20, 23, 30], the most comprehensive account is provided by the book [19]. Several results proved here were already presented (without proof) in our previous conference contribution [5]. An alternative approach to the generalization of UGs and DAGs was started by Cox and Wermuth [7] who introduced a wider class of jointresponse chain graphs which allow also 'dashed' directed and undirected edges in addition to the classic 'solid' directed and undirected edges treated in this paper. Andersson, Madigan and Perlman [1] introduced an alternative Markov property to give an interpretation to those jointresponse CGs which combine dashed directed edges with solid undirected edges (of course, another independency model is associated...
Conditional Independence
, 1997
"... This article has been prepared as an entry for the Wiley Encyclopedia of Statistical Sciences (Update). It gives a brief overview of fundamental properties and applications of conditional independence. ESS Update A. P. Dawid Conditional Independence Ancillarity; axioms; graphical models; markov pro ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
This article has been prepared as an entry for the Wiley Encyclopedia of Statistical Sciences (Update). It gives a brief overview of fundamental properties and applications of conditional independence. ESS Update A. P. Dawid Conditional Independence Ancillarity; axioms; graphical models; markov properties; sufficiency. The concepts of independence and conditional independence (CI) between random variables originate in Probability Theory, where they are introduced as properties of an underlying probability measure P on the sample space (see CONDITIONAL PROBABILITY AND EXPECTATION). Much of traditional Probability Theory and Statistics involves analysis of distributions having such properties: for example, limit theorems for independent and identically distributed variables, or the theory of MARKOV PROCESSES. More recently, it has become apparent that it is fruitful to treat conditional independence (and its special case independence) as a primitive concept, with an intuitive meaning, ...