Results 1  10
of
12
A Guide to the Literature on Learning Probabilistic Networks From Data
, 1996
"... This literature review discusses different methods under the general rubric of learning Bayesian networks from data, and includes some overlapping work on more general probabilistic networks. Connections are drawn between the statistical, neural network, and uncertainty communities, and between the ..."
Abstract

Cited by 172 (0 self)
 Add to MetaCart
This literature review discusses different methods under the general rubric of learning Bayesian networks from data, and includes some overlapping work on more general probabilistic networks. Connections are drawn between the statistical, neural network, and uncertainty communities, and between the different methodological communities, such as Bayesian, description length, and classical statistics. Basic concepts for learning and Bayesian networks are introduced and methods are then reviewed. Methods are discussed for learning parameters of a probabilistic network, for learning the structure, and for learning hidden variables. The presentation avoids formal definitions and theorems, as these are plentiful in the literature, and instead illustrates key concepts with simplified examples. Keywords Bayesian networks, graphical models, hidden variables, learning, learning structure, probabilistic networks, knowledge discovery. I. Introduction Probabilistic networks or probabilistic gra...
An Alternative Markov Property for Chain Graphs
 Scand. J. Statist
, 1996
"... Graphical Markov models use graphs, either undirected, directed, or mixed, to represent possible dependences among statistical variables. Applications of undirected graphs (UDGs) include models for spatial dependence and image analysis, while acyclic directed graphs (ADGs), which are especially conv ..."
Abstract

Cited by 49 (4 self)
 Add to MetaCart
Graphical Markov models use graphs, either undirected, directed, or mixed, to represent possible dependences among statistical variables. Applications of undirected graphs (UDGs) include models for spatial dependence and image analysis, while acyclic directed graphs (ADGs), which are especially convenient for statistical analysis, arise in such fields as genetics and psychometrics and as models for expert systems and Bayesian belief networks. Lauritzen, Wermuth, and Frydenberg (LWF) introduced a Markov property for chain graphs, which are mixed graphs that can be used to represent simultaneously both causal and associative dependencies and which include both UDGs and ADGs as special cases. In this paper an alternative Markov property (AMP) for chain graphs is introduced, which in some ways is a more direct extension of the ADG Markov property than is the LWF property for chain graph. 1 INTRODUCTION Graphical Markov models use graphs, either undirected, directed, or mixed, to represent...
Bayesian Model Averaging And Model Selection For Markov Equivalence Classes Of Acyclic Digraphs
 Communications in Statistics: Theory and Methods
, 1996
"... Acyclic digraphs (ADGs) are widely used to describe dependences among variables in multivariate distributions. In particular, the likelihood functions of ADG models admit convenient recursive factorizations that often allow explicit maximum likelihood estimates and that are well suited to building B ..."
Abstract

Cited by 38 (5 self)
 Add to MetaCart
Acyclic digraphs (ADGs) are widely used to describe dependences among variables in multivariate distributions. In particular, the likelihood functions of ADG models admit convenient recursive factorizations that often allow explicit maximum likelihood estimates and that are well suited to building Bayesian networks for expert systems. There may, however, be many ADGs that determine the same dependence (= Markov) model. Thus, the family of all ADGs with a given set of vertices is naturally partitioned into Markovequivalence classes, each class being associated with a unique statistical model. Statistical procedures, such as model selection or model averaging, that fail to take into account these equivalence classes, may incur substantial computational or other inefficiencies. Recent results have shown that each Markovequivalence class is uniquely determined by a single chain graph, the essential graph, that is itself Markovequivalent simultaneously to all ADGs in the equivalence clas...
Probabilistic Network Construction Using the Minimum Description Length Principle
, 1994
"... Probabilistic networks can be constructed from a database of cases by selecting a network that has highest quality with respect to this database according to a given measure. A new measure is presented for this purpose based on a minimum description length (MDL) approach. This measure is compared wi ..."
Abstract

Cited by 28 (1 self)
 Add to MetaCart
Probabilistic networks can be constructed from a database of cases by selecting a network that has highest quality with respect to this database according to a given measure. A new measure is presented for this purpose based on a minimum description length (MDL) approach. This measure is compared with a commonly used measure based on a Bayesian approach both from a theoretical and an experimental point of view. We show that the two measures have the same properties for infinite large databases. For smaller databases, however, the MDL measure assigns equal quality to networks that represent the same set of independencies while the Bayesian measure does not. Preliminary test results suggest that an algorithm for learning probabilistic networks using the minimum description length approach performs comparably to a learning algorithm using the Bayesian approach. However, the former is slightly faster.
Chain Graphs for Learning
 In Uncertainty in Artificial Intelligence
, 1995
"... Chain graphs combine directed and undirected graphs and their underlying mathematics combines properties of the two. This paper gives a simplified definition of chain graphs based on a hierarchical combination of Bayesian (directed) and Markov (undirected) networks. Examples of a chain graph are mul ..."
Abstract

Cited by 27 (1 self)
 Add to MetaCart
Chain graphs combine directed and undirected graphs and their underlying mathematics combines properties of the two. This paper gives a simplified definition of chain graphs based on a hierarchical combination of Bayesian (directed) and Markov (undirected) networks. Examples of a chain graph are multivariate feedforward networks, clustering with conditional interaction between variables, and forms of Bayes classifiers. Chain graphs are then extended using the notation of plates so that samples and data analysis problems can be represented in a graphical model as well. Implications for learning are discussed in the conclusion. 1 Introduction Probabilistic networks are a notational device that allow one to abstract forms of probabilistic reasoning without getting lost in the mathematical detail of the underlying equations. They offer a framework whereby many forms of probabilistic reasoning can be combined and performed on probabilistic models without careful hand programming. Efforts ...
Split models for contingency tables
, 2003
"... A framework for loglinear models with context specific independence structures, i.e. conditional independencies holding only for specific values of the conditioning variables is introduced. This framework is constituted by the class of split models. Also a software package named YGGDRASIL which is ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
A framework for loglinear models with context specific independence structures, i.e. conditional independencies holding only for specific values of the conditioning variables is introduced. This framework is constituted by the class of split models. Also a software package named YGGDRASIL which is designed for statistical inference in split models is presented. Split models are an extension of graphical models for contingency tables. The treatment of split models includes estimation, representation and a Markov property for reading off independencies holding in a specific context. Two examples, including an illustration of the use of YGGDRASIL are
Approximation of Bayesian networks through edge removals
, 1993
"... Due to the general inherent complexity of inference in Bayesian networks, the need to compromise the exactitude of inference arises frequently. A scheme for reduction of complexity by enforcing additional conditional independences is investigated. The enforcement of independences is achieved through ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Due to the general inherent complexity of inference in Bayesian networks, the need to compromise the exactitude of inference arises frequently. A scheme for reduction of complexity by enforcing additional conditional independences is investigated. The enforcement of independences is achieved through edge removals in a triangulated graph. The removal of a single edge may imply an enormous reduction of complexity, since other edges may become superuous by its removal. The approximation scheme presented has several appealing features. Most notably among these, a bound on the overall approximation error can be computed locally, the bound on the error by a series of approximations equals the sum of the bounds of the errors of the individual approximations, and the influence of an approximation attenuates with increasing `distance' from edge removed. The scheme compares in some cases very favorably with the approximation method suggested by Jensen & Andersen (1990).
Bayesian Graphical Models
, 2000
"... ions and Land use have an impact on River flow. The essential property concerning the structural model is that it reflects conditional independence relations. Definition: Two variables A and B are independent if knowledge of A does not change the belief about B (and vice versa). A and B are condit ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
ions and Land use have an impact on River flow. The essential property concerning the structural model is that it reflects conditional independence relations. Definition: Two variables A and B are independent if knowledge of A does not change the belief about B (and vice versa). A and B are conditionally independent 1 Level of domestic consumption Industrial output River flow Rainfall Land use Industrial abstractions abstractions Domestic Abstractions Agricultural abstractions Percentage of agricultural land irrigated Figure 1: Example of a simple Bayesian network representing a river basin. Abstraction and River flow have 7 states, the other variables have 4 states given C if they are independent whenever the state of C is known. In the language of probabilities, conditional independence is defined as PR(A j B; C) = PR(A j C) (1) The definition of conditional independence is in a straightforward way generalized to sets of variables. The structural part of a Bayes...
Learning Structures from Data and Experts
"... In modelling complex stochastic systems graphical association models provide a convenient framework. With graphical models the overall structure of association among variables is described in terms of conditional independence, and this structure can be represented graphically. Statistical methods fo ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In modelling complex stochastic systems graphical association models provide a convenient framework. With graphical models the overall structure of association among variables is described in terms of conditional independence, and this structure can be represented graphically. Statistical methods for revealing these basic structures of association on the basis of data and expert knowledge are described. Suggestions on how to make a more detailed modelling will be made, and it will be illustrated how to implement such models in a causal probabilistic network. As an illustration a model for the incidence of fungi attacks and yield in relation to various cultural factors in winter wheat is established. 1. Introduction This paper deals with problems related to identifying and quantifying structures of association in complex domains on the basis of data and expert knowledge. Modelling complex stochastic systems involving many variables can be difficult. In such situations it is tempting o...