Results 1  10
of
13
Cumulative distribution networks and the derivativesumproduct algorithm
"... We introduce a new type of graphical model called a ‘cumulative distribution network’ (CDN), which expresses a joint cumulative distribution as a product of local functions. Each local function can be viewed as providing evidence about possible orderings, or rankings, of variables. Interestingly, we ..."
Abstract

Cited by 17 (6 self)
 Add to MetaCart
We introduce a new type of graphical model called a ‘cumulative distribution network’ (CDN), which expresses a joint cumulative distribution as a product of local functions. Each local function can be viewed as providing evidence about possible orderings, or rankings, of variables. Interestingly, we find that the conditional independence properties of CDNs are quite different from other graphical models. We also describe a messagepassing algorithm that efficiently computes conditional cumulative distributions. Due to the unique independence properties of the CDN, these messages do not in general have a onetoone correspondence with messages exchanged in standard algorithms, such as belief propagation. We demonstrate the application of CDNs for structured ranking learning using a previouslystudied multiplayer gaming dataset. 1
Wishart distributions for decomposable covariance graph models, Ann. Statist
, 2010
"... ar ..."
(Show Context)
A Bayesian Approach to Constraint Based Causal Inference
"... We target the problem of accuracy and robustness in causal inference from finite data sets. Some stateoftheart algorithms produce clear output complete with solid theoretical guarantees but are susceptible to propagating erroneous decisions, while others are very adept at handling and representin ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
We target the problem of accuracy and robustness in causal inference from finite data sets. Some stateoftheart algorithms produce clear output complete with solid theoretical guarantees but are susceptible to propagating erroneous decisions, while others are very adept at handling and representing uncertainty, but need to rely on undesirable assumptions. Our aim is to combine the inherent robustness of the Bayesian approach with the theoretical strength and clarity of constraintbased methods. We use a Bayesian score to obtain probability estimates on the input statements used in a constraintbased procedure. These are subsequently processed in decreasing order of reliability, letting more reliable decisions take precedence in case of conflicts, until a single output model is obtained. Tests show that a basic implementation of the resulting Bayesian Constraintbased Causal Discovery (BCCD) algorithm already outperforms established procedures such as FCI and Conservative PC. It can also indicate which causal decisions in the output have high reliability and which do not.
Cumulative distribution networks: Inference, estimation and applications of graphical models for cumulative distribution functions
, 2009
"... ..."
Mixed Cumulative Distribution Networks
"... Directed acyclic graphs (DAGs) are a popular framework to express multivariate probability distributions. Acyclic directed mixed graphs (ADMGs) are generalizations of DAGs that can succinctly capture much richer sets of conditional independencies, and are especially useful in modeling the effects of ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
(Show Context)
Directed acyclic graphs (DAGs) are a popular framework to express multivariate probability distributions. Acyclic directed mixed graphs (ADMGs) are generalizations of DAGs that can succinctly capture much richer sets of conditional independencies, and are especially useful in modeling the effects of latent variables implicitly. Unfortunately, there are currently no parameterizations of general ADMGs. In this paper, we apply recent work on cumulative distribution networks and copulas to propose one general construction for ADMG models. We consider a simple parameter estimation approach, and report some encouraging experimental results. MGs are. Reading off independence constraints from a ADMG can be done with a procedure essentially identical to dseparation (Pearl, 1988, Richardson and Spirtes, 2002). Given a graphical structure, the challenge is to provide a procedure to parameterize models that correspond to the independence constraints of the graph, as illustrated below. Example 1: Bidirected edges correspond to some hidden common parent that has been marginalized. In the Gaussian case, this has an easy interpretation as constraints in the marginal covariance matrix of the remaining variables. Consider the two graphs below.
Editors
"... The working papers published in the Series constitute work in progress circulated to stimulate discussion and critical comments. Views expressed represent exclusively the authors ’ own opinions and do not necessarily reflect those of the editors. Ruhr Economic Papers #407 ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
The working papers published in the Series constitute work in progress circulated to stimulate discussion and critical comments. Views expressed represent exclusively the authors ’ own opinions and do not necessarily reflect those of the editors. Ruhr Economic Papers #407
A MCMC Approach for Learning the Structure of Gaussian Acyclic Directed Mixed Graphs
"... Abstract Graphical models are widely used to encode conditional independence constraints and causal assumptions, the directed acyclic graph (DAG) being one of the most common types of models. However, DAGs are not closed under marginalization: that is, a chosen marginal of a distribution Markov to a ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Abstract Graphical models are widely used to encode conditional independence constraints and causal assumptions, the directed acyclic graph (DAG) being one of the most common types of models. However, DAGs are not closed under marginalization: that is, a chosen marginal of a distribution Markov to a DAG might not be representable with another DAG, unless one discards some of the structural independencies. Acyclic directed mixed graphs (ADMGs) generalize DAGs so that closure under marginalization is possible. In a previous work, we showed how to perform Bayesian inference to infer the posterior distribution of the parameters of a given Gaussian ADMG model, where the graph is fixed. In this paper, we extend this procedure to allow for priors over graph structures.
Bayesian Inference in Cumulative Distribution Fields
"... Abstract One approach for constructing copula functions is by multiplication. Given that products of cumulative distribution functions (CDFs) are also CDFs, an adjustment to this multiplication will result in a copula model, as discussed by Liebscher (J Mult Analysis, 2008). Parameterizing models vi ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract One approach for constructing copula functions is by multiplication. Given that products of cumulative distribution functions (CDFs) are also CDFs, an adjustment to this multiplication will result in a copula model, as discussed by Liebscher (J Mult Analysis, 2008). Parameterizing models via products of CDFs has some advantages, both from the copula perspective (e.g., it is welldefined for any dimensionality) and from general multivariate analysis (e.g., it provides models where small dimensional marginal distributions can be easily readoff from the parameters). Independently, Huang and Frey (J Mach Learn Res, 2011) showed the connection between certain sparse graphical models and products of CDFs, as well as messagepassing (dynamic programming) schemes for computing the likelihood function of such models. Such schemes allows models to be estimated with likelihoodbased methods. We discuss and demonstrate MCMC approaches for estimating such models in a Bayesian context, their application in copula modeling, and how messagepassing can be strongly simplified. Importantly, our view of messagepassing opens up possibilities to scaling up such methods, given that even dynamic programming is not a scalable solution for calculating likelihood functions in many models. 1
Proceedings of the TwentyThird International Joint Conference on Artificial Intelligence Bayesian Probabilities for ConstraintBased Causal Discovery ∗
"... We target the problem of accuracy and robustness in causal inference from finite data sets. Our aim is to combine the inherent robustness of the Bayesian approach with the theoretical strength and clarity of constraintbased methods. We use a Bayesian score to obtain probability estimates on the inp ..."
Abstract
 Add to MetaCart
We target the problem of accuracy and robustness in causal inference from finite data sets. Our aim is to combine the inherent robustness of the Bayesian approach with the theoretical strength and clarity of constraintbased methods. We use a Bayesian score to obtain probability estimates on the input statements used in a constraintbased procedure. These are subsequently processed in decreasing order of reliability, letting more reliable decisions take precedence in case of conflicts, until a single output model is obtained. Tests show that a basic implementation of the resulting Bayesian Constraintbased Causal Discovery (BCCD) algorithm already outperforms established procedures such as FCI and Conservative PC. It indicates which causal decisions in the output have high reliability and which do not. The approach is easily adapted to other application areas such as complex independence tests. 1