Results 1 
6 of
6
Cumulative distribution networks and the derivativesumproduct algorithm
"... We introduce a new type of graphical model called a ‘cumulative distribution network’ (CDN), which expresses a joint cumulative distribution as a product of local functions. Each local function can be viewed as providing evidence about possible orderings, or rankings, of variables. Interestingly, we ..."
Abstract

Cited by 12 (6 self)
 Add to MetaCart
We introduce a new type of graphical model called a ‘cumulative distribution network’ (CDN), which expresses a joint cumulative distribution as a product of local functions. Each local function can be viewed as providing evidence about possible orderings, or rankings, of variables. Interestingly, we find that the conditional independence properties of CDNs are quite different from other graphical models. We also describe a messagepassing algorithm that efficiently computes conditional cumulative distributions. Due to the unique independence properties of the CDN, these messages do not in general have a onetoone correspondence with messages exchanged in standard algorithms, such as belief propagation. We demonstrate the application of CDNs for structured ranking learning using a previouslystudied multiplayer gaming dataset. 1
Cumulative distribution networks: Inference, estimation and applications of graphical models for cumulative distribution functions
, 2009
"... ..."
Mixed Cumulative Distribution Networks
"... Directed acyclic graphs (DAGs) are a popular framework to express multivariate probability distributions. Acyclic directed mixed graphs (ADMGs) are generalizations of DAGs that can succinctly capture much richer sets of conditional independencies, and are especially useful in modeling the effects of ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Directed acyclic graphs (DAGs) are a popular framework to express multivariate probability distributions. Acyclic directed mixed graphs (ADMGs) are generalizations of DAGs that can succinctly capture much richer sets of conditional independencies, and are especially useful in modeling the effects of latent variables implicitly. Unfortunately, there are currently no parameterizations of general ADMGs. In this paper, we apply recent work on cumulative distribution networks and copulas to propose one general construction for ADMG models. We consider a simple parameter estimation approach, and report some encouraging experimental results. MGs are. Reading off independence constraints from a ADMG can be done with a procedure essentially identical to dseparation (Pearl, 1988, Richardson and Spirtes, 2002). Given a graphical structure, the challenge is to provide a procedure to parameterize models that correspond to the independence constraints of the graph, as illustrated below. Example 1: Bidirected edges correspond to some hidden common parent that has been marginalized. In the Gaussian case, this has an easy interpretation as constraints in the marginal covariance matrix of the remaining variables. Consider the two graphs below.
A Bayesian Approach to Constraint Based Causal Inference
"... We target the problem of accuracy and robustness in causal inference from finite data sets. Some stateoftheart algorithms produce clear output complete with solid theoretical guarantees but are susceptible to propagating erroneous decisions, while others are very adept at handling and representin ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
We target the problem of accuracy and robustness in causal inference from finite data sets. Some stateoftheart algorithms produce clear output complete with solid theoretical guarantees but are susceptible to propagating erroneous decisions, while others are very adept at handling and representing uncertainty, but need to rely on undesirable assumptions. Our aim is to combine the inherent robustness of the Bayesian approach with the theoretical strength and clarity of constraintbased methods. We use a Bayesian score to obtain probability estimates on the input statements used in a constraintbased procedure. These are subsequently processed in decreasing order of reliability, letting more reliable decisions take precedence in case of conflicts, until a single output model is obtained. Tests show that a basic implementation of the resulting Bayesian Constraintbased Causal Discovery (BCCD) algorithm already outperforms established procedures such as FCI and Conservative PC. It can also indicate which causal decisions in the output have high reliability and which do not.
A MCMC Approach for Learning the Structure of Gaussian Acyclic Directed Mixed Graphs
"... Abstract Graphical models are widely used to encode conditional independence constraints and causal assumptions, the directed acyclic graph (DAG) being one of the most common types of models. However, DAGs are not closed under marginalization: that is, a chosen marginal of a distribution Markov to a ..."
Abstract
 Add to MetaCart
Abstract Graphical models are widely used to encode conditional independence constraints and causal assumptions, the directed acyclic graph (DAG) being one of the most common types of models. However, DAGs are not closed under marginalization: that is, a chosen marginal of a distribution Markov to a DAG might not be representable with another DAG, unless one discards some of the structural independencies. Acyclic directed mixed graphs (ADMGs) generalize DAGs so that closure under marginalization is possible. In a previous work, we showed how to perform Bayesian inference to infer the posterior distribution of the parameters of a given Gaussian ADMG model, where the graph is fixed. In this paper, we extend this procedure to allow for priors over graph structures.
Proceedings of the TwentyThird International Joint Conference on Artificial Intelligence Bayesian Probabilities for ConstraintBased Causal Discovery ∗
"... We target the problem of accuracy and robustness in causal inference from finite data sets. Our aim is to combine the inherent robustness of the Bayesian approach with the theoretical strength and clarity of constraintbased methods. We use a Bayesian score to obtain probability estimates on the inp ..."
Abstract
 Add to MetaCart
We target the problem of accuracy and robustness in causal inference from finite data sets. Our aim is to combine the inherent robustness of the Bayesian approach with the theoretical strength and clarity of constraintbased methods. We use a Bayesian score to obtain probability estimates on the input statements used in a constraintbased procedure. These are subsequently processed in decreasing order of reliability, letting more reliable decisions take precedence in case of conflicts, until a single output model is obtained. Tests show that a basic implementation of the resulting Bayesian Constraintbased Causal Discovery (BCCD) algorithm already outperforms established procedures such as FCI and Conservative PC. It indicates which causal decisions in the output have high reliability and which do not. The approach is easily adapted to other application areas such as complex independence tests. 1