Results 1 
3 of
3
Cumulative distribution networks and the derivativesumproduct algorithm
"... We introduce a new type of graphical model called a ‘cumulative distribution network’ (CDN), which expresses a joint cumulative distribution as a product of local functions. Each local function can be viewed as providing evidence about possible orderings, or rankings, of variables. Interestingly, we ..."
Abstract

Cited by 12 (6 self)
 Add to MetaCart
We introduce a new type of graphical model called a ‘cumulative distribution network’ (CDN), which expresses a joint cumulative distribution as a product of local functions. Each local function can be viewed as providing evidence about possible orderings, or rankings, of variables. Interestingly, we find that the conditional independence properties of CDNs are quite different from other graphical models. We also describe a messagepassing algorithm that efficiently computes conditional cumulative distributions. Due to the unique independence properties of the CDN, these messages do not in general have a onetoone correspondence with messages exchanged in standard algorithms, such as belief propagation. We demonstrate the application of CDNs for structured ranking learning using a previouslystudied multiplayer gaming dataset. 1
Cumulative distribution networks: Inference, estimation and applications of graphical models for cumulative distribution functions
, 2009
"... ..."
Maximumlikelihood learning of cumulative distribution functions on graphs
 13th International Conference on Artificial Intelligence and Statistics, AISTATS
, 2010
"... For many applications, a probability model can be more easily expressed as a cumulative distribution functions (CDF) as compared to the use of probability density or mass functions (PDF/PMFs). One advantage of CDF models is the simplicity of representing multivariate heavytailed distributions. Exam ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
For many applications, a probability model can be more easily expressed as a cumulative distribution functions (CDF) as compared to the use of probability density or mass functions (PDF/PMFs). One advantage of CDF models is the simplicity of representing multivariate heavytailed distributions. Examples of fields that can benefit from the use of graphical models for CDFs include climatology and epidemiology, where datafollowheavytaileddistributions and exhibit spatial correlations so that dependencies between model variables must be accounted for. However, in most cases the problem of learning from data consists of optimizing the loglikelihood function with respect to model parameters where we are required to optimize a logPDF/PMF and not a logCDF. Given a CDF defined on a graph, we present a messagepassing algorithm called the gradientderivativeproduct (GDP) algorithm that allows us to learn the model in terms of the loglikelihood function whereby messages correspond to local gradients of the likelihood with respect to model parameters. We demonstrate the GDP algorithm on realworld rainfall and H1N1 mortality data and weshow that the heavytailed multivariate distributions that arise in these problems can both be naturally parameterized and tractably estimated from data using our algorithm. 1