## The Multiinformation Function As A Tool For Measuring Stochastic Dependence (1998)

Venue: | Learning in Graphical Models |

Citations: | 32 - 0 self |

### BibTeX

@INPROCEEDINGS{Studeny98themultiinformation,

author = {M. Studeny and Y and J. Vejnarová},

title = {The Multiinformation Function As A Tool For Measuring Stochastic Dependence},

booktitle = {Learning in Graphical Models},

year = {1998},

pages = {261--298},

publisher = {Kluwer Academic Publishers}

}

### Years of Citing Articles

### OpenURL

### Abstract

. Given a collection of random variables [¸ i ] i2N where N is a finite nonempty set, the corresponding multiinformation function ascribes the relative entropy of the joint distribution of [¸ i ] i2A with respect to the product of distributions of individual random variables ¸ i for i 2 A to every subset A ae N . We argue it is a useful tool for problems concerning stochastic (conditional) dependence and independence (at least in discrete case). First, it makes possible to express the conditional mutual information between [¸ i ] i2A and [¸ i ] i2B given [¸ i ] i2C (for every disjoint A; B; C ae N) which can be considered as a good measure of conditional stochastic dependence. Second, one can introduce reasonable measures of dependence of level r among variables [¸ i ] i2A (where A ae N , 1 r ! card A) which are expressible by means of the multiinformation function. Third, it enables one to derive theoretical results on (nonexistence of an) axiomatic characterization of stochastic c...