## A Hilbert space embedding for distributions (2007)

### Cached

### Download Links

- [www.kyb.tuebingen.mpg.de]
- [www.kyb.mpg.de]
- [www.cs.cmu.edu]
- [www.gatsby.ucl.ac.uk]
- [eprints.pascal-network.org]
- DBLP

### Other Repositories/Bibliography

Venue: | In Algorithmic Learning Theory: 18th International Conference |

Citations: | 53 - 26 self |

### BibTeX

@INPROCEEDINGS{Smola07ahilbert,

author = {Alex Smola and Arthur Gretton and Le Song and Bernhard Schölkopf},

title = {A Hilbert space embedding for distributions},

booktitle = {In Algorithmic Learning Theory: 18th International Conference},

year = {2007},

pages = {13--31},

publisher = {Springer-Verlag}

}

### OpenURL

### Abstract

Abstract. We describe a technique for comparing distributions without the need for density estimation as an intermediate step. Our approach relies on mapping the distributions into a reproducing kernel Hilbert space. Applications of this technique can be found in two-sample tests, which are used for determining whether two sets of observations arise from the same distribution, covariate shift correction, local learning, measures of independence, and density estimation. Kernel methods are widely used in supervised learning [1, 2, 3, 4], however they are much less established in the areas of testing, estimation, and analysis of probability distributions, where information theoretic approaches [5, 6] have long been dominant. Recent examples include [7] in the context of construction of graphical models, [8] in the context of feature extraction, and [9] in the context of independent component analysis. These methods have by and large a common issue: to compute quantities such as the mutual information, entropy, or Kullback-Leibler divergence, we require sophisticated space partitioning and/or