Results 1 
4 of
4
Diffusion Kernels on Statistical Manifolds
, 2004
"... A family of kernels for statistical learning is introduced that exploits the geometric structure of statistical models. The kernels are based on the heat equation on the Riemannian manifold defined by the Fisher information metric associated with a statistical family, and generalize the Gaussian ker ..."
Abstract

Cited by 87 (6 self)
 Add to MetaCart
A family of kernels for statistical learning is introduced that exploits the geometric structure of statistical models. The kernels are based on the heat equation on the Riemannian manifold defined by the Fisher information metric associated with a statistical family, and generalize the Gaussian kernel of Euclidean space. As an important special case, kernels based on the geometry of multinomial families are derived, leading to kernelbased learning algorithms that apply naturally to discrete data. Bounds on covering numbers and Rademacher averages for the kernels are proved using bounds on the eigenvalues of the Laplacian on Riemannian manifolds. Experimental results are presented for document classification, for which the use of multinomial geometry is natural and well motivated, and improvements are obtained over the standard use of Gaussian or linear kernels, which have been the standard for text classification.
ON THE LOWER BOUND ESTIMATES OF SECTIONS OF THE CANONICAL BUNDLES OVER A RIEMANN
, 1999
"... 2. A lower bound estimate 4 3. A counterexample 11 4. Partial uniform estimates 17 ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
2. A lower bound estimate 4 3. A counterexample 11 4. Partial uniform estimates 17
Information Diffusion Kernels
 Advances in Neural Information Processing Systems 15
, 2002
"... A new family of kernels for statistical learning is introduced that exploits the geometric structure of statistical models. Based on the heat equation on the Riemannian manifold defined by the Fisher information metric, information diffusion kernels generalize the Gaussian kernel of Euclidean sp ..."
Abstract
 Add to MetaCart
A new family of kernels for statistical learning is introduced that exploits the geometric structure of statistical models. Based on the heat equation on the Riemannian manifold defined by the Fisher information metric, information diffusion kernels generalize the Gaussian kernel of Euclidean space, and provide a natural way of combining generative statistical modeling with nonparametric discriminative learning. As a special case, the kernels give a new approach to applying kernelbased learning algorithms to discrete data. Bounds on covering numbers for the new kernels are proved using spectral theory in differential geometry, and experimental results are presented for real data sets.
ON A MULTIPARTICLE MOSERTRUDINGER INEQUALITY HAO FANG
, 2004
"... Abstract. We verify a conjecture of GilletSoulĂ©. We prove that the determinant of the Laplacian on a line bundle over CP 1 is always bounded from above. This can also be viewed as a multiparticle generalization of the MoserTrudinger Inequality. Furthermore, we conjecture that this functional achi ..."
Abstract
 Add to MetaCart
Abstract. We verify a conjecture of GilletSoulĂ©. We prove that the determinant of the Laplacian on a line bundle over CP 1 is always bounded from above. This can also be viewed as a multiparticle generalization of the MoserTrudinger Inequality. Furthermore, we conjecture that this functional achieves its maximum at the canonical metric. We give some evidence for this conjecture, as well as links to other fields of analysis. 1.