Results 1 
4 of
4
Diffusion Kernels on Statistical Manifolds
, 2004
"... A family of kernels for statistical learning is introduced that exploits the geometric structure of statistical models. The kernels are based on the heat equation on the Riemannian manifold defined by the Fisher information metric associated with a statistical family, and generalize the Gaussian ker ..."
Abstract

Cited by 87 (6 self)
 Add to MetaCart
A family of kernels for statistical learning is introduced that exploits the geometric structure of statistical models. The kernels are based on the heat equation on the Riemannian manifold defined by the Fisher information metric associated with a statistical family, and generalize the Gaussian kernel of Euclidean space. As an important special case, kernels based on the geometry of multinomial families are derived, leading to kernelbased learning algorithms that apply naturally to discrete data. Bounds on covering numbers and Rademacher averages for the kernels are proved using bounds on the eigenvalues of the Laplacian on Riemannian manifolds. Experimental results are presented for document classification, for which the use of multinomial geometry is natural and well motivated, and improvements are obtained over the standard use of Gaussian or linear kernels, which have been the standard for text classification.
Information Diffusion Kernels
 Advances in Neural Information Processing Systems 15
, 2002
"... A new family of kernels for statistical learning is introduced that exploits the geometric structure of statistical models. Based on the heat equation on the Riemannian manifold defined by the Fisher information metric, information diffusion kernels generalize the Gaussian kernel of Euclidean sp ..."
Abstract
 Add to MetaCart
A new family of kernels for statistical learning is introduced that exploits the geometric structure of statistical models. Based on the heat equation on the Riemannian manifold defined by the Fisher information metric, information diffusion kernels generalize the Gaussian kernel of Euclidean space, and provide a natural way of combining generative statistical modeling with nonparametric discriminative learning. As a special case, the kernels give a new approach to applying kernelbased learning algorithms to discrete data. Bounds on covering numbers for the new kernels are proved using spectral theory in differential geometry, and experimental results are presented for real data sets.
1 Remarks on mean value properties
"... Dedicated to Paul H. Rabinowitz on the occasion of his 70th birthday We consider mean value properties for solutions of certain linear elliptic and parabolic equations in Euclidean and hyperbolic spaces which generalize standard mean value properties for solutions to the Laplace and the heat equatio ..."
Abstract
 Add to MetaCart
Dedicated to Paul H. Rabinowitz on the occasion of his 70th birthday We consider mean value properties for solutions of certain linear elliptic and parabolic equations in Euclidean and hyperbolic spaces which generalize standard mean value properties for solutions to the Laplace and the heat equations. 1.