Results 1  10
of
15
Diffusion Kernels on Statistical Manifolds
, 2004
"... A family of kernels for statistical learning is introduced that exploits the geometric structure of statistical models. The kernels are based on the heat equation on the Riemannian manifold defined by the Fisher information metric associated with a statistical family, and generalize the Gaussian ker ..."
Abstract

Cited by 116 (8 self)
 Add to MetaCart
(Show Context)
A family of kernels for statistical learning is introduced that exploits the geometric structure of statistical models. The kernels are based on the heat equation on the Riemannian manifold defined by the Fisher information metric associated with a statistical family, and generalize the Gaussian kernel of Euclidean space. As an important special case, kernels based on the geometry of multinomial families are derived, leading to kernelbased learning algorithms that apply naturally to discrete data. Bounds on covering numbers and Rademacher averages for the kernels are proved using bounds on the eigenvalues of the Laplacian on Riemannian manifolds. Experimental results are presented for document classification, for which the use of multinomial geometry is natural and well motivated, and improvements are obtained over the standard use of Gaussian or linear kernels, which have been the standard for text classification.
conformal field theories, limit sets of Kleinian groups and holography, J.Geom.Phys
"... ..."
(Show Context)
Reflecting diffusions and hyperbolic Brownian motions in multidimensional spheres
 Lithuanian Mathematical Journal
, 2013
"... Diffusion processes (Xd(t))t≥0 moving inside spheres S d R ⊂ Rd and reflecting orthogonally on their surfaces ∂SdR are considered. The stochastic differential equations governing the reflecting diffusions are presented and their kernels and distributions explicitly derived. Reflection is obtained by ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Diffusion processes (Xd(t))t≥0 moving inside spheres S d R ⊂ Rd and reflecting orthogonally on their surfaces ∂SdR are considered. The stochastic differential equations governing the reflecting diffusions are presented and their kernels and distributions explicitly derived. Reflection is obtained by means of the inversion with respect to the sphere SdR. The particular cases of OrnsteinUhlenbeck process and Brownian motion are examined in detail. The hyperbolic Brownian motion on the Poincarè halfspace Hd is examined in the last part of the paper and its reflecting counterpart within hyperbolic spheres is studied. Finally a section is devoted to reflecting hyperbolic Brownian motion in the Poincarè disc D within spheres concentric with D.
Global existence, scattering and blowup for the focusing nls on the hyperbolic space. arXiv preprint arXiv:1411.0846
, 2014
"... ar ..."
(Show Context)
NORMALIZATION OF THE DIFFUSIVE FILTERS THAT REPRESENT THE INHOMOGENEOUS COVARIANCE OPERATORS OF VARIATIONAL ASSIMILATION, USING ASYMPTOTIC EXPANSIONS AND TECHNIQUES OF NONEUCLIDEAN GEOMETRY; PART II: RIEMANNIAN GEOMETRY AND THE GENERIC PARAMETRIX EXPANSI
, 2008
"... This is an unreviewed manuscript, primarily intended for informal exchange of information among the ncep staff members ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
This is an unreviewed manuscript, primarily intended for informal exchange of information among the ncep staff members
Information Diffusion Kernels
 Advances in Neural Information Processing Systems 15
, 2002
"... A new family of kernels for statistical learning is introduced that exploits the geometric structure of statistical models. Based on the heat equation on the Riemannian manifold defined by the Fisher information metric, information diffusion kernels generalize the Gaussian kernel of Euclidean sp ..."
Abstract
 Add to MetaCart
A new family of kernels for statistical learning is introduced that exploits the geometric structure of statistical models. Based on the heat equation on the Riemannian manifold defined by the Fisher information metric, information diffusion kernels generalize the Gaussian kernel of Euclidean space, and provide a natural way of combining generative statistical modeling with nonparametric discriminative learning. As a special case, the kernels give a new approach to applying kernelbased learning algorithms to discrete data. Bounds on covering numbers for the new kernels are proved using spectral theory in differential geometry, and experimental results are presented for real data sets.
Explicit bounds on eigenfunctions and spectral functions on manifolds hyperbolic near a
, 2013
"... point ..."
(Show Context)
Contents
"... Abstract: We construct a new class of entanglement measures by extending the usual definition of Rényi entropy to include a chemical potential. These charged Rényi entropies measure the degree of entanglement in different charge sectors of the theory and are given by Euclidean path integrals with ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract: We construct a new class of entanglement measures by extending the usual definition of Rényi entropy to include a chemical potential. These charged Rényi entropies measure the degree of entanglement in different charge sectors of the theory and are given by Euclidean path integrals with the insertion of a Wilson line encircling the entangling surface. We compute these entropies for a spherical entangling surface in CFT’s with holographic duals, where they are related to entropies of charged black holes with hyperbolic horizons. We also compute charged Rényi entropies in free field theories. ArXiv ePrint: 1310.nnnn [hepth] ar X iv