Results 1 
5 of
5
Diffusion Kernels on Statistical Manifolds
, 2004
"... A family of kernels for statistical learning is introduced that exploits the geometric structure of statistical models. The kernels are based on the heat equation on the Riemannian manifold defined by the Fisher information metric associated with a statistical family, and generalize the Gaussian ker ..."
Abstract

Cited by 92 (6 self)
 Add to MetaCart
A family of kernels for statistical learning is introduced that exploits the geometric structure of statistical models. The kernels are based on the heat equation on the Riemannian manifold defined by the Fisher information metric associated with a statistical family, and generalize the Gaussian kernel of Euclidean space. As an important special case, kernels based on the geometry of multinomial families are derived, leading to kernelbased learning algorithms that apply naturally to discrete data. Bounds on covering numbers and Rademacher averages for the kernels are proved using bounds on the eigenvalues of the Laplacian on Riemannian manifolds. Experimental results are presented for document classification, for which the use of multinomial geometry is natural and well motivated, and improvements are obtained over the standard use of Gaussian or linear kernels, which have been the standard for text classification.
Toward a Generic Framework for Recognition Based on Uncertain Geometric Features
 Journal of Computer Vision Research
, 1998
"... this paper is to ..."
Contents
, 2011
"... 1 Introduction: Why do we need to compute on manifolds? 7 1.1 Computational anatomy.................................. 7 1.2 Shapes, forms and deformations.............................. 8 1.3 Problems with the processing of manifold data..................... 9 ..."
Abstract
 Add to MetaCart
1 Introduction: Why do we need to compute on manifolds? 7 1.1 Computational anatomy.................................. 7 1.2 Shapes, forms and deformations.............................. 8 1.3 Problems with the processing of manifold data..................... 9
Intrinsic Statistics on Riemannian Manifolds: Basic Tools for Geometric Measurements
, 2011
"... In medical image analysis and high level computer vision, there is an intensive use of geometric features like orientations, lines, and geometric transformations ranging from simple ones (orientations, lines, rigid body or affine transformations, etc.) to very complex ones like curves, surfaces, or ..."
Abstract
 Add to MetaCart
In medical image analysis and high level computer vision, there is an intensive use of geometric features like orientations, lines, and geometric transformations ranging from simple ones (orientations, lines, rigid body or affine transformations, etc.) to very complex ones like curves, surfaces, or general diffeomorphic transformations. The measurement of such geometric primitives is generally noisy in real applications and we need to use statistics either to reduce the uncertainty (estimation), to compare observations, or to test hypotheses. Unfortunately, even simple geometric primitives often belong to manifolds that are not vector spaces. In previous works [1, 2], we investigated invariance requirements to build some statistical tools on transformation groups and homogeneous manifolds that avoids paradoxes. In this paper, we consider finite dimensional manifolds with a Riemannian metric as the basic structure. Based on this metric, we develop the notions of mean value and covariance matrix of a random element, normal law, Mahalanobis distance and χ 2 law. We provide a new proof of the characterization of Riemannian centers of mass and an original gradient descent algorithm to efficiently compute them. The notion of Normal law we propose is based on the maximization of the entropy knowing the mean and covariance of the distribution. The resulting family of pdfs spans the whole range from uniform (on compact manifolds) to the point mass distribution. Moreover, we were able to provide tractable approximations (with their limits) for small variances which show that
Classification of CurveCurve Intersections from the CAD/CAM Viewpoint
"... Intersection points of curves are the basic elements for many algorithms in the CAD/CAM area. In most circumstances, the key property of an intersection point is the behavior of the curves in a neighborhood of that intersection point. This paper presents the classi3cation of intersection pointsfio ..."
Abstract
 Add to MetaCart
Intersection points of curves are the basic elements for many algorithms in the CAD/CAM area. In most circumstances, the key property of an intersection point is the behavior of the curves in a neighborhood of that intersection point. This paper presents the classi3cation of intersection pointsfiom the differential geometric point of view in terms of the movement ofparticles along curves. The movement is evaluated in the neighborhood of an intersection point under the constraint that the particle moves along one curve with respect to the othel: Applications of this class$cation and an universal algorithm for boolean operations on areas enclosed by continuous, simple curves show the advantages of this class$cation for todays CAD/CAM problems. 1.