Results 1  10
of
57,951
Intrinsic polynomials for regression on Riemannian manifolds
 Journal of Mathematical Imaging and Vision
, 2014
"... In this paper we develop the theory of parametric polynomial regression in Riemannian manifolds and Lie groups. We show application of Riemannian polynomial regression to shape analysis in Kendall shape space. Results are presented, showing the power of polynomial regression on the classic rat skull ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
In this paper we develop the theory of parametric polynomial regression in Riemannian manifolds and Lie groups. We show application of Riemannian polynomial regression to shape analysis in Kendall shape space. Results are presented, showing the power of polynomial regression on the classic rat
Filling Riemannian manifolds
 J. of Differential Geometry
, 1983
"... We want to discuss here several unsolved problems concerning metric invariants of a Riemannian manifold V = (V, g) which mediate between the curvature and topology of V. ..."
Abstract

Cited by 325 (6 self)
 Add to MetaCart
We want to discuss here several unsolved problems concerning metric invariants of a Riemannian manifold V = (V, g) which mediate between the curvature and topology of V.
Ricci Flow with Surgery on ThreeManifolds
"... This is a technical paper, which is a continuation of [I]. Here we verify most of the assertions, made in [I, §13]; the exceptions are (1) the statement that a 3manifold which collapses with local lower bound for sectional curvature is a graph manifold this is deferred to a separate paper, as the ..."
Abstract

Cited by 454 (2 self)
 Add to MetaCart
This is a technical paper, which is a continuation of [I]. Here we verify most of the assertions, made in [I, §13]; the exceptions are (1) the statement that a 3manifold which collapses with local lower bound for sectional curvature is a graph manifold this is deferred to a separate paper
Discrete DifferentialGeometry Operators for Triangulated 2Manifolds
, 2002
"... This paper provides a unified and consistent set of flexible tools to approximate important geometric attributes, including normal vectors and curvatures on arbitrary triangle meshes. We present a consistent derivation of these first and second order differential properties using averaging Vorono ..."
Abstract

Cited by 453 (17 self)
 Add to MetaCart
: they respect most intrinsic properties of the continuous differential operators.
Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering
 Advances in Neural Information Processing Systems 14
, 2001
"... Drawing on the correspondence between the graph Laplacian, the LaplaceBeltrami operator on a manifold, and the connections to the heat equation, we propose a geometrically motivated algorithm for constructing a representation for data sampled from a low dimensional manifold embedded in a higher ..."
Abstract

Cited by 664 (8 self)
 Add to MetaCart
Drawing on the correspondence between the graph Laplacian, the LaplaceBeltrami operator on a manifold, and the connections to the heat equation, we propose a geometrically motivated algorithm for constructing a representation for data sampled from a low dimensional manifold embedded in a
Hierarchies from Fluxes in String Compactifications
, 2002
"... Warped compactifications with significant warping provide one of the few known mechanisms for naturally generating large hierarchies of physical scales. We demonstrate that this mechanism is realizable in string theory, and give examples involving orientifold compactifications of IIB string theory a ..."
Abstract

Cited by 724 (33 self)
 Add to MetaCart
and Ftheory compactifications on CalabiYau fourfolds. In each case, the hierarchy of scales is fixed by a choice of RR and NS fluxes in the compact manifold. Our solutions involve compactifications of the KlebanovStrassler gravity dual to a confining N = 1 supersymmetric gauge theory
Locally weighted learning
 ARTIFICIAL INTELLIGENCE REVIEW
, 1997
"... This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, ass ..."
Abstract

Cited by 594 (53 self)
 Add to MetaCart
This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias
Just Relax: Convex Programming Methods for Identifying Sparse Signals in Noise
, 2006
"... This paper studies a difficult and fundamental problem that arises throughout electrical engineering, applied mathematics, and statistics. Suppose that one forms a short linear combination of elementary signals drawn from a large, fixed collection. Given an observation of the linear combination that ..."
Abstract

Cited by 496 (2 self)
 Add to MetaCart
. This paper studies a method called convex relaxation, which attempts to recover the ideal sparse signal by solving a convex program. This approach is powerful because the optimization can be completed in polynomial time with standard scientific software. The paper provides general conditions which ensure
An Efficient Boosting Algorithm for Combining Preferences
, 1999
"... The problem of combining preferences arises in several applications, such as combining the results of different search engines. This work describes an efficient algorithm for combining multiple preferences. We first give a formal framework for the problem. We then describe and analyze a new boosting ..."
Abstract

Cited by 707 (18 self)
 Add to MetaCart
Boost to nearestneighbor and regression algorithms.
Results 1  10
of
57,951