Results 1  10
of
48
Functional Maps: A Flexible Representation of Maps Between Shapes
"... Figure 1: Horse algebra: the functional representation and map inference algorithm allow us to go beyond pointtopoint maps. The source shape (top left corner) was mapped to the target shape (left) by posing descriptorbased functional constraints which do not disambiguate symmetries (i.e. without ..."
Abstract

Cited by 48 (12 self)
 Add to MetaCart
Figure 1: Horse algebra: the functional representation and map inference algorithm allow us to go beyond pointtopoint maps. The source shape (top left corner) was mapped to the target shape (left) by posing descriptorbased functional constraints which do not disambiguate symmetries (i.e. without landmark constraints). By further adding correspondence constraints, we obtain a near isometric map which reverses orientation, mapping left to right (center). The representation allows for algebraic operations on shape maps, so we can subtract this map from the ambivalent map, to retrieve the orientation preserving nearisometry (right). Each column shows the first 20x20 block of the functional map representation (bottom), and the action of the map by transferring colors from the source shape to the target shape (top). We present a novel representation of maps between pairs of shapes that allows for efficient inference and manipulation. Key to our approach is a generalization of the notion of map that puts in correspondence realvalued functions rather than points on the shapes. By choosing a multiscale basis for the function space on each shape, such as the eigenfunctions of its LaplaceBeltrami operator, we obtain a representation of a map that is very compact, yet fully suitable for global inference. Perhaps more remarkably, most
A Cheeger inequality for the graph connection Laplacian
, 2012
"... The O(d) Synchronization problem consists of estimating a set of n unknown orthogonal d × d matrices O1,..., On from noisy measurements of a subset of the pairwise ratios OiO −1 j. We formulate and prove a Cheegertype inequality that relates a measure of how well it is possible to solve the O(d) sy ..."
Abstract

Cited by 24 (14 self)
 Add to MetaCart
The O(d) Synchronization problem consists of estimating a set of n unknown orthogonal d × d matrices O1,..., On from noisy measurements of a subset of the pairwise ratios OiO −1 j. We formulate and prove a Cheegertype inequality that relates a measure of how well it is possible to solve the O(d) synchronization problem with the spectra of an operator, the graph Connection Laplacian. We also show how this inequality provides a worst case performance guarantee for a spectral method to solve this problem.
EXACT AND STABLE RECOVERY OF ROTATIONS FOR ROBUST SYNCHRONIZATION
, 1211
"... Abstract. The synchronization problem over the special orthogonal group SO(d) consists of estimating a set of unknown rotations R1, R2,..., Rn from noisy measurements of a subset of their pairwise ratios R −1 i Rj. The problem has found applications in computer vision, computer graphics, and sensor ..."
Abstract

Cited by 22 (9 self)
 Add to MetaCart
(Show Context)
Abstract. The synchronization problem over the special orthogonal group SO(d) consists of estimating a set of unknown rotations R1, R2,..., Rn from noisy measurements of a subset of their pairwise ratios R −1 i Rj. The problem has found applications in computer vision, computer graphics, and sensor network localization, among others. Its least squares solution can be approximated by either spectral relaxation or semidefinite programming followed by a rounding procedure, analogous to the approximation algorithms of MaxCut. The contribution of this paper is threefold: First, we introduce a robust penalty function involving the sum of unsquared deviations and derive a relaxation that leads to a convex optimization problem; Second, we apply the alternating direction method to minimize the penalty function; Finally, under a specific model of the measurement noise and the measurement graph, we prove that the rotations are exactly and stably recovered, exhibiting a phase transition behavior in terms of the proportion of noisy measurements. Numerical simulations confirm the phase transition behavior for our method as well as its improved accuracy compared to existing methods. Key words. Synchronization of rotations; least unsquared deviation; semidefinite relaxation; and alternating direction method 1. Introduction. The
Image cosegmentation via consistent functional maps
 In Proc. Int. Conf. on Comp. Vis. (2013), IEEE
"... Joint segmentation of image sets has great importance for object recognition, image classification, and image retrieval. In this paper, we aim to jointly segment a set of images starting from a small number of labeled images or none at all. To allow the images to share segmentation information with ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
(Show Context)
Joint segmentation of image sets has great importance for object recognition, image classification, and image retrieval. In this paper, we aim to jointly segment a set of images starting from a small number of labeled images or none at all. To allow the images to share segmentation information with each other, we build a network that contains segmented as well as unsegmented images, and extract functional maps between connected image pairs based on image appearance features. These functional maps act as general property transporters between the images and, in particular, are used to transfer segmentations. We define and operate in a reduced functional space optimized so that the functional maps approximately satisfy cycleconsistency under composition in the network. A joint optimization framework is proposed to simultaneously generate all segmentation functions over the images so that they both align with local segmentation cues in each particular image, and agree with each other under network transportation. This formulation allows us to extract segmentations even with no training data, but can also exploit such data when available. The collective effect of the joint processing using functional maps leads to accurate information sharing among images and yields superior segmentation results, as shown on the iCoseg, MSRC, and PASCAL data sets. 1.
SPECTRAL CONVERGENCE OF THE CONNECTION LAPLACIAN FROM RANDOM SAMPLES
, 1306
"... ABSTRACT. Spectral methods that are based on eigenvectors and eigenvalues of discrete graph Laplacians, such as Diffusion Maps and Laplacian Eigenmaps are extremely useful for manifold learning. It was previously shown by Belkin and Niyogi [4] that the eigenvectors and eigenvalues of the graph Lapla ..."
Abstract

Cited by 9 (4 self)
 Add to MetaCart
(Show Context)
ABSTRACT. Spectral methods that are based on eigenvectors and eigenvalues of discrete graph Laplacians, such as Diffusion Maps and Laplacian Eigenmaps are extremely useful for manifold learning. It was previously shown by Belkin and Niyogi [4] that the eigenvectors and eigenvalues of the graph Laplacian converge to the eigenfunctions and eigenvalues of the LaplaceBeltrami operator of the manifold in the limit of infinitely many uniformly sampled data points. Recently, we introduced Vector Diffusion Maps and showed that the Connection Laplacian of the tangent bundle of the manifold can be approximated from random samples. In this paper, we present a unified framework for approximating other Connection Laplacians over the manifold by considering its principle bundle structure. We prove that the eigenvectors and eigenvalues of these Laplacians converge in the limit of infinitely many random samples. Our results for spectral convergence also hold in the case where the data points are sampled from a nonuniform distribution, and for manifolds with and without boundary. 1.
The Spectrum of Random Innerproduct Kernel Matrices
, 1202
"... Abstract: We consider nbyn matrices whose (i,j)th entry is f(X T i Xj), where X1,...,Xn are i.i.d. standard Gaussian random vectors in R p, and f is a realvalued function. The eigenvalue distribution of these random kernel matrices is studied at the “large p, large n ” regime. It is shown that, ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
(Show Context)
Abstract: We consider nbyn matrices whose (i,j)th entry is f(X T i Xj), where X1,...,Xn are i.i.d. standard Gaussian random vectors in R p, and f is a realvalued function. The eigenvalue distribution of these random kernel matrices is studied at the “large p, large n ” regime. It is shown that, when p,n → ∞ and p/n = γ which is a constant, and f is properly scaled so that Var(f(X T i Xj)) is O(p −1), the spectral density converges weakly to a limiting density on R. The limiting density is dictated by a cubic equation involving its Stieltjes transform. While for smooth kernel functions the limiting spectral density has been previously shown to be the MarcenkoPastur distribution, our analysis is applicable to nonsmooth kernel functions, resulting in a new family of limiting densities.
Local linear regression on manifolds and its geometric interpretation
 JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
, 2013
"... Highdimensional data analysis has been an active area, and the main focuses have been variable selection and dimension reduction. In practice, it occurs often that the variables are located on an unknown, lowerdimensional nonlinear manifold. Under this manifold assumption, one purpose of this pap ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
(Show Context)
Highdimensional data analysis has been an active area, and the main focuses have been variable selection and dimension reduction. In practice, it occurs often that the variables are located on an unknown, lowerdimensional nonlinear manifold. Under this manifold assumption, one purpose of this paper is regression and gradient estimation on the manifold, and another is developing a new tool for manifold learning. To the first aim, we suggest directly reducing the dimensionality to the intrinsic dimension d of the manifold, and performing the popular local linear regression (LLR) on a tangent plane estimate. An immediate consequence is a dramatic reduction in the computation time when the ambient space dimension p d. We provide rigorous theoretical justification of the convergence of the proposed regression and gradient estimators by carefully analyzing the curvature, boundary, and nonuniform sampling effects. A bandwidth selector that can handle heteroscedastic errors is proposed. To the second aim, we analyze carefully the behavior of our regression estimator both in the interior and near the boundary of the manifold, and make explicit its relationship with manifold learning, in particular estimating the LaplaceBeltrami operator of the manifold. In this context, we also make clear that it is important to use a smaller bandwidth in the tangent plane estimation than in the LLR. Simulation studies and the Isomap face data example are used to illustrate the computational speed and estimation accuracy of our methods.
Alternating Projection, Ptychographic Imaging and Phase Synchronization. ArXiv eprints
, 2014
"... Abstract. We demonstrate necessary and sufficient conditions of the global convergence of the alternating projection algorithm to a unique solution up to a global phase factor. Additionally, for the ptychographic imaging problem, we discuss phase synchronization and connection graph Laplacian, and ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
(Show Context)
Abstract. We demonstrate necessary and sufficient conditions of the global convergence of the alternating projection algorithm to a unique solution up to a global phase factor. Additionally, for the ptychographic imaging problem, we discuss phase synchronization and connection graph Laplacian, and show how to construct an accurate initial guess to accelerate convergence speed to handle the big imaging data in the coming new light source era. 1.
Embedding Riemannian Manifolds by the Heat Kernel of the Connection Laplacian, preprint
"... ar ..."
(Show Context)
Linearprojection diffusion on smooth euclidean submanifolds, Submitted to Applied and Computational Harmonic Analysis
"... To process massive highdimensional datasets, we utilize the underlying assumption that data on manifold is approximately linear in sufficiently small patches (or neighborhoods of points) that are sampled with sufficient density from the manifold. Under this assumption, each patch can be represented ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
(Show Context)
To process massive highdimensional datasets, we utilize the underlying assumption that data on manifold is approximately linear in sufficiently small patches (or neighborhoods of points) that are sampled with sufficient density from the manifold. Under this assumption, each patch can be represented (up to a small approximation error) by a tangent space of the manifold in its area and the tangential point of this tangent space. We extend previously obtained results [1] for the finite construction of a linearprojection diffusion (LPD) superkernel by exploring its properties when it becomes continuous. Specifically, its infinitesimal generator and the stochastic process defined by it are explored. We show that the resulting infinitesimal generator of this superkernel converges to a natural extension of the original diffusion operator from scalar functions to vector fields. This operator is shown to be locally equivalent to a composition of linear projections between tangent spaces and the vectorLaplacians on them. We define a LPD process by using the LPD superkernel as a transition operator while extending the process to be continuous. The obtained LPD process is demonstrated on a synthetic manifold.