Results 1  10
of
26
Algorithms for manifold learning
, 2005
"... Manifold learning is a popular recent approach to nonlinear dimensionality reduction. Algorithms for this task are based on the idea that the dimensionality of many data sets is only artificially high; though each data point consists of perhaps thousands of features, it may be described as a funct ..."
Abstract

Cited by 23 (0 self)
 Add to MetaCart
Manifold learning is a popular recent approach to nonlinear dimensionality reduction. Algorithms for this task are based on the idea that the dimensionality of many data sets is only artificially high; though each data point consists of perhaps thousands of features, it may be described as a function of only a few underlying parameters. That is, the data points are actually samples from a lowdimensional manifold that is embedded in a highdimensional space. Manifold learning algorithms attempt to uncover these parameters in order to find a lowdimensional representation of the data. In this paper, we discuss the motivation, background, and algorithms proposed for manifold learning. Isomap, Locally Linear Embedding, Laplacian Eigenmaps, Semidefinite Embedding, and a host of variants of these algorithms are examined. 1
Dimensionality Estimation, Manifold Learning and Function Approximation using Tensor Voting
, 2010
"... We address instancebased learning from a perceptual organization standpoint and present methods for dimensionality estimation, manifold learning and function approximation. Under our approach, manifolds in highdimensional spaces are inferred by estimating geometric relationships among the input in ..."
Abstract

Cited by 13 (5 self)
 Add to MetaCart
We address instancebased learning from a perceptual organization standpoint and present methods for dimensionality estimation, manifold learning and function approximation. Under our approach, manifolds in highdimensional spaces are inferred by estimating geometric relationships among the input instances. Unlike conventional manifold learning, we do not perform dimensionality reduction, but instead perform all operations in the original input space. For this purpose we employ a novel formulation of tensor voting, which allows an ND implementation. Tensor voting is a perceptual organization framework that has mostly been applied to computer vision problems. Analyzing the estimated local structure at the inputs, we are able to obtain reliable dimensionality estimates at each instance, instead of a global estimate for the entire data set. Moreover, these local dimensionality and structure estimates enable us to measure geodesic distances and perform nonlinear interpolation for data sets with varying density, outliers, perturbation and intersections, that cannot be handled by stateoftheart methods. Quantitative results on the estimation of local manifold structure using ground truth data are presented. In addition, we compare our approach with several leading methods for manifold learning at the task of measuring geodesic distances. Finally, we show competitive function approximation results on real data.
A sensorimotor approach to sound localization
 Neural Computation
, 2008
"... Sound localization is known to be a complex phenomenon, combining multisensory information processing, experiencedependent plasticity and movement. Here we present a sensorimotor model that addresses the question of how an organism could learn to localize sound sources without any a priori neural r ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
(Show Context)
Sound localization is known to be a complex phenomenon, combining multisensory information processing, experiencedependent plasticity and movement. Here we present a sensorimotor model that addresses the question of how an organism could learn to localize sound sources without any a priori neural representation of its head related transfer function (HRTF) or prior experience with auditory spatial information. We demonstrate quantitatively that the experience of the sensory consequences of its voluntary motor actions allows an organism to learn the spatial location of any sound source. Using examples from humans and echolocating bats, our model shows that a naive organism can learn the auditory space based solely on acoustic inputs and their relation to motor states. A SENSORIMOTOR APPROACH TO SOUND LOCALIZATION 2
Parameterless isomap with adaptive neighborhood selection
 In Proceedings of the 28 th DAGM Symposium
, 2006
"... Abstract. Isomap is a highly popular manifold learning and dimensionality reduction technique that effectively performs multidimensional scaling on estimates of geodesic distances. However, the resulting output is extremely sensitive to parameters that control the selection of neighbors at each poin ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
(Show Context)
Abstract. Isomap is a highly popular manifold learning and dimensionality reduction technique that effectively performs multidimensional scaling on estimates of geodesic distances. However, the resulting output is extremely sensitive to parameters that control the selection of neighbors at each point. To date, no principled way of setting these parameters has been proposed, and in practice they are often tuned ad hoc, sometimes empirically based on prior knowledge of the desired output. In this paper we propose a parameterless technique that adaptively defines the neighborhood at each input point based on intrinsic dimensionality and local tangent orientation. In addition to eliminating the guesswork associated with parameter configuration, the adaptive nature of this technique enables it to select optimal neighborhoods locally at each point, resulting in superior performance. 1
Locally Linear Hashing for Extracting NonLinear Manifolds
"... Previous efforts in hashing intend to preserve data variance or pairwise affinity, but neither is adequate in capturing the manifold structures hidden in most visual data. In this paper, we tackle this problem by reconstructing the locally linear structures of manifolds in the binary Hamming spac ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
Previous efforts in hashing intend to preserve data variance or pairwise affinity, but neither is adequate in capturing the manifold structures hidden in most visual data. In this paper, we tackle this problem by reconstructing the locally linear structures of manifolds in the binary Hamming space, which can be learned by localitysensitive sparse coding. We cast the problem as a joint minimization of reconstruction error and quantization loss, and show that, despite its NPhardness, a local optimum can be obtained efficiently via alternative optimization. Our method distinguishes itself from existing methods in its remarkable ability to extract the nearest neighbors of the query from the same manifold, instead of from the ambient space. On extensive experiments on various image benchmarks, our results improve previous stateoftheart by 2874 % typically, and 627 % on the Yale face data. 1.
TANGENT SPACE ESTIMATION FOR SMOOTH EMBEDDINGS OF RIEMANNIAN MANIFOLDS
"... Abstract. Numerous dimensionality reduction problems in data analysis involve the recovery of lowdimensional models or the learning of manifolds underlying sets of data. Many manifold learning methods require the estimation of the tangent space of the manifold at a point from locally available data ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
Abstract. Numerous dimensionality reduction problems in data analysis involve the recovery of lowdimensional models or the learning of manifolds underlying sets of data. Many manifold learning methods require the estimation of the tangent space of the manifold at a point from locally available data samples. Local sampling conditions such as (i) the size of the neighborhood (sampling width) and (ii) the number of samples in the neighborhood (sampling density) affect the performance of learning algorithms. In this work, we propose a theoretical analysis of local sampling conditions for the estimation of the tangent space at a point P lying on a mdimensional Riemannian manifold S in Rn. Assuming a smooth embedding of S in Rn, we estimate the tangent space TP S by performing a Principal Component Analysis (PCA) on points sampled from the neighborhood of P on S. Our analysis explicitly takes into account the second order properties of the manifold at P, namely the principal curvatures as well as the higher order terms. We consider a random sampling framework and leverage recent results from random matrix theory to derive conditions on the sampling width and the local sampling density for an accurate estimation of tangent subspaces. We measure the estimation accuracy by the angle between the estimated tangent space TP S and the true tangent space TP S and we give conditions for this angle to be bounded with high probability. In particular, we observe that the local sampling conditions are highly dependent on the correlation between the components in the secondorder local approximation of the manifold. We finally provide numerical simulations to validate our theoretical findings. 1.
Consensus of kNNs for Robust Neighborhood Selection on GraphBased Manifolds
"... Propagating similarity information along the data manifold requires careful selection of local neighborhood. Selecting a “good ” neighborhood in an unsupervised setting, given an affinity graph, has been a difficult task. The most common way to select a local neighborhood has been to use the knea ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
Propagating similarity information along the data manifold requires careful selection of local neighborhood. Selecting a “good ” neighborhood in an unsupervised setting, given an affinity graph, has been a difficult task. The most common way to select a local neighborhood has been to use the knearest neighborhood (kNN) selection criterion. However, it has the tendency to include noisy edges. In this paper, we propose a way to select a robust neighborhood using the consensus of multiple rounds of kNNs. We explain how using consensus information can give better control over neighborhood selection. We also explain in detail the problems with another recently proposed neighborhood selection criteria, i.e., Dominant Neighbors, and show that our method is immune to those problems. Finally, we show the results from experiments in which we compare our method to other neighborhood selection approaches. The results corroborate our claims that consensus of kNNs does indeed help in selecting more robust and stable localities. 1.
Learning a Manifold as an Atlas
 In CVPR
"... In this work, we return to the underlying mathematical definition of a manifold and directly characterise learning a manifold as finding an atlas, or a set of overlapping charts, that accurately describe local structure. We formulate the problem of learning the manifold as an optimisation that simul ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
In this work, we return to the underlying mathematical definition of a manifold and directly characterise learning a manifold as finding an atlas, or a set of overlapping charts, that accurately describe local structure. We formulate the problem of learning the manifold as an optimisation that simultaneously refines the continuous parameters defining the charts, and the discrete assignment of points to charts. In contrast to existing methods, this direct formulation of a manifold does not require “unwrapping ” the manifold into a lower dimensional space and allows us to learn closed manifolds of interest to vision, such as those corresponding to gait cycles or camera pose. We report stateoftheart results for manifold based nearest neighbour classification on vision datasets, and show how the same techniques can be applied to the 3D reconstruction of human motion from a single image. 1.
Estimation of Tangent Planes for Neighborhood Graph Correction
"... Abstract. Local algorithms for nonlinear dimensionality reduction [1], [2], [3], [4], [5] and semisupervised learning algorithms [6], [7] use spectral decomposition based on a nearest neighborhood graph. In the presence of shortcuts (union of two points whose distance measure along the submanifold ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Abstract. Local algorithms for nonlinear dimensionality reduction [1], [2], [3], [4], [5] and semisupervised learning algorithms [6], [7] use spectral decomposition based on a nearest neighborhood graph. In the presence of shortcuts (union of two points whose distance measure along the submanifold is actually large), the resulting embbeding will be unsatisfactory. This paper proposes an algorithm to correct wrong graph connections based on the tangent subspace of the manifold at each point. This leads to the estimation of the proper and adaptive number of neighbors for each point in the dataset. Experiments show graph construction improvement. 1