Results 1  10
of
27
Unsupervised Learning of Image Manifolds by Semidefinite Programming
, 2004
"... Can we detect low dimensional structure in high dimensional data sets of images and video? The problem of dimensionality reduction arises often in computer vision and pattern recognition. In this paper, we propose a new solution to this problem based on semidefinite programming. Our algorithm can be ..."
Abstract

Cited by 162 (9 self)
 Add to MetaCart
Can we detect low dimensional structure in high dimensional data sets of images and video? The problem of dimensionality reduction arises often in computer vision and pattern recognition. In this paper, we propose a new solution to this problem based on semidefinite programming. Our algorithm can be used to analyze high dimensional data that lies on or near a low dimensional manifold. It overcomes certain limitations of previous work in manifold learning, such as Isomap and locally linear embedding. We illustrate the algorithm on easily visualized examples of curves and surfaces, as well as on actual images of faces, handwritten digits, and solid objects.
Principal manifolds and nonlinear dimensionality reduction via tangent space alignment zhenyue zhang, hongyuan zha
 SIAM Journal on Scientific Computing
, 2004
"... Abstract. Nonlinear manifold learning from unorganized data points is a very challenging unsupervised learning and data visualization problem with a great variety of applications. In this paper we present a new algorithm for manifold learning and nonlinear dimension reduction. Based on a set of unor ..."
Abstract

Cited by 133 (8 self)
 Add to MetaCart
Abstract. Nonlinear manifold learning from unorganized data points is a very challenging unsupervised learning and data visualization problem with a great variety of applications. In this paper we present a new algorithm for manifold learning and nonlinear dimension reduction. Based on a set of unorganized data points sampled with noise from the manifold, we represent the local geometry of the manifold using tangent spaces learned by fitting an affine subspace in a neighborhood of each data point. Those tangent spaces are aligned to give the internal global coordinates of the data points with respect to the underlying manifold by way of a partial eigendecomposition of the neighborhood connection matrix. We present a careful error analysis of our algorithm and show that the reconstruction errors are of secondorder accuracy. We illustrate our algorithm using curves and surfaces both in 2D/3D and higher dimensional Euclidean spaces, and 64by64 pixel face images with various pose and lighting conditions. We also address several theoretical and algorithmic issues for further research and improvements.
A Kernel View Of The Dimensionality Reduction Of Manifolds
, 2003
"... We interpret several wellknown algorithms for dimensionality reduction of manifolds as kernel methods. Isomap, graph Laplacian eigenmap, and locally linear embedding (LLE) all utilize local neighborhood information to construct a global embedding of the manifold. We show how all three algorithm ..."
Abstract

Cited by 110 (7 self)
 Add to MetaCart
We interpret several wellknown algorithms for dimensionality reduction of manifolds as kernel methods. Isomap, graph Laplacian eigenmap, and locally linear embedding (LLE) all utilize local neighborhood information to construct a global embedding of the manifold. We show how all three algorithms can be described as kernel PCA on specially constructed Gram matrices, and illustrate the similarities and differences between the algorithms with representative examples.
Generative Modeling for Continuous NonLinearly Embedded Visual Inference
 In ICML
, 2004
"... Many difficult visual perception problems, like 3D human motion estimation, can be formulated in terms of inference using complex generative models, defined over highdimensional state spaces. Despite progress, optimizing such models is difficult because prior knowledge cannot be flexibly inte ..."
Abstract

Cited by 75 (12 self)
 Add to MetaCart
Many difficult visual perception problems, like 3D human motion estimation, can be formulated in terms of inference using complex generative models, defined over highdimensional state spaces. Despite progress, optimizing such models is difficult because prior knowledge cannot be flexibly integrated in order to reshape an initially designed representation space. Nonlinearities, inherent sparsity of highdimensional training sets, and lack of global continuity makes dimensionality reduction challenging and lowdimensional search inefficient. To address these problems, we present a learning and inference algorithm that restricts visual tracking to automatically extracted, nonlinearly embedded, lowdimensional spaces. This formulation produces a layered generative model with reduced state representation, that can be estimated using efficient continuous optimization methods. Our prior flattening method allows a simple analytic treatment of lowdimensional intrinsic curvature constraints, and allows consistent interpolation operations.
Analysis and extension of spectral methods for nonlinear dimensionality reduction
 In Proceedings of the Twenty Second International Conference on Machine Learning (ICML05
, 2005
"... Many unsupervised algorithms for nonlinear dimensionality reduction, such as locally linear embedding (LLE) and Laplacian eigenmaps, are derived from the spectral decompositions of sparse matrices. While these algorithms aim to preserve certain proximity relations on average, their embeddings are no ..."
Abstract

Cited by 32 (6 self)
 Add to MetaCart
Many unsupervised algorithms for nonlinear dimensionality reduction, such as locally linear embedding (LLE) and Laplacian eigenmaps, are derived from the spectral decompositions of sparse matrices. While these algorithms aim to preserve certain proximity relations on average, their embeddings are not explicitly designed to preserve local features such as distances or angles. In this paper, we show how to construct a low dimensional embedding that maximally preserves angles between nearby data points. The embedding is derived from the bottom eigenvectors of LLE and/or Laplacian eigenmaps by solving an additional (but small) problem in semidefinite programming, whose size is independent of the number of data points. The solution obtained by semidefinite programming also yields an estimate of the data’s intrinsic dimensionality. Experimental results on several data sets demonstrate the merits of our approach. 1.
Regularization on graphs with functionadapted diffusion process
, 2006
"... Harmonic analysis and diffusion on discrete data has been shown to lead to stateoftheart algorithms for machine learning tasks, especially in the context of semisupervised and transductive learning. The success of these algorithms rests on the assumption that the function(s) to be studied (learn ..."
Abstract

Cited by 23 (5 self)
 Add to MetaCart
Harmonic analysis and diffusion on discrete data has been shown to lead to stateoftheart algorithms for machine learning tasks, especially in the context of semisupervised and transductive learning. The success of these algorithms rests on the assumption that the function(s) to be studied (learned, interpolated, etc.) are smooth with respect to the geometry of the data. In this paper we present a method for modifying the given geometry so the function(s) to be studied are smoother with respect to the modified geometry, and thus more amenable to treatment using harmonic analysis methods. Among the many possible applications, we consider the problems of image denoising and transductive classification. In both settings, our approach improves on standard diffusion based methods.
Isometric Embedding and Continuum ISOMAP
 In Proceedings of the Twentieth International Conference on Machine Learning
, 2003
"... Recently, the Isomap algorithm has been proposed for learning a nonlinear manifold from a set of unorganized highdimensional data points. It is based on extending the classical multidimensional scaling method for dimension reduction. In this paper, we present a continuous version of Isomap wh ..."
Abstract

Cited by 18 (1 self)
 Add to MetaCart
Recently, the Isomap algorithm has been proposed for learning a nonlinear manifold from a set of unorganized highdimensional data points. It is based on extending the classical multidimensional scaling method for dimension reduction. In this paper, we present a continuous version of Isomap which we call continuum isomap and show that manifold learning in the continuous framework is reduced to an eigenvalue problem of an integral operator. We also show that the continuum isomap can perfectly recover the underlying natural parametrization if the nonlinear manifold can be isometrically embedded onto an Euclidean space. Several numerical examples are given to illustrate the algorithm.
Distance functions and geodesics on point clouds
, 2003
"... An new paradigm for computing intrinsic distance functions and geodesics on submanifolds of given by point clouds is introduced in this paper. The basic idea is that, as shown here, intrinsic distance functions and geodesics on general codimension submanifolds of can be accurately approximated by ..."
Abstract

Cited by 14 (3 self)
 Add to MetaCart
An new paradigm for computing intrinsic distance functions and geodesics on submanifolds of given by point clouds is introduced in this paper. The basic idea is that, as shown here, intrinsic distance functions and geodesics on general codimension submanifolds of can be accurately approximated by extrinsic Euclidean ones computed inside a thin offset band surrounding the manifold. This permits the use of computationally optimal algorithms for computing distance functions in Cartesian grids. We use these algorithms, modified to deal with spaces with boundaries, and obtain also for the case of intrinsic distance functions on submanifolds of, a computationally optimal approach. For point clouds, the offset band is constructed without the need to explicitly find the underlying manifold, thereby computing intrinsic distance functions and geodesics on point clouds while skipping the manifold reconstruction step. The case of point clouds representing noisy samples of a submanifold of Euclidean space is studied as well. All the underlying theoretical results are presented along with experimental examples for diverse applications and comparisons to graphbased distance algorithms.
DISTANCE FUNCTIONS AND GEODESICS ON SUBMANIFOLDS OF R^d AND POINT CLOUDS
, 2005
"... A theoretical and computational framework for computing intrinsic distance functions and geodesics on submanifolds of Rd given by point clouds is introduced and developed in this paper. The basic idea is that, as shown here, intrinsic distance functions and geodesics on general codimension submanif ..."
Abstract

Cited by 14 (4 self)
 Add to MetaCart
A theoretical and computational framework for computing intrinsic distance functions and geodesics on submanifolds of Rd given by point clouds is introduced and developed in this paper. The basic idea is that, as shown here, intrinsic distance functions and geodesics on general codimension submanifolds of Rd can be accurately approximated by extrinsic Euclidean ones computed inside a thin offset band surrounding the manifold. This permits the use of computationally optimal algorithms for computing distance functions in Cartesian grids. We use these algorithms, modified to deal with spaces with boundaries, and obtain a computationally optimal approach also for the case of intrinsic distance functions on submanifolds of Rd. For point clouds, the offset band is constructed without the need to explicitly find the underlying manifold, thereby computing intrinsic distance functions and geodesics on point clouds while skipping the manifold reconstruction step. The case of point clouds representing noisy samples of a submanifold of Euclidean space is studied as well. All the underlying theoretical results are presented along with experimental examples for diverse applications and comparisons to graphbased distance algorithms.
A duality view of spectral methods for dimensionality reduction
 In ICML ’06: Proceedings of the 23rd international conference on Machine learning
, 2006
"... We present a unified duality view of several recently emerged spectral methods for nonlinear dimensionality reduction, including Isomap, locally linear embedding, Laplacian eigenmaps, and maximum variance unfolding. We discuss the duality theory for the maximum variance unfolding problem, and show t ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
We present a unified duality view of several recently emerged spectral methods for nonlinear dimensionality reduction, including Isomap, locally linear embedding, Laplacian eigenmaps, and maximum variance unfolding. We discuss the duality theory for the maximum variance unfolding problem, and show that other methods are directly related to either its primal formulation or its dual formulation, or can be interpreted from the optimality conditions. This duality framework reveals close connections between these seemingly quite different algorithms. In particular, it resolves the myth about these methods in using either the top eigenvectors of a dense matrix, or the bottom eigenvectors of a sparse matrix — these two eigenspaces are exactly aligned at primaldual optimality. 1.