Results 11  20
of
260
Patch alignment for dimensionality reduction
 IEEE Trans. Knowl. Data Eng
, 2009
"... All intext references underlined in blue are linked to publications on ResearchGate, letting you access and read them immediately. ..."
Abstract

Cited by 37 (12 self)
 Add to MetaCart
All intext references underlined in blue are linked to publications on ResearchGate, letting you access and read them immediately.
TopologyInvariant Similarity of Nonrigid Shapes
, 2009
"... This paper explores the problem of similarity criteria between nonrigid shapes. Broadly speaking, such criteria are divided into intrinsic and extrinsic, the first referring to the metric structure of the object and the latter to how it is laid out in the Euclidean space. Both criteria have their ..."
Abstract

Cited by 33 (3 self)
 Add to MetaCart
(Show Context)
This paper explores the problem of similarity criteria between nonrigid shapes. Broadly speaking, such criteria are divided into intrinsic and extrinsic, the first referring to the metric structure of the object and the latter to how it is laid out in the Euclidean space. Both criteria have their advantages and disadvantages: extrinsic similarity is sensitive to nonrigid deformations, while intrinsic similarity is sensitive to topological noise. In this paper, we approach the problem from the perspective of metric geometry. We show that by unifying the extrinsic and intrinsic similarity criteria, it is possible to obtain a stronger topologyinvariant similarity, suitable for comparing deformed shapes with different topology. We construct this new joint criterion as a tradeoff between the extrinsic and intrinsic similarity and use it as a setvalued distance. Numerical results demonstrate the efficiency of our approach in cases where using either extrinsic or intrinsic criteria alone would fail.
Sparse Manifold Clustering and Embedding
"... We propose an algorithm called Sparse Manifold Clustering and Embedding (SMCE) for simultaneous clustering and dimensionality reduction of data lying in multiple nonlinear manifolds. Similar to most dimensionality reduction methods, SMCE finds a small neighborhood around each data point and connects ..."
Abstract

Cited by 31 (1 self)
 Add to MetaCart
(Show Context)
We propose an algorithm called Sparse Manifold Clustering and Embedding (SMCE) for simultaneous clustering and dimensionality reduction of data lying in multiple nonlinear manifolds. Similar to most dimensionality reduction methods, SMCE finds a small neighborhood around each data point and connects each point to its neighbors with appropriate weights. The key difference is that SMCE finds both the neighbors and the weights automatically. This is done by solving a sparse optimization problem, which encourages selecting nearby points that lie in the same manifold and approximately span a lowdimensional affine subspace. The optimal solution encodes information that can be used for clustering and dimensionality reduction using spectral clustering and embedding. Moreover, the size of the optimal neighborhood of a data point, which can be different for different points, provides an estimate of the dimension of the manifold to which the point belongs. Experiments demonstrate that our method can effectively handle multiple manifolds that are very close to each other, manifolds with nonuniform sampling and holes, as well as estimate the intrinsic dimensions of the manifolds.
A Distributed SDP approach for Largescale Noisy Anchorfree Graph Realization with Applications to Molecular Conformation
, 2007
"... We propose a distributed algorithm for solving Euclidean metric realization problems arising from large 3D graphs, using only noisy distance information, and without any prior knowledge of the positions of any of the vertices. In our distributed algorithm, the graph is first subdivided into smaller ..."
Abstract

Cited by 27 (3 self)
 Add to MetaCart
(Show Context)
We propose a distributed algorithm for solving Euclidean metric realization problems arising from large 3D graphs, using only noisy distance information, and without any prior knowledge of the positions of any of the vertices. In our distributed algorithm, the graph is first subdivided into smaller subgraphs using intelligent clustering methods. Then a semidefinite programming relaxation and gradient search method is used to localize each subgraph. Finally, a stitching algorithm is used to find affine maps between adjacent clusters and the positions of all points in a global coordinate system are then derived. In particular, we apply our method to the problem of finding the 3D molecular configurations of proteins based on a limited number of given pairwise distances between atoms. The protein molecules, all with known molecular configurations, are taken from the Protein Data Bank. Our algorithm is able to reconstruct reliably and efficiently the configurations of large protein molecules from a limited number of pairwise distances corrupted by noise, without incorporating domain knowledge such as the minimum separation distance constraints derived from van der Waals interactions. 1
Adaptive Manifold Learning
, 2004
"... Recently, there have been several advances in the machine learning and pattern recognition communities for developing manifold learning algorithms to construct nonlinear lowdimensional manifolds from sample data points embedded in highdimensional spaces. In this paper, we develop algorithms that ..."
Abstract

Cited by 26 (0 self)
 Add to MetaCart
(Show Context)
Recently, there have been several advances in the machine learning and pattern recognition communities for developing manifold learning algorithms to construct nonlinear lowdimensional manifolds from sample data points embedded in highdimensional spaces. In this paper, we develop algorithms that address two key issues in manifold learning: 1) the adaptive selection of the neighborhood sizes; and 2) better fitting the local geometric structure to account for the variations in the curvature of the manifold and its interplay with the sampling density of the data set. We also illustrate the effectiveness of our methods on some synthetic data sets. 1
Learning a Maximum Margin Subspace for Image Retrieval
, 2008
"... One of the fundamental problems in ContentBased Image Retrieval (CBIR) has been the gap between lowlevel visual features and highlevel semantic concepts. To narrow down this gap, relevance feedback is introduced into image retrieval. With the userprovided information, a classifier can be learne ..."
Abstract

Cited by 25 (5 self)
 Add to MetaCart
(Show Context)
One of the fundamental problems in ContentBased Image Retrieval (CBIR) has been the gap between lowlevel visual features and highlevel semantic concepts. To narrow down this gap, relevance feedback is introduced into image retrieval. With the userprovided information, a classifier can be learned to distinguish between positive and negative examples. However, in realworld applications, the number of user feedbacks is usually too small compared to the dimensionality of the image space. In order to cope with the high dimensionality, we propose a novel semisupervised method for dimensionality reduction called Maximum Margin Projection (MMP). MMP aims at maximizing the margin between positive and negative examples at each local neighborhood. Different from traditional dimensionality reduction algorithms such as Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA), which effectively see only the global euclidean structure, MMP is designed for discovering the local manifold structure. Therefore, MMP is likely to be more suitable for image retrieval, where nearest neighbor search is usually involved. After projecting the images into a lower dimensional subspace, the relevant images get closer to the query image; thus, the retrieval performance can be enhanced. The experimental results on Corel image database demonstrate the effectiveness of our proposed algorithm.
Isometric Embedding and Continuum ISOMAP
 In Proceedings of the Twentieth International Conference on Machine Learning
, 2003
"... Recently, the Isomap algorithm has been proposed for learning a nonlinear manifold from a set of unorganized highdimensional data points. It is based on extending the classical multidimensional scaling method for dimension reduction. In this paper, we present a continuous version of Isomap wh ..."
Abstract

Cited by 24 (1 self)
 Add to MetaCart
(Show Context)
Recently, the Isomap algorithm has been proposed for learning a nonlinear manifold from a set of unorganized highdimensional data points. It is based on extending the classical multidimensional scaling method for dimension reduction. In this paper, we present a continuous version of Isomap which we call continuum isomap and show that manifold learning in the continuous framework is reduced to an eigenvalue problem of an integral operator. We also show that the continuum isomap can perfectly recover the underlying natural parametrization if the nonlinear manifold can be isometrically embedded onto an Euclidean space. Several numerical examples are given to illustrate the algorithm.
Riemannian manifold learning for nonlinear dimensionality reduction
, 2006
"... In recent years, nonlinear dimensionality reduction (NLDR) techniques have attracted much attention in visual perception and many other areas of science. We propose an efficient algorithm called Riemannian manifold learning (RML). A Riemannian manifold can be constructed in the form of a simplicial ..."
Abstract

Cited by 22 (1 self)
 Add to MetaCart
(Show Context)
In recent years, nonlinear dimensionality reduction (NLDR) techniques have attracted much attention in visual perception and many other areas of science. We propose an efficient algorithm called Riemannian manifold learning (RML). A Riemannian manifold can be constructed in the form of a simplicial complex, and thus its intrinsic dimension can be reliably estimated. Then the NLDR problem is solved by constructing Riemannian normal coordinates (RNC). Experimental results demonstrate that our algorithm can learn the data’s intrinsic geometric structure, yielding uniformly distributed and well organized lowdimensional embedding data.
TRACE OPTIMIZATION AND EIGENPROBLEMS IN DIMENSION REDUCTION METHODS
"... Abstract. This paper gives an overview of the eigenvalue problems encountered in areas of data mining that are related to dimension reduction. Given some input highdimensional data, the goal of dimension reduction is to map them to a lowdimensional space such that certain properties of the initial ..."
Abstract

Cited by 20 (1 self)
 Add to MetaCart
(Show Context)
Abstract. This paper gives an overview of the eigenvalue problems encountered in areas of data mining that are related to dimension reduction. Given some input highdimensional data, the goal of dimension reduction is to map them to a lowdimensional space such that certain properties of the initial data are preserved. Optimizing the above properties among the reduced data can be typically posed as a trace optimization problem that leads to an eigenvalue problem. There is a rich variety of such problems and the goal of this paper is to unravel relations between them as well as to discuss effective solution techniques. First, we make a distinction between projective methods that determine an explicit linear projection from the highdimensional space to the lowdimensional space, and nonlinear methods where the mapping between the two is nonlinear and implicit. Then, we show that all of the eigenvalue problems solved in the context of explicit projections can be viewed as the projected analogues of the socalled nonlinear or implicit projections. We also discuss kernels as a means of unifying both types of methods and revisit some of the equivalences between methods established in this way. Finally, we provide some illustrative examples to showcase the behavior and the particular characteristics of the various dimension reduction methods on real world data sets.