## Principal Manifolds and Nonlinear Dimension Reduction via Local Tangent Space Alignment (2002)

### Cached

### Download Links

- [pca.narod.ru]
- [arxiv.org]
- [www.cse.psu.edu]
- DBLP

### Other Repositories/Bibliography

Venue: | SIAM Journal of Scientific Computing |

Citations: | 135 - 8 self |

### BibTeX

@ARTICLE{Zhang02principalmanifolds,

author = {Zhenyue Zhang and Hongyuan Zha},

title = {Principal Manifolds and Nonlinear Dimension Reduction via Local Tangent Space Alignment},

journal = {SIAM Journal of Scientific Computing},

year = {2002},

volume = {26},

pages = {313--338}

}

### Years of Citing Articles

### OpenURL

### Abstract

We present a new algorithm for manifold learning and nonlinear dimension reduction. Based on a set of unorganized data points sampled with noise from the manifold, the local geometry of the manifold is learned by constructing a local tangent space for each data point, and those tangent subspaces are aligned to give the internal global coordinates of the data points with respect to the underlying manifold. We also present a careful error analysis of our algorithm and show that the reconstruction errors are of second-order accuracy. We illustrate our algorithm using curves and surfaces both in 2D/3D Euclidean spaces and higher dimensional Euclidean spaces, and we also address several theoretical and algorithmic issues for further research and improvements.

### Citations

2043 |
The Elements of Statistical Learning
- HASTIE, TIBSHIRANI, et al.
- 2001
(Show Context)
Citation Context .... Traditional dimension reduction techniques such as principal component analysis and factor analysis usually work well when the data points lie close to a linear (affine) subspace in the input space =-=[7]-=-. They can not, in general, discover nonlinear structures embedded in the set of data points. Recently, there have been much renewed interests in developing efficient algorithms for constructing nonli... |

1682 |
A Global Geometric Framework for Nonlinear Dimensionality Reduction
- Tenenbaum, Silva, et al.
- 2000
(Show Context)
Citation Context ...ional nonlinear manifold. Discovering the structure of the manifold from a set of data points sampled from the manifold possibly with noise represents a very challenging unsupervised learning problem =-=[2, 3, 4, 8, 9, 10, 13, 14, 15, 17, 18]-=-. The discovered low-dimensional structures can be further used for classification, clustering, outlier detection and data visualization. Example low-dimensional manifolds embedded in high-dimensional... |

465 | Functional Data Analysis
- Ramsay, Silverman
- 1997
(Show Context)
Citation Context ...ional nonlinear manifold. Discovering the structure of the manifold from a set of data points sampled from the manifold possibly with noise represents a very challenging unsupervised learning problem =-=[2, 3, 4, 8, 9, 10, 13, 14, 15, 17, 18]-=-. The discovered low-dimensional structures can be further used for classification, clustering, outlier detection and data visualization. Example low-dimensional manifolds embedded in high-dimensional... |

330 |
1999], Modern Applied Statistics with S-PLUS
- Venables, Ripley
(Show Context)
Citation Context ... i=1 to construct the principal manifold underlying the set of points xi. Here each of the component functions fj(τ) can be constructed separately, for example, we have used the simple loess function =-=[19]-=- in some of our experiments for generating the principal manifolds. In general, when the low-dimensional coordinates τi are available, we can construct an mapping from the τ-space (feature space) to t... |

292 |
Principal curves
- Hastie, Stuetzle
- 1989
(Show Context)
Citation Context ...s into a low-dimensional space that best preserves the geodesic distances. Another line of research follows the long tradition starting with self-organizing maps (SOM) [10], principal curves/surfaces =-=[6]-=- and topology-preserving networks [11]. The key idea is that the information about the global structure of a nonlinear manifold can be obtained from a careful analysis of the interactions of the overl... |

252 | Think globally, fit locally: Unsupervised learning of low dimensional manifolds
- Saul, Roweis
(Show Context)
Citation Context ...riant to translations and orthogonal transformations in a neighborhood of each data points and seeks to project the data points into a low-dimensional space that best preserves those local geometries =-=[14, 16]-=-. Our approach draws inspiration from and improves upon the work in [14, 16] which opens up new directions in nonlinear manifold learning with many fundamental problems requiring to be further investi... |

178 |
Topology representing networks
- Martinetz, Schulten
- 1994
(Show Context)
Citation Context ...est preserves the geodesic distances. Another line of research follows the long tradition starting with self-organizing maps (SOM) [10], principal curves/surfaces [6] and topology-preserving networks =-=[11]-=-. The key idea is that the information about the global structure of a nonlinear manifold can be obtained from a careful analysis of the interactions of the overlapping local structures. In particular... |

118 | Stochastic neighbor embedding
- Hinton, Roweis
- 2003
(Show Context)
Citation Context ...ional nonlinear manifold. Discovering the structure of the manifold from a set of data points sampled from the manifold possibly with noise represents a very challenging unsupervised learning problem =-=[2, 3, 4, 8, 9, 10, 13, 14, 15, 17, 18]-=-. The discovered low-dimensional structures can be further used for classification, clustering, outlier detection and data visualization. Example low-dimensional manifolds embedded in high-dimensional... |

76 | Global coordination of local linear models - Roweis, Saul, et al. - 2002 |

47 | Deriving action and behavior primitives from human motion data
- Jenkins, Mataric
- 2002
(Show Context)
Citation Context |

38 |
When does isomap recover the natural parameterization of families of articulated images
- Donoho, Grimes
- 2002
(Show Context)
Citation Context |

26 | Grouping and dimensionality reduction by locally linear embedding
- Perona, Polito
- 2002
(Show Context)
Citation Context ...o extract the global coordinate information needs more careful analysis of the eigenvector matrix of B and various models of the noise. Some preliminary results on this problem have been presented in =-=[12]-=-. 2. The selection of the set of points to estimate the local tangent space is very crucial to the success of the algorithm. Ideally, we want this set of points to be close to the tangent space. Howev... |

13 |
Automatic Alignment of Hidden Representations
- Teh, Roweis
- 2002
(Show Context)
Citation Context |

11 | Efficient simplicial reconstructions of manifolds from their samples
- Freedman
(Show Context)
Citation Context |

10 |
Nonlinear dimension reduction by locally linear embedding, Science 290
- Roweis, Saul
- 2000
(Show Context)
Citation Context |

3 |
Self-organizing Maps. Springer-Verlag, 3rd Edition
- Kohonen
- 2000
(Show Context)
Citation Context |

2 |
Local ISOMAP perfectly recovers the underlying parametrization for families of occluded/lacunary images
- Donoho, Grimes
- 2003
(Show Context)
Citation Context |

1 |
Laplacian eigenmaps for dimension reduction and data representation
- Belkin, Niyogi
- 2001
(Show Context)
Citation Context ...as a means for clustering the data points xi’s. The situation is illustrated by Figure 6. The data set consists of three bivariate Gaussians with covariance matrices 0.2I2 and mean vectors located at =-=[1, 1]-=-, [1, −1], [−1, 0]. There are 100 sample points from each Gaussian. i and ρ (2) σj,i18 ZHENYUE ZHANG AND HONGYUAN ZHA 0.2 lle 0.1 ltsa 2.5 0.15 0.08 2 0.06 1.5 0.1 0.04 1 0.05 0.02 0.5 0 0 0 −0.5 −0.... |