## Charting a Manifold (2003)

### Cached

### Download Links

- [www.merl.com]
- [www.cs.utsa.edu]
- [www.merl.com]
- [www.merl.com]
- [books.nips.cc]
- CiteULike
- DBLP

### Other Repositories/Bibliography

Venue: | Advances in Neural Information Processing Systems 15 |

Citations: | 170 - 7 self |

### BibTeX

@INPROCEEDINGS{Brand03chartinga,

author = {Matthew Brand and Matthew Brand},

title = {Charting a Manifold},

booktitle = {Advances in Neural Information Processing Systems 15},

year = {2003},

pages = {961--968},

publisher = {MIT Press}

}

### Years of Citing Articles

### OpenURL

### Abstract

this paper we use m i ( j ) N ( j ; i , s ), with the scale parameter s specifying the expected size of a neighborhood on the manifold in sample space. A reasonable choice is s = r/2, so that 2erf(2) > 99.5% of the density of m i ( j ) is contained in the area around y i where the manifold is expected to be locally linear. With uniform p i and i , m i ( j ) and fixed, the MAP estimates of the GMM covariances are S i = m i ( j ) (y j i )(y j i ) # + ( j i )( j i ) # +S j m i ( j ) . (3) Note that each covariance S i is dependent on all other S j . The MAP estimators for all covariances can be arranged into a set of fully constrained linear equations and solved exactly for their mutually optimal values. This key step brings nonlocal information about the manifold's shape into the local description of each neighborhood, ensuring that adjoining neighborhoods have similar covariances and small angles between their respective subspaces. Even if a local subset of data points are dense in a direction perpendicular to the manifold, the prior encourages the local chart to orient parallel to the manifold as part of a globally optimal solution, protecting against a pathology noted in [8]. Equation (3) is easily adapted to give a reduced number of charts and/or charts centered on local centroids. 4 Connecting the charts We now build a connection for set of charts specified as an arbitrary nondegenerate GMM. A GMM gives a soft partitioning of the dataset into neighborhoods of mean k and covariance S k . The optimal variance-preserving low-dimensional coordinate system for each neighborhood derives from its weighted principal component analysis, which is exactly specified by the eigenvectors of its covariance matrix: Eigendecompose V k L k V # k S k with...

### Citations

1792 |
A global geometric framework for non- linear dimensionality reduction
- TENENBAUM, SILVA, et al.
- 2000
(Show Context)
Citation Context ...es aim for embeddings: Gomes and Mojsilovic [4] treat manifold completion as an anisotropic diffusion problem, iteratively expanding points until they connect to their neighbors. The ISOMAP algorithm =-=[12]-=- represents remote distances as sums of a trusted set of distances between immediate neighbors, then uses multidimensional scaling to compute a low-dimensional embedding that minimally distorts all di... |

1725 | Nonlinear dimensionality reduction by locally linear embedding
- ROWEIS, SAUL
- 2000
(Show Context)
Citation Context ...wer points, higher noise levels, no possibility of an isometric mapping, and uneven sampling, this is arguably a much more challenging problem than the “swiss roll” and “s-curve” problems feat=-=ured in [12, 9, 8, 1]-=-. Figure 2LEFT contrasts the (unique) output of charting and the best outputs obtained from ISOMAP and LLE (considering all neighborhood sizes between 2 and 20 points). ISOMAP and LLE show catastrophi... |

318 |
Principal curves
- Hastie, Stuetzle
- 1989
(Show Context)
Citation Context ... and Cottrell [3] proposed using auto-encoding neural networks with a hidden layer “bottleneck,” effectively casting dimensionality reduction as a compression problem. Hastie defined principal cur=-=ves [5] as -=-nonparametric 1D curves that pass through the center of “nearby” data points. A rich literature has grown up around properly regularizing this approach and extending it to surfaces. Smola and coll... |

111 | Non-linear dimensionality reduction
- DeMers, Cottrell
- 1993
(Show Context)
Citation Context ...neutral NLDR algorithms can be divided into those that compute mappings, andsthose that directly compute low-dimensional embeddings. The field has its roots in mapping algorithms: DeMers and Cottrell =-=[3] pro-=-posed using auto-encoding neural networks with a hidden layer “bottleneck,” effectively casting dimensionality reduction as a compression problem. Hastie defined principal curves [5] as nonparamet... |

108 | Dimension reduction by local principal component analysis
- Kambhatla, Leen
- 1997
(Show Context)
Citation Context ...ortions in the mapping, and gave a expectation-maximization (EM) training rule. Innovative use of variational methods highlighted the difficulty of even hill-climbing their multimodal posterior. Like =-=[2, 7, 6, 8]-=-, the method we develop below is a decomposition of the manifold into locally linear neighborhoods. It bears closest relation to global coordination [8], although by a different construction of the pr... |

66 |
The Isomap algorithm and topological stability
- Balasubramanian, Schwartz
(Show Context)
Citation Context ...point-to-point relationships that has the lowest signal-to-noise ratio; small changes to the trusted set can induce large changes in the set of constraints on the embedding, making solutions unstable =-=[1]. -=-In a return to mapping, Roweis and colleagues [8] proposed global coordination—learning a mixture of locally linear projections from sample to coordinate space. They constructed a posterior that pen... |

54 | Nonlinear Image Interpolation using Manifold Learning
- Bregler, Omoundro
(Show Context)
Citation Context ...ortions in the mapping, and gave a expectation-maximization (EM) training rule. Innovative use of variational methods highlighted the difficulty of even hill-climbing their multimodal posterior. Like =-=[2, 7, 6, 8]-=-, the method we develop below is a decomposition of the manifold into locally linear neighborhoods. It bears closest relation to global coordination [8], although by a different construction of the pr... |

34 | Regularized principal manifolds
- Smola, Mika, et al.
- 2001
(Show Context)
Citation Context ...parametric 1D curves that pass through the center of “nearby” data points. A rich literature has grown up around properly regularizing this approach and extending it to surfaces. Smola and colleag=-=ues [10]-=- analyzed the NLDR problem in the broader framework of regularized quantization methods. More recent advances aim for embeddings: Gomes and Mojsilovic [4] treat manifold completion as an anisotropic d... |

14 |
Automatic alignment of hidden representations
- Teh, Roweis
(Show Context)
Citation Context ...factors 1 /PCAs density model; thus large eigenproblems can be avoided by connecting just a small number of charts that cover the data. 1 We thank reviewers for calling our attention to Teh & Roweis (=-=[11]��-=-�in this volume), which shows how to connect a set of given local dimensionality reducers in a generalized eigenvalue problem that is related to equation (8).soriginal data embedding, XY view XYZ view... |

10 | A variational approach to recovering a manifold from sample points
- Gomes, Mojsilovic
- 2002
(Show Context)
Citation Context ...tending it to surfaces. Smola and colleagues [10] analyzed the NLDR problem in the broader framework of regularized quantization methods. More recent advances aim for embeddings: Gomes and Mojsilovic =-=[4]-=- treat manifold completion as an anisotropic diffusion problem, iteratively expanding points until they connect to their neighbors. The ISOMAP algorithm [12] represents remote distances as sums of a t... |

5 |
Modeling the manifolds of handwritten digits
- Hinton, Dayan, et al.
- 1997
(Show Context)
Citation Context ...ortions in the mapping, and gave a expectation-maximization (EM) training rule. Innovative use of variational methods highlighted the difficulty of even hill-climbing their multimodal posterior. Like =-=[2, 7, 6, 8]-=-, the method we develop below is a decomposition of the manifold into locally linear neighborhoods. It bears closest relation to global coordination [8], although by a different construction of the pr... |

2 |
Global coordination of linear models
- Roweis, Saul, et al.
- 2002
(Show Context)
Citation Context ...gnal-to-noise ratio; small changes to the trusted set can induce large changes in the set of constraints on the embedding, making solutions unstable [1]. In a return to mapping, Roweis and colleagues =-=[8] p-=-roposed global coordination—learning a mixture of locally linear projections from sample to coordinate space. They constructed a posterior that penalizes distortions in the mapping, and gave a expec... |