## Locality Preserving Projections (2002)

### Cached

### Download Links

- [people.cs.uchicago.edu]
- [www.cs.utsa.edu]
- [www.cmlab.csie.ntu.edu.tw]
- [books.nips.cc]
- [people.cs.uchicago.edu]
- [books.nips.cc]
- CiteULike
- DBLP

### Other Repositories/Bibliography

Citations: | 205 - 15 self |

### BibTeX

@MISC{He02localitypreserving,

author = {Xiaofei He and Partha Niyogi},

title = {Locality Preserving Projections},

year = {2002}

}

### Years of Citing Articles

### OpenURL

### Abstract

Many problems in information processing involve some form of dimensionality reduction. In this paper, we introduce Locality Preserving Projections (LPP). These are linear projective maps that arise by solving a variational problem that optimally preserves the neighborhood structure of the data set. LPP should be seen as an alternative to Principal Component Analysis (PCA) -- a classical linear technique that projects the data along the directions of maximal variance. When the high dimensional data lies on a low dimensional manifold embedded in the ambient space, the Locality Preserving Projections are obtained by finding the optimal linear approximations to the eigenfunctions of the Laplace Beltrami operator on the manifold. As a result, LPP shares many of the data representation properties of nonlinear techniques such as Laplacian Eigenmaps or Locally Linear Embedding. Yet LPP is linear and more crucially is defined everywhere in ambient space rather than just on the training data points. This is borne out by illustrative examples on some high dimensional data sets.

### Citations

2868 |
P.: UCI Repository of Machine Learning Databases
- Merz, Merphy
- 1996
(Show Context)
Citation Context ...incipal direction obtained by PCA, while they are well separated in the principal direction obtained by LPP. 4.2. 2-D Data Visulization An experiment was conducted with the Multiple Features Database =-=[3]-=-. This dataset consists of features of handwritten numbers (`0'-`9') extracted from a collection of Dutch utility maps. 200 patterns per class (for a total of 2,000 patterns) have been digitized in bi... |

2791 | Eigenfaces for recognition
- Turk, Pentland
- 1991
(Show Context)
Citation Context ...th (linked by solid line), illustrating one particular mode of variability in pose. 4.4. Face Recognition PCA and LDA are the two most widely used subspace learning techniques for face recognition [1]=-=[7]-=-. These methods project the training sample faces to a low dimensional representation space where the recognition is carried out. The main supposition behind this procedure is that the face space (giv... |

2721 | Indexing by Latent Semantic Analysis - Deerwester, Dumais, et al. - 1990 |

2590 | Normalized cuts and image segmentation - Shi, Malik - 2000 |

2534 | An Introduction to the Bootstrap - Efron, Tibshirani - 1993 |

2362 | Modern Information Retrieval - Baeza-Yates, Ribeiro-Neto - 1999 |

2014 | Principal Component Analysis - Jolliffe - 2002 |

1688 |
A Global Geometric Framework for Nonlinear Dimensionality
- Tenenbaum, Silva, et al.
(Show Context)
Citation Context ...perties (1) and (2) above, we know of no other linear projective technique that has such a property. 4. LPP is defined everywhere. Recall that nonlinear dimensionality reduction techniques like ISOMAP=-=[6]-=-, LLE[5], Laplacian eigenmaps[2] are defined only on the training data points and it is unclear how to evaluate the map for new test points. In contrast, the Locality Preserving Projection may be simp... |

1614 | Nonlinear Dimensionality Reduction by locally linear embedding
- Roweis, Saul
(Show Context)
Citation Context ...(1) and (2) above, we know of no other linear projective technique that has such a property. 4. LPP is defined everywhere. Recall that nonlinear dimensionality reduction techniques like ISOMAP[6], LLE=-=[5]-=-, Laplacian eigenmaps[2] are defined only on the training data points and it is unclear how to evaluate the map for new test points. In contrast, the Locality Preserving Projection may be simply appli... |

1503 | Eigenfaces vs. Fisherfaces: Recognition using class specific linear projection
- Belhumeur, Hespanha, et al.
- 1997
(Show Context)
Citation Context ... path (linked by solid line), illustrating one particular mode of variability in pose. 4.4. Face Recognition PCA and LDA are the two most widely used subspace learning techniques for face recognition =-=[1]-=-[7]. These methods project the training sample faces to a low dimensional representation space where the recognition is carried out. The main supposition behind this procedure is that the face space (... |

1097 | On spectral clustering: Analysis and an algorithm - Ng, Jordan, et al. - 2001 |

1048 | Nonlinear component analysis as a kernel eigenvalue problem - Schölkopf, Smola, et al. - 1998 |

1031 | Wrappers for feature subset selection - Kohavi, John - 1997 |

989 | H.S.: Learning the parts of objects by non-negative matrix factorization - Lee, Seung - 1999 |

958 | Visual Learning and Recognition of 3-D Objects from Appearance - Murase, Nayar - 1995 |

941 | Face recognition using eigenface - Turk, Petland - 1991 |

863 | The Elements of - Hastie, Tibshirani, et al. - 2009 |

785 | Probabilistic latent semantic indexing - Hofmann - 1999 |

563 | A.: Probabilistic visual learning for object recognition - Moghaddam, Pentland - 1997 |

438 | Newsweeder: Learning to filter netnews - Lang - 1995 |

424 | Laplacian eigenmaps and spectral techniques for embedding and clustering
- Belkin, Niyogi
(Show Context)
Citation Context ...tion in a certain sense. The representation map generated by the algorithm may be viewed as a linear discrete approximation to a continuous map that naturally arises from the geometry of the manifold =-=[2]-=-. The new algorithm is interesting from a number of perspectives. 1. The maps are designed to minimize a different objective criterion from the classical linear techniques. 2. The locality preserving ... |

319 | Segmentation using eigenvectors: A unifying view - Weiss - 1999 |

289 | PCA versus LDA - Martinez, Kak |

254 | Think globally, fit locally: unsupervised learning of low dimensional manifolds - Saul, Roweis |

254 | The Jackkinfe and Bootstrap - Shao, Tu |

249 | Latent semantic indexing: a probabilistic analysis - Papadimitriou, Raghavan, et al. |

223 | Personalized Information Delivery: An Analysis of Information Filtering Methods - Foltz, Dumais - 1992 |

223 | Face recognition by elastic bunch graph matching - Wiskott, Fellous, et al. - 1997 |

218 | Generalized discriminant analysis using a kernel approach - Baudat, Anouar - 2000 |

204 | Feature selection for SVMs - Weston, Mukherjee, et al. |

186 | Face Recognition Using Laplacianfaces - He, Yan, et al. - 2005 |

178 | Document clustering based on non-negative matrix factorization - Xu, Liu, et al. - 2003 |

170 | Two algorithms for nearest-neighbor search in high dimensions - Kleinberg - 1997 |

161 | Charting a manifold - Brand - 2003 |

153 | 2004, Two-dimensional PCA: a new approach to appearance-based face representation and recognition - Yang, Zhang |

137 | Random Projection in Dimensionality Reduction: Applications to Image and Text - Bingham, Mannila - 2001 |

120 | Kernel eigenfaces vs. kernel Fisherfaces: Face recognition using kernel methods - Yang |

111 | Introduction to smooth manifolds - Lee - 2000 |

111 | Video-based face recognition using probabilistic appearance manifolds - Lee, Ho, et al. - 2003 |

109 | Spectral Graph Theory, volume 92 - Chung - 1994 |

98 | P.J.: “Subspace Linear Discriminant Analysis for Face Recognition - Zhao, Chellappa, et al. - 1999 |

94 | Out-of-sample extensions for lle, isomap, mds, eigenmaps, and spectral clustering. Advances in neural information processing systems - Bengio, Paiement, et al. |

92 | Support Vector Machines Applied to Face Recognition - Phillips - 1999 |

90 | Some applications of laplace eigenvalues of graphs, in graph symmetry: algebric methods and applications - Mohar - 1997 |

89 | An O(n log n) Algorithm for the All-Nearest-Neighbor Problem - Vaidya - 1989 |

88 | Nonlinear Dimensionality Reduction by Locally - Roweis, Saul - 2000 |

87 | Experiments with random projection - Dasgupta - 2000 |

79 | Matching Theory, Akadémiai Kiadó - Lovász, Plummer - 1986 |

77 | Global coordination of local linear models - Roweis, Saul, et al. |

76 | A semidiscrete matrix decomposition for latent semantic indexing information retrieval - Kolda, O’Leary - 1998 |