## On Spectral Clustering: Analysis and an algorithm (2001)

Venue: | ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS |

Citations: | 1173 - 14 self |

### BibTeX

@INPROCEEDINGS{Ng01onspectral,

author = {Andrew Y. Ng and Michael I. Jordan and Yair Weiss},

title = {On Spectral Clustering: Analysis and an algorithm},

booktitle = {ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS },

year = {2001},

pages = {849--856},

publisher = {MIT Press}

}

### Years of Citing Articles

### OpenURL

### Abstract

Despite many empirical successes of spectral clustering methods -- algorithms that cluster points using eigenvectors of matrices derived from the distances between the points -- there are several unresolved issues. First, there is a wide variety of algorithms that use the eigenvectors in slightly different ways. Second, many of these algorithms have no proof that they will actually compute a reasonable clustering. In this paper, we present a simple spectral clustering algorithm that can be implemented using a few lines of Matlab. Using tools from matrix perturbation theory, we analyze the algorithm, and give conditions under which it can be expected to do well. We also show surprisingly good experimental results on a number of challenging clustering problems.

### Citations

1125 | Nonlinear component analysis as a kernel eigenvalue problem
- Scholkopf, Smola, et al.
- 1998
(Show Context)
Citation Context ...al. Spectral Algorithm I. (See text.) 5 Discussion There are some intriguing similarities between spectral clustering methods and KernelsPCA, which has been empirically observed to perform clustering =-=[7, 2-=-]. The main dierence between thesrst steps of our algorithm and Kernel PCA with a Gaussian kernel is the normalization of A (to form L) and X . These normalizations do improve the performance of the a... |

661 | Matrix Perturbation Theory
- STEWART, SUN
- 1990
(Show Context)
Citation Context ...ws of Y to cluster similarly to the rows of ^ Y ? Specically, when will the eigenvectors of L, which we now view as a perturbed version of ^ L, be \close" to those of ^ L? Matrix perturbation the=-=ory [10]-=- indicates that the stability of the eigenvectors of a matrix is determined by the eigengap. More precisely, the subspace spanned by ^ L'ssrst 3 eigenvectors will be stable to small changes to ^ L if ... |

331 | Segmentation using eigenvectors: A unifying view
- Weiss
- 1999
(Show Context)
Citation Context ...including computer vision and VLSI design [5, 1]. But despite their empirical successes, dierent authors still disagree on exactly which eigenvectors to use and how to derive clusters from them (see [=-=11-=-] for a review). Also, the analysis of these algorithms, which we brie y review below, has tended to focus on simplied algorithms that only use one eigenvector at a time. One line of analysis makes th... |

313 | Contour and Texture Analysis for Image Segmentation
- Malik, Belongie, et al.
- 2001
(Show Context)
Citation Context ...g. Here, one uses the top eigenvectors of a matrix derived from the distance between points. Such algorithms have been successfully used in many applications including computer vision and VLSI design =-=[5, 1-=-]. But despite their empirical successes, dierent authors still disagree on exactly which eigenvectors to use and how to derive clusters from them (see [11] for a review). Also, the analysis of these ... |

109 | Learning segmentation by random walks
- Meila, Shi
- 2000
(Show Context)
Citation Context ...perimentally it has been observed that using more eigenvectors and directly computing a k way partitioning is better (e.g. [5, 1]). Here, we build upon the recent work of Weiss [11] and Meila and Shi =-=[6]-=-, who analyzed algorithms that use k eigenvectors simultaneously in simple settings. We propose a particular manner to use the k eigenvectors simultaneously, and give conditions under which the algori... |

93 |
Spectral Graph Theory. Number 92
- Chung
- 1997
(Show Context)
Citation Context ..., in which the second eigenvector of a graph's Laplacian is used to dene a semi-optimal cut. Here, the eigenvector is seen as a solving a relaxation of an NP-hard discrete graph partitioning problem [=-=3]-=-, and it can be shown that cuts based on the second eigenvector give a guaranteed approximation to the optimal cut [9, 3]. This analysis can be extended to clustering by building a weighted graph in w... |

41 | Feature grouping by â€™relocalisationâ€™ of eigenvectors of the proximity matrix
- Scott
- 1990
(Show Context)
Citation Context ..., they form tight clusters (Figure 1h) from which our method obtains the good clustering shown in Figure 1e. We note that the clusters in Figure 1h lie at 90 to each other relative to the origin (cf. =-=[8]-=-). 1 Readers familiar with spectral graph theory [3] may be more familiar with the LaplaciansI L. But as replacing L with I L would complicate our later discussion, and only changes the eigenvalues (f... |

23 |
On clusteringsgood, bad and spectral
- Kannan, Vempala, et al.
(Show Context)
Citation Context ...the performance of the algorithm, but it is also straightforward to extend our analysis to prove conditions under which Kernel PCA will indeed give clustering. While dierent in detail, Kannan et al. [=-=4]-=- give an analysis of spectral clustering that also makes use of matrix perturbation theory, for the case of an anity matrix with row sums equal to one. They also present a clustering algorithm based o... |

10 |
Spectral Partitioning: The More Eigenvectors, The
- Alpert, Yao
- 1995
(Show Context)
Citation Context ...g. Here, one uses the top eigenvectors of a matrix derived from the distance between points. Such algorithms have been successfully used in many applications including computer vision and VLSI design =-=[5, 1-=-]. But despite their empirical successes, dierent authors still disagree on exactly which eigenvectors to use and how to derive clusters from them (see [11] for a review). Also, the analysis of these ... |

10 |
Spectral partitioning works: planar graphs and element meshes
- Spielman, Teng
- 1996
(Show Context)
Citation Context ... seen as a solving a relaxation of an NP-hard discrete graph partitioning problem [3], and it can be shown that cuts based on the second eigenvector give a guaranteed approximation to the optimal cut =-=[9, 3]-=-. This analysis can be extended to clustering by building a weighted graph in which nodes correspond to datapoints and edges are related to the distance between the points. Since the majority of analy... |

7 |
Spectral kernel methods for clustering
- Christianini, Shawe-Taylor, et al.
- 2002
(Show Context)
Citation Context ...al. Spectral Algorithm I. (See text.) 5 Discussion There are some intriguing similarities between spectral clustering methods and KernelsPCA, which has been empirically observed to perform clustering =-=[7, 2-=-]. The main dierence between thesrst steps of our algorithm and Kernel PCA with a Gaussian kernel is the normalization of A (to form L) and X . These normalizations do improve the performance of the a... |