Results 11  20
of
317
Some Applications of Laplace Eigenvalues of Graphs
 GRAPH SYMMETRY: ALGEBRAIC METHODS AND APPLICATIONS, VOLUME 497 OF NATO ASI SERIES C
, 1997
"... In the last decade important relations between Laplace eigenvalues and eigenvectors of graphs and several other graph parameters were discovered. In these notes we present some of these results and discuss their consequences. Attention is given to the partition and the isoperimetric properties of ..."
Abstract

Cited by 129 (0 self)
 Add to MetaCart
In the last decade important relations between Laplace eigenvalues and eigenvectors of graphs and several other graph parameters were discovered. In these notes we present some of these results and discuss their consequences. Attention is given to the partition and the isoperimetric properties of graphs, the maxcut problem and its relation to semidefinite programming, rapid mixing of Markov chains, and to extensions of the results to infinite graphs.
Spectral learning
 In IJCAI
, 2003
"... We present a simple, easily implemented spectral learning algorithm which applies equally whether we have no supervisory information, pairwise link constraints, or labeled examples. In the unsupervised case, it performs consistently with other spectral clustering algorithms. In the supervised case, ..."
Abstract

Cited by 106 (5 self)
 Add to MetaCart
We present a simple, easily implemented spectral learning algorithm which applies equally whether we have no supervisory information, pairwise link constraints, or labeled examples. In the unsupervised case, it performs consistently with other spectral clustering algorithms. In the supervised case, our approach achieves high accuracy on the categorization of thousands of documents given only a few dozen labeled training documents for the 20 Newsgroups data set. Furthermore, its classification accuracy increases with the addition of unlabeled documents, demonstrating effective use of unlabeled data. By using normalized affinity matrices which are both symmetric and stochastic, we also obtain both a probabilistic interpretation of our method and certain guarantees of performance. 1
Landscapes and Their Correlation Functions
, 1996
"... Fitness landscapes are an important concept in molecular evolution. Many important examples of landscapes in physics and combinatorial optimation, which are widely used as model landscapes in simulations of molecular evolution and adaptation, are "elementary", i.e., they are (up to an addi ..."
Abstract

Cited by 105 (16 self)
 Add to MetaCart
Fitness landscapes are an important concept in molecular evolution. Many important examples of landscapes in physics and combinatorial optimation, which are widely used as model landscapes in simulations of molecular evolution and adaptation, are "elementary", i.e., they are (up to an additive constant) eigenfuctions of a graph Laplacian. It is shown that elementary landscapes are characterized by their correlation functions. The correlation functions are in turn uniquely determined by the geometry of the underlying configuration space and the nearest neighbor correlation of the elementary landscape. Two types of correlation functions are investigated here: the correlation of a time series sampled along a random walk on the landscape and the correlation function with respect to a partition of the set of all vertex pairs.
The Principal Components Analysis of a Graph, and its Relationships to Spectral Clustering
 Proceedings of the 15th European Conference on Machine Learning (ECML 2004). Lecture Notes in Artificial Intelligence
, 2004
"... This work presents a novel procedure for computing (1) distances between nodes of a weighted, undirected, graph, called the Euclidean Commute Time Distance (ECTD), and (2) a subspace projection of the nodes of the graph that preserves as much variance as possible, in terms of the ECTD  a princi ..."
Abstract

Cited by 105 (20 self)
 Add to MetaCart
This work presents a novel procedure for computing (1) distances between nodes of a weighted, undirected, graph, called the Euclidean Commute Time Distance (ECTD), and (2) a subspace projection of the nodes of the graph that preserves as much variance as possible, in terms of the ECTD  a principal components analysis of the graph. It is based on a Markovchain model of random walk through the graph. The model assigns transition probabilities to the links between nodes, so that a random walker can jump from node to node. A quantity, called the average commute time, computes the average time taken by a random walker for reaching node j when starting from node i, and coming back to node i. The square root of this quantity, the ECTD, is a distance measure between any two nodes, and has the nice property of decreasing when the number of paths connecting two nodes increases and when the "length" of any path decreases. The ECTD can be computed from the pseudoinverse of the Laplacian matrix of the graph, which is a kernel. We finally define the Principal Components Analysis (PCA) of a graph as the subspace projection that preserves as much variance as possible, in terms of the ECTD. This graph PCA has some interesting links with spectral graph theory, in particular spectral clustering.
Graph Laplacians and Stabilization of Vehicle Formations
, 2001
"... Control of vehicle formations has emerged as a topic of significant interest to the controls community. In this paper, we merge tools from graph theory and control theory to derive stability criteria for formation stabilization. The interconnection between vehicles (i.e., which vehicles are sensed b ..."
Abstract

Cited by 95 (7 self)
 Add to MetaCart
Control of vehicle formations has emerged as a topic of significant interest to the controls community. In this paper, we merge tools from graph theory and control theory to derive stability criteria for formation stabilization. The interconnection between vehicles (i.e., which vehicles are sensed by other vehicles) is modeled as a graph, and the eigenvalues of the Laplacian matrix of the graph are used in stating a Nyquistlike stability criterion for vehicle formations. The location of the Laplacian eigenvalues can be correlated to the graph structure, and therefore used to identify desirable and undesirable formation interconnection topologies.
A Decentralized Algorithm for Spectral Analysis
, 2004
"... In many large network settings, such as computer networks, social networks, or hyperlinked text documents, much information can be obtained from the network’s spectral properties. However, traditional centralized approaches for computing eigenvectors struggle with at least two obstacles: the data ma ..."
Abstract

Cited by 93 (1 self)
 Add to MetaCart
In many large network settings, such as computer networks, social networks, or hyperlinked text documents, much information can be obtained from the network’s spectral properties. However, traditional centralized approaches for computing eigenvectors struggle with at least two obstacles: the data may be difficult to obtain (both due to technical reasons and because of privacy concerns), and the sheer size of the networks makes the computation expensive. A decentralized, distributed algorithm addresses both of these obstacles: it utilizes the computational power of all nodes in the network and their ability to communicate, thus speeding up the computation with the network size. And as each node knows its incident edges, the data collection problem is avoided as well. Our main result is a simple decentralized algorithm for computing the top k eigenvectors of a symmetric weighted adjacency matrix, and a proof that it converges essentially in O(τmix log 2 n) rounds of communication and computation, where τmix is the mixing time of a random walk on the network. An additional contribution of our work is a decentralized way of actually detecting convergence, and diagnosing the current error. Our protocol scales well, in that the amount of computation performed at any node in any one round, and the sizes of messages sent, depend polynomially on k, but not at all on the (typically much larger) number n of nodes.
A spectral algorithm for envelope reduction of sparse matrices
 ACM/IEEE CONFERENCE ON SUPERCOMPUTING
, 1993
"... The problem of reordering a sparse symmetric matrix to reduce its envelope size is considered. A new spectral algorithm for computing an envelopereducing reordering is obtained by associating a Laplacian matrix with the given matrix and then sorting the components of a specified eigenvector of the ..."
Abstract

Cited by 87 (5 self)
 Add to MetaCart
(Show Context)
The problem of reordering a sparse symmetric matrix to reduce its envelope size is considered. A new spectral algorithm for computing an envelopereducing reordering is obtained by associating a Laplacian matrix with the given matrix and then sorting the components of a specified eigenvector of the Laplacian. This Laplacian eigenvector solves a continuous relaxation of a discrete problem related to envelope minimization called the minimum 2sum problem. The permutation vector computed by the spectral algorithm is a closest permutation vector to the specified Laplacian eigenvector. Numerical results show that the new reordering algorithm usually computes smaller envelope sizes than those obtained from the current standards such as the GibbsPooleStockmeyer (GPS) algorithm or the reverse CuthillMcKee (RCM) algorithm in SPARSPAK, in some cases reducing the envelope by more than a factor of two.
Supervised Learning of Large Perceptual Organization: Graph Spectral Partitioning and Learning Automata
 IEEE Transactions on Pattern Analysis and Machine Intelligence
, 2000
"... this article, please send email to: tpami@computer.org, and reference IEEECS Log Number 107780 ..."
Abstract

Cited by 74 (6 self)
 Add to MetaCart
(Show Context)
this article, please send email to: tpami@computer.org, and reference IEEECS Log Number 107780
Relationshipbased Clustering and Cluster Ensembles for Highdimensional Data Mining
, 2002
"... ..."
Dynamic Load Balancing for PDE Solvers on Adaptive Unstructured Meshes
, 1992
"... Modern PDE solvers written for timedependent problems increasingly employ adaptive unstructured meshes (see Flaherty et al. [4]) in order to both increase efficiency and control the numerical error. If a distributed memory parallel computer is to be used, there arises the significant problem of div ..."
Abstract

Cited by 65 (15 self)
 Add to MetaCart
(Show Context)
Modern PDE solvers written for timedependent problems increasingly employ adaptive unstructured meshes (see Flaherty et al. [4]) in order to both increase efficiency and control the numerical error. If a distributed memory parallel computer is to be used, there arises the significant problem of dividing up the domain equally amongst the processors whilst minimising the intersubdomain dependencies. A number of graph based algorithms have recently been proposed for steady state calculations, for example [6] & [11]. This paper considers an extension to such methods which renders them more suitable for timedependent problems in which the mesh may be changed frequently. 1 Introduction Modern PDE solvers for timedependent applications are currently being written so as to obtain accurate solutions to reallife problems with the solution process as automatic as possible. The use of an unstructured mesh allows the code to cater for completely general geometries and hence a wide range of pro...