Results 1  10
of
45
Randomwalk computation of similarities between nodes of a graph, with application to collaborative recommendation
 IEEE Transactions on Knowledge and Data Engineering
, 2006
"... Abstract—This work presents a new perspective on characterizing the similarity between elements of a database or, more generally, nodes of a weighted and undirected graph. It is based on a Markovchain model of random walk through the database. More precisely, we compute quantities (the average comm ..."
Abstract

Cited by 116 (14 self)
 Add to MetaCart
Abstract—This work presents a new perspective on characterizing the similarity between elements of a database or, more generally, nodes of a weighted and undirected graph. It is based on a Markovchain model of random walk through the database. More precisely, we compute quantities (the average commute time, the pseudoinverse of the Laplacian matrix of the graph, etc.) that provide similarities between any pair of nodes, having the nice property of increasing when the number of paths connecting those elements increases and when the “length ” of paths decreases. It turns out that the square root of the average commute time is a Euclidean distance and that the pseudoinverse of the Laplacian matrix is a kernel matrix (its elements are inner products closely related to commute times). A principal component analysis (PCA) of the graph is introduced for computing the subspace projection of the node vectors in a manner that preserves as much variance as possible in terms of the Euclidean commutetime distance. This graph PCA provides a nice interpretation to the “Fiedler vector, ” widely used for graph partitioning. The model is evaluated on a collaborativerecommendation task where suggestions are made about which movies people should watch based upon what they watched in the past. Experimental results on the MovieLens database show that the Laplacianbased similarities perform well in comparison with other methods. The model, which nicely fits into the socalled “statistical relational learning ” framework, could also be used to compute document or word similarities, and, more generally, it could be applied to machinelearning and patternrecognition tasks involving a relational database. Index Terms—Graph analysis, graph and database mining, collaborative recommendation, graph kernels, spectral clustering, Fiedler vector, proximity measures, statistical relational learning. 1
The Numerical Solution of Stochastic Automata Networks
, 1994
"... Stochastic Automata Networks (SAN's) have recently received attention in the literature as an efficient means of modelling parallel systems such as communicating processes, concurrent processors, shared memory, etc. The advantage that the SAN approach has over generalized stochastic Petri nets, and ..."
Abstract

Cited by 44 (10 self)
 Add to MetaCart
Stochastic Automata Networks (SAN's) have recently received attention in the literature as an efficient means of modelling parallel systems such as communicating processes, concurrent processors, shared memory, etc. The advantage that the SAN approach has over generalized stochastic Petri nets, and indeed over any Markovian analysis that requires the generation of a transition matrix, is that its representation remains compact even as the number of states in the underlying Markov chain begins to explode. Our concern in this paper is with the numerical issues that are involved in solving SAN networks. We introduce stochastic automata and consider the numerical difficulties that result from their interaction. We examine how the product of a vector with a compact SAN descriptor may be formed, for this operation is basis to all iterative solution methods. We describe possible solution methods, including the power method, the method of Arnoldi and GMRES, and show that the two latter methods...
Perturbation realization, potentials, and sensitivity analysis of Markov processes
 IEEE Transactions on Automatic Control
, 1997
"... Abstract — Two fundamental concepts and quantities, realization factors and performance potentials, are introduced for Markov processes. The relations among these two quantities and the group inverse of the infinitesimal generator are studied. It is shown that the sensitivity of the steadystate per ..."
Abstract

Cited by 39 (1 self)
 Add to MetaCart
Abstract — Two fundamental concepts and quantities, realization factors and performance potentials, are introduced for Markov processes. The relations among these two quantities and the group inverse of the infinitesimal generator are studied. It is shown that the sensitivity of the steadystate performance with respect to the change of the infinitesimal generator can be easily calculated by using either of these three quantities and that these quantities can be estimated by analyzing a single sample path of a Markov process. Based on these results, algorithms for estimating performance sensitivities on a single sample path of a Markov process can be proposed. The potentials in this paper are defined through realization factors and are shown to be the same as those defined by Poisson equations. The results provide a uniform framework of perturbation realization for infinitesimal perturbation analysis (IPA) and nonIPA approaches to the sensitivity analysis of steadystate performance; they also provide a theoretical background for the PA algorithms developed in recent years. Index Terms—Perturbation analysis, Poisson equations, samplepath analysis.
Comparison of perturbation bounds for the stationary distribution of a Markov chain
 IN PROCEEDINGS OF THE TWENTYSIXTH INTERNATIONAL CONFERENCE ON VERY LARGE DATABASES
, 2000
"... The purpose of this paper is to review and compare the existing perturbation bounds for the stationary distribution of a finite, irreducible, homogeneous Markov chain. ..."
Abstract

Cited by 35 (2 self)
 Add to MetaCart
The purpose of this paper is to review and compare the existing perturbation bounds for the stationary distribution of a finite, irreducible, homogeneous Markov chain.
MARKOV CHAIN SENSITIVITY MEASURED BY MEAN FIRST PASSAGE TIMES
, 1999
"... The purpose of this article is to present results concerning the sensitivity of the stationary probabilities for a nstate, timehomogeneous, irreducible Markov chain in terms of the mean first passage times in the chain. ..."
Abstract

Cited by 27 (3 self)
 Add to MetaCart
The purpose of this article is to present results concerning the sensitivity of the stationary probabilities for a nstate, timehomogeneous, irreducible Markov chain in terms of the mean first passage times in the chain.
Uniform Stability Of Markov Chains
 SIAM J. MATRIX ANAL. APPL
, 1994
"... By deriving a new set of tight perturbation bounds, it is shown that all stationary probabilities of a finite irreducible Markov chain react essentially in the same way to perturbations in the transition probabilities. In particular, if at least one stationary probability is insensitive in a relativ ..."
Abstract

Cited by 24 (7 self)
 Add to MetaCart
By deriving a new set of tight perturbation bounds, it is shown that all stationary probabilities of a finite irreducible Markov chain react essentially in the same way to perturbations in the transition probabilities. In particular, if at least one stationary probability is insensitive in a relative sense, then all stationary probabilities must be insensitive in an absolute sense. New measures of sensitivity are related to more traditional ones, and it is shown that all relevant condition numbers for the Markov chain problem are small multiples of each other. Finally, the implications of these findings to the computation of stationary probabilities by direct methods are discussed, and the results are applied to stability issues in nearly transient chains.
Towards Exploiting Link Evolution
, 2001
"... Exploiting hyperlink information has revolutionized search algorithms for the Web [8, 2]. As the Web grows, its link structure, along with content, evolves at a rapid rate. Consequently, largescale static hyperlinkbased ranking computations become too expensive to be performed frequently. ..."
Abstract

Cited by 21 (0 self)
 Add to MetaCart
Exploiting hyperlink information has revolutionized search algorithms for the Web [8, 2]. As the Web grows, its link structure, along with content, evolves at a rapid rate. Consequently, largescale static hyperlinkbased ranking computations become too expensive to be performed frequently.
Sensitivity Of The Stationary Distribution Of A Markov Chain
 SIAM Journal on Matrix Analysis and Applications
, 1994
"... . It is well known that if the transition matrix of an irreducible Markov chain of moderate size has a subdominant eigenvalue which is close to 1, then the chain is ill conditioned in the sense that there are stationary probabilities which are sensitive to perturbations in the transition probabiliti ..."
Abstract

Cited by 20 (2 self)
 Add to MetaCart
. It is well known that if the transition matrix of an irreducible Markov chain of moderate size has a subdominant eigenvalue which is close to 1, then the chain is ill conditioned in the sense that there are stationary probabilities which are sensitive to perturbations in the transition probabilities. However, the converse of this statement has heretofore been unresolved. The purpose of this article is to address this issue by establishing upper and lower bounds on the condition number of the chain such that the bounding terms are functions of the eigenvalues of the transition matrix. Furthermore, it is demonstrated how to obtain estimates for the condition number of an irreducible chain with little or no extra computational e#ort over that required to compute the stationary probabilities by means of an LU or QR factorization. Key words. Markov chains, stationary distribution, stochastic matrix, sensitivity analysis, perturbation theory, character of a Markov chain, condition numbers ...
Stationary distributions and mean first passage times of perturbed Markov chains
 AsiaPacific Journal of Operational Research
, 1992
"... Stationary distributions of perturbed finite irreducible discrete time Markov chains are intimately connected with the behaviour of associated mean first passage times. This interconnection is explored through the use of generalized matrix inverses. Some interesting qualitative results regarding the ..."
Abstract

Cited by 18 (4 self)
 Add to MetaCart
Stationary distributions of perturbed finite irreducible discrete time Markov chains are intimately connected with the behaviour of associated mean first passage times. This interconnection is explored through the use of generalized matrix inverses. Some interesting qualitative results regarding the nature of the relative and absolute changes to the stationary probabilities are obtained together with some improved bounds. AMS classification: 15A51; 60J10; 60J20; 65F20; 65F35
A Parallel Solver for LargeScale Markov Chains
 APPL. NUMER. MATH
, 2002
"... We consider the parallel computation of the stationary probability distribution vector of ergodic Markov chains with large state spaces by preconditioned Krylov subspace methods. The parallel preconditioner is obtained as an explicit approximation, in factorized form, of a particular generalized ..."
Abstract

Cited by 17 (7 self)
 Add to MetaCart
We consider the parallel computation of the stationary probability distribution vector of ergodic Markov chains with large state spaces by preconditioned Krylov subspace methods. The parallel preconditioner is obtained as an explicit approximation, in factorized form, of a particular generalized inverse of the infinitesimal generator of the Markov process. Conditions that guarantee the existence of the preconditioner are given, and the results of a parallel implementation are presented.