Results 1  10
of
98
NonUniform Random Variate Generation
, 1986
"... Abstract. This is a survey of the main methods in nonuniform random variate generation, and highlights recent research on the subject. Classical paradigms such as inversion, rejection, guide tables, and transformations are reviewed. We provide information on the expected time complexity of various ..."
Abstract

Cited by 646 (21 self)
 Add to MetaCart
Abstract. This is a survey of the main methods in nonuniform random variate generation, and highlights recent research on the subject. Classical paradigms such as inversion, rejection, guide tables, and transformations are reviewed. We provide information on the expected time complexity of various algorithms, before addressing modern topics such as indirectly specified distributions, random processes, and Markov chain methods.
Multiple Indicators, partially ordered sets, and linear extensions: Multicriterion ranking and prioritization
, 2004
"... ..."
Distributed Routing in SmallWorld Networks
, 2007
"... So called smallworld networks – clustered networks with small diameters – are thought to be prevalent in nature, especially appearing in people’s social interactions. Many models exist for this phenomenon, with some of the most recent explaining how it is possible to find short routes between nodes ..."
Abstract

Cited by 20 (3 self)
 Add to MetaCart
So called smallworld networks – clustered networks with small diameters – are thought to be prevalent in nature, especially appearing in people’s social interactions. Many models exist for this phenomenon, with some of the most recent explaining how it is possible to find short routes between nodes in such networks. Searching for such routes, however, always depends on nodes knowing what their and their neighbors positions are relative to the destination. In real applications where one may wish to search a smallworld network, such as peertopeer computer networks, this cannot always be assumed to be true. We propose and explore a method of routing that does not depend on such knowledge, and which can be implemented in a completely distributed way without any global elements. The Markov Chain MonteCarlo based algorithm takes only a graph as input, and requires no further information about the nodes themselves. The proposed method is tested against simulated and real world data.
Settings in social networks: A measurement model
 Sociological Methodology
, 2003
"... A class of statistical models is proposed which aims to recover latent settings structures in social networks. Settings may be regarded as clusters of vertices. The measurement model builds on two assumptions. The observed network is assumed to be generated by hierarchically nested latent transitive ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
A class of statistical models is proposed which aims to recover latent settings structures in social networks. Settings may be regarded as clusters of vertices. The measurement model builds on two assumptions. The observed network is assumed to be generated by hierarchically nested latent transitive structures, expressed by ultrametrics. It is assumed that expected tie strength decreases with ultrametric distance. The approach could be described as modelbased clustering with an ultrametric space as the underlying metric to capture the dependence in the observations. Maximum likelihood methods as well as Bayesian methods are applied for statistical inference. Both approaches are implemented using Markov chain Monte Carlo methods. 1.
Error bounds for computing the expectation by Markov chain Monte Carlo, preprint, arXiv:0906.2359 C. Villani, Topics in optimal transportation
 Graduate Studies in Mathematics 58, American Mathematical Society
, 2003
"... Abstract. We study the error of reversible Markov chain Monte Carlo methods for approximating the expectation of a function. Explicit error bounds with respect to the l2, l4 and l∞norm of the function are proven. By the estimation the well known asymptotical limit of the error is attained, i.e. t ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
Abstract. We study the error of reversible Markov chain Monte Carlo methods for approximating the expectation of a function. Explicit error bounds with respect to the l2, l4 and l∞norm of the function are proven. By the estimation the well known asymptotical limit of the error is attained, i.e. there is no gap between the estimate and the asymptotical behavior. We discuss the dependence of the error on a burnin of the Markov chain. Furthermore we suggest and justify a specific burnin for optimizing the algorithm. 1.
Extending the Lifetime of a Network of BatteryPowered Mobile Devices by Remote Processing: A Markovian . . .
, 2003
"... This paper addresses the problem of extending the lifetime of a batterypowered mobile host in a clientserver wireless network by using task migration and remote processing. This problem is solved by first constructing a stochastic model of the clientserver system based on the theory of continuous ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
This paper addresses the problem of extending the lifetime of a batterypowered mobile host in a clientserver wireless network by using task migration and remote processing. This problem is solved by first constructing a stochastic model of the clientserver system based on the theory of continuoustime Markovian decision processes. Next the dynamic power management problem with task migration is formulated as a policy optimization problem and solved exactly by using a linear programming approach. Based on the offline optimal policy derived in this way, an online adaptive policy is proposed, which dynamically monitors the channel conditions and the server behavior and adopts a clientside power management policy with task migration that results in optimum energy consumption in the client. Experimental results demonstrate that the proposed method outperforms existing heuristic methods by as much as 35% in terms of the overall energy savings.
PageRank: Three Distributed Algorithms
 Department of Computing, Imperial College
, 2004
"... This paper shows for the first time that there are multiple PageRank definitions, and that more is required to justify PageRank than is offered in the literature ([1], [2] and [5]). Adopting a formal approach, this paper provides a justification of PageRank. It also shows that the irreducibility res ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
This paper shows for the first time that there are multiple PageRank definitions, and that more is required to justify PageRank than is offered in the literature ([1], [2] and [5]). Adopting a formal approach, this paper provides a justification of PageRank. It also shows that the irreducibility restriction on the PageRank transition matrix [5] is unnecessary. This is important because it gives the personalisation vector more intuitive force. Noting the difficulties in calculating PageRank centrally, the paper then shows, via the ChazanMiranker theorem [10], that distributed calculation algorithms are possible; and, it presents three such novel algorithms. The first of these takes documents / pages as being of principal interest; the second and third, in an attempt to reduce communication overheads, focus attention on nodes/ webservers. Empirical test results are shown. These confirm reduced message numbers for algorithms 2 and 3. They also show that more investigation is required into suitable ɛthresholds within asynchronous environs. Acknowledgments I would like to thank my supervisor, Dr Jeremy Bradley, for encouraging my theoretical meanderings,
Eijnden. Fitting timeseries by continuoustime Markov chains: A quadratic programming approach
, 2006
"... Construction of stochastic models that describe the effective dynamics of observables of interest is an useful instrument in various fields of application, such as physics, climate science, and finance. We present a new technique for the construction of such models. From the timeseries of an observ ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
Construction of stochastic models that describe the effective dynamics of observables of interest is an useful instrument in various fields of application, such as physics, climate science, and finance. We present a new technique for the construction of such models. From the timeseries of an observable, we construct a discreteintime Markov chain and calculate the eigenspectrum of its transition probability (or stochastic) matrix. As a next step we aim to find the generator of a continuoustime Markov chain whose eigenspectrum resembles the observed eigenspectrum as closely as possible, using an appropriate norm. The generator is found by solving a minimization problem: the norm is chosen such that the object function is quadratic and convex, so that the minimization problem can be solved using quadratic programming techniques. The technique is illustrated on various toy problems as well as on datasets stemming from simulations of molecular dynamics and of atmospheric flows.