Results 1  10
of
18
Extension of Fill’s perfect rejection sampling algorithm to general chains (extended abstract
 Pages 37–52 in Monte Carlo Methods
, 2000
"... By developing and applying a broad framework for rejection sampling using auxiliary randomness, we provide an extension of the perfect sampling algorithm of Fill (1998) to general chains on quite general state spaces, and describe how use of bounding processes can ease computational burden. Along th ..."
Abstract

Cited by 42 (13 self)
 Add to MetaCart
By developing and applying a broad framework for rejection sampling using auxiliary randomness, we provide an extension of the perfect sampling algorithm of Fill (1998) to general chains on quite general state spaces, and describe how use of bounding processes can ease computational burden. Along the way, we unearth a simple connection between the Coupling From The Past (CFTP) algorithm originated by Propp and Wilson (1996) and our extension of Fill’s algorithm. Key words and phrases. Fill’s algorithm, Markov chain Monte Carlo, perfect sampling, exact sampling, rejection sampling, interruptibility, coupling from the past, readonce coupling from the past, monotone transition rule, realizable monotonicity, stochastic monotonicity, partially ordered set, coalescence, imputation,
Optimal Coding and Sampling of Triangulations
, 2003
"... Abstract. We present a simple encoding of plane triangulations (aka. maximal planar graphs) by plane trees with two leaves per inner node. Our encoding is a bijection taking advantage of the minimal Schnyder tree decomposition of a plane triangulation. Coding and decoding take linear time. As a bypr ..."
Abstract

Cited by 39 (5 self)
 Add to MetaCart
Abstract. We present a simple encoding of plane triangulations (aka. maximal planar graphs) by plane trees with two leaves per inner node. Our encoding is a bijection taking advantage of the minimal Schnyder tree decomposition of a plane triangulation. Coding and decoding take linear time. As a byproduct we derive: (i) a simple interpretation of the formula for the number of plane triangulations with n vertices, (ii) a linear random sampling algorithm, (iii) an explicit and simple information theory optimal encoding. 1
Mathematical foundations of the Markov chain Monte Carlo method
 in Probabilistic Methods for Algorithmic Discrete Mathematics
, 1998
"... 7.2 was jointly undertaken with Vivek Gore, and is published here for the first time. I also thank an anonymous referee for carefully reading and providing helpful comments on a draft of this chapter. 1. Introduction The classical Monte Carlo method is an approach to estimating quantities that a ..."
Abstract

Cited by 30 (1 self)
 Add to MetaCart
7.2 was jointly undertaken with Vivek Gore, and is published here for the first time. I also thank an anonymous referee for carefully reading and providing helpful comments on a draft of this chapter. 1. Introduction The classical Monte Carlo method is an approach to estimating quantities that are hard to compute exactly. The quantity z of interest is expressed as the expectation z = ExpZ of a random variable (r.v.) Z for which some efficient sampling procedure is available. By taking the mean of some sufficiently large set of independent samples of Z, one may obtain an approximation to z. For example, suppose S = \Phi (x; y) 2 [0; 1] 2 : p i (x; y) 0; for all i \Psi<F12
Parallel computing and Monte Carlo algorithms
, 1999
"... We argue that Monte Carlo algorithms are ideally suited to parallel computing, and that "parallel Monte Carlo" should be more widely used. We consider a number of issues that arise, including dealing with slow or unreliable computers. We also discuss the possibilities of parallel Markov chain Monte ..."
Abstract

Cited by 21 (0 self)
 Add to MetaCart
We argue that Monte Carlo algorithms are ideally suited to parallel computing, and that "parallel Monte Carlo" should be more widely used. We consider a number of issues that arise, including dealing with slow or unreliable computers. We also discuss the possibilities of parallel Markov chain Monte Carlo. We illustrate our results with actual computer experiments.
Randomised Techniques in Combinatorial Algorithmics
, 1999
"... ix Chapter 1 Introduction 1 1.1 Algorithmic Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.2 Technical Preliminaries . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.2.1 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 ..."
Abstract

Cited by 20 (7 self)
 Add to MetaCart
ix Chapter 1 Introduction 1 1.1 Algorithmic Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.2 Technical Preliminaries . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.2.1 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 1.2.2 Parallel Computational Complexity . . . . . . . . . . . . . . . . . . . . . 7 1.2.3 Probability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 1.2.4 Graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 1.2.5 Random Graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 1.2.6 Group Theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 1.3 Concluding Remarks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 Chapter 2 Parallel Uniform Generation of Unlabelled Graphs 25 2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 2.2 Sampling O...
ProppWilson algorithms and finitary codings for high noise Markov random fields
, 1999
"... In this paper, we combine two previous works, the first being by the first author and K. Nelander, and the second by J. van den Berg and the second author, to show (1) that one can carry out a ProppWilson exact simulation for all Markov random fields on Z d satisfying a certain high noise assump ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
In this paper, we combine two previous works, the first being by the first author and K. Nelander, and the second by J. van den Berg and the second author, to show (1) that one can carry out a ProppWilson exact simulation for all Markov random fields on Z d satisfying a certain high noise assumption, and (2) that all such random fields are a finitary image of a finite state i.i.d. process. (2) is a strengthening of the previously known fact that such random fields are socalled Bernoulli shifts. 1 Introduction A random field with finite state space S indexed by the integer lattice Z d is a random mapping X : Z d ! S, or it can equivalently be seen as a random element of S Z d . Here we focus on socalled Markov random fields, characterized by having a dependency structure which only propagates via interactions between nearest neighbors in Z d . We specialize further to Markov random fields satsifying a certain high noise assumption, which says that these interactions shou...
Note on Rejection sampling and exact sampling with the Metropolised Independence Sampler
, 2004
"... Introduction This short note shows a close relationship between standard rejection sampling and exact sampling by coupling from the past applied to a Metropolised independence sampler. Little background is assumed, but [1] provides a clear review of all required material. I now know that this idea, ..."
Abstract
 Add to MetaCart
Introduction This short note shows a close relationship between standard rejection sampling and exact sampling by coupling from the past applied to a Metropolised independence sampler. Little background is assumed, but [1] provides a clear review of all required material. I now know that this idea, first presented as a tenminute teatime talk, is probably a duplicate of an unavailable work [3], and is closely related to a paper by Jun S. Liu [2], who provides a much more detailed analysis. Perhaps this exposition will be of interest to some readers. 2 Rejection sampling Rejection sampling [4] is a method to draw independent samples from a probability distribution P (x) = P # (x)/Z P . We may not know the normalising constant ZP , but we assume that we can evaluate P # (x) at any position x we choose. It does not matter here if the function P (x) gives probabilities for discrete x or describes a probability density function over continuous x. Firstly we choose a distribution Q(x)
Exact and Approximate Sampling from the Stationary Distribution of a Markov Chain
, 1998
"... When simulating a physical system with discrete sates, one often would like to generate a sample from the stationary distribution of a Markov chain. This report focuses on three sampling methodologies which do not rely on explicitly computing the stationary distribution. Two of these lead to algorit ..."
Abstract
 Add to MetaCart
When simulating a physical system with discrete sates, one often would like to generate a sample from the stationary distribution of a Markov chain. This report focuses on three sampling methodologies which do not rely on explicitly computing the stationary distribution. Two of these lead to algorithms which can generate an exact sample in finite time. The third yields a sample whose distribution approximates, but is arbitrary close to, the stationary distribution from which one desires a sample. The approximate and one of the exact methodologies are illustrated with examples from statistical mechanics. Contents 1 Introduction 3 2 Markov Chain Monte Carlo for Statistical Mechanics 5 2.1 Ising Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 2.2 Metropolis and Heat Bath Chains . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 3 CouplingFromthePast 12 3.1 The Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ...