Results 1  10
of
103
Markov Chain Algorithms for Planar Lattice Structures
, 1995
"... Consider the following Markov chain, whose states are all domino tilings of a 2n x 2n chessboard: starting from some arbitrary tiling, pick a 2 x 2 window uniformly at random. If the four squares appearing in this window are covered by two parallel dominoes, rotate the dominoes 90° in place. Repeat ..."
Abstract

Cited by 110 (11 self)
 Add to MetaCart
Consider the following Markov chain, whose states are all domino tilings of a 2n x 2n chessboard: starting from some arbitrary tiling, pick a 2 x 2 window uniformly at random. If the four squares appearing in this window are covered by two parallel dominoes, rotate the dominoes 90° in place. Repeat many times. This process is used in practice to generate a random tiling, and is a widely used tool in the study of the combinatorics of tilings and the behavior of dimer systems in statistical physics. Analogous Markov chains are used to randomly generate other structures on various twodimensional lattices. This paper presents techniques which prove for the first time that, in many interesting cases, a small number of random moves suffice to obtain a uniform distribution.
Analyzing Glauber Dynamics by Comparison of Markov Chains
 Journal of Mathematical Physics
, 1999
"... A popular technique for studying random properties of a combinatorial set is to design a Markov chain Monte Carlo algorithm. For many problems there are natural Markov chains connecting the set of allowable configurations which are based on local moves, or "Glauber dynamics." Typically the ..."
Abstract

Cited by 71 (16 self)
 Add to MetaCart
A popular technique for studying random properties of a combinatorial set is to design a Markov chain Monte Carlo algorithm. For many problems there are natural Markov chains connecting the set of allowable configurations which are based on local moves, or "Glauber dynamics." Typically these single site update algorithms are difficult to analyze, so often the Markov chain is modified to update several sites simultaneously. Recently there has been progress in analyzing these more complicated algorithms for several important combinatorial problems. In this work we use the comparison technique of Diaconis and SaloffCoste to show that several of the natural single point update algorithms are efficient. The strategy is to relate the mixing rate of these algorithms to the corresponding nonlocal algorithms which have already been analyzed. This allows us to give polynomial bounds for single point update algorithms for problems such as generating planar tilings and random triangulations of c...
Mathematical aspects of mixing times in Markov chains
, 2006
"... In the past few years we have seen a surge in the theory of finite Markov chains, by way of new techniques to bounding the convergence to stationarity. This includes functional techniques such as logarithmic Sobolev and Nash inequalities, refined spectral and entropy techniques, and isoperimetric te ..."
Abstract

Cited by 43 (4 self)
 Add to MetaCart
(Show Context)
In the past few years we have seen a surge in the theory of finite Markov chains, by way of new techniques to bounding the convergence to stationarity. This includes functional techniques such as logarithmic Sobolev and Nash inequalities, refined spectral and entropy techniques, and isoperimetric techniques such as the average and blocking conductance and the evolving set methodology. We attempt to give a more or less selfcontained treatment of some of these modern techniques, after reviewing several preliminaries. We also review classical and modern lower bounds on mixing times. There have been other important contributions to this theory such as variants on coupling techniques and decomposition methods, which are not included here; our choice was to keep the analytical methods as the theme of this presentation. We illustrate the strength of the main techniques by way of simple examples, a recent result on the Pollard Rho random walk to compute the discrete logarithm, as well as with an improved analysis of the Thorp shuffle. 1
Gibbs sampling, exponential families and orthogonal polynomials
 Statistical Sciences
, 2008
"... Abstract. We give families of examples where sharp rates of convergence to stationarity of the widely used Gibbs sampler are available. The examples involve standard exponential families and their conjugate priors. In each case, the transition operator is explicitly diagonalizable with classical ort ..."
Abstract

Cited by 40 (10 self)
 Add to MetaCart
(Show Context)
Abstract. We give families of examples where sharp rates of convergence to stationarity of the widely used Gibbs sampler are available. The examples involve standard exponential families and their conjugate priors. In each case, the transition operator is explicitly diagonalizable with classical orthogonal polynomials as eigenfunctions. Key words and phrases: Gibbs sampler, running time analyses, exponential families, conjugate priors, location families, orthogonal polynomials, singular value decomposition. 1.
Random walks on finite groups
 In Probability on Discrete Structures, Encyclopedia of Mathematical Sciences
, 2004
"... ..."
(Show Context)
Sampling Adsorbing Staircase Walks Using a New Markov Chain Decomposition Method
 In Proceedings of the 41st Annual Symposium on Foundations of Computer Science
"... Staircase walks are lattice paths from (0; 0) to (2n; 0) which take diagonal steps and which never fall below the xaxis. A path hitting the xaxis k times is assigned a weight of k ; where ? 0 . A simple local Markov chain which connects the state space and converges to the Gibbs measure (which ..."
Abstract

Cited by 32 (5 self)
 Add to MetaCart
(Show Context)
Staircase walks are lattice paths from (0; 0) to (2n; 0) which take diagonal steps and which never fall below the xaxis. A path hitting the xaxis k times is assigned a weight of k ; where ? 0 . A simple local Markov chain which connects the state space and converges to the Gibbs measure (which normalizes these weights) is known to be rapidly mixing when = 1 , and can easily be shown to be rapidly mixing when ! 1 . We give the first proof that this Markov chain is also mixing in the more interesting case of ? 1 , known in the statistical physics community as adsorbing staircase walks. The main new ingredient is a decomposition technique which allows us to analyze the Markov chain in pieces, applying different arguments to analyze each piece. 1. Introduction 1.1. The model Staircase walks (also called Dyck paths) are walks in ZZ 2 from (0; 0) to (n; n) which stay above the diagonal x = y . Rotating by 45 o , they correspond to walks from (0; 0) to (2n; 0) which take diagon...
Layered multishift coupling for use in perfect sampling algorithms (with a primer to CFTP
 Fields Institute Communications Series, American Mathematical Society
, 2000
"... Abstract. In this article we describe a new coupling technique which is useful in a variety of perfect sampling algorithms. A multishift coupler generates a random function f() so that for each x ∈ R, f(x) − x is governed by the same fixed probability distribution, such as a normal distribution. We ..."
Abstract

Cited by 25 (1 self)
 Add to MetaCart
(Show Context)
Abstract. In this article we describe a new coupling technique which is useful in a variety of perfect sampling algorithms. A multishift coupler generates a random function f() so that for each x ∈ R, f(x) − x is governed by the same fixed probability distribution, such as a normal distribution. We develop the class of layered multishift couplers, which are simple and have several useful properties. For the standard normal distribution, for instance, the layered multishift coupler generates an f() which (surprisingly) maps an interval of length ℓ to fewer than 2+ℓ/2.35 points — useful in applications which perform computations on each such image point. The layered multishift coupler improves and simplifies algorithms for generating perfectly random samples from several distributions, including the autogamma distribution, posterior distributions for Bayesian inference, and the steady state distribution for certain storage systems. We also use the layered multishift coupler to develop a Markovchain based perfect sampling algorithm for the autonormal distribution. At the request of the organizers, we begin by giving a primer on CFTP (coupling from the past); CFTP and Fill’s algorithm are the two predominant techniques for generating perfectly random samples using coupled Markov chains.
Random walks on combinatorial objects
 Surveys in Combinatorics 1999
, 1999
"... Summary Approximate sampling from combinatoriallydefined sets, using the Markov chain Monte Carlo method, is discussed from the perspective of combinatorial algorithms. We also examine the associated problem of discrete integration over such sets. Recent work is reviewed, and we reexamine the unde ..."
Abstract

Cited by 22 (8 self)
 Add to MetaCart
(Show Context)
Summary Approximate sampling from combinatoriallydefined sets, using the Markov chain Monte Carlo method, is discussed from the perspective of combinatorial algorithms. We also examine the associated problem of discrete integration over such sets. Recent work is reviewed, and we reexamine the underlying formal foundational framework in the light of this. We give a detailed treatment of the coupling technique, a classical method for analysing the convergence rates of Markov chains. The related topic of perfect sampling is examined. In perfect sampling, the goal is to sample exactly from the target set. We conclude with a discussion of negative results in this area. These are results which imply that there are no polynomial time algorithms of a particular type for a particular problem. 1
Mixing times of the biased card shuffling and the asymmetric exclusion process
 Trans. Amer. Math. Soc
, 2005
"... Abstract. Consider the following method of card shuffling. Start with a deck of N cards numbered 1 through N. Fix a parameter p between 0 and 1. In this model a “shuffle ” consists of uniformly selecting a pair of adjacent cards and then flipping a coin that is heads with probability p. If the coin ..."
Abstract

Cited by 21 (1 self)
 Add to MetaCart
(Show Context)
Abstract. Consider the following method of card shuffling. Start with a deck of N cards numbered 1 through N. Fix a parameter p between 0 and 1. In this model a “shuffle ” consists of uniformly selecting a pair of adjacent cards and then flipping a coin that is heads with probability p. If the coin comes up heads, then we arrange the two cards so that the lowernumbered card comes before the highernumbered card. If the coin comes up tails, then we arrange the cards with the highernumbered card first. In this paper we prove that for all p � = 1/2, the mixing time of this card shuffling is O(N 2), as conjectured by Diaconis and Ram (2000). Our result is a rare case of an exact estimate for the convergence rate of the Metropolis algorithm. A novel feature of our proof is that the analysis of an infinite (asymmetric exclusion) process plays an essential role in bounding the mixing time of a finite process. 1.