Results 1  10
of
25
A New Look at Survey Propagation and its Generalizations
"... We study the survey propagation algorithm [19, 5, 4], which is an iterative technique that appears to be very effective in solving random kSAT problems even with densities close to threshold. We first describe how any SAT formula can be associated with a novel family of Markov random fields (MRFs), ..."
Abstract

Cited by 66 (11 self)
 Add to MetaCart
We study the survey propagation algorithm [19, 5, 4], which is an iterative technique that appears to be very effective in solving random kSAT problems even with densities close to threshold. We first describe how any SAT formula can be associated with a novel family of Markov random fields (MRFs), parameterized by a real number ρ. We then show that applying belief propagation— a wellknown “messagepassing” technique—to this family of MRFs recovers various algorithms, ranging from pure survey propagation at one extreme (ρ = 1) to standard belief propagation on the uniform distribution over SAT assignments at the other extreme (ρ = 0). Configurations in these MRFs have a natural interpretation as generalized satisfiability assignments, on which a partial order can be defined. We isolate cores as minimal elements in this partial
Information flow on trees
 Ann. Appl. Probab
"... Consider a tree network T, where each edge acts as an independent copy of a given channel M, and information is propagated from the root. For which T and M does the configuration obtained at level n of T typically contain significant information on the root variable? This problem arose independently ..."
Abstract

Cited by 56 (14 self)
 Add to MetaCart
(Show Context)
Consider a tree network T, where each edge acts as an independent copy of a given channel M, and information is propagated from the root. For which T and M does the configuration obtained at level n of T typically contain significant information on the root variable? This problem arose independently in biology, information theory and statistical physics. • For all b, we construct a channel for which the variable at the root of the bary tree is independent of the configuration at level 2 of that tree, yet for sufficiently large B> b, the mutual information between the configuration at level n of the Bary tree and the root variable is bounded away from zero. This is related to certain secretsharing protocols. • We improve the upper bounds on information flow for asymmetric binary channels (which correspond to the Ising model with an external field) and for symmetric qary channels (which correspond to Potts models). • Let λ2(M) denote the second largest eigenvalue of M, in absolute value. A CLT of Kesten and Stigum (1966) implies that if bλ2(M)  2> 1, then the census of the variables at any level of the bary tree, contains significant information on the root variable. We establish a converse: if bλ2(M)  2 < 1, then the census of the variables at level n of the bary tree is asymptotically independent of the root variable. This contrasts with examples where bλ2(M)  2 < 1, yet the configuration at level n is not asymptotically independent of the root variable. 1
Phase transitions in phylogeny
 Trans. Amer. Math. Soc
, 2003
"... Abstract. We apply the theory of Markov random fields on trees to derive a phase transition in the number of samples needed in order to reconstruct phylogenies. We consider the CavenderFarrisNeyman model of evolution on trees, where all the inner nodes have degree at least 3, and the net transitio ..."
Abstract

Cited by 32 (8 self)
 Add to MetaCart
(Show Context)
Abstract. We apply the theory of Markov random fields on trees to derive a phase transition in the number of samples needed in order to reconstruct phylogenies. We consider the CavenderFarrisNeyman model of evolution on trees, where all the inner nodes have degree at least 3, and the net transition on each edge is bounded by ɛ. Motivated by a conjecture by M. Steel, we show that if 2(1 − 2ɛ) 2> 1, then for balanced trees, the topology of the underlying tree, having n leaves, can be reconstructed from O(log n) samples (characters) at the leaves. On the other hand, we show that if 2(1 − 2ɛ) 2 < 1, then there exist topologies which require at least n Ω(1) samples for reconstruction. Our results are the first rigorous results to establish the role of phase transitions for Markov random fields on trees, as studied in probability, statistical physics and information theory, for the study of phylogenies in mathematical biology. 1.
Mixing in Time and Space for Lattice Spin Systems: A Combinatorial View
 ALG
, 2004
"... The paper considers spin systems on the ddimensional integer lattice Z^d with nearestneighbor interactions. A sharp equivalence is proved between exponential decay with distance of spin correlations (a spatial property of the equilibrium state) and "superfast" mixing time of the Glaube ..."
Abstract

Cited by 29 (7 self)
 Add to MetaCart
(Show Context)
The paper considers spin systems on the ddimensional integer lattice Z^d with nearestneighbor interactions. A sharp equivalence is proved between exponential decay with distance of spin correlations (a spatial property of the equilibrium state) and "superfast" mixing time of the Glauber dynamics (a temporal property of a Markov chain Monte Carlo algorithm). While such
Reconstruction for models on random graphs
 In FOCS ’07: Proceedings of the 48th Annual IEEE Symposium on Foundations of Computer Science
, 2007
"... The reconstruction problem requires to estimate a random variable given ‘far away ’ observations. Several theoretical results (and simple algorithms) are available when the underlying probability distribution is Markov with respect to a tree. In this paper we estabilish several exact thresholds for ..."
Abstract

Cited by 27 (4 self)
 Add to MetaCart
The reconstruction problem requires to estimate a random variable given ‘far away ’ observations. Several theoretical results (and simple algorithms) are available when the underlying probability distribution is Markov with respect to a tree. In this paper we estabilish several exact thresholds for loopy graphs. More precisely we consider models on random graphs that converge locally to trees. We establish the reconstruction thresholds for the Ising model both with attractive and random interactions (respectively, ‘ferromagnetic ’ and ‘spin glass’). Remarkably, in the first case the result does not coincide with the corresponding tree threshold. Among the other tools, we develop a sufficient condition for the tree and graph reconstruction problem to coincide. We apply such condition to antiferromagnetic colorings of random graphs. 1 Introduction and
Robust Reconstruction on Trees is Determined By the Second Eigenvalue
, 2002
"... Consider information propagation from the root of infinite Bary tree, where each edge of the tree acts as an independent copy of some channel M . The reconstruction problem is solvable, if the n'th level of the tree contains a nonvanishing amount of information on the root of the tree, as ..."
Abstract

Cited by 23 (8 self)
 Add to MetaCart
Consider information propagation from the root of infinite Bary tree, where each edge of the tree acts as an independent copy of some channel M . The reconstruction problem is solvable, if the n'th level of the tree contains a nonvanishing amount of information on the root of the tree, as n # #.
Elementary bounds on Poincaré and logSobolev constants for decomposable Markov chains
 Annals of Applied Probability
, 2004
"... We consider finitestate Markov chains that can be naturally decomposed into smaller “projection ” and “restriction ” chains. Possibly this decomposition will be inductive, in that the restriction chains will be smaller copies of the initial chain. We provide expressions for Poincaré (resp. logSobo ..."
Abstract

Cited by 22 (0 self)
 Add to MetaCart
(Show Context)
We consider finitestate Markov chains that can be naturally decomposed into smaller “projection ” and “restriction ” chains. Possibly this decomposition will be inductive, in that the restriction chains will be smaller copies of the initial chain. We provide expressions for Poincaré (resp. logSobolev) constants of the initial Markov chain in terms of Poincaré (resp. logSobolev) constants of the projection and restriction chains, together with further a parameter. In the case of the Poincaré constant, our bound is always at least as good as existing ones and, depending on the value of the extra parameter, may be much better. There appears to be no previously published decomposition result for the logSobolev constant. Our proofs are elementary and selfcontained. 1. The setting. In a number of applications, one is interested in finding tight, nonasymptotic upper bounds on the mixing time, that is, rate of convergence to stationarity, of finitestate Markov chains. One important example arises in the analysis of Markov chain Monte Carlo algorithms. These are algorithms for
Reconstruction thresholds on regular trees
 in DMTCS volume Proc. Discrete Random Walks
, 2003
"... We consider the model of broadcasting on a tree, with binary state space, on the infinite rooted tree k in which each node has k children. The root of the tree takes a random value 0 or 1, and then each node passes a value independently to each of its children according to a 2 ¡ 2 transition matri ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
We consider the model of broadcasting on a tree, with binary state space, on the infinite rooted tree k in which each node has k children. The root of the tree takes a random value 0 or 1, and then each node passes a value independently to each of its children according to a 2 ¡ 2 transition matrix P. We say that reconstruction is possible if the values at the dth level of the tree contain nonvanishing information about the value at the root as d ¢ ∞. Extending a method of Brightwell and Winkler, we obtain new conditions under which reconstruction is impossible, both in the general case and in the special case p11 £ 0. The latter case is closely related to the hardcore model from statistical physics; a corollary of our results is that, for the hardcore model on the ¤ k ¥ 1 ¦regular tree with activity λ £ 1, the unique simple invariant Gibbs measure is extremal in the set of Gibbs measures, for any k § 2.
Exact Thresholds for IsingGibbs Samplers on General Graphs
"... We establish tight results for rapid mixing of Gibbs Samplers for the Ferromagnetic Ising model on general graphs. We show that if (d − 1)tanhβ < 1, then there exists a constant C such that the discrete time mixing time of Gibbs Samplers for the Ferromagnetic Ising model on any graph of n vertice ..."
Abstract

Cited by 14 (3 self)
 Add to MetaCart
(Show Context)
We establish tight results for rapid mixing of Gibbs Samplers for the Ferromagnetic Ising model on general graphs. We show that if (d − 1)tanhβ < 1, then there exists a constant C such that the discrete time mixing time of Gibbs Samplers for the Ferromagnetic Ising model on any graph of n vertices and maximal degree d, where all interactions are bounded by β, and arbitrary external fields is bounded by Cn log n. We further show the when d tanhβ < 1, with high probability over the ErdősRényi random graph on n vertices with average degree d, it holds that the mixing time of Gibbs Samplers is n 1+Θ ( 1 log log n). Both result are tight as it is known that the mixing time for random regular and ErdősRényi random graphs is, with high probability, exponential in n when if (d − 1)tanhβ> 1 and d tanhβ> 1 respectively.
Rapid mixing of gibbs sampling on graphs that are sparse on average
, 2007
"... Gibbs sampling also known as Glauber dynamics is a popular technique for sampling high dimensional distributions defined on graphs. Of special interest is the behavior of Gibbs sampling on the ErdősRényi random graph G(n, d/n), where each edge is chosen independently with probability d/n and d is f ..."
Abstract

Cited by 13 (3 self)
 Add to MetaCart
(Show Context)
Gibbs sampling also known as Glauber dynamics is a popular technique for sampling high dimensional distributions defined on graphs. Of special interest is the behavior of Gibbs sampling on the ErdősRényi random graph G(n, d/n), where each edge is chosen independently with probability d/n and d is fixed. While the average degree in G(n, d/n) is d(1 − o(1)), it contains many nodes of degree of order log n / loglog n. The existence of nodes of almost logarithmic degrees implies that for many natural distributions defined on G(n, p) such as uniform coloring (with a constant number of colors) or the Ising model at any fixed inverse temperature β, the mixing time of Gibbs sampling is at least n 1+Ω(1 / log log n). Recall that the Ising model with inverse temperature β defined on a graph G = (V, E) is the distribution over {±} V given by P(σ) = 1