Results 1  10
of
79
Loopy belief propagation: Convergence and effects of message errors
 Journal of Machine Learning Research
, 2005
"... Belief propagation (BP) is an increasingly popular method of performing approximate inference on arbitrary graphical models. At times, even further approximations are required, whether due to quantization of the messages or model parameters, from other simplified message or model representations, or ..."
Abstract

Cited by 104 (9 self)
 Add to MetaCart
(Show Context)
Belief propagation (BP) is an increasingly popular method of performing approximate inference on arbitrary graphical models. At times, even further approximations are required, whether due to quantization of the messages or model parameters, from other simplified message or model representations, or from stochastic approximation methods. The introduction of such errors into the BP message computations has the potential to affect the solution obtained adversely. We analyze the effect resulting from message approximation under two particular measures of error, and show bounds on the accumulation of errors in the system. This analysis leads to convergence conditions for traditional BP message passing, and both strict bounds and estimates of the resulting error in systems of approximate BP message passing. 1
Consensus propagation
 IEEE Transactions on Information Theory
"... Abstract — We propose consensus propagation, an asynchronous distributed protocol for averaging numbers across a network. We establish convergence, characterize the convergence rate for regular graphs, and demonstrate that the protocol exhibits better scaling properties than pairwise averaging, an a ..."
Abstract

Cited by 89 (5 self)
 Add to MetaCart
Abstract — We propose consensus propagation, an asynchronous distributed protocol for averaging numbers across a network. We establish convergence, characterize the convergence rate for regular graphs, and demonstrate that the protocol exhibits better scaling properties than pairwise averaging, an alternative that has received much recent attention. Consensus propagation can be viewed as a special case of belief propagation, and our results contribute to the belief propagation literature. In particular, beyond singlyconnected graphs, there are very few classes of relevant problems for which belief propagation is known to converge. Index Terms — belief propagation, distributed averaging, distributed consensus, distributed signal processing, Gaussian Markov random fields, messagepassing algorithms, maxproduct algorithm, minsum algorithm, sumproduct algorithm. I.
MAP Estimation, Linear Programming and Belief Propagation with Convex Free Energies
, 2007
"... Finding the most probable assignment (MAP) in a general graphical model is known to be NP hard but good approximations have been attained with maxproduct belief propagation (BP) and its variants. In particular, it is known that using BP on a singlecycle graph or tree reweighted BP on an arbitrary ..."
Abstract

Cited by 74 (4 self)
 Add to MetaCart
Finding the most probable assignment (MAP) in a general graphical model is known to be NP hard but good approximations have been attained with maxproduct belief propagation (BP) and its variants. In particular, it is known that using BP on a singlecycle graph or tree reweighted BP on an arbitrary graph will give the MAP solution if the beliefs have no ties. In this paper we extend the setting under which BP can be used to provably extract the MAP. We define Convex BP as BP algorithms based on a convex free energy approximation and show that this class includes ordinary BP with singlecycle, tree reweighted BP and many other BP variants. We show that when there are no ties, fixedpoints of convex maxproduct BP will provably give the MAP solution. We also show that convex sumproduct BP at sufficiently small temperatures can be used to solve linear programs that arise from relaxing the MAP problem. Finally, we derive a novel condition that allows us to derive the MAP solution even if some of the convex BP beliefs have ties. In experiments, we show that our theorems allow us to find the MAP in many realworld instances of graphical models where exact inference using junctiontree is impossible.
Sufficient conditions for convergence of the sumproduct algorithm
 IEEE Trans. IT
, 2007
"... Abstract—Novel conditions are derived that guarantee convergence ..."
Abstract

Cited by 62 (2 self)
 Add to MetaCart
(Show Context)
Abstract—Novel conditions are derived that guarantee convergence
NormProduct Belief Propagation: PrimalDual MessagePassing for Approximate Inference
, 2008
"... Inference problems in graphical models can be represented as a constrained optimization of a free energy function. In this paper we treat both forms of probabilistic inference, estimating marginal probabilities of the joint distribution and finding the most probable assignment, through a unified me ..."
Abstract

Cited by 53 (11 self)
 Add to MetaCart
(Show Context)
Inference problems in graphical models can be represented as a constrained optimization of a free energy function. In this paper we treat both forms of probabilistic inference, estimating marginal probabilities of the joint distribution and finding the most probable assignment, through a unified messagepassing algorithm architecture. In particular we generalize the Belief Propagation (BP) algorithms of sumproduct and maxproduct and treerewaighted (TRW) sum and max product algorithms (TRBP) and introduce a new set of convergent algorithms based on ”convexfreeenergy” and LinearProgramming (LP) relaxation as a zerotemprature of a convexfreeenergy. The main idea of this work arises from taking a general perspective on the existing BP and TRBP algorithms while observing that they all are reductions from the basic optimization formula of f + ∑ i hi
Convexity arguments for efficient minimization of the Bethe and Kikuchi free energies.
 Journal of Artificial Intelligence Research,
, 2006
"... Abstract Loopy and generalized belief propagation are popular algorithms for approximate inference in Markov random fields and Bayesian networks. Fixed points of these algorithms have been shown to correspond to extrema of the Bethe and Kikuchi free energy, both of which are approximations of the e ..."
Abstract

Cited by 44 (0 self)
 Add to MetaCart
(Show Context)
Abstract Loopy and generalized belief propagation are popular algorithms for approximate inference in Markov random fields and Bayesian networks. Fixed points of these algorithms have been shown to correspond to extrema of the Bethe and Kikuchi free energy, both of which are approximations of the exact Helmholtz free energy. However, belief propagation does not always converge, which motivates approaches that explicitly minimize the Kikuchi/Bethe free energy, such as CCCP and UPS. Here we describe a class of algorithms that solves this typically nonconvex constrained minimization problem through a sequence of convex constrained minimizations of upper bounds on the Kikuchi free energy. Intuitively one would expect tighter bounds to lead to faster algorithms, which is indeed convincingly demonstrated in our simulations. Several ideas are applied to obtain tight convex bounds that yield dramatic speedups over CCCP.
Efficient belief propagation for vision using linear constraint nodes
 in CVPR
, 2007
"... Belief propagation over pairwise connected Markov Random Fields has become a widely used approach, and has been successfully applied to several important computer vision problems. However, pairwise interactions are often insufficient to capture the full statistics of the problem. Higherorder intera ..."
Abstract

Cited by 39 (7 self)
 Add to MetaCart
(Show Context)
Belief propagation over pairwise connected Markov Random Fields has become a widely used approach, and has been successfully applied to several important computer vision problems. However, pairwise interactions are often insufficient to capture the full statistics of the problem. Higherorder interactions are sometimes required. Unfortunately, the complexity of belief propagation is exponential in the size of the largest clique. In this paper, we introduce a new technique to compute belief propagation messages in time linear with respect to clique size for a large class of potential functions over realvalued variables. We demonstrate this technique in two applications. First, we perform efficient inference in graphical models where the spatial prior of natural images is captured by 2 × 2 cliques. This approach shows significant improvement over the commonly used pairwiseconnected models, and may benefit a variety of applications using belief propagation to infer images or range images. Finally, we apply these techniques to shapefromshading and demonstrate significant improvement over previous methods, both in quality and in flexibility. 1.
Loop series and Bethe variational bounds in attractive graphical models
, 2008
"... Variational methods are frequently used to approximate or bound the partition or likelihood function of a Markov random field. Methods based on mean field theory are guaranteed to provide lower bounds, whereas certain types of convex relaxations provide upper bounds. In general, loopy belief propaga ..."
Abstract

Cited by 33 (0 self)
 Add to MetaCart
(Show Context)
Variational methods are frequently used to approximate or bound the partition or likelihood function of a Markov random field. Methods based on mean field theory are guaranteed to provide lower bounds, whereas certain types of convex relaxations provide upper bounds. In general, loopy belief propagation (BP) provides often accurate approximations, but not bounds. We prove that for a class of attractive binary models, the so–called Bethe approximation associated with any fixed point of loopy BP always lower bounds the true likelihood. Empirically, this bound is much tighter than the naive mean field bound, and requires no further work than running BP. We establish these lower bounds using a loop series expansion due to Chertkov and Chernyak, which we show can be derived as a consequence of the tree reparameterization characterization of BP fixed points.
Sufficient conditions for convergence of loopy belief propagation
 In Proc. Conference on Uncertainty in Artificial Intelligence (UAI
, 2005
"... We derive novel conditions that guarantee convergence of Loopy Belief Propagation (also known as the SumProduct algorithm) to a unique fixed point. Our results are provably stronger than existing sufficient conditions. We show that the improvement can be quite substantial; in particular, for binary ..."
Abstract

Cited by 30 (3 self)
 Add to MetaCart
(Show Context)
We derive novel conditions that guarantee convergence of Loopy Belief Propagation (also known as the SumProduct algorithm) to a unique fixed point. Our results are provably stronger than existing sufficient conditions. We show that the improvement can be quite substantial; in particular, for binary variables with (anti)ferromagnetic interactions, our conditions seem to be sharp.
Convergent messagepassing algorithms for inference over general graphs with convex free energy
 In The 24th Conference on Uncertainty in Artificial Intelligence (UAI
, 2008
"... Inference problems in graphical models can be represented as a constrained optimization of a free energy function. It is known that when the Bethe free energy is used, the fixedpoints of the belief propagation (BP) algorithm correspond to the local minima of the free energy. However BP fails to conv ..."
Abstract

Cited by 25 (8 self)
 Add to MetaCart
(Show Context)
Inference problems in graphical models can be represented as a constrained optimization of a free energy function. It is known that when the Bethe free energy is used, the fixedpoints of the belief propagation (BP) algorithm correspond to the local minima of the free energy. However BP fails to converge in many cases of interest. Moreover, the Bethe free energy is nonconvex for graphical models with cycles thus introducing great difficulty in deriving efficient algorithms for finding local minima of the free energy for general graphs. In this paper we introduce two efficient BPlike algorithms, one sequential and the other parallel, that are guaranteed to converge to the global minimum, for any graph, over the class of energies known as ”convex free energies”. In addition, we propose an efficient heuristic for setting the parameters of the convex free energy based on the structure of the graph. 1