## Loopy Belief Propagation for Approximate Inference: An Empirical Study (1999)

### Cached

### Download Links

- [http.cs.berkeley.edu]
- [www.cs.berkeley.edu]
- CiteULike
- DBLP

### Other Repositories/Bibliography

Venue: | In Proceedings of Uncertainty in AI |

Citations: | 491 - 18 self |

### BibTeX

@INPROCEEDINGS{Murphy99loopybelief,

author = {Kevin P. Murphy and Yair Weiss and Michael I. Jordan},

title = {Loopy Belief Propagation for Approximate Inference: An Empirical Study},

booktitle = {In Proceedings of Uncertainty in AI},

year = {1999},

pages = {467--475}

}

### Years of Citing Articles

### OpenURL

### Abstract

Recently, researchers have demonstrated that "loopy belief propagation" --- the use of Pearl's polytree algorithm in a Bayesian network with loops --- can perform well in the context of error-correcting codes. The most dramatic instance of this is the near Shannon-limit performance of "Turbo Codes" --- codes whose decoding algorithm is equivalent to loopy belief propagation in a chain-structured Bayesian network. In this paper we ask: is there something special about the error-correcting code context, or does loopy propagation work as an approximate inference scheme in a more general setting? We compare the marginals computed using loopy propagation to the exact ones in four Bayesian network architectures, including two real-world networks: ALARM and QMR. We find that the loopy beliefs often converge and when they do, they give a good approximation to the correct marginals. However, on the QMR network, the loopy beliefs oscillated and had no obvious relationship ...

### Citations

7314 |
Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference
- Pearl
- 1988
(Show Context)
Citation Context ...ameter regimes for which each scheme works best. In this paper we investigate the approximation performance of "loopy belief propagation". This refers to using the well-known Pearl polytree =-=algorithm [12]-=- on a Bayesian network with loops (undirected cycles). The algorithm is an exact inference algorithm for singlyconnected networks --- the beliefs converge to the correct marginals in a number of itera... |

1360 | Near Shannon limit error-correcting coding and decoding: turbo-codes. 1
- Berrou, Glavieux, et al.
(Show Context)
Citation Context ...ing 1 This assumes parallel updating of all nodes. The algorithm can also be implemented in a centralized fashion in which case it converges in two iterations [13]. code scheme known as "Turbo Co=-=des" [4]. These co-=-des have been described as "the most exciting and potentially important development in coding theory in many years" [11] and have recently been shown [9, 10] to utilize an algorithm equivale... |

602 |
The computational complexity of probabilistic inference using Bayesian belief networks
- Cooper
- 1990
(Show Context)
Citation Context ... show that some simple methods of preventing them lead to the wrong results. 1 Introduction The task of calculating posterior marginals on nodes in an arbitrary Bayesian network is known to be NPhard =-=[5]-=-. This is true even for the seemingly easier task of calculating approximate posteriors [6]. Nevertheless, due to the obvious practical importance of this task, there has been considerable interest in... |

336 | Turbo decoding as an instance of Pearl’s “belief propagation” algorithm
- McEliece, MacKay, et al.
- 1998
(Show Context)
Citation Context .... code scheme known as "Turbo Codes" [4]. These codes have been described as "the most exciting and potentially important development in coding theory in many years" [11] and have =-=recently been shown [9, 10] to u-=-tilize an algorithm equivalent to belief propagation in a network with loops. Although there is widespread agreement in the coding community that these codes "represent a genuine, and perhaps his... |

260 | Approximating probabilistic inference in Bayesian belief networks is NPhard
- Dagum, Luby
- 1993
(Show Context)
Citation Context ... The task of calculating posterior marginals on nodes in an arbitrary Bayesian network is known to be NPhard [5]. This is true even for the seemingly easier task of calculating approximate posteriors =-=[6]-=-. Nevertheless, due to the obvious practical importance of this task, there has been considerable interest in assessing the quality of different approximation schemes, in an Appears in UAI'99. This ve... |

244 |
The ALARM monitoring system: A case study with two probabilistic inference techniques for belief networks
- Beinlich
- 1989
(Show Context)
Citation Context ...hosen uniformly in the range [0; 1]. 3.3 The ALARM network Figure 3 shows the structure of the ALARM network --- a Bayesian network for monitoring patients in intensive care. This network was used by =-=[3]-=- to compare various inference algorithms. The arity of the nodes ranges from two to four and all conditional distributions are represented by tables. The structure and the CPTs were downloaded from Ni... |

185 | Correctness of local probability propagation in graphical models with loops
- Weiss
- 2000
(Show Context)
Citation Context ...robability to a near-optimum value" of the desired belief on a class of loopy DAGs [10]. Progress in the analysis of loopy belief propagation has been made for the case of networks with a single =-=loop [18, 19, 2, 1]. For the -=-sum-product (or "belief update") version it can be shown that: ffl Unless all the conditional probabilities are deterministic, belief propagation will converge. ffl There is an analytic expr... |

162 |
Simulation approaches to general probabilistic inference on belief networks
- Shachter, Peot
- 1989
(Show Context)
Citation Context ...ime was roughly comparable (to within an order of magnitude) to loopy propagation. We did not implement some of the more sophisticated versions of likelihood weighting, such as Markov blanket scoring =-=[16]-=-, since our goal in this paper was to evaluate loopy propagation rather than exhaustively compare the performance of alternative algorithms. (For a more careful evaluation of likelihood weighted sampl... |

125 | Mean field theory for sigmoid belief networks
- Saul, Jaakkola, et al.
- 1996
(Show Context)
Citation Context ...ayer and observations only at the bottom layer. We chose this structure because networks of this type are often used in image analysis --- the bottom layer would correspond to pixels (see for example =-=[15]-=-). All nodes were binary and the conditional probabilities were represented by tables --- entries in the conditional probability tables (CPTs) were chosen uniformly in the range [0; 1]. 3.2 The toyQMR... |

115 | Iterative decoding of compound codes by probability propagation in graphical models
- Kschischang
- 1998
(Show Context)
Citation Context .... code scheme known as "Turbo Codes" [4]. These codes have been described as "the most exciting and potentially important development in coding theory in many years" [11] and have =-=recently been shown [9, 10] to u-=-tilize an algorithm equivalent to belief propagation in a network with loops. Although there is widespread agreement in the coding community that these codes "represent a genuine, and perhaps his... |

72 | Belief Propagation and Revision in Networks with Loops
- Weiss
- 1997
(Show Context)
Citation Context ...robability to a near-optimum value" of the desired belief on a class of loopy DAGs [10]. Progress in the analysis of loopy belief propagation has been made for the case of networks with a single =-=loop [18, 19, 2, 1]. For the -=-sum-product (or "belief update") version it can be shown that: ffl Unless all the conditional probabilities are deterministic, belief propagation will converge. ffl There is an analytic expr... |

57 | Variational probabilistic inference and the QMR-DT network
- Jaakkola, Jordan
- 1999
(Show Context)
Citation Context ...aluate loopy propagation rather than exhaustively compare the performance of alternative algorithms. (For a more careful evaluation of likelihood weighted sampling in the case of the QMR network, see =-=[8]-=-.) 3 The networks We used two synthetic networks, PYRAMID and toyQMR, and two real world networks, ALARM and QMR. The synthetic networks are sufficiently small that we can perform exact inference, usi... |

53 |
The geometry of turbo-decoding dynamics
- Richardson
- 2000
(Show Context)
Citation Context ...ment of values to the hidden nodes. This result is independent of the arity of the nodes and whether the nodes are inside or outside the loop. For the case of networks with multiple loops, Richardson =-=[14]-=- has analyzed the special case of Turbo codes. He has shown that fixed points of the sum-product version always exist, and has given sufficient conditions under which they will be unique and stable (a... |

52 |
Fusion and propagation with multiple observations in belief networks
- Peot, Shachter
(Show Context)
Citation Context ...f this performance is in an error correcting 1 This assumes parallel updating of all nodes. The algorithm can also be implemented in a centralized fashion in which case it converges in two iterations =-=[13]. code scheme known -=-as "Turbo Codes" [4]. These codes have been described as "the most exciting and potentially important development in coding theory in many years" [11] and have recently been shown ... |

50 | A tractable inference algorithm for diagnosing multiple diseases
- Heckerman
- 1989
(Show Context)
Citation Context ...ximately 4000 findin nodes, with a number of observed findings that varies per case. Due to the form of the noisy-or CPTs the complexity of inference is exponential in the number of positive findings =-=[7]-=-. Following [8], we focused on the four CPC cases for which the number of positive findings is less than 20, so that exact inference is possible (using the QUICKSCORE algorithm [7]). 4 Results 4.1 Ini... |

44 | Iterative decoding on graphs with a single cycle - Aji, Horn, et al. - 1998 |

36 |
The turbo decision algorithm
- McEliece, Rodemich, et al.
- 1995
(Show Context)
Citation Context ...t converges in two iterations [13]. code scheme known as "Turbo Codes" [4]. These codes have been described as "the most exciting and potentially important development in coding theory =-=in many years" [11]-=- and have recently been shown [9, 10] to utilize an algorithm equivalent to belief propagation in a network with loops. Although there is widespread agreement in the coding community that these codes ... |

29 |
An empirical analysis of likelihood—Weighting simulation on a large, multiply connected medical belief network
- Shwe, Cooper
- 1991
(Show Context)
Citation Context ...es were initialized to a vector of ones; random initialization yielded similar results, since the initial conditions rapidly get "washed out". For comparison, we also implemented likelihood =-=weighting [17]-=-, which is a simple form of importance sampling. Like any sampling algorithm, the errors can be driven towards zero by running the algorithm for long enough; in this paper, we usually used 200 samples... |

16 | Variational probabilistic inference and the qmr-dt database
- Jaakkola, Jordan
- 1999
(Show Context)
Citation Context ...aper was to evaluate loopy propagation rather than exhaustively compare the performance of alternative algorithms. (For a more careful evaluation of the performance of sampling algorithms on QMR, see =-=[8]-=-.) 3 The networks We used two synthetic networks, PYRAMID and toyQMR, and two real world networks, ALARM and QMR. The synthetic networks are sufficiently small that we can perform exact inference, usi... |

11 |
The structure of Bayes networks for visual recognition
- Agosta
- 1990
(Show Context)
Citation Context ...robability to a near-optimum value" of the desired belief on a class of loopy DAGs [10]. Progress in the analysis of loopy belief propagation has been made for the case of networks with a single =-=loop [18, 19, 2, 1]. For the -=-sum-product (or "belief update") version it can be shown that: ffl Unless all the conditional probabilities are deterministic, belief propagation will converge. ffl There is an analytic expr... |

6 | Does the wake-sleep algorithm learn good density estimators - Frey, Hinton, et al. - 1996 |