#### DMCA

## Loopy belief propagation for approximate inference: An empirical study. In: (1999)

Venue: | Proceedings of Uncertainty in AI, |

Citations: | 676 - 15 self |

### Citations

8903 |
Probabilistic reasoning in intelligent systems: networks of plausible inference
- Pearl
- 1988
(Show Context)
Citation Context ...ameter regimes for which each scheme works best. In this paper we investigate the approximation performance of "loopy belief propagation". This refers to using the well-known Pearl polytree =-=algorithm [12]-=- on a Bayesian network with loops (undirected cycles). The algorithm is an exact inference algorithm for singlyconnected networks --- the beliefs converge to the correct marginals in a number of itera... |

1776 | Near Shannon Limit Error-Correcting Coding and Decoding
- Berrou, Glavieux, et al.
- 1993
(Show Context)
Citation Context ...ing 1 This assumes parallel updating of all nodes. The algorithm can also be implemented in a centralized fashion in which case it converges in two iterations [13]. code scheme known as "Turbo Co=-=des" [4]. These co-=-des have been described as "the most exciting and potentially important development in coding theory in many years" [11] and have recently been shown [9, 10] to utilize an algorithm equivale... |

720 |
Computational complexity of probabilistic inference using bayesian belief networks (research note
- Cooper
- 1990
(Show Context)
Citation Context ... show that some simple methods of preventing them lead to the wrong results. 1 Introduction The task of calculating posterior marginals on nodes in an arbitrary Bayesian network is known to be NPhard =-=[5]-=-. This is true even for the seemingly easier task of calculating approximate posteriors [6]. Nevertheless, due to the obvious practical importance of this task, there has been considerable interest in... |

404 | Turbo decoding as an instance of Pearl’s ‘belief propagation’ algorithm
- McEliece, MacKay, et al.
- 1998
(Show Context)
Citation Context .... code scheme known as "Turbo Codes" [4]. These codes have been described as "the most exciting and potentially important development in coding theory in many years" [11] and have =-=recently been shown [9, 10] to u-=-tilize an algorithm equivalent to belief propagation in a network with loops. Although there is widespread agreement in the coding community that these codes "represent a genuine, and perhaps his... |

292 | Approximating probabilistic inference in Bayesian belief networks is NP-hard
- Dagum, Luby
- 1993
(Show Context)
Citation Context ... The task of calculating posterior marginals on nodes in an arbitrary Bayesian network is known to be NPhard [5]. This is true even for the seemingly easier task of calculating approximate posteriors =-=[6]-=-. Nevertheless, due to the obvious practical importance of this task, there has been considerable interest in assessing the quality of different approximation schemes, in an Appears in UAI'99. This ve... |

285 |
The ALARM monitoring system: A case study with two probablistic inference techniques for belief networks
- Beinlich, Suermondt, et al.
- 1988
(Show Context)
Citation Context ...hosen uniformly in the range [0; 1]. 3.3 The ALARM network Figure 3 shows the structure of the ALARM network --- a Bayesian network for monitoring patients in intensive care. This network was used by =-=[3]-=- to compare various inference algorithms. The arity of the nodes ranges from two to four and all conditional distributions are represented by tables. The structure and the CPTs were downloaded from Ni... |

231 | Correctness of local probability propagation in graphical models with loops
- Weiss
(Show Context)
Citation Context ...robability to a near-optimum value" of the desired belief on a class of loopy DAGs [10]. Progress in the analysis of loopy belief propagation has been made for the case of networks with a single =-=loop [18, 19, 2, 1]. For the -=-sum-product (or "belief update") version it can be shown that: ffl Unless all the conditional probabilities are deterministic, belief propagation will converge. ffl There is an analytic expr... |

179 |
Simulation Approaches to General Probabilistic Inference on Belief Networks.
- Shacter, Peot
- 1989
(Show Context)
Citation Context ...ime was roughly comparable (to within an order of magnitude) to loopy propagation. We did not implement some of the more sophisticated versions of likelihood weighting, such as Markov blanket scoring =-=[16]-=-, since our goal in this paper was to evaluate loopy propagation rather than exhaustively compare the performance of alternative algorithms. (For a more careful evaluation of likelihood weighted sampl... |

154 | Mean field theory for sigmoid belief networks.
- Saul, Jaakkola, et al.
- 1996
(Show Context)
Citation Context ...ayer and observations only at the bottom layer. We chose this structure because networks of this type are often used in image analysis --- the bottom layer would correspond to pixels (see for example =-=[15]-=-). All nodes were binary and the conditional probabilities were represented by tables --- entries in the conditional probability tables (CPTs) were chosen uniformly in the range [0; 1]. 3.2 The toyQMR... |

139 | Iterative decoding of compound codes by probability propagation in graphical models.
- Kschischang, Frey
- 1998
(Show Context)
Citation Context .... code scheme known as "Turbo Codes" [4]. These codes have been described as "the most exciting and potentially important development in coding theory in many years" [11] and have =-=recently been shown [9, 10] to u-=-tilize an algorithm equivalent to belief propagation in a network with loops. Although there is widespread agreement in the coding community that these codes "represent a genuine, and perhaps his... |

77 | Belief propagation and revision in networks with loops.
- Weiss
- 1997
(Show Context)
Citation Context ...robability to a near-optimum value" of the desired belief on a class of loopy DAGs [10]. Progress in the analysis of loopy belief propagation has been made for the case of networks with a single =-=loop [18, 19, 2, 1]. For the -=-sum-product (or "belief update") version it can be shown that: ffl Unless all the conditional probabilities are deterministic, belief propagation will converge. ffl There is an analytic expr... |

65 | Variational probabilistic inference and the QMR-DT network.
- Jaakkola, Jordan
- 1999
(Show Context)
Citation Context ...aluate loopy propagation rather than exhaustively compare the performance of alternative algorithms. (For a more careful evaluation of likelihood weighted sampling in the case of the QMR network, see =-=[8]-=-.) 3 The networks We used two synthetic networks, PYRAMID and toyQMR, and two real world networks, ALARM and QMR. The synthetic networks are sufficiently small that we can perform exact inference, usi... |

65 | The geometry of turbo decoding dynamics.
- Richardson
- 1999
(Show Context)
Citation Context ...ment of values to the hidden nodes. This result is independent of the arity of the nodes and whether the nodes are inside or outside the loop. For the case of networks with multiple loops, Richardson =-=[14]-=- has analyzed the special case of Turbo codes. He has shown that fixed points of the sum-product version always exist, and has given sufficient conditions under which they will be unique and stable (a... |

62 |
Fusion and propagation with multiple observations in belief networks
- Peot, Shachter
- 1991
(Show Context)
Citation Context ...f this performance is in an error correcting 1 This assumes parallel updating of all nodes. The algorithm can also be implemented in a centralized fashion in which case it converges in two iterations =-=[13]. code scheme known -=-as "Turbo Codes" [4]. These codes have been described as "the most exciting and potentially important development in coding theory in many years" [11] and have recently been shown ... |

56 | A tractable inference algorithm for diagnosing multiple diseases. In
- Heckerman
- 1989
(Show Context)
Citation Context ...ximately 4000 findin nodes, with a number of observed findings that varies per case. Due to the form of the noisy-or CPTs the complexity of inference is exponential in the number of positive findings =-=[7]-=-. Following [8], we focused on the four CPC cases for which the number of positive findings is less than 20, so that exact inference is possible (using the QUICKSCORE algorithm [7]). 4 Results 4.1 Ini... |

50 | On the convergence of iterative decoding on graphs with a single cycle.
- Aji, Horn, et al.
- 1998
(Show Context)
Citation Context ...conjectured that the performance of loopy belief prop agation on the Turbo code structure was a special case of a more general phenomenon: We believe there are general undiscovered theorems about the performance of belief propagation on loopy DAGs. These theo rems, which may have nothing directly to do with coding or decoding will show that in some sense belief propagation "converges with high probability to a near-optimum value" of the desired belief on a class of loopy DAGs [10]. Progress in the analysis of loopy belief propagation has been made for the case of networks with a single loop [18, 19, 2, 1]. For the sum-product (or "belief update" ) version it can be shown that: • Unless all the conditional probabilities are deter ministic, belief propagation will converge. • There is an analytic expression relating the cor rect marginals to the loopy marginals. The ap proximation error is related to the convergence rate of the messages - the faster the convergence the more exact the approximation. • If the hidden nodes are binary, then thresholding the loopy beliefs is guaranteed to give the most probable assignment, even though the numerical value of the beliefs may be incorrect. This resul... |

39 |
The Turbo decision algorithm.
- McEliece, Rodemich, et al.
- 1995
(Show Context)
Citation Context ...t converges in two iterations [13]. code scheme known as "Turbo Codes" [4]. These codes have been described as "the most exciting and potentially important development in coding theory =-=in many years" [11]-=- and have recently been shown [9, 10] to utilize an algorithm equivalent to belief propagation in a network with loops. Although there is widespread agreement in the coding community that these codes ... |

33 |
An Empirical Analysis of Likelihood-Weighting Simulation on a Large, Multiply Connected Belief Network.
- Shwe, Cooper
- 1990
(Show Context)
Citation Context ...es were initialized to a vector of ones; random initialization yielded similar results, since the initial conditions rapidly get "washed out". For comparison, we also implemented likelihood =-=weighting [17]-=-, which is a simple form of importance sampling. Like any sampling algorithm, the errors can be driven towards zero by running the algorithm for long enough; in this paper, we usually used 200 samples... |

18 | Variational probabilistic inference and the qmr-dtdatabase.
- Jaakkola, Jordan
- 1999
(Show Context)
Citation Context ...aper was to evaluate loopy propagation rather than exhaustively compare the performance of alternative algorithms. (For a more careful evaluation of the performance of sampling algorithms on QMR, see =-=[8]-=-.) 3 The networks We used two synthetic networks, PYRAMID and toyQMR, and two real world networks, ALARM and QMR. The synthetic networks are sufficiently small that we can perform exact inference, usi... |

13 |
The structure of Bayes networks for visual recognition.
- Agosta
- 1990
(Show Context)
Citation Context ...robability to a near-optimum value" of the desired belief on a class of loopy DAGs [10]. Progress in the analysis of loopy belief propagation has been made for the case of networks with a single =-=loop [18, 19, 2, 1]. For the -=-sum-product (or "belief update") version it can be shown that: ffl Unless all the conditional probabilities are deterministic, belief propagation will converge. ffl There is an analytic expr... |

6 | Does the wake-sleep algorithm learn good density estimators - Frey, Hinton, et al. - 1996 |

1 |
Aproximate probabilis tic inference in Bayesian networks in NP hard.
- Dagum, Luby
- 1993
(Show Context)
Citation Context ...opy beliefs of ten converge and when they do, they give a good approximation to the correct marginals. However, on the QMR network, the loopy be liefs oscillated and had no obvious relation ship to the correct posteriors. We present some initial investigations into the cause of these oscillations, and show that some sim ple methods of preventing them lead to the wrong results. 1 Introduction The task of calculating posterior marginals on nodes in an arbitrary Bayesian network is known to be NP hard [5]. This is true even for the seemingly easier task of calculating approximate posteriors [6]. Never theless, due to the obvious practical importance of this task, there has been considerable interest in assessing the quality of different approximation schemes, in an attempt to delimit the types of networks and parame ter regimes for which each scheme works best. In this paper we investigate the approximation performance of "loopy belief propagation" . This refers to using the well-known Pearl polytree algorithm [12] on a Bayesian network with loops (undirected cycles). The algorithm is an exact inference algorithm for singly connected networks - the beliefs converge to the cor re... |

1 |
Iterative de coding of compound codes by probability prop agation in graphical models.
- Kschischang, Frey
- 1998
(Show Context)
Citation Context ...tal results by using this approximation scheme by running algorithms equivalent to Pearl's algorithm on networks with loops. Perhaps the most dramatic instance of this performance is in an error correcting code scheme known as "Turbo Codes" [4]. These codes have been described as "the most exciting and poten tially important development in coding theory in many 1 This assumes parallel updating of all nodes. The algo rithm can also be implemented in a centralized fashion in which case it converges in two iterations [13). 468 Murphy, Weiss, and Jordan years" [11] and have recently been shown [9, 10] to uti lize an algorithm equivalent to belief propagation in a network with loops. Although there is widespread agreement in the coding community that these codes "represent a genuine, and perhaps historic, break through" [11], a theoretical understanding of their per formance has yet to be achieved. Yet McEliece et. a! conjectured that the performance of loopy belief prop agation on the Turbo code structure was a special case of a more general phenomenon: We believe there are general undiscovered theorems about the performance of belief propagation on loopy DAGs. These theo rems, which ... |

1 |
Turbo decoding as as an instance of Pearl's 'be lief propagation' algorithm.
- McEliece, MacKay, et al.
- 1998
(Show Context)
Citation Context ...tal results by using this approximation scheme by running algorithms equivalent to Pearl's algorithm on networks with loops. Perhaps the most dramatic instance of this performance is in an error correcting code scheme known as "Turbo Codes" [4]. These codes have been described as "the most exciting and poten tially important development in coding theory in many 1 This assumes parallel updating of all nodes. The algo rithm can also be implemented in a centralized fashion in which case it converges in two iterations [13). 468 Murphy, Weiss, and Jordan years" [11] and have recently been shown [9, 10] to uti lize an algorithm equivalent to belief propagation in a network with loops. Although there is widespread agreement in the coding community that these codes "represent a genuine, and perhaps historic, break through" [11], a theoretical understanding of their per formance has yet to be achieved. Yet McEliece et. a! conjectured that the performance of loopy belief prop agation on the Turbo code structure was a special case of a more general phenomenon: We believe there are general undiscovered theorems about the performance of belief propagation on loopy DAGs. These theo rems, which ... |

1 |
Fusion and prop agation with multiple observations in belief net works.
- Peot, Shachter
- 1991
(Show Context)
Citation Context ...ch it can be guaranteed to work well. In this paper we investigate loopy prop agation empirically under a wider range of conditions. Is there something special about the error-correcting code setting, or does loopy propagation work as an approximation scheme for a wider range of networks? 2 The algorithm For completeness, we briefly summarize Pearl's belief propagation algorithm. Each node X computes a be lief BEL(:x) = P(X = :xiE), where E denotes the ob served evidence, by combining messages from its chil dren ..\y;(:x) and messages from its parents 1rx(uk). (Following Peot and Shachter [13], we incorporate ev idence by letting a node send a message to itself, ..\x(:x).) (1) where: BEL(:x) = a..\(:x)1r(x) ,x(tl(x) = ..\x(x) IT ..\�}(x) j and: (2) 7r('l(x) = LP(X = xiU = u) IT 1r�)(uk) (3) u k The message X passes to its parent U; is given by: (4) and the message X sends to its child Yj is given by: 7r�;+l)(:x) = a?C('l(x)..\x(x) IT .>.W(x) (5) k;Cj For noisy-or links between parents and children, there exists an analytic expression for 1r( x) and Ax ( u;) that avoids the exhaustive enumeration over parent config urations [12]. We made a slight modification to the update rules i... |

1 |
Correctness of local probability prop agation in graphical models with loops. Neural Computation, to appear,
- Weiss
- 1999
(Show Context)
Citation Context ...conjectured that the performance of loopy belief prop agation on the Turbo code structure was a special case of a more general phenomenon: We believe there are general undiscovered theorems about the performance of belief propagation on loopy DAGs. These theo rems, which may have nothing directly to do with coding or decoding will show that in some sense belief propagation "converges with high probability to a near-optimum value" of the desired belief on a class of loopy DAGs [10]. Progress in the analysis of loopy belief propagation has been made for the case of networks with a single loop [18, 19, 2, 1]. For the sum-product (or "belief update" ) version it can be shown that: • Unless all the conditional probabilities are deter ministic, belief propagation will converge. • There is an analytic expression relating the cor rect marginals to the loopy marginals. The ap proximation error is related to the convergence rate of the messages - the faster the convergence the more exact the approximation. • If the hidden nodes are binary, then thresholding the loopy beliefs is guaranteed to give the most probable assignment, even though the numerical value of the beliefs may be incorrect. This resul... |