## The Factor Graph Approach to Model-Based Signal Processing (2007)

### Cached

### Download Links

Citations: | 33 - 12 self |

### BibTeX

@MISC{Loeliger07thefactor,

author = {Hans-Andrea Loeliger and et al.},

title = {The Factor Graph Approach to Model-Based Signal Processing},

year = {2007}

}

### OpenURL

### Abstract

### Citations

8089 | Maximum likelihood from incomplete data via the EM algorithm
- Dempster, Laird, et al.
- 1977
(Show Context)
Citation Context ...σ 2 Y ( 2 2 ˜σ + ˜m ) . (114) VIII. EXPECTATION MAXIMIZATION AS MESSAGE PASSING Another classical method to deal with the estimation of coefficients like C in Fig. 16 is expectation maximization (EM) =-=[57]-=-–[59]. It turns out that EM can be put into message passing form, where it essentially boils down to a message computation rule that differs from the sum-product and maxproduct rules [60]. For example... |

7052 |
Probabilistic Reasoning in Intelligent Systems
- Pearl
- 1988
(Show Context)
Citation Context ...on (EM) to sequential Monte Carlo methods (particle filters). Factor graphs are graphical models [4]–[7]. In many respects, the different notation systems for graphical models (Bayesian networks [7], =-=[8]-=-, Markov random fields [7], [9], junction graphs [10], [11]. . . ) are essentially equivalent, but there are some real differences when it comes to practical Hans-Andrea Loeliger and Junli Hu are with... |

1167 | Factor graphs and the sum-product algorithm
- Kschischang, Frey, et al.
(Show Context)
Citation Context ...-free graphs, or with cycle-free subgraphs of complex system models. We will encounter some factor graphs with cycles, though. The message passing approach to signal processing was suggested in [22], =-=[2]-=- and has been used, e.g., in [23]–[39] for tasks like equalization, multi-user detection, MIMO detection, channel estimation, etc. The literature on graphical models for general inference problems is ... |

624 | Probabilistic networks and expert systems - Cowell, Dawid, et al. - 1999 |

611 | Learning in Graphical Models
- Jordan, editor
- 1999
(Show Context)
Citation Context ...riety of techniques ranging from classical Gaussian and gradient techniques over expectation maximization (EM) to sequential Monte Carlo methods (particle filters). Factor graphs are graphical models =-=[4]-=-–[7]. In many respects, the different notation systems for graphical models (Bayesian networks [7], [8], Markov random fields [7], [9], junction graphs [10], [11]. . . ) are essentially equivalent, bu... |

413 | Constructing Free-Energy Approximations and Generalized Belief FALL 2008 105 Spring Symposium Series Call for Participation AAAI presents the 2009 Spring Symposium Series, to be held Monday - Wednesday, March 23-25, 2008, at Stanford University. The topic - Yedidia, Freeman, et al. |

397 | Generalized belief propagation - Yedidia, Freeman, et al. - 2001 |

324 | Clustering by passing messages between data points
- Frey, Dueck
- 2007
(Show Context)
Citation Context ... on graphical models for general inference problems is vast. For example, factor graphs have also been used for link monitoring in wireless networks [40], for genome analysis [41], and for clustering =-=[42]-=-. This paper begins with an introduction to factor graphs that complements [2] and [3]. We then turn to an in-depth discussion of Gaussian message passing for linear models, i.e., Kalman filtering and... |

308 | Expectation Propagation for Approximate Bayesian Inference - Minka - 2001 |

293 |
The generalized distributive law
- Aji, McEliece
- 2000
(Show Context)
Citation Context ...filters). Factor graphs are graphical models [4]–[7]. In many respects, the different notation systems for graphical models (Bayesian networks [7], [8], Markov random fields [7], [9], junction graphs =-=[10]-=-, [11]. . . ) are essentially equivalent, but there are some real differences when it comes to practical Hans-Andrea Loeliger and Junli Hu are with the Dept. of Information Technology and Electrical E... |

284 | Iterative (Turbo) Soft Interference Cancellation and Decoding for Coded CDMA - Wang, Poor - 1999 |

265 |
A family of algorithms for approximate Bayesian inference
- Minka
- 2001
(Show Context)
Citation Context ...2 X (93) = 1 − ←− m 2 X. (94) It should be noted, however, that (92) and (94) need not be optimal even for graphs without cycles. An alternative way to compute a Gaussian message ←− µY is proposed in =-=[18]-=- and [49]. VI. BEYOND GAUSSIANS We have seen in the previous section that, for continuous variables, working out the sum-product or max-product message computation rules for particular nodes / factors... |

265 | Codes and Decoding on General Graphs
- Wiberg
- 1996
(Show Context)
Citation Context ... cycle-free graphs, or with cycle-free subgraphs of complex system models. We will encounter some factor graphs with cycles, though. The message passing approach to signal processing was suggested in =-=[22]-=-, [2] and has been used, e.g., in [23]–[39] for tasks like equalization, multi-user detection, MIMO detection, channel estimation, etc. The literature on graphical models for general inference problem... |

263 | A unifying review of linear Gaussian models, Neural Computation 11 - Roweis, Ghahramani - 1999 |

216 |
The EM algorithm for graphical association models with missing data’, Computational Statistics and Analysis
- Lauritzen
- 1995
(Show Context)
Citation Context ... ( 2 2 ˜σ + ˜m ) . (114) VIII. EXPECTATION MAXIMIZATION AS MESSAGE PASSING Another classical method to deal with the estimation of coefficients like C in Fig. 16 is expectation maximization (EM) [57]–=-=[59]-=-. It turns out that EM can be put into message passing form, where it essentially boils down to a message computation rule that differs from the sum-product and maxproduct rules [60]. For example, con... |

215 |
Adaptive Filter Theory”, 3 rd Ed
- Haykin
- 1996
(Show Context)
Citation Context ...arge matrices, such conversions are costly. The inversion of large matrices can often be avoided by using the message computation rules given in Table IV (which follow from the Matrix Inversion Lemma =-=[47]-=-, cf. Theorem 7 in Appendix II). The point of these rules is that the dimension of Y may be much smaller than the dimension of X and Z; in particular, Y may be a scalar. Table III shows the propagatio... |

143 | Markov Random Fields and Their Applications - Kindermann, Snell, et al. - 1980 |

122 |
Codes on Graphs: Normal Realizations
- Forney
- 2001
(Show Context)
Citation Context ...an distributions and LMMSE estimation in summarized in Appendix I. Appendix II contains the proofs for Section V. Appendix III reviews some results by Forney on the Fourier transform on factor graphs =-=[43]-=- and adapts them to the setting of the present paper. The following notation will be used. The transpose of a matrix (or vector) A is denoted by AT ; AH denotes the complex conjugate of AT ; A # denot... |

121 | An introduction to factor graphs - Loeliger - 2004 |

96 | Codes and iterative decoding on general graphs - Wiberg - 1996 |

83 | Variational message passing - Winn, Bishop |

61 | Iterative multiuser joint decoding: unified framework and asymptotic analysis - Boutros, Caire - 2002 |

60 | Stark: Unified design of iterative receivers using factor graphs
- Worthen, E
- 2001
(Show Context)
Citation Context ... subgraphs of complex system models. We will encounter some factor graphs with cycles, though. The message passing approach to signal processing was suggested in [22], [2] and has been used, e.g., in =-=[23]-=-–[39] for tasks like equalization, multi-user detection, MIMO detection, channel estimation, etc. The literature on graphical models for general inference problems is vast. For example, factor graphs ... |

57 |
Particle filtering
- Djuric, Kotecha, et al.
(Show Context)
Citation Context ...II). Gaussians (cf. Section V and Appendix I). Gaussian mixtures. List of samples: A probability density can be represented by a list of samples. This message type allows to describe particle filters =-=[52]-=-, [53] as message passing algorithms (see, e.g, [35], [36], [38], [39], [54]–[56]). Compound messages consisting of the “product” of other message types. All these message types, and many different me... |

51 | On the Optimality of Tree-reweighted Maxproduct Message-passing - Kolmogorov, Wainwright - 2007 |

44 |
Loop series for discrete statistical models on graphs
- Chertkov, Chernyak
- 2006
(Show Context)
Citation Context ...r, in Sections IV, V, and Appendix III) is not easily expressed in other notation systems. While much interesting recent work on graphical models specifically addresses graphs with cycles (e.g., [12]–=-=[21]-=-), the present paper is mostly concerned with cycle-free graphs, or with cycle-free subgraphs of complex system models. We will encounter some factor graphs with cycles, though. The message passing ap... |

42 | An analysis of belief propagation on the turbo decoding graph with Gaussian densities
- Rusmevichientong, Roy
- 2001
(Show Context)
Citation Context ...n message passing in an appropriate linear model. If the sum-product algorithm converges in a Gaussian factor graph with cycles, then the means of the marginals are correct (despite the cycles) [12], =-=[13]-=-. It is thus obvious that Gaussian message passing in linear models encompasses much of classical signal processing. We now turn to the actual message computations. In this section, all messages will ... |

41 |
The generalized distributive law and free energy minimization
- Aji, McEliece
- 2003
(Show Context)
Citation Context ...s). Factor graphs are graphical models [4]–[7]. In many respects, the different notation systems for graphical models (Bayesian networks [7], [8], Markov random fields [7], [9], junction graphs [10], =-=[11]-=-. . . ) are essentially equivalent, but there are some real differences when it comes to practical Hans-Andrea Loeliger and Junli Hu are with the Dept. of Information Technology and Electrical Enginee... |

27 | Factor graphs and algorithms
- Frey, Kschischang, et al.
- 1997
(Show Context)
Citation Context ...s. In particular, a large number of algorithms in these fields have been shown to be special cases of the basic sum-product and maxproduct algorithms that operate by message passing in a factor graph =-=[1]-=-–[3]. In this paper, we elaborate on this topic with an emphasis on signal processing. We hope to convey that factor graphs continue to grow more useful for the design of practical algorithms for mode... |

27 | A factor graph approach to link loss monitoring in wireless sensor networks
- Mao, Kschischang, et al.
- 2005
(Show Context)
Citation Context ... detection, channel estimation, etc. The literature on graphical models for general inference problems is vast. For example, factor graphs have also been used for link monitoring in wireless networks =-=[40]-=-, for genome analysis [41], and for clustering [42]. This paper begins with an introduction to factor graphs that complements [2] and [3]. We then turn to an in-depth discussion of Gaussian message pa... |

26 | BAlgorithms for iterative decoding in the presence of strong phase noise
- Colavolpe, Barbieri, et al.
- 2005
(Show Context)
Citation Context ... number of parameters works nicely, the prime example being linear Gaussian models as in Section V. However, beyond the Gaussian case, this does not seem to happen often. (An interesting exception is =-=[27]-=-, which uses a family of Tikhonov distributions.) In general, therefore, one has to resort to simplified messages for continuous variables. The following message types are widely applicable. • • • • •... |

21 | LP decoding - Feldman, Karger, et al. - 2003 |

20 |
BPhase estimation by message passing
- Dauwels, Loeliger
(Show Context)
Citation Context ...xtures. List of samples: A probability density can be represented by a list of samples. This message type allows to describe particle filters [52], [53] as message passing algorithms (see, e.g, [35], =-=[36]-=-, [38], [39], [54]–[56]). Compound messages consisting of the “product” of other message types. All these message types, and many different message computation rules, can coexist in large system model... |

20 |
Sequential auxiliary particle belief propagation
- Briers, Doucet, et al.
- 2005
(Show Context)
Citation Context ...mples: A probability density can be represented by a list of samples. This message type allows to describe particle filters [52], [53] as message passing algorithms (see, e.g, [35], [36], [38], [39], =-=[54]-=-–[56]). Compound messages consisting of the “product” of other message types. All these message types, and many different message computation rules, can coexist in large system models. The . . . Xk−1 ... |

18 |
Genome-wide analysis of mouse transcripts using exon microarrays and factor graphs
- Frey, Mohammad, et al.
- 2005
(Show Context)
Citation Context ...tion, etc. The literature on graphical models for general inference problems is vast. For example, factor graphs have also been used for link monitoring in wireless networks [40], for genome analysis =-=[41]-=-, and for clustering [42]. This paper begins with an introduction to factor graphs that complements [2] and [3]. We then turn to an in-depth discussion of Gaussian message passing for linear models, i... |

18 | Cyclic minimizers, majorization techniques, and the expectation-maximization algorithm: a refresher - Stoica, Selén - 2004 |

17 |
Markov Random Fields and Their
- Kindermann, Snell
- 1980
(Show Context)
Citation Context ...lo methods (particle filters). Factor graphs are graphical models [4]–[7]. In many respects, the different notation systems for graphical models (Bayesian networks [7], [8], Markov random fields [7], =-=[9]-=-, junction graphs [10], [11]. . . ) are essentially equivalent, but there are some real differences when it comes to practical Hans-Andrea Loeliger and Junli Hu are with the Dept. of Information Techn... |

16 | On the application of factor graphs and the sum-product algorithm to ISI channels - Colavolpe, Germi - 2005 |

15 |
Least squares and Kalman filtering on Forney graphs
- Loeliger
(Show Context)
Citation Context ...g and some of its ramifications. In particular, we will present tabulated message computation rules for multivariate Gaussian messages that are extensions and refinements of similar tables in [3] and =-=[33]-=-. With these tables, it is possible to write down (essentially without any computation) efficient Gaussian / LMMSE (linear minimum-mean-squarederror) estimation algorithms for a wide variety of applic... |

14 | Some remarks on factor graphs
- Loeliger
(Show Context)
Citation Context ... replaced by a single point ˆx, which may be viewed as a temporary or final decision on the value of the variable X. Function value and derivative / gradient at a point selected by the receiving node =-=[50]-=-, [51] (to be described in Section VII). Gaussians (cf. Section V and Appendix I). Gaussian mixtures. List of samples: A probability density can be represented by a list of samples. This message type ... |

11 |
On the optimality of the max-product belief propagation algorithm in arbitrary graphs
- Weiss, Freeman
- 2001
(Show Context)
Citation Context ...icular, in Sections IV, V, and Appendix III) is not easily expressed in other notation systems. While much interesting recent work on graphical models specifically addresses graphs with cycles (e.g., =-=[12]-=-–[21]), the present paper is mostly concerned with cycle-free graphs, or with cycle-free subgraphs of complex system models. We will encounter some factor graphs with cycles, though. The message passi... |

11 | Expectation maximization as message passing
- Dauwels, Korl, et al.
(Show Context)
Citation Context ...ization (EM) [57]–[59]. It turns out that EM can be put into message passing form, where it essentially boils down to a message computation rule that differs from the sum-product and maxproduct rules =-=[60]-=-. For example, consider the factor graph of Fig. 19 with the single node / factor g(x, y, θ) (which should be considered as a part of some larger factor graph). Along the edge Θ, we receive only the s... |

9 | Filters, Factor Graphs, and Electrical Networks. Internal report INT/200202
- Vontobel, Kalman
- 2002
(Show Context)
Citation Context ...this paper has grown over many years into its present shape. The first author wishes to particularly acknowledge the enjoyable collaborations with Niclas Wiberg [22], [44] and with Pascal O. Vontobel =-=[62]-=-, [63]. Both the first and the last author are immensely grateful to G. David Forney, Jr., for his continued encouragement and feedback. Sascha Korl and Andi Loeliger are indebted to Allen Lindgren fo... |

8 | factor graph approach to iterative channel estimation and LDPC decoding over fading channels - Niu, Shen, et al. - 2005 |

8 | Signal processing with factor graphs: examples - Loeliger, Dauwels, et al. |

8 | Hot coupling: a particle approach to inference and normalization on pairwise undirected graphs - Hamze, Freitas - 2006 |

7 | The factor graph EM algorithm: applications for LDPC codes - Eckford |

7 |
A Factor Graph Approach to Signal Modelling, System Identification and Filtering
- Korl
- 2005
(Show Context)
Citation Context .... List of samples: A probability density can be represented by a list of samples. This message type allows to describe particle filters [52], [53] as message passing algorithms (see, e.g, [35], [36], =-=[38]-=-, [39], [54]–[56]). Compound messages consisting of the “product” of other message types. All these message types, and many different message computation rules, can coexist in large system models. The... |

7 | On factor graphs and the Fourier transform - Mao, Kschischang - 2005 |

7 | On factor graphs and electrical networks
- Vontobel, Loeliger
- 2003
(Show Context)
Citation Context ...aper has grown over many years into its present shape. The first author wishes to particularly acknowledge the enjoyable collaborations with Niclas Wiberg [22], [44] and with Pascal O. Vontobel [62], =-=[63]-=-. Both the first and the last author are immensely grateful to G. David Forney, Jr., for his continued encouragement and feedback. Sascha Korl and Andi Loeliger are indebted to Allen Lindgren for many... |