Results 1  10
of
17
Factor Graphs and the SumProduct Algorithm
 IEEE TRANSACTIONS ON INFORMATION THEORY
, 1998
"... A factor graph is a bipartite graph that expresses how a "global" function of many variables factors into a product of "local" functions. Factor graphs subsume many other graphical models including Bayesian networks, Markov random fields, and Tanner graphs. Following one simple computational rule, t ..."
Abstract

Cited by 1163 (67 self)
 Add to MetaCart
A factor graph is a bipartite graph that expresses how a "global" function of many variables factors into a product of "local" functions. Factor graphs subsume many other graphical models including Bayesian networks, Markov random fields, and Tanner graphs. Following one simple computational rule, the sumproduct algorithm operates in factor graphs to computeeither exactly or approximatelyvarious marginal functions by distributed messagepassing in the graph. A wide variety of algorithms developed in artificial intelligence, signal processing, and digital communications can be derived as specific instances of the sumproduct algorithm, including the forward/backward algorithm, the Viterbi algorithm, the iterative "turbo" decoding algorithm, Pearl's belief propagation algorithm for Bayesian networks, the Kalman filter, and certain fast Fourier transform algorithms.
Probabilistic independence networks for hidden Markov probability models
, 1996
"... Graphical techniques for modeling the dependencies of random variables have been explored in a variety of different areas including statistics, statistical physics, artificial intelligence, speech recognition, image processing, and genetics. Formalisms for manipulating these models have been develop ..."
Abstract

Cited by 167 (12 self)
 Add to MetaCart
Graphical techniques for modeling the dependencies of random variables have been explored in a variety of different areas including statistics, statistical physics, artificial intelligence, speech recognition, image processing, and genetics. Formalisms for manipulating these models have been developed relatively independently in these research communities. In this paper we explore hidden Markov models (HMMs) and related structures within the general framework of probabilistic independence networks (PINs). The paper contains a selfcontained review of the basic principles of PINs. It is shown that the wellknown forwardbackward (FB) and Viterbi algorithms for HMMs are special cases of more general inference algorithms for arbitrary PINs. Furthermore, the existence of inference and estimation algorithms for more general graphical models provides a set of analysis tools for HMM practitioners who wish to explore a richer class of HMM structures. Examples of relatively complex models to handle sensor fusion and coarticulation in speech recognition are introduced and treated within the graphical model framework to illustrate the advantages of the general approach.
Iterative Decoding of Compound Codes by Probability Propagation in Graphical Models
 IEEE J. Sel. Areas Comm
, 1998
"... Abstract—We present a unified graphical model framework for describing compound codes and deriving iterative decoding algorithms. After reviewing a variety of graphical models (Markov random fields, Tanner graphs, and Bayesian networks), we derive a general distributed marginalization algorithm for ..."
Abstract

Cited by 108 (12 self)
 Add to MetaCart
Abstract—We present a unified graphical model framework for describing compound codes and deriving iterative decoding algorithms. After reviewing a variety of graphical models (Markov random fields, Tanner graphs, and Bayesian networks), we derive a general distributed marginalization algorithm for functions described by factor graphs. From this general algorithm, Pearl’s belief propagation algorithm is easily derived as a special case. We point out that recently developed iterative decoding algorithms for various codes, including “turbo decoding ” of parallelconcatenated convolutional codes, may be viewed as probability propagation in a graphical model of the code. We focus on Bayesian network descriptions of codes, which give a natural input/state/output/channel description of a code and channel, and we indicate how iterative decoders can be developed for paralleland serially concatenated coding systems, product codes, and lowdensity paritycheck codes. Index Terms — Concatenated coding, decoding, graph theory, iterative methods, product codes.
Factor graphs and algorithms
 IN PROC. 35TH ALLERTON CONF. COMMUNICATIONS, CONTROL, AND COMPUTING
"... A factor graph is a bipartite graph that expresses how a global function of several variables factors into a product of local functions. Factor graphs subsume many other graphical models, including Bayesian networks, Markov random fields, and Tanner graphs. We describe a general algorithm for comput ..."
Abstract

Cited by 28 (7 self)
 Add to MetaCart
A factor graph is a bipartite graph that expresses how a global function of several variables factors into a product of local functions. Factor graphs subsume many other graphical models, including Bayesian networks, Markov random fields, and Tanner graphs. We describe a general algorithm for computing "marginals" of the global function by distributed messagepassing in the corresponding factor graph. A wide variety of algorithms developed in the artificial intelligence, statistics, signal processing, and digital communications communities can be derived as specific instances of this general algorithm, including Pearl's "belief propagation" and "belief revision" algorithms, the fast Fourier transform, the Viterbi algorithm, the forward/backward algorithm, and the iterative "turbo" decoding algorithm.
Lectures on Contingency Tables
, 2002
"... The present set of lecture notes are prepared for the course “Statistik 2” at the University of Copenhagen. It is a revised version of notes prepared in connection with a series of lectures at the Swedish summerschool in Särö, June 11–17, 1979. The notes do by no means give a complete account of the ..."
Abstract

Cited by 18 (0 self)
 Add to MetaCart
The present set of lecture notes are prepared for the course “Statistik 2” at the University of Copenhagen. It is a revised version of notes prepared in connection with a series of lectures at the Swedish summerschool in Särö, June 11–17, 1979. The notes do by no means give a complete account of the theory of contingency tables. They are based on the idea that the graph theoretic methods in Darroch, Lauritzen and Speed (1978) can be used directly to develop this theory and, hopefully, with some pedagogical advantages. My thanks are due to the audience at the Swedish summerschool for patiently listening to the first version of these lectures, to Joseph Verducci, Stanford, who read the manuscript and suggested many improvements and corrections, and to Ursula Hansen, who typed the manuscript.
Efficient markov network structure discovery using independence tests
 In Proc SIAM Data Mining
, 2006
"... We present two algorithms for learning the structure of a Markov network from discrete data: GSMN and GSIMN. Both algorithms use statistical conditional independence tests on data to infer the structure by successively constraining the set of structures consistent with the results of these tests. GS ..."
Abstract

Cited by 17 (2 self)
 Add to MetaCart
We present two algorithms for learning the structure of a Markov network from discrete data: GSMN and GSIMN. Both algorithms use statistical conditional independence tests on data to infer the structure by successively constraining the set of structures consistent with the results of these tests. GSMN is a natural adaptation of the GrowShrink algorithm of Margaritis and Thrun for learning the structure of Bayesian networks. GSIMN extends GSMN by additionally exploiting Pearl’s wellknown properties of conditional independence relations to infer novel independencies from known independencies, thus avoiding the need to perform these tests. Experiments on artificial and real data sets show GSIMN can yield savings of up to 70 % with respect to GSMN, while generating a Markov network with comparable or in several cases considerably improved quality. In addition
Towards Perceptual Intelligence: Statistical Modeling of Human Individual and Interactive Behaviors
 Prediction of Human Behavior, IEEE Intelligent Vehicles
, 1995
"... This thesis presents a computational framework for the automatic recognition and prediction of different kinds of human behaviors from video cameras and other sensors, via perceptually intelligent systems that automatically sense and correctly classify human behaviors, by means of Machine Perception ..."
Abstract

Cited by 12 (5 self)
 Add to MetaCart
This thesis presents a computational framework for the automatic recognition and prediction of different kinds of human behaviors from video cameras and other sensors, via perceptually intelligent systems that automatically sense and correctly classify human behaviors, by means of Machine Perception and Machine Learning techniques. In the thesis I develop the statistical machine learning algorithms (dynamic graphical models) necessary for detecting and recognizing individual and interactive behaviors. In the case of the interactions two Hidden Markov Models (HMMs) are coupled in a novel architecture called Coupled Hidden Markov Models (CHMMs) that explicitly captures the interactions between them. The algorithms for learning the parameters from data as well as for doing inference with those models are developed and described. Four systems that experimentally evaluate the proposed paradigm are presented: (1) LAFTER, an automatic face detection and tracking system with facial expression recognition; (2) a TaiChi gesture recognition system; (3) a pedestrian surveillance system that recognizes typical human to human interactions; (4) and a SmartCar for driver maneuver recognition. These systems capture human behaviors of different nature and increasing complexity: first, isolated, singleuser facial expressions, then, twohand gestures and humantohuman interactions,...
Graphical Models
 In Proceedings of International School for the Synthesis of Expert Knowledge (ISSEK’98
, 2002
"... Graphical modeling is an important method to efficiently represent and analyze uncertain information in knowledgebased systems. Its most prominent representatives are Bayesian networks and Markov networks for probabilistic reasoning, which have been wellknown for over ten years now. However, they ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
Graphical modeling is an important method to efficiently represent and analyze uncertain information in knowledgebased systems. Its most prominent representatives are Bayesian networks and Markov networks for probabilistic reasoning, which have been wellknown for over ten years now. However, they suffer from certain deficiencies, if imprecise information has to be taken into account. Therefore possibilistic graphical modeling has recently emerged as a promising new area of research. Possibilistic networks are a noteworthy alternative to probabilistic networks whenever it is necessary to model both uncertainty and imprecision. Imprecision, understood as setvalued data, has often to be considered in situations in which information is obtained from human observers or imprecise measuring instruments. In this paper we provide an overview on the state of the art of possibilistic networks w.r.t. to propagation and learning algorithms. 1
Multiscale Graphical Modeling in Space: Applications to Command and Control
, 2000
"... Recently, a class of multiscale treestructured models was introduced in terms of scalerecursive dynamics defined on trees. The main advantage of these models is their association with a fast, recursive, Kalmanfilter prediction algorithm. In this article, we propose a more general class of multisca ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
Recently, a class of multiscale treestructured models was introduced in terms of scalerecursive dynamics defined on trees. The main advantage of these models is their association with a fast, recursive, Kalmanfilter prediction algorithm. In this article, we propose a more general class of multiscale graphical models over acyclic directed graphs, for use in command and control problems. Moreover, we derive the generalizedKalmanfilter algorithm for graphical Markov models, which can be used to obtain the optimal predictors and prediction variances for multiscale graphical models. 1 Introduction Almost every aspect of command and control (C2) involves dealing with information in the presence of uncertainty. Since information in a battlefield is never precise, its status is rarely known exactly. In the face of this uncertainty, commanders must make decisions, issue orders, and monitor the consequences. The uncertainty may come from noisy data or, indeed, regions of the battle space whe...
Data Mining with Graphical Models
 Proc. Computer Science for Environmental Protection (12th Int. Symp. ”Informatik fr den Umweltschutz”, Bremen
, 2000
"... Abstract. The explosion of data stored in commercial or administrational databases calls for intelligent techniques to discover the patterns hidden in them and thus to exploit all available information. Therefore a new line of research has recently been established, which became known under the name ..."
Abstract

Cited by 7 (6 self)
 Add to MetaCart
Abstract. The explosion of data stored in commercial or administrational databases calls for intelligent techniques to discover the patterns hidden in them and thus to exploit all available information. Therefore a new line of research has recently been established, which became known under the names “Data Mining ” and “Knowledge Discovery in Databases”. In this paper we study a popular technique from its arsenal of methods to do dependency analysis, namely learning inference networks (also called “graphical models”) from data. We review the already wellknown probabilistic networks and provide an introduction to the recently developed and closely related possibilistic networks. 1