## Dynamic Bayesian Networks: Representation, Inference and Learning (2002)

### Cached

### Download Links

- [www.cs.ubc.ca]
- [www.ai.mit.edu]
- [cdn.preterhuman.net]
- [cdn.preterhuman.net]
- [www.cs.ubc.ca]
- [www.ai.mit.edu]
- [ibug.doc.ic.ac.uk]
- CiteULike

### Other Repositories/Bibliography

Citations: | 610 - 3 self |

### BibTeX

@MISC{Murphy02dynamicbayesian,

author = {Kevin Patrick Murphy},

title = {Dynamic Bayesian Networks: Representation, Inference and Learning},

year = {2002}

}

### Years of Citing Articles

### OpenURL

### Abstract

Modelling sequential data is important in many areas of science and engineering. Hidden Markov models (HMMs) and Kalman filter models (KFMs) are popular for this because they are simple and flexible. For example, HMMs have been used for speech recognition and bio-sequence analysis, and KFMs have been used for problems ranging from tracking planes and missiles to predicting the economy. However, HMMs and KFMs are limited in their “expressive power”. Dynamic Bayesian Networks (DBNs) generalize HMMs by allowing the state space to be represented in factored form, instead of as a single discrete random variable. DBNs generalize KFMs by allowing arbitrary probability distributions, not just (unimodal) linear-Gaussian. In this thesis, I will discuss how to represent many different kinds of models as DBNs, how to perform exact and approximate inference in DBNs, and how to learn DBN models from sequential data. In particular, the main novel technical contributions of this thesis are as follows: a way of representing Hierarchical HMMs as DBNs, which enables inference to be done in O(T) time instead of O(T 3), where T is the length of the sequence; an exact smoothing algorithm that takes O(log T) space instead of O(T); a simple way of using the junction tree algorithm for online inference in DBNs; new complexity bounds on exact online inference in DBNs; a new deterministic approximate inference algorithm called factored frontier; an analysis of the relationship between the BK algorithm and loopy belief propagation; a way of applying Rao-Blackwellised particle filtering to DBNs in general, and the SLAM (simultaneous localization and mapping) problem in particular; a way of extending the structural EM algorithm to DBNs; and a variety of different applications of DBNs. However, perhaps the main value of the thesis is its catholic presentation of the field of sequential data modelling.