## Tractable inference for complex stochastic processes (1998)

### Cached

### Download Links

Venue: | In Proc. UAI |

Citations: | 263 - 13 self |

### BibTeX

@INPROCEEDINGS{Boyen98tractableinference,

author = {Xavier Boyen},

title = {Tractable inference for complex stochastic processes},

booktitle = {In Proc. UAI},

year = {1998},

pages = {33--42}

}

### Years of Citing Articles

### OpenURL

### Abstract

The monitoring and control of any dynamic system depends crucially on the ability to reason about its current status and its future trajectory. In the case of a stochastic system, these tasks typically involve the use of a belief state—a probability distribution over the state of the process at a given point in time. Unfortunately, the state spaces of complex processes are very large, making an explicit representation of a belief state intractable. Even in dynamic Bayesian networks (DBNs), where the process itself can be represented compactly, the representation of the belief state is intractable. We investigate the idea of maintaining a compact approximation to the true belief state, and analyze the conditions under which the errors due to the approximations taken over the lifetime of the process do not accumulate to make our answers completely irrelevant. We show that the error in a belief state contracts exponentially as the process evolves. Thus, even with multiple approximations, the error in our process remains bounded indefinitely. We show how the additional structure of a DBN can be used to design our approximation scheme, improving its performance significantly. We demonstrate the applicability of our ideas in the context of a monitoring task, showing that orders of magnitude faster inference can be achieved with only a small degradation in accuracy. 1

### Citations

8563 |
Elements of Information Theory
- Cover, Thomas
- 1991
(Show Context)
Citation Context ...r from growing unboundedly. More specifically, we have proved the important (and to our knowledge) new result that stochastic processes (under certain assumptions) are a contraction for KL-divergence =-=[2]-=-: propagation of two distributions through a stochastic transition model results in a constant factor reduction of the KL-divergence between them. We believe that this result will have significant app... |

2112 |
A New Approach to Linear Filtering and Prediction Problems
- Kalman
- 1960
(Show Context)
Citation Context ... processes is fundamental to many applications [6, 8, 3, 14]. A number of formal models have been developed for describing situations of this type, including Hidden Markov Models [15], Kalman Filters =-=[9]-=-, and Dynamic Bayesian Networks [4]. These very different models all share the same underlying Markov assumption, the fact that the future is conditionally independent of the past given the current st... |

1284 |
Local Computations with Probabilities on Graphical Structures and Their Application to Expert Systems (with Discussion
- Lauritzen, Spiegelhalter
- 1988
(Show Context)
Citation Context ... (t+1ffl) [XXX l ], and the entire distribution as a product of these factors. In the case of DBNs, we can actually accomplish this update procedure quite efficiently. We first generate a clique tree =-=[13]-=- in which, for every l, some clique contains XXX (t) l and some clique contains XXX (t+1) l . A standard clique tree propagation algorithm can then be used to compute the posterior distribution over e... |

834 | A tutorial on hidden Markov models
- Rabiner, Juang
- 1989
(Show Context)
Citation Context ...ason about stochastic processes is fundamental to many applications [6, 8, 3, 14]. A number of formal models have been developed for describing situations of this type, including Hidden Markov Models =-=[15]-=-, Kalman Filters [9], and Dynamic Bayesian Networks [4]. These very different models all share the same underlying Markov assumption, the fact that the future is conditionally independent of the past ... |

489 | Factorial Hidden Markov Models
- Ghahramani, Jordan
- 1998
(Show Context)
Citation Context ... these ideas can be viewed as falling into the framework described in this paper. However, neither contains any analysis nor an explicit connection to the structure of the process. The recent work of =-=[7]-=- and [16] utilize mean field approximation in the context of various types of structured HMMs. Of these approaches, [7] is the closest to our work (to the part of it dealing with structured processes)... |

457 |
A model for reasoning about persistence and causation
- T, Kanazawa
- 1989
(Show Context)
Citation Context ...pplications [6, 8, 3, 14]. A number of formal models have been developed for describing situations of this type, including Hidden Markov Models [15], Kalman Filters [9], and Dynamic Bayesian Networks =-=[4]-=-. These very different models all share the same underlying Markov assumption, the fact that the future is conditionally independent of the past given the current state. Since the domain is stochastic... |

167 | Probabilistic independence networks for hidden markov probability models
- Smyth, Heckerman, et al.
- 1997
(Show Context)
Citation Context ...posability properties on the approximate belief state, whereas we have no control over the true belief state. Formally, it is most convenient to describe our results in the framework of factored HMMs =-=[17]-=-; in the next section, we discuss how they can be applied to dynamic Bayesian networks. We assume that our system is composed of several subprocesses T l . Each subprocess has a state with a Markovian... |

149 | A.: Inference in belief networks: A procedural guide - Huang, Darwiche - 1996 |

148 | Stochastic simulation algorithms for dynamic probabilistic networks
- Kanazawa, Koller, et al.
- 1995
(Show Context)
Citation Context ...complex temporal models. 4 The early work of [14] considers a simple approach of using domain knowledge to simply eliminate some of the variables from each time slice. The random sampling approach of =-=[10]-=- can also be viewed as maintaining an approximate belief state, albeit one represented very naively as a set of weighted samples. The recent work of [12] extends this idea, using the random samples at... |

122 |
Optimal Control of Markov Decision Processes with Incomplete State Estimation
- Astrom
- 1965
(Show Context)
Citation Context ...e of the process is rarely known with certainty. However, most reasoning tasks can be performed by using a belief state, which is a probability distribution over the state of a system at a given time =-=[1]-=-. It follows from the Markov assumption that the belief state at time t completely captures all of our information about the past. In particular, it suffices both for predicting the probabilities of f... |

98 | The batmobile: Towards a bayesian automated taxi
- Forbes, Huang, et al.
- 1995
(Show Context)
Citation Context ... of magnitude faster inference can be achieved with only a small degradation in accuracy. 1 Introduction The ability to model and reason about stochastic processes is fundamental to many applications =-=[6, 8, 3, 14]-=-. A number of formal models have been developed for describing situations of this type, including Hidden Markov Models [15], Kalman Filters [9], and Dynamic Bayesian Networks [4]. These very different... |

98 | Exploiting tractable substructures in intractable networks
- Saul, Jordan
- 1995
(Show Context)
Citation Context ...deas can be viewed as falling into the framework described in this paper. However, neither contains any analysis nor an explicit connection to the structure of the process. The recent work of [7] and =-=[16]-=- utilize mean field approximation in the context of various types of structured HMMs. Of these approaches, [7] is the closest to our work (to the part of it dealing with structured processes). In thei... |

64 |
Dynamic network models for forecasting
- Galper, Horvitz
- 1992
(Show Context)
Citation Context ... of magnitude faster inference can be achieved with only a small degradation in accuracy. 1 Introduction The ability to model and reason about stochastic processes is fundamental to many applications =-=[6, 8, 3, 14]-=-. A number of formal models have been developed for describing situations of this type, including Hidden Markov Models [15], Kalman Filters [9], and Dynamic Bayesian Networks [4]. These very different... |

62 | A scheme for approximating probabilistic inference
- Dechter, Rish
- 1997
(Show Context)
Citation Context ...licit bounds (in expectation) on the total error. 4 Some approximate inference algorithms for nontemporal Bayesian networks can also be applied to this task. Specifically, the mini-bucket approach of =-=[5]-=- is somewhat related to ours, as it also uses a (different) form of factoring decomposition during the course of inference. However, none of the potentially relevant algorithms are associated with any... |

61 | A computational scheme for reasoning in dynamic probabilistic networks - Kjaerulff |

54 | Using learning for approximation in stochastic processes
- Koller, Fratkina
- 1998
(Show Context)
Citation Context ... complex, we use a compactly represented approximate belief state. For example, in the context of a DBN, we might choose to represent an approximate belief state using a factored representation. (See =-=[12]-=- for some discussion of possible belief state representations for DBNs.) In the context of a hybrid process, we might choose to restrict the number of components in our Gaussian mixture. This idea imm... |

23 | A computational scheme for reasoning in dynamic probabilistic networks
- Kjærulff
- 1992
(Show Context)
Citation Context ...nce to the monitoring task. Exact inference for this task is infeasible in complex stochastic processes with a large number of states. Even if the process is highly structured, exact inference (e.g., =-=[11]-=-) is forced into intractability by the full correlation of the belief state that invariably occurs. We propose an approach whereby the algorithm maintains an approximate belief state with compact repr... |

18 |
Tradeoffs in constructing and evaluating temporal influence diagrams
- Provan
- 1993
(Show Context)
Citation Context ... of magnitude faster inference can be achieved with only a small degradation in accuracy. 1 Introduction The ability to model and reason about stochastic processes is fundamental to many applications =-=[6, 8, 3, 14]-=-. A number of formal models have been developed for describing situations of this type, including Hidden Markov Models [15], Kalman Filters [9], and Dynamic Bayesian Networks [4]. These very different... |

8 |
An expert system for control of waste water treatment — a pilot project
- Jensen, Kjaerulff, et al.
- 1989
(Show Context)
Citation Context |

5 |
A computational scheme for reasoning in dynamic probabilistic networks
- Kj��rulff
- 1992
(Show Context)
Citation Context ...nce to the monitoring task. Exact inference for this task is infeasible in complex stochastic processes with a large number of states. Even if the process is highly structured, exact inference (e.g., =-=[11]-=-) is forced into intractability by the full correlation of the belief state that invariably occurs. We propose an approach whereby the algorithm maintains an approximate belief state with compact repr... |