Results 1 
2 of
2
RaoBlackwellised Particle Filtering for Dynamic Bayesian Networks
"... Particle filters (PFs) are powerful samplingbased inference/learning algorithms for dynamic Bayesian networks (DBNs). They allow us to treat, in a principled way, any type of probability distribution, nonlinearity and nonstationarity. They have appeared in several fields under such names as “conde ..."
Abstract

Cited by 256 (10 self)
 Add to MetaCart
Particle filters (PFs) are powerful samplingbased inference/learning algorithms for dynamic Bayesian networks (DBNs). They allow us to treat, in a principled way, any type of probability distribution, nonlinearity and nonstationarity. They have appeared in several fields under such names as “condensation”, “sequential Monte Carlo” and “survival of the fittest”. In this paper, we show how we can exploit the structure of the DBN to increase the efficiency of particle filtering, using a technique known as RaoBlackwellisation. Essentially, this samples some of the variables, and marginalizes out the rest exactly, using the Kalman filter, HMM filter, junction tree algorithm, or any other finite dimensional optimal filter. We show that RaoBlackwellised particle filters (RBPFs) lead to more accurate estimates than standard PFs. We demonstrate RBPFs on two problems, namely nonstationary online regression with radial basis function networks and robot localization and map building. We also discuss other potential application areas and provide references to some Þnite dimensional optimal filters.
Logical particle filtering
 In Proceedings of the Dagstuhl Seminar on Probabilistic, Logical, and Relational Learning
, 2007
"... Abstract. In this paper, we consider the problem of filtering in relational hidden Markov models. We present a compact representation for such models and an associated logical particle filtering algorithm. Each particle contains a logical formula that describes a set of states. The algorithm updates ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
Abstract. In this paper, we consider the problem of filtering in relational hidden Markov models. We present a compact representation for such models and an associated logical particle filtering algorithm. Each particle contains a logical formula that describes a set of states. The algorithm updates the formulae as new observations are received. Since a single particle tracks many states, this filter can be more accurate than a traditional particle filter in high dimensional state spaces, as we demonstrate in experiments. Consider an agent operating in a complex environment, made up of an unknown, possibly infinite, number of objects. The agent can take actions and make observations of the state of the world, and it knows a probabilistic model of how the state changes over time as a result of its actions and of how the observations are generated from the states. How can it efficiently estimate the underlying state of the environment? Filtering is the problem of predicting a distribution over the underlying environment state given a history of the agent’s