## Inference in Hybrid Networks: Theoretical Limits and Practical Algorithms (2001)

Venue: | In UAI |

Citations: | 62 - 3 self |

### BibTeX

@INPROCEEDINGS{Lerner01inferencein,

author = {Uri Lerner and Ronald Parr},

title = {Inference in Hybrid Networks: Theoretical Limits and Practical Algorithms},

booktitle = {In UAI},

year = {2001},

pages = {310--318}

}

### Years of Citing Articles

### OpenURL

### Abstract

An important subclass of hybrid Bayesian networks

### Citations

740 | Statistical methods for speech recognition - Jelinek - 1997 |

714 |
Tracking and Data Association
- Bar-Shalom, Formann
- 1988
(Show Context)
Citation Context ...us variables given the discrete ones is a multivariate Gaussian. CLG models are popular in a variety of applications, in both static and dynamic settings. Example applications include target tracking =-=[1]-=-, where the continuous variables represent the state of one or more targets and the discrete variables might model the maneuver type; visual tracking, (e.g., [13]) where the continuous variables repre... |

667 | On sequential monte carlo sampling methods for bayesian filtering
- Doucet, Andrieu, et al.
(Show Context)
Citation Context ... variables, set the evidence d and then sample from the tree, ignoring anything which is not in \Delta 1 . One can view this method as a static version of Rao Blackwellized Particle Filtering or RBPF =-=[4]-=- --- we sample the discrete variables and solve the remaining continuous problem analytically. The sampling method runs into problems when the prior probability mass of a very small number of hypothes... |

639 |
Diagnosing Multiple Faults
- deKleer, Williams
(Show Context)
Citation Context ...(hypotheses with a small number of failures, or failures which tend to happen together). This idea of concentrating on likely hypotheses with a small number of faults is very natural, and was used in =-=[3]-=-, although the probabilistic model there was discrete and some strong independence assumptions were made. 4.2 Monte Carlo Methods We begin by presenting a naive algorithm that enumerates all the hypot... |

564 | Probabilistic inference using Markov chain Monte Carlo methods - Neal - 1993 |

263 | Tractable inference for complex stochastic processes
- Boyen, Koller
- 1998
(Show Context)
Citation Context ... large, keeping even one Gaussian for every possible combination of discrete variables in the belief state may be too expensive. Our algorithm can also be integrated effectively with the BK algorithm =-=[2]-=-, which was adopted for hybrid systems in [10]. The key idea is to exploit the fact that large systems are often composed of subsystems, and while the subsystems are correlated, the interaction betwee... |

144 | Variational learning for switching statespace models
- Ghahramani, Hinton
(Show Context)
Citation Context ... prove that unless P=NP there does not exist a polynomial time approximate inference algorithm with absolute error smaller than 0:5. The class of networks we consider include Switching Kalman Filters =-=[1, 6]-=- as a special case; thus, we provide the first formal complexity results for this important class of models. The second part of the paper addresses the question of how to perform inference in CLG mode... |

140 | Propagation of probabilities, means and variances in mixed graphical association models
- Lauritzen
- 1992
(Show Context)
Citation Context ...part of the paper addresses the question of how to perform inference in CLG models in light of our complexity results. The commonly used approach for CLG models is the algorithm proposed by Lauritzen =-=[8, 9]-=-, which is an extension of the standard clique tree algorithm. Not surprisingly, since inference in CLGs is NP-hard even for simple networks, the size of the resulting clique tree is often exponential... |

111 | A dynamic bayesian network approach to figure tracking using learned dynamic models
- Pavlovic, Rehg, et al.
- 1999
(Show Context)
Citation Context ...e applications include target tracking [1], where the continuous variables represent the state of one or more targets and the discrete variables might model the maneuver type; visual tracking, (e.g., =-=[13]-=-) where the continuous variables represent the head, legs, and torso position of a person and the discrete variables the type of movement; fault diagnosis [10], where discrete events can affect a cont... |

83 | Bayesian fault detection and diagnosis in dynamic systems
- Lerner, Parr, et al.
- 2000
(Show Context)
Citation Context ...e maneuver type; visual tracking, (e.g., [13]) where the continuous variables represent the head, legs, and torso position of a person and the discrete variables the type of movement; fault diagnosis =-=[10]-=-, where discrete events can affect a continuous process; and speech recognition [7, ch.9], where a discrete phoneme determines a distribution over the acoustic signal. The first part of the paper deal... |

66 | An efficient algorithm for finding the M most probable configurations in probabilistic expert systems
- Nilsson
- 1998
(Show Context)
Citation Context ...s variables in the BN for the purposes of the enumeration: we only care about the discrete problem of enumerating from P (\Delta 1 j d). The key subroutine of this method is an algorithm presented in =-=[12]-=- for enumerating the K most likely configurations of a discrete BN given some evidence. The algorithm generates a clique tree and uses it to generate the configurations in an anytime fashion. The comp... |

4 |
Stable local compuation with conditional Gaussian distributions
- Lauritzen, Jensen
- 1999
(Show Context)
Citation Context ...part of the paper addresses the question of how to perform inference in CLG models in light of our complexity results. The commonly used approach for CLG models is the algorithm proposed by Lauritzen =-=[8, 9]-=-, which is an extension of the standard clique tree algorithm. Not surprisingly, since inference in CLGs is NP-hard even for simple networks, the size of the resulting clique tree is often exponential... |