Results 1  10
of
12
Dynamic Bayesian Networks: Representation, Inference and Learning
, 2002
"... Modelling sequential data is important in many areas of science and engineering. Hidden Markov models (HMMs) and Kalman filter models (KFMs) are popular for this because they are simple and flexible. For example, HMMs have been used for speech recognition and biosequence analysis, and KFMs have bee ..."
Abstract

Cited by 563 (3 self)
 Add to MetaCart
Modelling sequential data is important in many areas of science and engineering. Hidden Markov models (HMMs) and Kalman filter models (KFMs) are popular for this because they are simple and flexible. For example, HMMs have been used for speech recognition and biosequence analysis, and KFMs have been used for problems ranging from tracking planes and missiles to predicting the economy. However, HMMs
and KFMs are limited in their “expressive power”. Dynamic Bayesian Networks (DBNs) generalize HMMs by allowing the state space to be represented in factored form, instead of as a single discrete random variable. DBNs generalize KFMs by allowing arbitrary probability distributions, not just (unimodal) linearGaussian. In this thesis, I will discuss how to represent many different kinds of models as DBNs, how to perform exact and approximate inference in DBNs, and how to learn DBN models from sequential data.
In particular, the main novel technical contributions of this thesis are as follows: a way of representing
Hierarchical HMMs as DBNs, which enables inference to be done in O(T) time instead of O(T 3), where T is the length of the sequence; an exact smoothing algorithm that takes O(log T) space instead of O(T); a simple way of using the junction tree algorithm for online inference in DBNs; new complexity bounds on exact online inference in DBNs; a new deterministic approximate inference algorithm called factored frontier; an analysis of the relationship between the BK algorithm and loopy belief propagation; a way of
applying RaoBlackwellised particle filtering to DBNs in general, and the SLAM (simultaneous localization
and mapping) problem in particular; a way of extending the structural EM algorithm to DBNs; and a variety of different applications of DBNs. However, perhaps the main value of the thesis is its catholic presentation of the field of sequential data modelling.
ContextSpecific Independence in Bayesian Networks
, 1996
"... Bayesiannetworks provide a languagefor qualitatively representing the conditional independence properties of a distribution. This allows a natural and compact representation of the distribution, eases knowledge acquisition, and supports effective inference algorithms. ..."
Abstract

Cited by 288 (29 self)
 Add to MetaCart
Bayesiannetworks provide a languagefor qualitatively representing the conditional independence properties of a distribution. This allows a natural and compact representation of the distribution, eases knowledge acquisition, and supports effective inference algorithms.
Inference in belief networks: A procedural guide
 International Journal of Approximate Reasoning
, 1996
"... Belief networks are popular tools for encoding uncertainty in expert systems. These networks rely on inference algorithms to compute beliefs in the context of observed evidence. One established method for exact inference onbelief networks is the Probability Propagation in Trees of Clusters (PPTC) al ..."
Abstract

Cited by 149 (6 self)
 Add to MetaCart
Belief networks are popular tools for encoding uncertainty in expert systems. These networks rely on inference algorithms to compute beliefs in the context of observed evidence. One established method for exact inference onbelief networks is the Probability Propagation in Trees of Clusters (PPTC) algorithm, as developed byLauritzen and Spiegelhalter and re ned by Jensen et al. [1, 2, 3] PPTC converts the belief network into a secondary structure, then computes probabilities by manipulating the secondary structure. In this document, we provide a selfcontained, procedural guide to understanding and implementing PPTC. We synthesize various optimizations to PPTC that are scattered throughout the literature. We articulate undocumented, \open secrets " that are vital to producing a robust and e cient implementation of PPTC. We hope that this document makes probabilistic inference more accessible and a ordable to those without extensive prior exposure.
Topological Parameters for timespace tradeoff
 ARTIFICIAL INTELLIGENCE
, 1996
"... In this paper we propose a family of algorithms combining treeclustering with conditioning that trade space for time. Such algorithms are useful for reasoning in probabilistic and deterministic networks as well as for accomplishing optimization tasks. By analyzing the problem structure it will be p ..."
Abstract

Cited by 55 (12 self)
 Add to MetaCart
In this paper we propose a family of algorithms combining treeclustering with conditioning that trade space for time. Such algorithms are useful for reasoning in probabilistic and deterministic networks as well as for accomplishing optimization tasks. By analyzing the problem structure it will be possible to select from a spectrum the algorithm that best meets a given timespace specification.
Local Conditioning in Bayesian Networks
 Artificial Intelligence
, 1996
"... Local conditioning (LC) is an exact algorithm for computing probability in Bayesian networks, developed as an extension of Kim and Pearl's algorithm for singlyconnected networks. A list of variables associated to each node guarantees that only the nodes inside a loop are conditioned on the variable ..."
Abstract

Cited by 27 (6 self)
 Add to MetaCart
Local conditioning (LC) is an exact algorithm for computing probability in Bayesian networks, developed as an extension of Kim and Pearl's algorithm for singlyconnected networks. A list of variables associated to each node guarantees that only the nodes inside a loop are conditioned on the variable which breaks it. The main advantage of this algorithm is that it computes the probability directly on the original network instead of building a cluster tree, and this can save time when debugging a model and when the sparsity of evidence allows a pruning of the network. The algorithm is also advantageous when some families in the network interact through AND/OR gates. A parallel implementation of the algorithm with a processor for each node is possible even in the case of multiplyconnected networks. 1 Introduction A Bayesian network is an acyclic directed graph in which every node represents a random variable, together with a probability distribution such that P (x 1 ; : : : ; x n ) = ...
Exploiting System Structure in ModelBased Diagnosis of DiscreteEvent Systems
, 1996
"... We describe an implemented system for modelbased diagnosis of discreteevent systems, which are continuous systems governed by discrete controllers. Our approach contributes to modelbased diagnosis by taking advantage of the system structure (a directed graph depicting component interconnectivity) ..."
Abstract

Cited by 23 (6 self)
 Add to MetaCart
We describe an implemented system for modelbased diagnosis of discreteevent systems, which are continuous systems governed by discrete controllers. Our approach contributes to modelbased diagnosis by taking advantage of the system structure (a directed graph depicting component interconnectivity) to efficiently diagnose dynamic systems, and by providing computational guarantees based on such structure. Specifically, we use a propositional temporal logic to describe discreteevent systems, but constrain the temporal sentences defining a system using the topology of its structure. We describe a computational machinery for computing and focusing consistencybased diagnoses based on a structured set of temporal sentences, the complexity of which is dependent on the topology of the given structure. We explain how to engineer the system structure to ensure that our algorithms are efficient. We also explain why our approach has proven effective for diagnosing discreteevent systems by iden...
A logical notion of conditional independence: Properties and applications
 ARTIFICIAL INTELLIGENCE
, 1997
"... We propose a notion of conditional independence with respect to propositional logic and study some of its key properties. We present several equivalent formulations of the proposed notion, each oriented towards a specific application of logical reasoning such as abduction and diagnosis. We suggest a ..."
Abstract

Cited by 17 (3 self)
 Add to MetaCart
We propose a notion of conditional independence with respect to propositional logic and study some of its key properties. We present several equivalent formulations of the proposed notion, each oriented towards a specific application of logical reasoning such as abduction and diagnosis. We suggest a framework for utilizing logical independence computationally by structuring a propositional logic database around a directed acyclic graph. This structuring explicates many of the independences satisfied by the underlying database. Based on these structural independences, we develop an algorithm for a class of structured databases that is not necessarily Horn. The algorithm is linear in the size of a database structure and can be used for deciding entailment, computing abductions and diagnoses. The presented results are motivated by similar results in the literature on probabilistic and constraintbased reasoning.
DIAVAL, a Bayesian expert system for echocardiography
 ARTIFICIAL INTELLIGENCE IN MEDICINE 10
, 1997
"... DIAVAL is an expert system for the diagnosis of heart diseases, based on several kinds of data, mainly from echocardiography. The first part of this paper is devoted to the causal probabilistic model which constitutes the knowledge base of the expert system in the form of a Bayesian network, emphasi ..."
Abstract

Cited by 11 (2 self)
 Add to MetaCart
DIAVAL is an expert system for the diagnosis of heart diseases, based on several kinds of data, mainly from echocardiography. The first part of this paper is devoted to the causal probabilistic model which constitutes the knowledge base of the expert system in the form of a Bayesian network, emphasizing the importance of the OR gate. The second part deals with the process of diagnosis, which consists of computing the a posteriori probabilities, selecting the most probable and most relevant diagnoses, and generating a written report. It also describes the results of the evaluation of the program.
Information Fusion, Causal Probabilistic Network And Probanet II: Inference Algorithms and Probanet System
 Proc. 1st Intl. Workshop on Image Analysis and Information Fusion
, 1997
"... As an extension of an overview paper [Pan and McMichael, 1997] on information fusion and Causal Probabilistic Networks (CPN), this paper formalizes kernel algorithms for probabilistic inferences upon CPNs. Information fusion is realized through updating joint probabilities of the variables upon the ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
As an extension of an overview paper [Pan and McMichael, 1997] on information fusion and Causal Probabilistic Networks (CPN), this paper formalizes kernel algorithms for probabilistic inferences upon CPNs. Information fusion is realized through updating joint probabilities of the variables upon the arrival of new evidences or new hypotheses. Kernel algorithms for some dominant methods of inferences are formalized from discontiguous, mathematicsoriented literatures, with gaps lled in with regards to computability and completeness. In particular, possible optimizations on causal tree algorithm, graph triangulation and junction tree algorithm are discussed. Probanet has been designed and developed as a generic shell, or say, mother system for CPN construction and application. The design aspects and current status of Probanet are described. A few directions for research and system development are pointed out, including hierarchical structuring of network, structure decomposition and adaptive inference algorithms. This paper thus has a nature of integration including literature review, algorithm formalization and future perspective.
Alarms for Monitoring: A DecisionTheoretic Framework
, 1997
"... Alarms are intended to aid in the monitoring of processes over time. We present a general framework for designing such alarms. Our framework is founded on decision theory, and it incorporates notions of time: process states that change over time, decisions that recur over time, and decision crit ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Alarms are intended to aid in the monitoring of processes over time. We present a general framework for designing such alarms. Our framework is founded on decision theory, and it incorporates notions of time: process states that change over time, decisions that recur over time, and decision criteria that consider the future effects of present actions. To address the need to prioritize alerts, we model the notion of an alert urgency within our framework: we define and provide a quantitative measure of this urgency.