Results 1  10
of
64
Exploiting Causal Independence in Bayesian Network Inference
 Journal of Artificial Intelligence Research
, 1996
"... A new method is proposed for exploiting causal independencies in exact Bayesian network inference. ..."
Abstract

Cited by 157 (9 self)
 Add to MetaCart
A new method is proposed for exploiting causal independencies in exact Bayesian network inference.
Learning dynamic Bayesian networks
 Adaptive Processing of Sequences and Data Structures
, 1998
"... Bayesian networks are directed acyclic graphs that represent dependencies between variables in a probabilistic model. Many time series models, including the hidden Markov models (HMMs) used in speech recognition and Kalman filter models used in filtering and control applications, can be viewed as ex ..."
Abstract

Cited by 124 (0 self)
 Add to MetaCart
Bayesian networks are directed acyclic graphs that represent dependencies between variables in a probabilistic model. Many time series models, including the hidden Markov models (HMMs) used in speech recognition and Kalman filter models used in filtering and control applications, can be viewed as examples of dynamic Bayesian networks. We first provide a brief tutorial on learning and Bayesian networks. We then present some dynamic Bayesian networks that can capture much richer structure than HMMs and Kalman filters, including spatial and temporal multiresolution structure, distributed hidden state representations, and multiple switching linear regimes. While exact probabilistic inference is intractable in these networks, one can obtain tractable variational approximations which call as subroutines the forwardbackward and Kalman filter recursions. These approximations can be used to learn the model parameters...
Decision Theory in Expert Systems and Artificial Intelligence
 International Journal of Approximate Reasoning
, 1988
"... Despite their different perspectives, artificial intelligence (AI) and the disciplines of decision science have common roots and strive for similar goals. This paper surveys the potential for addressing problems in representation, inference, knowledge engineering, and explanation within the decision ..."
Abstract

Cited by 89 (18 self)
 Add to MetaCart
Despite their different perspectives, artificial intelligence (AI) and the disciplines of decision science have common roots and strive for similar goals. This paper surveys the potential for addressing problems in representation, inference, knowledge engineering, and explanation within the decisiontheoretic framework. Recent analyses of the restrictions of several traditional AI reasoning techniques, coupled with the development of more tractable and expressive decisiontheoretic representation and inference strategies, have stimulated renewed interest in decision theory and decision analysis. We describe early experience with simple probabilistic schemes for automated reasoning, review the dominant expertsystem paradigm, and survey some recent research at the crossroads of AI and decision science. In particular, we present the belief network and influence diagram representations. Finally, we discuss issues that have not been studied in detail within the expertsystems sett...
Causal independence for probability assessment and inference using Bayesian networks
 IEEE Trans. on Systems, Man and Cybernetics
, 1994
"... ABayesian network is a probabilistic representation for uncertain relationships, which has proven to be useful for modeling realworld problems. When there are many potential causes of a given e ect, however, both probability assessment and inference using a Bayesian network can be di cult. In this ..."
Abstract

Cited by 65 (2 self)
 Add to MetaCart
ABayesian network is a probabilistic representation for uncertain relationships, which has proven to be useful for modeling realworld problems. When there are many potential causes of a given e ect, however, both probability assessment and inference using a Bayesian network can be di cult. In this paper, we describe causal independence, a collection of conditional independence assertions and functional relationships that are often appropriate to apply to the representation of the uncertain interactions between causes and e ect. We show how the use of causal independence in a Bayesian network can greatly simplify probability assessment aswell as probabilistic inference. 1
Efficient Reasoning in Qualitative Probabilistic Networks
 In Proceedings of the 11th National Conference on Artificial Intelligence (AAAI93
, 1993
"... Qualitative Probabilistic Networks (QPNs) are an abstraction of Bayesian belief networks replacing numerical relations by qualitative influences and synergies [ Wellman, 1990b ] . To reason in a QPN is to find the effect of new evidence on each node in terms of the sign of the change in belief (incr ..."
Abstract

Cited by 50 (7 self)
 Add to MetaCart
Qualitative Probabilistic Networks (QPNs) are an abstraction of Bayesian belief networks replacing numerical relations by qualitative influences and synergies [ Wellman, 1990b ] . To reason in a QPN is to find the effect of new evidence on each node in terms of the sign of the change in belief (increase or decrease). We introduce a polynomial time algorithm for reasoning in QPNs, based on local sign propagation. It extends our previous scheme from singly connected to general multiply connected networks. Unlike existing graphreduction algorithms, it preserves the network structure and determines the effect of evidence on all nodes in the network. This aids metalevel reasoning about the model and automatic generation of intuitive explanations of probabilistic reasoning. Introduction A formal representation should not use more specificity than needed to support the reasoning required of it. The appropriate degree of specificity or numerical precision will vary depending on what kind o...
Soft Computing: the Convergence of Emerging Reasoning Technologies
 Soft Computing
, 1997
"... The term Soft Computing (SC) represents the combination of emerging problemsolving technologies such as Fuzzy Logic (FL), Probabilistic Reasoning (PR), Neural Networks (NNs), and Genetic Algorithms (GAs). Each of these technologies provide us with complementary reasoning and searching methods to so ..."
Abstract

Cited by 50 (8 self)
 Add to MetaCart
The term Soft Computing (SC) represents the combination of emerging problemsolving technologies such as Fuzzy Logic (FL), Probabilistic Reasoning (PR), Neural Networks (NNs), and Genetic Algorithms (GAs). Each of these technologies provide us with complementary reasoning and searching methods to solve complex, realworld problems. After a brief description of each of these technologies, we will analyze some of their most useful combinations, such as the use of FL to control GAs and NNs parameters; the application of GAs to evolve NNs (topologies or weights) or to tune FL controllers; and the implementation of FL controllers as NNs tuned by backpropagationtype algorithms.
Global Conditioning for Probabilistic Inference in Belief Networks
 In Proc. Tenth Conference on Uncertainty in AI
, 1994
"... In this paper we propose a new approach to probabilistic inference on belief networks, global conditioning, which is a simple generalization of Pearl's (1986b) method of loopcutset conditioning. We show that global conditioning, as well as loopcutset conditioning, can be thought of as a speci ..."
Abstract

Cited by 46 (0 self)
 Add to MetaCart
In this paper we propose a new approach to probabilistic inference on belief networks, global conditioning, which is a simple generalization of Pearl's (1986b) method of loopcutset conditioning. We show that global conditioning, as well as loopcutset conditioning, can be thought of as a special case of the method of Lauritzen and Spiegelhalter (1988) as refined by Jensen et al (1990a; 1990b).
Causal independence for knowledge acquisition and inference. Also in this proceedings
, 1993
"... I introduce a temporal beliefnetwork representation of causal independence that a knowledge engineer can use to elicit probabilistic models. Like the current, atemporal beliefnetwork representation of causal independence, the new representation makes knowledge acquisition tractable. Unlike the ate ..."
Abstract

Cited by 43 (4 self)
 Add to MetaCart
I introduce a temporal beliefnetwork representation of causal independence that a knowledge engineer can use to elicit probabilistic models. Like the current, atemporal beliefnetwork representation of causal independence, the new representation makes knowledge acquisition tractable. Unlike the atemproal representation, however, the temporal representation can simplify inference, and does not require the use of unobservable variables. The representation is less general than is the atemporal representation, but appears to be useful for many practical applications. 1
Dynamic construction of belief networks
 IEEE Transactions on Pattern Analysis and Machine Intelligence
, 1990
"... AbstractWe describe a method for incrementally constructing belief networks, which are directed acyclic graph representations for probability distributions. We have developed a networkconstruction language (FRAIW), which is similar to a fonvardchaining language using data dependencies but has add ..."
Abstract

Cited by 42 (1 self)
 Add to MetaCart
AbstractWe describe a method for incrementally constructing belief networks, which are directed acyclic graph representations for probability distributions. We have developed a networkconstruction language (FRAIW), which is similar to a fonvardchaining language using data dependencies but has additional features for specifying distributions. A particularly important feature of this language is that it allows the user to conveniently specify conditional probability matrices using stereotyped models of intercausal interaction. Using FRAIW, one can define parameterized classes of probabilistic models. These parameterized models make it possible to apply probabilistic reasoning to problems for which it is impractical to have a single large, static model.