Results 1  10
of
27
Learning Bayesian belief networks: An approach based on the MDL principle
 Computational Intelligence
, 1994
"... A new approach for learning Bayesian belief networks from raw data is presented. The approach is based on Rissanen's Minimal Description Length (MDL) principle, which is particularly well suited for this task. Our approach does not require any prior assumptions about the distribution being learned. ..."
Abstract

Cited by 188 (8 self)
 Add to MetaCart
A new approach for learning Bayesian belief networks from raw data is presented. The approach is based on Rissanen's Minimal Description Length (MDL) principle, which is particularly well suited for this task. Our approach does not require any prior assumptions about the distribution being learned. In particular, our method can learn unrestricted multiplyconnected belief networks. Furthermore, unlike other approaches our method allows us to tradeo accuracy and complexity in the learned model. This is important since if the learned model is very complex (highly connected) it can be conceptually and computationally intractable. In such a case it would be preferable to use a simpler model even if it is less accurate. The MDL principle o ers a reasoned method for making this tradeo. We also show that our method generalizes previous approaches based on Kullback crossentropy. Experiments have been conducted to demonstrate the feasibility of the approach. Keywords: Knowledge Acquisition � Bayes Nets � Uncertainty Reasoning. 1
The Computational Complexity of Abduction
, 1991
"... The problem of abduction can be characterized as finding the best explanation of a set of data. In this paper we focus on one type of abduction in which the best explanation is the most plausible combination of hypotheses that explains all the data. We then present several computational complexity r ..."
Abstract

Cited by 108 (3 self)
 Add to MetaCart
The problem of abduction can be characterized as finding the best explanation of a set of data. In this paper we focus on one type of abduction in which the best explanation is the most plausible combination of hypotheses that explains all the data. We then present several computational complexity results demonstrating that this type of abduction is intractable (NPhard) in general. In particular, choosing between incompatible hypotheses, reasoning about cancellation effects among hypotheses, and satisfying the maximum plausibility requirement are major factors leading to intractability. We also identify a tractable, but restricted, class of abduction problems. Thanks to B. Chandrasekaran, Ashok Goel, Jack Smith, and Jon Sticklen for their comments on the numerous versions of this paper. The referees have also made a substantial contribution. Any remaining errors are our responsibility, of course. This research has been supported in part by the National Library of Medicine, grant LM...
Clustering Intrusion Detection Alarms to Support Root Cause Analysis
 ACM Transactions on Information and System Security
, 2003
"... It is a wellknown problem that intrusion detection systems overload their human operators by triggering thousands of alarms per day. This paper presents a new approach for handling intrusion detection alarms more efficiently. Central to this approach is the notion that each alarm occurs for a reaso ..."
Abstract

Cited by 58 (0 self)
 Add to MetaCart
It is a wellknown problem that intrusion detection systems overload their human operators by triggering thousands of alarms per day. This paper presents a new approach for handling intrusion detection alarms more efficiently. Central to this approach is the notion that each alarm occurs for a reason, which is referred to as the alarm’s root causes. This paper observes that a few dozens of rather persistent root causes generally account for over 90 % of the alarms that an intrusion detection system triggers. Therefore, we argue that alarms should be handled by identifying and removing the most predominant and persistent root causes. To make this paradigm practicable, we propose a novel alarmclustering method that supports the human analyst in identifying root causes. We present experiments with realworld intrusion detection alarms to show how alarm clustering helped us identify root causes. Moreover, we show that the alarm load decreases quite substantially if the identified root causes are eliminated so that they can no longer trigger alarms in the future.
Variational Probabilistic Inference and the QMRDT Network
 JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH
, 1999
"... We describe a variational approximation method for efficient inference in largescale probabilistic models. Variational methods are deterministic procedures that provide approximations to marginal and conditional probabilities of interest. They provide alternatives to approximate inference method ..."
Abstract

Cited by 57 (3 self)
 Add to MetaCart
We describe a variational approximation method for efficient inference in largescale probabilistic models. Variational methods are deterministic procedures that provide approximations to marginal and conditional probabilities of interest. They provide alternatives to approximate inference methods based on stochastic sampling or search. We describe a variational approach to the problem of diagnostic inference in the "Quick Medical Reference" (QMR) network. The QMR network is a largescale probabilistic graphical model built on statistical and expert knowledge. Exact probabilistic inference is infeasible in this model for all but a small set of cases. We evaluate our variational inference algorithm on a large set of diagnostic test cases, comparing the algorithm to a stateoftheart stochastic sampling method.
An Optimal Approximation Algorithm For Bayesian Inference
 Artificial Intelligence
, 1997
"... Approximating the inference probability Pr[X = xjE = e] in any sense, even for a single evidence node E, is NPhard. This result holds for belief networks that are allowed to contain extreme conditional probabilitiesthat is, conditional probabilities arbitrarily close to 0. Nevertheless, all p ..."
Abstract

Cited by 48 (2 self)
 Add to MetaCart
Approximating the inference probability Pr[X = xjE = e] in any sense, even for a single evidence node E, is NPhard. This result holds for belief networks that are allowed to contain extreme conditional probabilitiesthat is, conditional probabilities arbitrarily close to 0. Nevertheless, all previous approximation algorithms have failed to approximate efficiently many inferences, even for belief networks without extreme conditional probabilities. We prove that we can approximate efficiently probabilistic inference in belief networks without extreme conditional probabilities. We construct a randomized approximation algorithmthe boundedvariance algorithmthat is a variant of the known likelihoodweighting algorithm. The boundedvariance algorithm is the first algorithm with provably fast inference approximation on all belief networks without extreme conditional probabilities. From the boundedvariance algorithm, we construct a deterministic approximation algorithm u...
A theory of diagnosis for incomplete causal models
 In Proc. 11th IJCAI
, 1989
"... One of the problems of the recent approaches to problem solving based on deep knowledge is the lack of a formal treatment of incomplete knowledge. However, dealing with incomplete models is fundamental to many realworld domains. In this paper we propose a formal theory of causal diagnostic reasoning ..."
Abstract

Cited by 46 (4 self)
 Add to MetaCart
One of the problems of the recent approaches to problem solving based on deep knowledge is the lack of a formal treatment of incomplete knowledge. However, dealing with incomplete models is fundamental to many realworld domains. In this paper we propose a formal theory of causal diagnostic reasoning, dealing with different forms of incompleteness both in the general causal knowledge (missing or abstracted knowledge) and in the data describing a specific case under examination. Different forms of nonmonotonic reasoning (hypothetical and circumscriptive reasoning) are used in order to draw and confirm conclusions from incomplete knowledge. Multiple fault solutions are treated in a natural way and parsimony criteria arc used to rank alternative solutions. 1.
A Survey of Algorithms for RealTime Bayesian Network Inference
 In In the joint AAAI02/KDD02/UAI02 workshop on RealTime Decision Support and Diagnosis Systems
, 2002
"... As Bayesian networks are applied to more complex and realistic realworld applications, the development of more efficient inference algorithms working under realtime constraints is becoming more and more important. This paper presents a survey of various exact and approximate Bayesian network ..."
Abstract

Cited by 32 (2 self)
 Add to MetaCart
As Bayesian networks are applied to more complex and realistic realworld applications, the development of more efficient inference algorithms working under realtime constraints is becoming more and more important. This paper presents a survey of various exact and approximate Bayesian network inference algorithms. In particular, previous research on realtime inference is reviewed. It provides a framework for understanding these algorithms and the relationships between them. Some important issues in realtime Bayesian networks inference are also discussed.
Partial abductive inference in Bayesian belief networks using a genetic algorithm
 Pattern Recognit. Lett
, 1999
"... Abstract—Abductive inference in Bayesian belief networks (BBNs) is intended as the process of generating the most probable configurations given observed evidence. When we are interested only in a subset of the network’s variables, this problem is called partial abductive inference. Both problems are ..."
Abstract

Cited by 24 (2 self)
 Add to MetaCart
Abstract—Abductive inference in Bayesian belief networks (BBNs) is intended as the process of generating the most probable configurations given observed evidence. When we are interested only in a subset of the network’s variables, this problem is called partial abductive inference. Both problems are NPhard and so exact computation is not always possible. In this paper, a genetic algorithm is used to perform partial abductive inference in BBNs. The main contribution is the introduction of new genetic operators designed specifically for this problem. By using these genetic operators, we try to take advantage of the calculations previously carried out, when a new individual is evaluated. The algorithm is tested using a widely used Bayesian network and a randomly generated one and then compared with a previous genetic algorithm based on classical genetic operators. From the experimental results, we conclude that the new genetic operators preserve the accuracy of the previous algorithm, and also reduce the number of operations performed during the evaluation of individuals. The performance of the genetic algorithm is, thus, improved. Index Terms—Abductive inference, bayesian belief networks, evolutionary computation, genetic operators, most probable explanation, probabilistic reasoning. I.
Variational probabilistic inference and the QMRDT database
 Journal of Artificial Intelligence Research
, 1999
"... We describe a variational approximation method for efficient inference in largescale probabilistic models. Variational methods are deterministic procedures that provide approximations to marginal and conditional probabilities of interest. They provide alternatives to approximate inference methods b ..."
Abstract

Cited by 16 (3 self)
 Add to MetaCart
We describe a variational approximation method for efficient inference in largescale probabilistic models. Variational methods are deterministic procedures that provide approximations to marginal and conditional probabilities of interest. They provide alternatives to approximate inference methods based on stochastic sampling or search. We describe a variational approach to the problem of diagnostic inference in the "Quick Medical Reference" (QMR) database. The QMR database is a largescale probabilistic graphical model built on statistical and expert knowledge. Exact probabilistic inference is infeasible in this model for all but a small set of cases. We evaluate our variational inference algorithm on a large set of diagnostic test cases, comparing the algorithm to a stateoftheart stochastic sampling method. 1 Introduction Probabilistic models have become increasingly prevalent in AI in recent years. Beyond the significant representational advantages of probability theory, inclu...
A Recurrence Local Computation Approach Towards Ordering Composite Beliefs in Bayesian Belief Networks
, 1993
"... Finding the l Most Probable Explanations (MPE) of a given evidence, S e , in a Bayesian belief network can be formulated as identifying and ordering a set of composite hypotheses, H i s, of which the posterior probabilities are the l largest; i.e., P r(H 1 jS e ) ::: P r(H l jS e ). When an orde ..."
Abstract

Cited by 9 (5 self)
 Add to MetaCart
Finding the l Most Probable Explanations (MPE) of a given evidence, S e , in a Bayesian belief network can be formulated as identifying and ordering a set of composite hypotheses, H i s, of which the posterior probabilities are the l largest; i.e., P r(H 1 jS e ) ::: P r(H l jS e ). When an order includes all the composite hypotheses in the network in order to find all the probable explanations, it becomes a total order and the derivation of such an order has an exponential complexity. The focus of this paper is on the derivation of a partial order, with length l, for finding the l most probable composite hypotheses; where l typically is much smaller than the total number of composite hypotheses in a network. Previously, only the partial order of length two (i.e., l = 2) in a singly connected Bayesian network could be efficiently derived without further restriction on network topologies and the increase in spatial complexity. This paper discusses an efficient algorithm for the deri...