Results 1  10
of
13
Model Checking vs. Theorem Proving: A Manifesto
, 1991
"... We argue that rather than representing an agent's knowledge as a collection of formulas, and then doing theorem proving to see if a given formula follows from an agent's knowledge base, it may be more useful to represent this knowledge by a semantic model, and then do model checking to see if the g ..."
Abstract

Cited by 117 (5 self)
 Add to MetaCart
We argue that rather than representing an agent's knowledge as a collection of formulas, and then doing theorem proving to see if a given formula follows from an agent's knowledge base, it may be more useful to represent this knowledge by a semantic model, and then do model checking to see if the given formula is true in that model. We discuss how to construct a model that represents an agent's knowledge in a number of different contexts, and then consider how to approach the modelchecking problem.
CAUSAL NETWORKS: SEMANTICS AND EXPRESSIVENESS
, 1990
"... Dependency knowledge of the form "x is independent of y once z is known" invariably obeys the four graphoid axioms, examples include probabilistic and database dependencies. Often, such knowledge can be represented efficiently with graphical structures such as undirected graphs and directed acyclic ..."
Abstract

Cited by 96 (8 self)
 Add to MetaCart
Dependency knowledge of the form "x is independent of y once z is known" invariably obeys the four graphoid axioms, examples include probabilistic and database dependencies. Often, such knowledge can be represented efficiently with graphical structures such as undirected graphs and directed acyclic graphs (DAGs). In this paper we show that the graphical criterion called dseparation is a sound rule for reading independencies from any DAG based on a causal input list drawn from a graphoid. The rule may be extended to cover DAGs that represent functional dependencies as well as conditional dependencies.
Decision Theory in Expert Systems and Artificial Intelligence
 International Journal of Approximate Reasoning
, 1988
"... Despite their different perspectives, artificial intelligence (AI) and the disciplines of decision science have common roots and strive for similar goals. This paper surveys the potential for addressing problems in representation, inference, knowledge engineering, and explanation within the decision ..."
Abstract

Cited by 91 (18 self)
 Add to MetaCart
Despite their different perspectives, artificial intelligence (AI) and the disciplines of decision science have common roots and strive for similar goals. This paper surveys the potential for addressing problems in representation, inference, knowledge engineering, and explanation within the decisiontheoretic framework. Recent analyses of the restrictions of several traditional AI reasoning techniques, coupled with the development of more tractable and expressive decisiontheoretic representation and inference strategies, have stimulated renewed interest in decision theory and decision analysis. We describe early experience with simple probabilistic schemes for automated reasoning, review the dominant expertsystem paradigm, and survey some recent research at the crossroads of AI and decision science. In particular, we present the belief network and influence diagram representations. Finally, we discuss issues that have not been studied in detail within the expertsystems sett...
Improved learning of Bayesian networks
 Proc. of the Conf. on Uncertainty in Artificial Intelligence
, 2001
"... Two or more Bayesian network structures are Markov equivalent when the corresponding acyclic digraphs encode the same set of conditional independencies. Therefore, the search space of Bayesian network structures may be organized in equivalence classes, where each of them represents a different set o ..."
Abstract

Cited by 37 (6 self)
 Add to MetaCart
Two or more Bayesian network structures are Markov equivalent when the corresponding acyclic digraphs encode the same set of conditional independencies. Therefore, the search space of Bayesian network structures may be organized in equivalence classes, where each of them represents a different set of conditional independencies. The collection of sets of conditional independencies obeys a partial order, the socalled “inclusion order.” This paper discusses in depth the role that the inclusion order plays in learning the structure of Bayesian networks. In particular, this role involves the way a learning algorithm traverses the search space. We introduce a condition for traversal operators, the inclusion boundary condition, which, when it is satisfied, guarantees that the search strategy can avoid local maxima. This is proved under the assumptions that the data is sampled from a probability distribution which is faithful to an acyclic digraph, and the length of the sample is unbounded. The previous discussion leads to the design of a new traversal operator and two new learning algorithms in the context of heuristic search and the Markov Chain Monte Carlo method. We carry out a set of experiments with synthetic and realworld data that show empirically the benefit of striving for the inclusion order when learning Bayesian networks from data.
Lp, A Logic for Representing and Reasoning with Statistical Knowledge
, 1990
"... This paper presents a logical formalism for representing and reasoning with statistical knowledge. One of the key features of the formalism is its ability to deal with qualitative statistical information. It is argued that statistical knowledge, especially that of a qualitative nature, is an importa ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
This paper presents a logical formalism for representing and reasoning with statistical knowledge. One of the key features of the formalism is its ability to deal with qualitative statistical information. It is argued that statistical knowledge, especially that of a qualitative nature, is an important component of our world knowledge and that such knowledge is used in many different reasoning tasks. The work is further motivated by the observation that previous formalisms for representing probabilistic information are inadequate for representing statistical knowledge. The representation mechanism takes the form of a logic that is capable of representing a wide variety of statistical knowledge, and that possesses an intuitive formal semantics based on the simple notions of sets of objects and probabilities defined over those sets. Furthermore, a proof theory is developed and is shown to be sound and complete. The formalism offers a perspicuous and powerful representational tool for stat...
Distributed inference in Bayesian networks
 CYBERNETICS AND SYSTEMS
, 1994
"... Bayesian networks originated as a framework for distributed reasoning. In singlyconnected networks, there exists an elegant inference algorithm that can be implemented in parallel having a processor for every node. It can be extended to take profit of the ORgate, a model of interaction among ca ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
Bayesian networks originated as a framework for distributed reasoning. In singlyconnected networks, there exists an elegant inference algorithm that can be implemented in parallel having a processor for every node. It can be extended to take profit of the ORgate, a model of interaction among causes which simplifies knowledge acquisition and evidence propagation. We also discuss two exact and one approximate methods for dealing with general networks. All these algorithms admit distributed implementations.
Automated Database Schema Design Using Mined Data Dependencies
 J. Amer. Soc. Inform. Sci
, 1998
"... Data dependencies are used in database schema design to enforce the correctness of a database as well as to reduce redundant data. These dependencies are usually determined from the semantics of the attributes and are then enforced upon the relations. This paper describes a bottomup procedure for d ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
Data dependencies are used in database schema design to enforce the correctness of a database as well as to reduce redundant data. These dependencies are usually determined from the semantics of the attributes and are then enforced upon the relations. This paper describes a bottomup procedure for discovering multivalued dependencies (MVDs) in observed data without knowing `a priori the relationships amongst the attributes. The proposed algorithm is an application of the technique we designed for learning conditional independencies in probabilistic reasoning. A prototype system for automated database schema design has been implemented. Experiments were carried out to demonstrate both the effectiveness and efficiency of our method. 1
Entropy and MDL Discretization of Continuous Variables for Bayesian Belief Networks
 Int’l J. Intelligent Systems
, 2000
"... An efficient algorithm for partitioning the range of a continuous variable to a discrete number of intervals, for use in the construction of Bayesian belief networks Ž BBNs., is presented here. The partitioning minimizes the information loss, relative to the number of intervals used to represent the ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
An efficient algorithm for partitioning the range of a continuous variable to a discrete number of intervals, for use in the construction of Bayesian belief networks Ž BBNs., is presented here. The partitioning minimizes the information loss, relative to the number of intervals used to represent the variable. Partitioning can be done prior to BBN construction or extended for repartitioning during construction. Prior partitioning allows either Bayesian or minimum descriptive length Ž MDL. metrics to be used to guide BBN construction. Dynamic repartitioning, during BBN construction, is done with a MDL metric to guide construction. The methods are demonstrated with data from two epidemiological studies and these results are compared for all of the methods. The use of the partitioning algorithm resulted in more sparsely connected BBNs, than with binary partitioning, with little information loss from mapping continuous variables into discrete ones. � 2000 John Wiley & Sons, Inc. I.
Towards an inclusion driven learning of Bayesian Networks
 Institute for Computing and Information Sciences, University of Utrecht, The Netherlands
, 2002
"... Two or more Bayesian Networks are Markov equivalent when their corresponding acyclic digraphs encode the same set of conditional independence (# CI) restrictions. Therefore, the search space of Bayesian Networks may be organized in classes of equivalence, where each of them consists of a partic ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Two or more Bayesian Networks are Markov equivalent when their corresponding acyclic digraphs encode the same set of conditional independence (# CI) restrictions. Therefore, the search space of Bayesian Networks may be organized in classes of equivalence, where each of them consists of a particular set of CI restrictions. The collection of sets of CI restrictions obeys a partial order, the graphical Markov model inclusion partial order, or inclusion order for short.
A Characterization of Moral Transitive Directed Acyclic Graph Markov models as trees and its properties
, 2000
"... It follows from the known relationships among the dierent classes of graphical Markov models for conditional independence that the intersection of the classes of moral directed acyclic graph models (or decomposable {DEC models), and transitive directed acyclic graph {TDAG models (or lattice cond ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
It follows from the known relationships among the dierent classes of graphical Markov models for conditional independence that the intersection of the classes of moral directed acyclic graph models (or decomposable {DEC models), and transitive directed acyclic graph {TDAG models (or lattice conditional independence {LCI models) is nonempty. This paper shows that the conditional independence models in the intersection can be characterized as labeled trees, where every vertex on the tree corresponds to a single random variable. This fact leads to the de nition of a speci c Markov property for trees and therefore to the introduction of trees as part of the family of graphical Markov Models.