Results 1 
8 of
8
PCLASSIC: A tractable probabilistic description logic
 In Proceedings of AAAI97
, 1997
"... Knowledge representation languages invariably reflect a tradeoff between expressivity and tractability. Evidence suggests that the compromise chosen by description logics is a particularly successful one. However, description logic (as for all variants of firstorder logic) is severely limited in i ..."
Abstract

Cited by 106 (4 self)
 Add to MetaCart
Knowledge representation languages invariably reflect a tradeoff between expressivity and tractability. Evidence suggests that the compromise chosen by description logics is a particularly successful one. However, description logic (as for all variants of firstorder logic) is severely limited in its ability to express uncertainty. In this paper, we present PCLASSIC, a probabilistic version of the description logic CLASSIC. In addition to terminological knowledge, the language utilizes Bayesian networks to express uncertainty about the basic properties of an individual, the number of fillers for its roles, and the properties of these fillers. We provide a semantics for PCLASSIC and an effective inference procedure for probabilistic subsumption: computing the probability that a random individual in class C is also in class D. The effectiveness of the algorithm relies on independenceassumptions and on our ability to execute lifted inference: reasoning about similar individuals as a gr...
Generating New Beliefs From Old
, 1994
"... In previous work [BGHK92, BGHK93], we have studied the randomworlds approacha particular (and quite powerful) method for generating degrees of belief (i.e., subjective probabilities) from a knowledge base consisting of objective (firstorder, statistical, and default) information. But allow ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
In previous work [BGHK92, BGHK93], we have studied the randomworlds approacha particular (and quite powerful) method for generating degrees of belief (i.e., subjective probabilities) from a knowledge base consisting of objective (firstorder, statistical, and default) information. But allowing a knowledge base to contain only objective information is sometimes limiting. We occasionally wish to include information about degrees of belief in the knowledge base as well, because there are contexts in which old beliefs represent important information that should influence new beliefs. In this paper, we describe three quite general techniques for extending a method that generates degrees of belief from objective information to one that can make use of degrees of belief as well. All of our techniques are based on wellknown approaches, such as crossentropy. We discuss general connections between the techniques and in particular show that, although conceptually and techn...
Representation Independence of Nonmonotonic Inference Relations
, 1996
"... A logical concept of representation independence is developed for nonmonotonic logics, including probabilistic inference systems. The general framework then is applied to several nonmonotonic logics, particularly propositional probabilistic logics. For these logics our investigation leads us to modi ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
A logical concept of representation independence is developed for nonmonotonic logics, including probabilistic inference systems. The general framework then is applied to several nonmonotonic logics, particularly propositional probabilistic logics. For these logics our investigation leads us to modified inference rules with greater representation independence. 1 INTRODUCTION Entropy maximization is a rule for probabilistic inference for whose application to problems in artificial intelligence there exist several independent and very strong arguments (Grove, Halpern & Koller 1992),(Paris & Vencovsk'a 1990). Unfortunately, though, there is a major drawback for which the maximum entropy inference rule has often been criticized: the result of the inference depends on how given information is represented. The probably best known example used to illustrate this point is the "Life on Mars" example, a rendition of which may be given as follows: the belief that the probability for the existen...
Market Analysis Using a Combination of Bayesian Networks and Description Logics
, 1999
"... The work described in this paper was inspired by a problem increasingly vexatious to many businesses confronting the everdiminishing life cycles of modern productsviz., that of predicting characteristics (such as overall demand, segmentation, etc.) of markets facing new product introductions. A ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
The work described in this paper was inspired by a problem increasingly vexatious to many businesses confronting the everdiminishing life cycles of modern productsviz., that of predicting characteristics (such as overall demand, segmentation, etc.) of markets facing new product introductions. A framework is proposed that allows the market parameters of new products to be derived by analogy with those of old ones. To do so, the framework combines the capabilities of Bayesian networks [14] and description logics [16]. The paper commences with an exposition of the problem that motivates the work. There follow brief descriptions of Bayesian networks and description logics in their unalloyed state, and a discussion of issues surrounding their combination. The combined system is presented, along with an account of its formal details and an inference procedure for entailment. A sample application of the framework is given. The paper concludes by comparing the proposed framework with related existing systems and by suggesting possible courses for future development.
Minimum CrossEntropy Reasoning: A Statistical Justification
 Proceedings of the Fourteenth International Joint Conference on Artificial Intelligence (IJCAI95
, 1995
"... Degrees of belief are formed using observed evidence and statistical background information. In this paper we examine the process of how prior degrees of belief derived from the evidence are combined with statistical data to form more specific degrees of belief. A statistical model for this process ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
Degrees of belief are formed using observed evidence and statistical background information. In this paper we examine the process of how prior degrees of belief derived from the evidence are combined with statistical data to form more specific degrees of belief. A statistical model for this process then is shown to vindicate the crossentropy minimization principle as a rule for probabilistic defaultinference. 1 Introduction A knowledge based system incorporating reasoning with uncertain information gives rise to quantitative statements of two different kinds: statements expressing statistical information and statements of degrees of belief. "10% of applicants seeking employment at company X who are invited to an interview will get a job there" is a statistical statement. "The likelihood that I will be invited for an interview if I apply for a job at company X is about 0.6" expresses a degree of belief. In this paper, both of these kinds of statements are regarded as probabilistic, i...
Measure selection: Notions of rationality and representation independence
 Proceedings of the 14th conference on Uncertainty in Artificial Intelligence
, 1998
"... We take another look at the general problem of selecting a preferred probability measure among those that comply with some given constraints. The dominant role that entropy maximization has obtained in this context is questioned by arguing that the minimum information principle on which it is based ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
We take another look at the general problem of selecting a preferred probability measure among those that comply with some given constraints. The dominant role that entropy maximization has obtained in this context is questioned by arguing that the minimum information principle on which it is based could be supplanted by an at least as plausible “likelihood of evidence ” principle. We then review a method for turning given selection functions into representation independent variants, and discuss the tradeoffs involved in this transformation. setI(J) 1
A logic for inductive probabilistic reasoning
 Synthese
, 2005
"... Inductive probabilistic reasoning is understood as the application of inference patterns that use statistical background information to assign (subjective) probabilities to single events. The simplest such inference pattern is direct inference: from “70 % of As are Bs ” and “a is an A ” infer that a ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Inductive probabilistic reasoning is understood as the application of inference patterns that use statistical background information to assign (subjective) probabilities to single events. The simplest such inference pattern is direct inference: from “70 % of As are Bs ” and “a is an A ” infer that a is a B with probability 0.7. Direct inference is generalized by Jeffrey’s rule and the principle of crossentropy minimization. To adequately formalize inductive probabilistic reasoning is an interesting topic for artificial intelligence, as an autonomous system acting in a complex environment may have to base its actions on a probabilistic model of its environment, and the probabilities needed to form this model can often be obtained by combining statistical background information with particular observations made, i.e. by inductive probabilistic reasoning. In this paper a formal framework for inductive probabilistic reasoning is developed: syntactically it consists of an extension of the language of firstorder predicate logic that allows to express statements about both statistical and subjective probabilities. Semantics for this representation language are developed that give rise to two distinct entailment relations: a relation = that models strict, probabilistically valid, inferences, and a relation  ≈ that models inductive probabilistic inferences. The inductive entailment relation is obtained by implementing crossentropy minimization in a preferred model semantics. A main objective of our approach is to ensure that for both entailment relations complete proof systems exist. This is achieved by allowing probability distributions in our semantic models that use nonstandard probability values. A number of results are presented that show that in several important aspects the resulting logic behaves just like a logic based on realvalued probabilities alone. 1