Results 1  10
of
43
Common Sense and Maximum Entropy
 Synthese
, 2000
"... This paper concerns the question of how to draw inferences common sensically from uncertain knowledge. Since the early work of Shore and Johnson, [10], Paris and Vencovsk a, [6], and Csiszár, [1], it has been known that the Maximum Entropy Inference Process is the only inference process which obeys ..."
Abstract

Cited by 24 (3 self)
 Add to MetaCart
This paper concerns the question of how to draw inferences common sensically from uncertain knowledge. Since the early work of Shore and Johnson, [10], Paris and Vencovsk a, [6], and Csiszár, [1], it has been known that the Maximum Entropy Inference Process is the only inference process which obeys certain common sense principles of uncertain reasoning. In this paper we consider the present status of this result and argue that within the rather narrow context in which we work this complete and consistent mode of uncertain reasoning is actually characterised by the observance of just a single common sense principle (or slogan).
Representation Dependence in Probabilistic Inference
 JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH
, 2004
"... Nondeductive reasoning systems are often representation dependent: representing the same situation in two different ways may cause such a system to return two different answers. Some have viewed ..."
Abstract

Cited by 20 (1 self)
 Add to MetaCart
Nondeductive reasoning systems are often representation dependent: representing the same situation in two different ways may cause such a system to return two different answers. Some have viewed
Probabilistic Logic Programming under Maximum Entropy
 In Proc. ECSQARU99, LNCS 1638
, 1999
"... . In this paper, we focus on the combination of probabilistic logic programming with the principle of maximum entropy. We start by defining probabilistic queries to probabilistic logic programs and their answer substitutions under maximum entropy. We then present an efficient linear programming char ..."
Abstract

Cited by 18 (5 self)
 Add to MetaCart
. In this paper, we focus on the combination of probabilistic logic programming with the principle of maximum entropy. We start by defining probabilistic queries to probabilistic logic programs and their answer substitutions under maximum entropy. We then present an efficient linear programming characterization for the problem of deciding whether a probabilistic logic program is satisfiable. Finally, and as a central contribution of this paper, we introduce an efficient technique for approximative probabilistic logic programming under maximum entropy. This technique reduces the original entropy maximization task to solving a modified and relatively small optimization problem. 1 Introduction Probabilistic propositional logics and their various dialects are thoroughly studied in the literature (see especially [19] and [5]; see also [15] and [16]). Their extensions to probabilistic firstorder logics can be classified into firstorder logics in which probabilities are defined over the do...
A Logic for Default Reasoning About Probabilities
, 1998
"... A logic is defined that allows to express information about statistical probabilities and about degrees of belief in specific propositions. By interpreting the two types of probabilities in one common probability space, the semantics given are well suited to model the in uence of statistical informa ..."
Abstract

Cited by 12 (4 self)
 Add to MetaCart
A logic is defined that allows to express information about statistical probabilities and about degrees of belief in specific propositions. By interpreting the two types of probabilities in one common probability space, the semantics given are well suited to model the in uence of statistical information on the formation of subjective beliefs. Cross entropy minimization is a key element in these semantics, the use of which is justified by showing that the resulting logic exhibits some very reasonable properties.
Combining probabilistic logic programming with the power of maximum entropy
 ARTIF. INTELL
, 2004
"... This paper is on the combination of two powerful approaches to uncertain reasoning: logic programming in a probabilistic setting, on the one hand, and the informationtheoretical principle of maximum entropy, on the other hand. More precisely, we present two approaches to probabilistic logic progra ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
This paper is on the combination of two powerful approaches to uncertain reasoning: logic programming in a probabilistic setting, on the one hand, and the informationtheoretical principle of maximum entropy, on the other hand. More precisely, we present two approaches to probabilistic logic programming under maximum entropy. The first one is based on the usual notion of entailment under maximum entropy, and is defined for the very general case of probabilistic logic programs over Boolean events. The second one is based on a new notion of entailment under maximum entropy, where the principle of maximum entropy is coupled with the closed world assumption (CWA) from classical logic programming. It is only defined for the more restricted case of probabilistic logic programs over conjunctive events. We then analyze the nonmonotonic behavior of both approaches along benchmark examples and along general properties for default reasoning from conditional knowledge bases. It turns out that both approaches have very nice nonmonotonic features. In particular, they realize some inheritance of probabilistic knowledge along subclass relationships, without suffering from the problem of inheritance blocking and from the drowning problem. They both also satisfy the property of rational monotonicity and several irrelevance properties. We finally present algorithms for both approaches, which are based on generalizations of techniques from probabilistic
In Defence of the Maximum Entropy Inference Process
, 1997
"... This paper is a sequel to an earlier result of the authors that in making inferences from certain probabilistic knowledge bases the Maximum Entropy Inference Process, ME, is the only inference process respecting 'common sense'. This result was criticised on the grounds that the probabilistic knowle ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
This paper is a sequel to an earlier result of the authors that in making inferences from certain probabilistic knowledge bases the Maximum Entropy Inference Process, ME, is the only inference process respecting 'common sense'. This result was criticised on the grounds that the probabilistic knowledge bases considered are unnatural and that ignorance of dependence should not be identied with statistical independence. We argue against these criticisms and also against the more general criticism that ME is representation dependant. In a nal section we however provide a criticism of our own of ME, and of inference processes in general, namely that they fail to satisfy compactness. Introduction and Notation In [1] we gave a justication of the Maximum Entropy Inference Process, ME, by characterising it as the unique probabilistic inference process satisfying a certain collection of common sense principles. In the years following that publication a number of criticisms of these principl...
Representation Independence of Nonmonotonic Inference Relations
, 1996
"... A logical concept of representation independence is developed for nonmonotonic logics, including probabilistic inference systems. The general framework then is applied to several nonmonotonic logics, particularly propositional probabilistic logics. For these logics our investigation leads us to modi ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
A logical concept of representation independence is developed for nonmonotonic logics, including probabilistic inference systems. The general framework then is applied to several nonmonotonic logics, particularly propositional probabilistic logics. For these logics our investigation leads us to modified inference rules with greater representation independence. 1 INTRODUCTION Entropy maximization is a rule for probabilistic inference for whose application to problems in artificial intelligence there exist several independent and very strong arguments (Grove, Halpern & Koller 1992),(Paris & Vencovsk'a 1990). Unfortunately, though, there is a major drawback for which the maximum entropy inference rule has often been criticized: the result of the inference depends on how given information is represented. The probably best known example used to illustrate this point is the "Life on Mars" example, a rendition of which may be given as follows: the belief that the probability for the existen...
Reasoning with Probabilities and Maximum Entropy: The System PIT and its Application in LEXMED
 In Proceedings of the Symposium on Operations Research (SOR '99
, 1999
"... We present a theory, a system and an application for common sense reasoning based on propositional logic, the probability calculus and the concept of maximum entropy. The task of the system Pit (Probability Induction Tool) is to provide decisions under incomplete knowledge, while keeping the necessa ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
We present a theory, a system and an application for common sense reasoning based on propositional logic, the probability calculus and the concept of maximum entropy. The task of the system Pit (Probability Induction Tool) is to provide decisions under incomplete knowledge, while keeping the necessary additional assumptions as minimal and clear as possible. We therefore enrich the probability calculus by two principles which have their common source in the concept of modelquantification ([8, 17]) and find their dense representation in the wellknown principle of Maximum Entropy (MaxEnt [6]). As modelquantification delivers a precise semantics to MaxEnt, the corresponding decisions make sense not only in our current project of medical diagnosis in Lexmed.