Results 1  10
of
15
Random Worlds and Maximum Entropy
 In Proc. 7th IEEE Symp. on Logic in Computer Science
, 1994
"... Given a knowledge base KB containing firstorder and statistical facts, we consider a principled method, called the randomworlds method, for computing a degree of belief that some formula ' holds given KB . If we are reasoning about a world or system consisting of N individuals, then we can conside ..."
Abstract

Cited by 49 (12 self)
 Add to MetaCart
Given a knowledge base KB containing firstorder and statistical facts, we consider a principled method, called the randomworlds method, for computing a degree of belief that some formula ' holds given KB . If we are reasoning about a world or system consisting of N individuals, then we can consider all possible worlds, or firstorder models, with domain f1; : : : ; Ng that satisfy KB , and compute the fraction of them in which ' is true. We define the degree of belief to be the asymptotic value of this fraction as N grows large. We show that when the vocabulary underlying ' and KB uses constants and unary predicates only, we can naturally associate an entropy with each world. As N grows larger, there are many more worlds with higher entropy. Therefore, we can use a maximumentropy computation to compute the degree of belief. This result is in a similar spirit to previous work in physics and artificial intelligence, but is far more general. Of equal interest to the result itself are...
From Statistics to Beliefs
, 1992
"... An intelligent agent uses known facts, including statistical knowledge, to assign degrees of belief to assertions it is uncertain about. We investigate three principled techniques for doing this. All three are applications of the principle of indifference, because they assign equal degree of belief ..."
Abstract

Cited by 43 (12 self)
 Add to MetaCart
An intelligent agent uses known facts, including statistical knowledge, to assign degrees of belief to assertions it is uncertain about. We investigate three principled techniques for doing this. All three are applications of the principle of indifference, because they assign equal degree of belief to all basic "situations " consistent with the knowledge base. They differ because there are competing intuitions about what the basic situations are. Various natural patterns of reasoning, such as the preference for the most specific statistical data available, turn out to follow from some or all of the techniques. This is an improvement over earlier theories, such as work on direct inference and reference classes, which arbitrarily postulate these patterns without offering any deeper explanations or guarantees of consistency. The three methods we investigate have surprising characterizations: there are connections to the principle of maximum entropy, a principle of maximal independence, an...
Probabilistic Logic Programming under Maximum Entropy
 In Proc. ECSQARU99, LNCS 1638
, 1999
"... . In this paper, we focus on the combination of probabilistic logic programming with the principle of maximum entropy. We start by defining probabilistic queries to probabilistic logic programs and their answer substitutions under maximum entropy. We then present an efficient linear programming char ..."
Abstract

Cited by 18 (5 self)
 Add to MetaCart
. In this paper, we focus on the combination of probabilistic logic programming with the principle of maximum entropy. We start by defining probabilistic queries to probabilistic logic programs and their answer substitutions under maximum entropy. We then present an efficient linear programming characterization for the problem of deciding whether a probabilistic logic program is satisfiable. Finally, and as a central contribution of this paper, we introduce an efficient technique for approximative probabilistic logic programming under maximum entropy. This technique reduces the original entropy maximization task to solving a modified and relatively small optimization problem. 1 Introduction Probabilistic propositional logics and their various dialects are thoroughly studied in the literature (see especially [19] and [5]; see also [15] and [16]). Their extensions to probabilistic firstorder logics can be classified into firstorder logics in which probabilities are defined over the do...
Asymptotic Conditional Probabilities for FirstOrder Logic
 In Proc. 24th ACM Symp. on Theory of Computing
, 1992
"... Motivated by problems that arise in computing degrees of belief, we consider the problem of computing asymptotic conditional probabilities for firstorder formulas. That is, given firstorder formulas ' and `, we consider the number of structures with domain f1; : : : ; Ng that satisfy `, and comput ..."
Abstract

Cited by 13 (7 self)
 Add to MetaCart
Motivated by problems that arise in computing degrees of belief, we consider the problem of computing asymptotic conditional probabilities for firstorder formulas. That is, given firstorder formulas ' and `, we consider the number of structures with domain f1; : : : ; Ng that satisfy `, and compute the fraction of them in which ' is true. We then consider what happens to this probability as N gets large. This is closely connected to the work on 01 laws that considers the limiting probability of firstorder formulas, except that now we are considering asymptotic conditional probabilities. Although work has been done on special cases of asymptotic conditional probabilities, no general theory has been developed. This is probably due in part to the fact that it has been known that, if there is a binary predicate symbol in the vocabulary, asymptotic conditional probabilities do not always exist. We show that in this general case, almost all the questions one might want to ask (such as d...
Generating New Beliefs From Old
, 1994
"... In previous work [BGHK92, BGHK93], we have studied the randomworlds approacha particular (and quite powerful) method for generating degrees of belief (i.e., subjective probabilities) from a knowledge base consisting of objective (firstorder, statistical, and default) information. But allow ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
In previous work [BGHK92, BGHK93], we have studied the randomworlds approacha particular (and quite powerful) method for generating degrees of belief (i.e., subjective probabilities) from a knowledge base consisting of objective (firstorder, statistical, and default) information. But allowing a knowledge base to contain only objective information is sometimes limiting. We occasionally wish to include information about degrees of belief in the knowledge base as well, because there are contexts in which old beliefs represent important information that should influence new beliefs. In this paper, we describe three quite general techniques for extending a method that generates degrees of belief from objective information to one that can make use of degrees of belief as well. All of our techniques are based on wellknown approaches, such as crossentropy. We discuss general connections between the techniques and in particular show that, although conceptually and techn...
A Logic for Default Reasoning About Probabilities
, 1998
"... A logic is defined that allows to express information about statistical probabilities and about degrees of belief in specific propositions. By interpreting the two types of probabilities in one common probability space, the semantics given are well suited to model the in uence of statistical informa ..."
Abstract

Cited by 12 (4 self)
 Add to MetaCart
A logic is defined that allows to express information about statistical probabilities and about degrees of belief in specific propositions. By interpreting the two types of probabilities in one common probability space, the semantics given are well suited to model the in uence of statistical information on the formation of subjective beliefs. Cross entropy minimization is a key element in these semantics, the use of which is justified by showing that the resulting logic exhibits some very reasonable properties.
Asymptotic Conditional Probabilities: The Unary Case
, 1993
"... Motivated by problems that arise in computing degrees of belief, we consider the problem of computing asymptotic conditional probabilities for firstorder sentences. Given firstorder sentences ' and `, we consider the structures with domain f1; : : : ; Ng that satisfy `, and compute the fraction of ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
Motivated by problems that arise in computing degrees of belief, we consider the problem of computing asymptotic conditional probabilities for firstorder sentences. Given firstorder sentences ' and `, we consider the structures with domain f1; : : : ; Ng that satisfy `, and compute the fraction of them in which ' is true. We then consider what happens to this fraction as N gets large. This extends the work on 01 laws that considers the limiting probability of firstorder sentences, by considering asymptotic conditional probabilities. As shown by Liogon'kii[31] and Grove, Halpern, and Koller [22], in the general case, asymptotic conditional probabilities do not always exist, and most questions relating to this issue are highly undecidable. These results, however, all depend on the assumption that ` can use a nonunary predicate symbol. Liogon'kii [31] shows that if we condition on formulas ` involving unary predicate symbols only (but no equality or constant symbols), then the asymptotic conditional probability does exist and can be effectively computed. This is the case even if we place no corresponding restrictions on '. We extend this result here to the case where ` involves equality and constants. We show that the complexity of computing the limit depends on various factors, such as the depth of quantifier nesting, or whether the vocabulary is finite or infinite. We completely characterize the complexity of the problem in the different cases, and show related results for the associated approximation problem.
NonMonotonic Reasoning on Probability Models: Indifference, Independence & MaxEnt  Part I  Overview
"... Through completing an underspecified probability model, Maximum Entropy (MaxEnt) supports nonmonotonic inferences. Some major aspects of how this is done by MaxEnt can be understood from the background of two principles of rational decision: the concept of Indifference and the concept of Indepen ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Through completing an underspecified probability model, Maximum Entropy (MaxEnt) supports nonmonotonic inferences. Some major aspects of how this is done by MaxEnt can be understood from the background of two principles of rational decision: the concept of Indifference and the concept of Independence. In a formal specification MaxEnt can be viewed as (conservative) extension of these principles; so these principles shed light on the "magical" decisions of MaxEnt. But the other direction is true as well: Since MaxEnt is a "correct" representation of the set of models (Concentration Theorem), it elucidates these two principles (e.g. it can be shown, that the knowledge of independences can be of very different informationtheoretic value). These principles and their calculi are not just arbitrary ideas: When extended to work with qualitative constraints which are modelled by probability intervals, each calculus can be successfully applied to V. Lifschitz's Benchmarks of NonM...
A Probabilistic Extension of Terminological Logics
, 1994
"... In this report we define a probabilistic extension for a basic terminological knowledge representation languages. Two kinds of probabilistic statements are introduced: statements about conditional probabilities between concepts and statements expressing uncertain knowledge about a specific object. T ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In this report we define a probabilistic extension for a basic terminological knowledge representation languages. Two kinds of probabilistic statements are introduced: statements about conditional probabilities between concepts and statements expressing uncertain knowledge about a specific object. The usual modeltheoretic semantics for terminological logics are extended to define interpretations for the resulting probabilistic language. It is our main objective to find an adequate modeling of the way the two kinds of probabilistic knowledge are combined in what we call default reasoning about probabilities. Cross entropy minimization is a technique that turns out to be a very promising tool towards achieving this end.
Maximum Entropy in Nilsson's Probabilistic Logic
 in: Proceedings of IJCAI 1989
, 1989
"... Nilsson's Probabilistic Logic is a set theoretic mechanism for reasoning with uncertainty. We propose a new way of looking at the probability constraints enforced by the framework, which allows the expert to include conditional probabilities in the semantic tree, thus making Probabilistic Logic more ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Nilsson's Probabilistic Logic is a set theoretic mechanism for reasoning with uncertainty. We propose a new way of looking at the probability constraints enforced by the framework, which allows the expert to include conditional probabilities in the semantic tree, thus making Probabilistic Logic more expressive. An algorithm is presented which will find the maximum entropy point probability for a rule of entailment without resorting to solution by iterative approximation. The algorithm works for both the propositional and the predicate logic. Also presented are a number of methods for employing the conditional