Results 1  10
of
17
System Z: a natural ordering of defaults with tractable applications to default reasoning
, 1990
"... Recent progress towards unifying the probabilistic and preferential models semantics for nonmonotonic reasoning has led to a remarkable observation: Any consistent system of default rules imposes an unambiguous and natural ordering on these rules which, to emphasize its simple and basic character, ..."
Abstract

Cited by 186 (0 self)
 Add to MetaCart
Recent progress towards unifying the probabilistic and preferential models semantics for nonmonotonic reasoning has led to a remarkable observation: Any consistent system of default rules imposes an unambiguous and natural ordering on these rules which, to emphasize its simple and basic character, we term &quot;Zordering. &quot; This ordering can be used with various levels of refinement, to prioritize conflicting arguments, to rank the degree of abnormality of states of the world, and to define plausible consequence relationships. This paper defines the Zordering, briefly mentions its semantical origins, and iUustrates two simple entailment relationships induced by the ordering. Two extensions are then described, maximumentropy and conditional entailment, which trade in computational simplicity for semantic refinements. 1. Description We begin with a set of rules R = {r: %. ~ 6,} where % and [~r are propositional formulas over a finite alphabet of literals, ando denotes a new connective to be given default interpretations later on. A truth valuation of the fiterals in the language will be called a model. A model M is said to verify a rule ot ~ ifM ~ot ^ [3(i.e., o~and ~ are both true in M), and to falsify ot ~ ~ifM ~A ~ 13. Given a set R of such rules, we first define the relation of toleration.
Statistical Foundations for Default Reasoning
, 1993
"... We describe a new approach to default reasoning, based on a principle of indifference among possible worlds. We interpret default rules as extreme statistical statements, thus obtaining a knowledge base KB comprised of statistical and firstorder statements. We then assign equal probability to all w ..."
Abstract

Cited by 48 (8 self)
 Add to MetaCart
We describe a new approach to default reasoning, based on a principle of indifference among possible worlds. We interpret default rules as extreme statistical statements, thus obtaining a knowledge base KB comprised of statistical and firstorder statements. We then assign equal probability to all worlds consistent with KB in order to assign a degree of belief to a statement '. The degree of belief can be used to decide whether to defeasibly conclude '. Various natural patterns of reasoning, such as a preference for more specific defaults, indifference to irrelevant information, and the ability to combine independent pieces of evidence, turn out to follow naturally from this technique. Furthermore, our approach is not restricted to default reasoning; it supports a spectrum of reasoning, from quantitative to qualitative. It is also related to other systems for default reasoning. In particular, we show that the work of [ Goldszmidt et al., 1990 ] , which applies maximum entropy ideas t...
Modeling a Dynamic and Uncertain World I: Symbolic and Probabilistic Reasoning about Change
 Artificial Intelligence
, 1993
"... Intelligent agency requires some ability to predict the future. An agent must ask itself what is presently its best course of action given what it now knows about what the world will be like when it intends to act. This paper presents a system that uses a probabilistic model to reason about the effe ..."
Abstract

Cited by 47 (7 self)
 Add to MetaCart
Intelligent agency requires some ability to predict the future. An agent must ask itself what is presently its best course of action given what it now knows about what the world will be like when it intends to act. This paper presents a system that uses a probabilistic model to reason about the effects of an agent's proposed actions on a dynamic and uncertain world, computing the probability that relevant propositions will hold at a specified point in time. The model allows for incomplete information about the world, the occurrence of exogenous (unplanned) events, unreliable sensors, and the possibility of an imperfect causal theory. The system provides an application program with answers to questions of the form "is the probability that ' will hold in the world at time t greater than ø ?" It is unique among algorithms for probabilistic temporal reasoning in that it tries to limit its inference according to the proposition, time, and probability threshold provided by the application. T...
Using FirstOrder Probability Logic for the Construction of Bayesian Networks
, 1993
"... We present a mechanism for constructing graphical models, specifically Bayesian networks, from a knowledge base of general probabilistic information. The unique feature of our approach is that it uses a powerful firstorder probabilistic logic for expressing the general knowledge base. This logic al ..."
Abstract

Cited by 20 (0 self)
 Add to MetaCart
We present a mechanism for constructing graphical models, specifically Bayesian networks, from a knowledge base of general probabilistic information. The unique feature of our approach is that it uses a powerful firstorder probabilistic logic for expressing the general knowledge base. This logic allows for the representation of a wide range of logical and probabilistic information. The model construction procedure we propose uses notions from direct inference to identify pieces of local statistical information from the knowledge base that are most appropriate to the particular event we want to reason about. These pieces are composed to generate a joint probability distribution specified as a Bayesian network. Although there are fundamental difficulties in dealing with fully general knowledge, our procedure is practical for quite rich knowledge bases and it supports the construction of a far wider range of networks than allowed for by current template technology. 1 Introduction The de...
Philosophies of probability: objective Bayesianism and its challenges
 Handbook of the philosophy of mathematics. Elsevier, Amsterdam. Handbook of the Philosophy of Science
, 2004
"... This chapter presents an overview of the major interpretations of probability followed by an outline of the objective Bayesian interpretation and a discussion of the key challenges it faces. ..."
Abstract

Cited by 10 (5 self)
 Add to MetaCart
(Show Context)
This chapter presents an overview of the major interpretations of probability followed by an outline of the objective Bayesian interpretation and a discussion of the key challenges it faces.
Credal Networks under Maximum Entropy
, 2000
"... We apply the principle of maximum entropy to select a unique joint probability distribution from the set of all joint probability distributions specified by a credal network. In detail, we start by showing that the unique joint distribution of a Bayesian tree coincides with the maximum entropy m ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
(Show Context)
We apply the principle of maximum entropy to select a unique joint probability distribution from the set of all joint probability distributions specified by a credal network. In detail, we start by showing that the unique joint distribution of a Bayesian tree coincides with the maximum entropy model of its conditional distributions. This result, however, does not hold anymore for general Bayesian networks. We thus present a new kind of maximum entropy models, which are computed sequentially. We then show that for all general Bayesian networks, the sequential maximum entropy model coincides with the unique joint distribution. Moreover, we apply the new principle of sequential maximum entropy to interval Bayesian networks and more generally to credal networks. We especially show that this application is equivalent to a number of small local entropy maximizations.
Forming Beliefs About a Changing World
, 1994
"... The situation calculus is a popular technique for reasoning about action and change. However, its restriction to a firstorder syntax and pure deductive reasoning makes it unsuitable in many contexts. In particular, we often face uncertainty, due either to lack of knowledge or to some probabilistic a ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
(Show Context)
The situation calculus is a popular technique for reasoning about action and change. However, its restriction to a firstorder syntax and pure deductive reasoning makes it unsuitable in many contexts. In particular, we often face uncertainty, due either to lack of knowledge or to some probabilistic aspects of the world. While attempts have been made to address aspects of this problem, most notably using nonmonotonic reasoning formalisms, the general problem of uncertainty in reasoning about action has not been fully dealt with in a logical framework. In this paper we present a theory of action that extends the situation calculus to deal with uncertainty. Our framework is based on applying the randomworlds approach of [BGHK94] to a situation calculus ontology, enriched to allow the expression of probabilistic action effects. Our approach is able to solve many of the problems imposed by incomplete and probabilistic knowledge within a unified framework. In particular, we obtain a default ...
Maximum entropy probabilistic logic
, 2002
"... Recent research has shown there are two types of uncertainty that can be expressed in firstorder logic— propositional and statistical uncertainty—and that both types can be represented in terms of probability spaces. However, these efforts have fallen short of providing a general account of how to ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
(Show Context)
Recent research has shown there are two types of uncertainty that can be expressed in firstorder logic— propositional and statistical uncertainty—and that both types can be represented in terms of probability spaces. However, these efforts have fallen short of providing a general account of how to design probability measures for these spaces; as a result, we lack a crucial component of any system that reasons under these types of uncertainty. In this paper, we describe an automatic procedure for defining such measures in terms of a probabilistic knowledge base. In particular, we employ the principle of maximum entropy to select measures that are consistent with our knowledge and that make the fewest assumptions in doing so. This approach yields models of firstorder uncertainty that are principled, intuitive, and economical in their representation.
Philosophies of probability
 Handbook of the Philosophy of Mathematics, Volume 4 of the Handbook of the Philosophy of Science
"... This chapter presents an overview of the major interpretations of probability followed by an outline of the objective Bayesian interpretation and a discussion of the key challenges it faces. I discuss the ramifications of interpretations of probability and objective Bayesianism for the philosophy of ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
(Show Context)
This chapter presents an overview of the major interpretations of probability followed by an outline of the objective Bayesian interpretation and a discussion of the key challenges it faces. I discuss the ramifications of interpretations of probability and objective Bayesianism for the philosophy of mathematics in general.
Default Reasoning Using Maximum Entropy and Variable Strength Defaults
, 1999
"... The thesis presents a computational model for reasoning with partial information which uses default rules or information about what normally happens. The idea is to provide a means of filling the gaps in an incomplete world view with the most plausible assumptions while allowing for the retraction o ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
The thesis presents a computational model for reasoning with partial information which uses default rules or information about what normally happens. The idea is to provide a means of filling the gaps in an incomplete world view with the most plausible assumptions while allowing for the retraction of conclusions should they subsequently turn out to be incorrect. The model can be used both to reason from a given knowledge base of default rules, and to aid in the construction of such knowledge bases by allowing their designer to compare the consequences of his design with his own default assumptions. The conclusions supported by the proposed model are justified by the use of a probabilistic semantics for default rules in conjunction with the application of a rational means of inference from incomplete knowledgethe principle of maximum entropy (ME). The thesis develops both the theory and algorithms for the ME approach and argues that it should be considered as a general theory of default reasoning. The argument supporting the thesis has two main threads. Firstly, the ME approach is tested on the benchmark examples required of nonmonotonic behaviour, and it is found to handle them appropriately. Moreover, these patterns of commonsense reasoning emerge as consequences of the chosen semantics rather than being design features. It is argued that this makes the ME approach more objective, and its conclusions more justifiable, than other default systems. Secondly, the ME approach is compared with two existing systems: the lexicographic approach (LEX) and system Z + . It is shown that the former can be equated with ME under suitable conditions making it strictly less expressive, while the latter is too crude to perform the subtle resolution of default conflict which the ME...