Results 1 
9 of
9
Soft Evidential Update for Probabilistic Multiagent Systems
 INTERNATIONAL JOURNAL OF APPROXIMATE REASONING
, 2000
"... We address the problem of updating a probability distribution represented by a Bayesian network upon presentation of soft evidence. Our motivation ..."
Abstract

Cited by 27 (5 self)
 Add to MetaCart
We address the problem of updating a probability distribution represented by a Bayesian network upon presentation of soft evidence. Our motivation
A Logic for Default Reasoning About Probabilities
, 1998
"... A logic is defined that allows to express information about statistical probabilities and about degrees of belief in specific propositions. By interpreting the two types of probabilities in one common probability space, the semantics given are well suited to model the in uence of statistical informa ..."
Abstract

Cited by 12 (4 self)
 Add to MetaCart
A logic is defined that allows to express information about statistical probabilities and about degrees of belief in specific propositions. By interpreting the two types of probabilities in one common probability space, the semantics given are well suited to model the in uence of statistical information on the formation of subjective beliefs. Cross entropy minimization is a key element in these semantics, the use of which is justified by showing that the resulting logic exhibits some very reasonable properties.
Representation Independence of Nonmonotonic Inference Relations
, 1996
"... A logical concept of representation independence is developed for nonmonotonic logics, including probabilistic inference systems. The general framework then is applied to several nonmonotonic logics, particularly propositional probabilistic logics. For these logics our investigation leads us to modi ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
A logical concept of representation independence is developed for nonmonotonic logics, including probabilistic inference systems. The general framework then is applied to several nonmonotonic logics, particularly propositional probabilistic logics. For these logics our investigation leads us to modified inference rules with greater representation independence. 1 INTRODUCTION Entropy maximization is a rule for probabilistic inference for whose application to problems in artificial intelligence there exist several independent and very strong arguments (Grove, Halpern & Koller 1992),(Paris & Vencovsk'a 1990). Unfortunately, though, there is a major drawback for which the maximum entropy inference rule has often been criticized: the result of the inference depends on how given information is represented. The probably best known example used to illustrate this point is the "Life on Mars" example, a rendition of which may be given as follows: the belief that the probability for the existen...
Modelling conditional knowledge discovery and belief revision by Abstract State Machines
 In International Workshop on Abstract State Machines  ASM2003, Lecture Notes in Computer Science
, 2003
"... Abstract. We develop a highlevel ASM specification for the Condor system that provides powerful methods and tools for managing knowledge represented by conditionals. Thereby, we are able to elaborate crucial interdependencies between different aspects of knowledge representation, knowledge discover ..."
Abstract

Cited by 6 (6 self)
 Add to MetaCart
Abstract. We develop a highlevel ASM specification for the Condor system that provides powerful methods and tools for managing knowledge represented by conditionals. Thereby, we are able to elaborate crucial interdependencies between different aspects of knowledge representation, knowledge discovery, and belief revision. Moreover, this specification provides the basis for a stepwise refinement development process of the Condor system based on the ASM methodology. 1
Minimum CrossEntropy Reasoning: A Statistical Justification
 Proceedings of the Fourteenth International Joint Conference on Artificial Intelligence (IJCAI95
, 1995
"... Degrees of belief are formed using observed evidence and statistical background information. In this paper we examine the process of how prior degrees of belief derived from the evidence are combined with statistical data to form more specific degrees of belief. A statistical model for this process ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
Degrees of belief are formed using observed evidence and statistical background information. In this paper we examine the process of how prior degrees of belief derived from the evidence are combined with statistical data to form more specific degrees of belief. A statistical model for this process then is shown to vindicate the crossentropy minimization principle as a rule for probabilistic defaultinference. 1 Introduction A knowledge based system incorporating reasoning with uncertain information gives rise to quantitative statements of two different kinds: statements expressing statistical information and statements of degrees of belief. "10% of applicants seeking employment at company X who are invited to an interview will get a job there" is a statistical statement. "The likelihood that I will be invited for an interview if I apply for a job at company X is about 0.6" expresses a degree of belief. In this paper, both of these kinds of statements are regarded as probabilistic, i...
On the Emergence of Reasons in Inductive Logic
, 2001
"... We apply methods of abduction derived from propositional probabilistic reasoning to predicate probabilistic reasoning, in particular inductive logic, by treating finite predicate knowledge bases as potentially infinite propositional knowledge bases. It is shown that for a range of predicate knowled ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
We apply methods of abduction derived from propositional probabilistic reasoning to predicate probabilistic reasoning, in particular inductive logic, by treating finite predicate knowledge bases as potentially infinite propositional knowledge bases. It is shown that for a range of predicate knowledge bases (such as those typically associated with inductive reasoning) and several key propositional inference processes (in particular the Maximum Entropy Inference Process) this procedure is well defined, and furthermore yields an explanation for the validity of the induction in terms of `reasons'.
A logic for inductive probabilistic reasoning
 Synthese
, 2005
"... Inductive probabilistic reasoning is understood as the application of inference patterns that use statistical background information to assign (subjective) probabilities to single events. The simplest such inference pattern is direct inference: from “70 % of As are Bs ” and “a is an A ” infer that a ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Inductive probabilistic reasoning is understood as the application of inference patterns that use statistical background information to assign (subjective) probabilities to single events. The simplest such inference pattern is direct inference: from “70 % of As are Bs ” and “a is an A ” infer that a is a B with probability 0.7. Direct inference is generalized by Jeffrey’s rule and the principle of crossentropy minimization. To adequately formalize inductive probabilistic reasoning is an interesting topic for artificial intelligence, as an autonomous system acting in a complex environment may have to base its actions on a probabilistic model of its environment, and the probabilities needed to form this model can often be obtained by combining statistical background information with particular observations made, i.e. by inductive probabilistic reasoning. In this paper a formal framework for inductive probabilistic reasoning is developed: syntactically it consists of an extension of the language of firstorder predicate logic that allows to express statements about both statistical and subjective probabilities. Semantics for this representation language are developed that give rise to two distinct entailment relations: a relation = that models strict, probabilistically valid, inferences, and a relation  ≈ that models inductive probabilistic inferences. The inductive entailment relation is obtained by implementing crossentropy minimization in a preferred model semantics. A main objective of our approach is to ensure that for both entailment relations complete proof systems exist. This is achieved by allowing probability distributions in our semantic models that use nonstandard probability values. A number of results are presented that show that in several important aspects the resulting logic behaves just like a logic based on realvalued probabilities alone. 1
A Probabilistic Extension of Terminological Logics
, 1994
"... In this report we define a probabilistic extension for a basic terminological knowledge representation languages. Two kinds of probabilistic statements are introduced: statements about conditional probabilities between concepts and statements expressing uncertain knowledge about a specific object. T ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In this report we define a probabilistic extension for a basic terminological knowledge representation languages. Two kinds of probabilistic statements are introduced: statements about conditional probabilities between concepts and statements expressing uncertain knowledge about a specific object. The usual modeltheoretic semantics for terminological logics are extended to define interpretations for the resulting probabilistic language. It is our main objective to find an adequate modeling of the way the two kinds of probabilistic knowledge are combined in what we call default reasoning about probabilities. Cross entropy minimization is a technique that turns out to be a very promising tool towards achieving this end.
Minimum CrossEntropy Reasoning: A Statistical Justification
"... Degrees of belief are formed using observed evidence and statistical background information. In this paper we examine the process of how prior degrees of belief derived from the evidence are combined with statistical data to form more specific degrees of belief. A statistical model for this process ..."
Abstract
 Add to MetaCart
Degrees of belief are formed using observed evidence and statistical background information. In this paper we examine the process of how prior degrees of belief derived from the evidence are combined with statistical data to form more specific degrees of belief. A statistical model for this process then is shown to vindicate the crossentropy minimization principle as a rule for probabilistic defaultinference. 1