Results 1  10
of
89
A Complete ManyValued Logic With ProductConjunction
, 1996
"... this paper we investigate some logics whose set of truth values is the real interval [0; 1] and we concentrate our attention to logics having a conjunction whose truth function t(x; y) is a tnorm, and having a corresponding residuated implication (or, as Pavelka [14] observes, the conjunction and t ..."
Abstract

Cited by 42 (10 self)
 Add to MetaCart
this paper we investigate some logics whose set of truth values is the real interval [0; 1] and we concentrate our attention to logics having a conjunction whose truth function t(x; y) is a tnorm, and having a corresponding residuated implication (or, as Pavelka [14] observes, the conjunction and the implication form an adjoint couple); i.e., if i(x; y) is the truth function of the implication then
Approaches to measuring inconsistent information
 Inconsistency Tolerance. Volume 3300 of Lecture Notes in Computer Science
, 2005
"... Abstract. Measures of quantity of information have been studied extensively for more than fifty years. The seminal work on information theory is by Shannon [67]. This work, based on probability theory, can be used in a logical setting when the worlds are the possible events. This work is also the ba ..."
Abstract

Cited by 33 (9 self)
 Add to MetaCart
(Show Context)
Abstract. Measures of quantity of information have been studied extensively for more than fifty years. The seminal work on information theory is by Shannon [67]. This work, based on probability theory, can be used in a logical setting when the worlds are the possible events. This work is also the basis of Lozinskii’s work [48] for defining the quantity of information of a formula (or knowledgebase) in propositional logic. But this definition is not suitable when the knowledgebase is inconsistent. In this case, it has no classical model, so we have no “event ” to count. This is a shortcoming since in practical applications (e.g. databases) it often happens that the knowledgebase is not consistent. And it is definitely not true that all inconsistent knowledgebases contain the same (null) amount of information, as given by the “classical information theory”. As explored for several years in the paraconsistent logic community, two inconsistent knowledgebases can lead to very different conclusions, showing that they do not convey the same information. There has been some
Common Sense and Maximum Entropy
 Synthese
, 2000
"... This paper concerns the question of how to draw inferences common sensically from uncertain knowledge. Since the early work of Shore and Johnson, [10], Paris and Vencovsk a, [6], and Csiszár, [1], it has been known that the Maximum Entropy Inference Process is the only inference process which obeys ..."
Abstract

Cited by 26 (4 self)
 Add to MetaCart
(Show Context)
This paper concerns the question of how to draw inferences common sensically from uncertain knowledge. Since the early work of Shore and Johnson, [10], Paris and Vencovsk a, [6], and Csiszár, [1], it has been known that the Maximum Entropy Inference Process is the only inference process which obeys certain common sense principles of uncertain reasoning. In this paper we consider the present status of this result and argue that within the rather narrow context in which we work this complete and consistent mode of uncertain reasoning is actually characterised by the observance of just a single common sense principle (or slogan).
HighLevel Primitives for Recursive Maximum Likelihood Estimation
 IEEE Trans. Automatic Control
, 1995
"... This paper proposes a high level language constituted of a small number of primitives and macros for describing recursive maximum likelihood (ML) estimation algorithms. ..."
Abstract

Cited by 17 (4 self)
 Add to MetaCart
(Show Context)
This paper proposes a high level language constituted of a small number of primitives and macros for describing recursive maximum likelihood (ML) estimation algorithms.
Combining probabilistic logic programming with the power of maximum entropy
 ARTIF. INTELL
, 2004
"... This paper is on the combination of two powerful approaches to uncertain reasoning: logic programming in a probabilistic setting, on the one hand, and the informationtheoretical principle of maximum entropy, on the other hand. More precisely, we present two approaches to probabilistic logic progra ..."
Abstract

Cited by 17 (3 self)
 Add to MetaCart
This paper is on the combination of two powerful approaches to uncertain reasoning: logic programming in a probabilistic setting, on the one hand, and the informationtheoretical principle of maximum entropy, on the other hand. More precisely, we present two approaches to probabilistic logic programming under maximum entropy. The first one is based on the usual notion of entailment under maximum entropy, and is defined for the very general case of probabilistic logic programs over Boolean events. The second one is based on a new notion of entailment under maximum entropy, where the principle of maximum entropy is coupled with the closed world assumption (CWA) from classical logic programming. It is only defined for the more restricted case of probabilistic logic programs over conjunctive events. We then analyze the nonmonotonic behavior of both approaches along benchmark examples and along general properties for default reasoning from conditional knowledge bases. It turns out that both approaches have very nice nonmonotonic features. In particular, they realize some inheritance of probabilistic knowledge along subclass relationships, without suffering from the problem of inheritance blocking and from the drowning problem. They both also satisfy the property of rational monotonicity and several irrelevance properties. We finally present algorithms for both approaches, which are based on generalizations of techniques from probabilistic
On the measure of conflicts: Shapley inconsistency values
 Artificial Intelligence
"... There are relatively few proposals for inconsistency measures for propositional belief bases. However inconsistency measures are potentially as important as information measures for artificial intelligence, and more generally for computer science. In particular, they can be useful to define various ..."
Abstract

Cited by 14 (3 self)
 Add to MetaCart
There are relatively few proposals for inconsistency measures for propositional belief bases. However inconsistency measures are potentially as important as information measures for artificial intelligence, and more generally for computer science. In particular, they can be useful to define various operators for belief revision, belief merging, and negotiation. The measures that have been proposed so far can be split into two classes. The first class of measures takes into account the number of formulae required to produce an inconsistency: the more formulae required to produce an inconsistency, the less inconsistent the base. The second class takes into account the proportion of the language that is affected by the inconsistency: the more propositional variables affected, the more inconsistent the base. Both approaches are sensible, but there is no proposal for combining them. We address this need in this paper: our proposal takes into account both the number of variables affected by the inconsistency and the distribution of the inconsistency among the formulae of the base. Our idea is to use existing inconsistency measures in order to define a game in coalitional form, and then to use the Shapley value to obtain an inconsistency measure that indicates the responsibility/contribution of each formula to the overall inconsistency in the base. This allows us to provide a more reliable image of the belief base and of the inconsistency in it. ⇤ This paper is a revised and extended version of the paper ”Shapley Inconcistency Values ” presented at KR’06. 1 1
Constructing a Logic of Plausible Inference: a Guide To Cox's Theorem
 International Journal of Approximate Reasoning
, 2003
"... Cox's Theorem provides a theoretical basis for using probability theory as a general logic of plausible inference. The theorem states that any system for plausible reasoning that satisfies certain qualitative requirements intended to ensure consistency with classical deductive logic and corresp ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
(Show Context)
Cox's Theorem provides a theoretical basis for using probability theory as a general logic of plausible inference. The theorem states that any system for plausible reasoning that satisfies certain qualitative requirements intended to ensure consistency with classical deductive logic and correspondence with commonsense reasoning is isomorphic to probability theory. However, the requirements used to obtain this result have been the subject of much debate. We review Cox's Theorem, discussing its requirements, the intuition and reasoning behind these, and the most important objections, and finish with an abbreviated proof of the theorem.
In Defence of the Maximum Entropy Inference Process
, 1997
"... This paper is a sequel to an earlier result of the authors that in making inferences from certain probabilistic knowledge bases the Maximum Entropy Inference Process, ME, is the only inference process respecting 'common sense'. This result was criticised on the grounds that the probabilis ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
This paper is a sequel to an earlier result of the authors that in making inferences from certain probabilistic knowledge bases the Maximum Entropy Inference Process, ME, is the only inference process respecting 'common sense'. This result was criticised on the grounds that the probabilistic knowledge bases considered are unnatural and that ignorance of dependence should not be identied with statistical independence. We argue against these criticisms and also against the more general criticism that ME is representation dependant. In a nal section we however provide a criticism of our own of ME, and of inference processes in general, namely that they fail to satisfy compactness. Introduction and Notation In [1] we gave a justication of the Maximum Entropy Inference Process, ME, by characterising it as the unique probabilistic inference process satisfying a certain collection of common sense principles. In the years following that publication a number of criticisms of these principl...
A New Criterion for Comparing Fuzzy Logics for Uncertain Reasoning
, 1996
"... A new criterion is introduced for judging the suitability of various `fuzzy logics' for practical uncertain reasoning in a probabilistic world and the relationship of this criterion to several established criteria, and its consequences for truth functional belief, are investigated. Introductio ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
A new criterion is introduced for judging the suitability of various `fuzzy logics' for practical uncertain reasoning in a probabilistic world and the relationship of this criterion to several established criteria, and its consequences for truth functional belief, are investigated. Introduction It is a rather widespread assumption in uncertain reasoning, and one that we shall make for the purpose of this paper, that a piece of uncertain knowledge can be adequately captured by attaching a real number (signifying the degree of uncertainty) on some scale to some unequivocal statement or conditional, and that an intelligent agent's knowledge base consists of a large, but nevertheless nite, set K of such expressions. Whether or not this is the correct picture for animate intelligent agents such as ourselves is, perhaps, questionable, but it is certainly the case that many expert systems (which one might feel should be included under the vague title of `intelligent agent') have, by design...
A Decision Procedure for Probability Calculus with Applications
"... Abstract. A decision procedure (PrSAT) for classical (Kolmogorov) probability calculus is presented. This decision procedure is based on an existing decision procedure for the theory of real closed fields, which has recently been implemented in Mathematica. A Mathematica implementation of PrSAT is a ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
Abstract. A decision procedure (PrSAT) for classical (Kolmogorov) probability calculus is presented. This decision procedure is based on an existing decision procedure for the theory of real closed fields, which has recently been implemented in Mathematica. A Mathematica implementation of PrSAT is also described, along with several applications to various nontrivial problems in the probability calculus.