Results 1  10
of
452
Extensional versus intuitive reasoning: The conjunction fallacy in probability judgment
 Psychological Review
, 1983
"... Perhaps the simplest and the most basic qualitative law of probability is the conjunction rule: The probability of a conjunction, P(A&B), cannot exceed the probabilities of its constituents, P(A) and.P(B), because the extension (or the possibility set) of the conjunction is included in the extension ..."
Abstract

Cited by 239 (2 self)
 Add to MetaCart
Perhaps the simplest and the most basic qualitative law of probability is the conjunction rule: The probability of a conjunction, P(A&B), cannot exceed the probabilities of its constituents, P(A) and.P(B), because the extension (or the possibility set) of the conjunction is included in the extension of its constituents. Judgments under uncertainty, however, are often mediated by intuitive heuristics that are not bound by the conjunction rule. A conjunction can be more representative than one of its constituents, and instances of a specific category can be easier to imagine or to retrieve than instances of a more inclusive category. The representativeness and availability heuristics therefore can make a conjunction appear more probable than one of its constituents. This phenomenon is demonstrated in a variety of contexts including estimation of word frequency, personality judgment, medical prognosis, decision under risk, suspicion of criminal acts, and political forecasting. Systematic violations of the conjunction rule are observed in judgments of lay people and of experts in both betweensubjects and withinsubjects comparisons. Alternative interpretations of the conjunction fallacy are discussed and attempts to combat it are explored. Uncertainty is an unavoidable aspect of the the last decade (see, e.g., Einhorn & Hogarth, human condition. Many significant choices must be based on beliefs about the likelihood
Perspectives on the Theory and Practice of Belief Functions
 International Journal of Approximate Reasoning
, 1990
"... The theory of belief functions provides one way to use mathematical probability in subjective judgment. It is a generalization of the Bayesian theory of subjective probability. When we use the Bayesian theory to quantify judgments about a question, we must assign probabilities to the possible answer ..."
Abstract

Cited by 86 (7 self)
 Add to MetaCart
The theory of belief functions provides one way to use mathematical probability in subjective judgment. It is a generalization of the Bayesian theory of subjective probability. When we use the Bayesian theory to quantify judgments about a question, we must assign probabilities to the possible answers to that question. The theory of belief functions is more flexible; it allows us to derive degrees of belief for a question from probabilities for a related question. These degrees of belief may or may not have the mathematical properties of probabilities; how much they differ from probabilities will depend on how closely the two questions are related. Examples of what we would now call belieffunction reasoning can be found in the late seventeenth and early eighteenth centuries, well before Bayesian ideas were developed. In 1689, George Hooper gave rules for combining testimony that can be recognized as special cases of Dempster's rule for combining belief functions (Shafer 1986a). Similar rules were formulated by Jakob Bernoulli in his Ars Conjectandi, published posthumously in 1713, and by JohannHeinrich Lambert in his Neues Organon, published in 1764 (Shafer 1978). Examples of belieffunction reasoning can also be found in more recent work, by authors
Toward normative expert systems: Part I. The pathfinder project
 Methods Inf. Med
, 1992
"... Pathfinder is an expert system that assists surgical pathologists with the diagnosis of lymphnode diseases. The program is one of a growing number of normative expert systems that use probability and decision theory to acquire, represent, manipulate, and explain uncertain medical knowledge. In this ..."
Abstract

Cited by 83 (15 self)
 Add to MetaCart
Pathfinder is an expert system that assists surgical pathologists with the diagnosis of lymphnode diseases. The program is one of a growing number of normative expert systems that use probability and decision theory to acquire, represent, manipulate, and explain uncertain medical knowledge. In this article, we describe Pathfinder and our research in uncertainreasoning paradigms that was stimulated by the development of the program. We discuss limitations with early decisiontheoretic methods for reasoning under uncertainty and our initial attempts to use nondecisiontheoretic methods. Then, we describe experimental and theoretical results that directed us to return to reasoning methods based in probability and decision theory.
Two views of belief: Belief as generalized probability and belief as evidence
, 1992
"... : Belief functions are mathematical objects defined to satisfy three axioms that look somewhat similar to the Kolmogorov axioms defining probability functions. We argue that there are (at least) two useful and quite different ways of understanding belief functions. The first is as a generalized prob ..."
Abstract

Cited by 72 (12 self)
 Add to MetaCart
: Belief functions are mathematical objects defined to satisfy three axioms that look somewhat similar to the Kolmogorov axioms defining probability functions. We argue that there are (at least) two useful and quite different ways of understanding belief functions. The first is as a generalized probability function (which technically corresponds to the inner measure induced by a probability function). The second is as a way of representing evidence. Evidence, in turn, can be understood as a mapping from probability functions to probability functions. It makes sense to think of updating a belief if we think of it as a generalized probability. On the other hand, it makes sense to combine two beliefs (using, say, Dempster's rule of combination) only if we think of the belief functions as representing evidence. Many previous papers have pointed out problems with the belief function approach; the claim of this paper is that these problems can be explained as a consequence of confounding the...
A UNIFYING FIELD IN LOGICS: NEUTROSOPHIC LOGIC. NEUTROSOPHY, NEUTROSOPHIC SET, NEUTROSOPHIC PROBABILITY AND STATISTICS (fourth edition)
, 2005
"... ..."
Soft Computing: the Convergence of Emerging Reasoning Technologies
 Soft Computing
, 1997
"... The term Soft Computing (SC) represents the combination of emerging problemsolving technologies such as Fuzzy Logic (FL), Probabilistic Reasoning (PR), Neural Networks (NNs), and Genetic Algorithms (GAs). Each of these technologies provide us with complementary reasoning and searching methods to so ..."
Abstract

Cited by 50 (8 self)
 Add to MetaCart
The term Soft Computing (SC) represents the combination of emerging problemsolving technologies such as Fuzzy Logic (FL), Probabilistic Reasoning (PR), Neural Networks (NNs), and Genetic Algorithms (GAs). Each of these technologies provide us with complementary reasoning and searching methods to solve complex, realworld problems. After a brief description of each of these technologies, we will analyze some of their most useful combinations, such as the use of FL to control GAs and NNs parameters; the application of GAs to evolve NNs (topologies or weights) or to tune FL controllers; and the implementation of FL controllers as NNs tuned by backpropagationtype algorithms.
A Review of Rough Set Models
, 1997
"... Since introduction of the theory of rough set in early eighties, considerable work has been done on the development and application of this new theory. The paper provides a review of the Pawlak rough set model and its extensions, with emphasis on the formulation, characterization, and interpretation ..."
Abstract

Cited by 48 (16 self)
 Add to MetaCart
Since introduction of the theory of rough set in early eighties, considerable work has been done on the development and application of this new theory. The paper provides a review of the Pawlak rough set model and its extensions, with emphasis on the formulation, characterization, and interpretation of various rough set models. 1
Robust Learning with Missing Data
, 1996
"... Bayesian methods are becoming increasingly popular in the development of intelligent machines. Bayesian Belief Networks (bbns) are nowaday a prominent reasoning method and, during the past few years, several efforts have been addressed to develop methods able to learn bbns directly from databases. H ..."
Abstract

Cited by 48 (5 self)
 Add to MetaCart
Bayesian methods are becoming increasingly popular in the development of intelligent machines. Bayesian Belief Networks (bbns) are nowaday a prominent reasoning method and, during the past few years, several efforts have been addressed to develop methods able to learn bbns directly from databases. However, all these methods assume that the database is complete or, at least, that unreported data are missing at random. Unfortunately, realworld databases are rarely complete and the "Missing at Random" assumption is often unrealistic. This paper shows that this assumption can dramatically affect the reliability of the learned bbn and introduces a robust method to learn conditional probabilities in a bbn, which does not rely on this assumption. In order to drop this assumption, we have to change the overall learning strategy used by traditional Bayesian methods: our method bounds the set of all posterior probabilities consistent with the database and proceed by refining this set as more i...
Approximation Algorithms and Decision Making in the DempsterShafer Theory of Evidence  An Empirical Study
 International Journal of Approximate Reasoning
, 1996
"... The computational complexity of reasoning within the DempsterShafer theory of evidence is one of the major points of criticism this formalism has to face. To overcome this difficulty various approximation algorithms have been suggested that aim at reducing the number of focal elements in the belief ..."
Abstract

Cited by 47 (0 self)
 Add to MetaCart
The computational complexity of reasoning within the DempsterShafer theory of evidence is one of the major points of criticism this formalism has to face. To overcome this difficulty various approximation algorithms have been suggested that aim at reducing the number of focal elements in the belief functions involved. This article reviews a number of algorithms based on this method and introduces a new onethe D1 algorithmthat was designed to bring about minimal deviations in those values that are relevant to decision making. It describes an empirical study that examines the appropriateness of these approximation procedures in decisionmaking situations. It presents and interprets the empirical findings along several dimensions and discusses the various tradeoffs that have to be taken into account when actually applying one of these methods. 2 1. Introduction The complexity of the computations that have to be carried out in the DempsterShafer theory of evidence (DST) [3, 10] i...
A New Approach to Updating Beliefs
 Uncertainty in Artificial Intelligence
, 1991
"... : We define a new notion of conditional belief, which plays the same role for DempsterShafer belief functions as conditional probability does for probability functions. Our definition is different from the standard definition given by Dempster, and avoids many of the wellknown problems of that def ..."
Abstract

Cited by 47 (6 self)
 Add to MetaCart
: We define a new notion of conditional belief, which plays the same role for DempsterShafer belief functions as conditional probability does for probability functions. Our definition is different from the standard definition given by Dempster, and avoids many of the wellknown problems of that definition. Just as the conditional probability P r(\DeltajB) is a probability function which is the result of conditioning on B being true, so too our conditional belief function Bel(\DeltajB) is a belief function which is the result of conditioning on B being true. We define the conditional belief as the lower envelope (that is, the inf) of a family of conditional probability functions, and provide a closedform expression for it. An alternate way of understanding our definition of conditional belief is provided by considering ideas from an earlier paper [FH91], where we connect belief functions with inner measures. In particular, we show here how to extend the definition of conditional pro...