Results 1  10
of
192
The Transferable Belief Model
 ARTIFICIAL INTELLIGENCE
, 1994
"... We describe the transferable belief model, a model for representing quantified beliefs based on belief functions. Beliefs can be held at two levels: (1) a credal level where beliefs are entertained and quantified by belief functions, (2) a pignistic level where beliefs can be used to make decisions ..."
Abstract

Cited by 371 (13 self)
 Add to MetaCart
We describe the transferable belief model, a model for representing quantified beliefs based on belief functions. Beliefs can be held at two levels: (1) a credal level where beliefs are entertained and quantified by belief functions, (2) a pignistic level where beliefs can be used to make decisions and are quantified by probability functions. The relation between the belief function and the probability function when decisions must be made is derived and justified. Four paradigms are analyzed in order to compare Bayesian, upper and lower probability, and the transferable belief approaches.
Toward a Logic for Qualitative Decision Theory
 In Proceedings of the KR'94
, 1992
"... We present a logic for representing and reasoning with qualitative statements of preference and normality and describe how these may interact in decision making under uncertainty. Our aim is to develop a logical calculus that employs the basic elements of classical decision theory, namely proba ..."
Abstract

Cited by 196 (4 self)
 Add to MetaCart
We present a logic for representing and reasoning with qualitative statements of preference and normality and describe how these may interact in decision making under uncertainty. Our aim is to develop a logical calculus that employs the basic elements of classical decision theory, namely probabilities, utilities and actions, but exploits qualitative information about these elements directly for the derivation of goals. Preferences and judgements of normality are captured in a modal/conditional logic, and a simple model of action is incorporated. Without quantitative information, decision criteria other than maximum expected utility are pursued. We describe how techniques for conditional default reasoning can be used to complete information about both preferences and normality judgements, and we show how maximin and maximax strategies can be expressed in our logic.
Rationality and its Roles in Reasoning
 Computational Intelligence
, 1994
"... The economic theory of rationality promises to equal mathematical logic in its importance for the mechanization of reasoning. We survey the growing literature on how the basic notions of probability, utility, and rational choice, coupled with practical limitations on information and resources, in ..."
Abstract

Cited by 109 (4 self)
 Add to MetaCart
The economic theory of rationality promises to equal mathematical logic in its importance for the mechanization of reasoning. We survey the growing literature on how the basic notions of probability, utility, and rational choice, coupled with practical limitations on information and resources, influence the design and analysis of reasoning and representation systems. 1 Introduction People make judgments of rationality all the time, usually in criticizing someone else's thoughts or deeds as irrational, or in defending their own as rational. Artificial intelligence researchers construct systems and theories to perform or describe rational thought and action, criticizing and defending these systems and theories in terms similar to but more formal than those of the man or woman on the street. Judgments of human rationality commonly involve several different conceptions of rationality, including a logical conception used to judge thoughts, and an economic one used to judge actions or...
Toward normative expert systems: Part I. The pathfinder project
 Methods Inf. Med
, 1992
"... Pathfinder is an expert system that assists surgical pathologists with the diagnosis of lymphnode diseases. The program is one of a growing number of normative expert systems that use probability and decision theory to acquire, represent, manipulate, and explain uncertain medical knowledge. In this ..."
Abstract

Cited by 83 (15 self)
 Add to MetaCart
Pathfinder is an expert system that assists surgical pathologists with the diagnosis of lymphnode diseases. The program is one of a growing number of normative expert systems that use probability and decision theory to acquire, represent, manipulate, and explain uncertain medical knowledge. In this article, we describe Pathfinder and our research in uncertainreasoning paradigms that was stimulated by the development of the program. We discuss limitations with early decisiontheoretic methods for reasoning under uncertainty and our initial attempts to use nondecisiontheoretic methods. Then, we describe experimental and theoretical results that directed us to return to reasoning methods based in probability and decision theory.
Provably BoundedOptimal Agents
 Journal of Artificial Intelligence Research
, 1995
"... Since its inception, artificial intelligence has relied upon a theoretical foundation centred around perfect rationality as the desired property of intelligent systems. We argue, as others have done, that this foundation is inadequate because it imposes fundamentally unsatisfiable requirements. As a ..."
Abstract

Cited by 79 (1 self)
 Add to MetaCart
Since its inception, artificial intelligence has relied upon a theoretical foundation centred around perfect rationality as the desired property of intelligent systems. We argue, as others have done, that this foundation is inadequate because it imposes fundamentally unsatisfiable requirements. As a result, there has arisen a wide gap between theory and practice in AI, hindering progress in the field. We propose instead a property called bounded optimality. Roughly speaking, an agent is boundedoptimal if its program is a solution to the constrained optimization problem presented by its architecture and the task environment. We show how to construct agents with this property for a simple class of machine architectures in a broad class of realtime environments. We illustrate these results using a simple model of an automated mail sorting facility. We also define a weaker property, asymptotic bounded optimality (ABO), that generalizes the notion of optimality in classical complexity th...
Rationality For Economists?
 JOURNAL OF RISK AND UNCERTAINTY
, 1998
"... Rationality is a complex behavioral theory that can be parsed into statements about preferences, perceptions, and process. This paper looks at the evidence on rationality that is provided by behavioral experiments, and argues that most cognitive anomalies operate through errors in perception that a ..."
Abstract

Cited by 73 (5 self)
 Add to MetaCart
Rationality is a complex behavioral theory that can be parsed into statements about preferences, perceptions, and process. This paper looks at the evidence on rationality that is provided by behavioral experiments, and argues that most cognitive anomalies operate through errors in perception that arise from the way information is stored, retrieved, and processed, or through errors in process that lead to formulation of choice problems as cognitive tasks that are inconsistent at least with rationality narrowly defined. The paper discusses how these cognitive anomalies influence economic behavior and measurement, and their implications for economic analysis.
Betting on Theories
, 1993
"... Predictions about the future and unrestricted universal generalizations are never logically implied by our observational evidence, which is limited to particular facts in the present and past. Nevertheless, propositions of these and other kinds are often said to be confirmed by observational evidenc ..."
Abstract

Cited by 70 (4 self)
 Add to MetaCart
Predictions about the future and unrestricted universal generalizations are never logically implied by our observational evidence, which is limited to particular facts in the present and past. Nevertheless, propositions of these and other kinds are often said to be confirmed by observational evidence. A natural place to begin the study of confirmation theory is to consider what it means to say that some evidence E confirms a hypothesis H. Incremental and absolute confirmation Let us say that E raises the probability of H if the probability of H given E is higher than the probability of H not given E. According to many confirmation theorists, “E confirms H ” means that E raises the probability of H. This conception of confirmation will be called incremental confirmation. Let us say that H is probable given E if the probability of H given E is above some threshold. (This threshold remains to be specified but is assumed to be at least one half.) According to some confirmation theorists, “E confirms H ” means that H is probable given E. This conception of confirmation will be called absolute confirmation. Confirmation theorists have sometimes failed to distinguish these two concepts. For example, Carl Hempel in his classic “Studies in the Logic of Confirmation ” endorsed the following principles: (1) A generalization of the form “All F are G ” is confirmed by the evidence that there is an individual that is both F and G. (2) A generalization of that form is also confirmed by the evidence that there is an individual that is neither F nor G. (3) The hypotheses confirmed by a piece of evidence are consistent with one another. (4) If E confirms H then E confirms every logical consequence of H. Principles (1) and (2) are not true of absolute confirmation. Observation of a single thing that is F and G cannot in general make it probable that all F are G; likewise for an individual that is neither
Uncertainty, Belief, and Probability
 Computational Intelligence
, 1989
"... : We introduce a new probabilistic approach to dealing with uncertainty, based on the observation that probability theory does not require that every event be assigned a probability. For a nonmeasurable event (one to which we do not assign a probability), we can talk about only the inner measure and ..."
Abstract

Cited by 46 (2 self)
 Add to MetaCart
: We introduce a new probabilistic approach to dealing with uncertainty, based on the observation that probability theory does not require that every event be assigned a probability. For a nonmeasurable event (one to which we do not assign a probability), we can talk about only the inner measure and outer measure of the event. In addition to removing the requirement that every event be assigned a probability, our approach circumvents other criticisms of probabilitybased approaches to uncertainty. For example, the measure of belief in an event turns out to be represented by an interval (defined by the inner and outer measure), rather than by a single number. Further, this approach allows us to assign a belief (inner measure) to an event E without committing to a belief about its negation :E (since the inner measure of an event plus the inner measure of its negation is not necessarily one). Interestingly enough, inner measures induced by probability measures turn out to correspond in a ...
2003a). Bayesian Epistemology
"... Bayesian epistemology addresses epistemological problems with the help of the mathematical theory of probability. It turns out that the probability calculus is especially suited to represent degrees of belief (credences) and to deal with questions of belief change, confirmation, evidence, justificat ..."
Abstract

Cited by 39 (6 self)
 Add to MetaCart
Bayesian epistemology addresses epistemological problems with the help of the mathematical theory of probability. It turns out that the probability calculus is especially suited to represent degrees of belief (credences) and to deal with questions of belief change, confirmation, evidence, justification, and coherence.