Results 1  10
of
32
Fundamental Concepts of Qualitative Probabilistic Networks
 ARTIFICIAL INTELLIGENCE
, 1990
"... Graphical representations for probabilistic relationships have recently received considerable attention in A1. Qualitative probabilistic networks abstract from the usual numeric representations by encoding only qualitative relationships, which are inequality constraints on the joint probability dist ..."
Abstract

Cited by 119 (6 self)
 Add to MetaCart
Graphical representations for probabilistic relationships have recently received considerable attention in A1. Qualitative probabilistic networks abstract from the usual numeric representations by encoding only qualitative relationships, which are inequality constraints on the joint probability distribution over the variables. Although these constraints are insufficient to determine probabilities uniquely, they are designed to justify the deduction of a class of relative likelihood conclusions that imply useful decisionmaking properties. Two types of qualitative relationship are defined, each a probabilistic form of monotonicity constraint over a group of variables. Qualitative influences describe the direction of the relationship between two variables. Qualitative synergies describe interactions among influences. The probabilistic definitions chosen justify sound and efficient inference procedures based on graphical manipulations of the network. These procedures answer queries about qualitative relationships among variables separated in the network and determine structural properties of optimal assignments to decision variables.
Uncertainty, Belief, and Probability
 Computational Intelligence
, 1989
"... : We introduce a new probabilistic approach to dealing with uncertainty, based on the observation that probability theory does not require that every event be assigned a probability. For a nonmeasurable event (one to which we do not assign a probability), we can talk about only the inner measure and ..."
Abstract

Cited by 46 (2 self)
 Add to MetaCart
: We introduce a new probabilistic approach to dealing with uncertainty, based on the observation that probability theory does not require that every event be assigned a probability. For a nonmeasurable event (one to which we do not assign a probability), we can talk about only the inner measure and outer measure of the event. In addition to removing the requirement that every event be assigned a probability, our approach circumvents other criticisms of probabilitybased approaches to uncertainty. For example, the measure of belief in an event turns out to be represented by an interval (defined by the inner and outer measure), rather than by a single number. Further, this approach allows us to assign a belief (inner measure) to an event E without committing to a belief about its negation :E (since the inner measure of an event plus the inner measure of its negation is not necessarily one). Interestingly enough, inner measures induced by probability measures turn out to correspond in a ...
IntervalValued Probabilities
, 1998
"... 0 =h 0 in the diagram. The sawtooth line reflects the fact that even when the principle of indifference can be applied, there may be arguments whose strength can be bounded no more precisely than by an adjacent pair of indifference arguments. Note that a=h in the diagram is bounded numerically on ..."
Abstract

Cited by 21 (1 self)
 Add to MetaCart
0 =h 0 in the diagram. The sawtooth line reflects the fact that even when the principle of indifference can be applied, there may be arguments whose strength can be bounded no more precisely than by an adjacent pair of indifference arguments. Note that a=h in the diagram is bounded numerically only by 0.0 and the strength of a 00 =h 00 . Keynes' ideas were taken up by B. O. Koopman [14, 15, 16], who provided an axiomatization for Keynes' probability values. The axioms are qualitative, and reflect what Keynes said about probability judgment. (It should be remembered that for Keynes probability judgment was intended to be objective in the sense that logic is objective. Although different people may accept different premises, whether or not a conclusion follows logically from a given set of premises is objective. Though Ramsey [26] attacked this aspect of Keynes' theory, it can be argued
Elementary NonArchimedean Representations of Probability for Decision Theory and Games
 Suppes: Scientific Philosopher, Vol. I: Probability and Probabilistic Causality
, 1994
"... 1992 version is intended as a contribution to a two volume collection honouring Patrick Suppes, to be edited by Paul Humphreys and published by Kluwer Academic Publishers. ABSTRACT. In an extensive form game, whether a player has a better strategy than in a presumed equilibrium depends on the other ..."
Abstract

Cited by 20 (5 self)
 Add to MetaCart
1992 version is intended as a contribution to a two volume collection honouring Patrick Suppes, to be edited by Paul Humphreys and published by Kluwer Academic Publishers. ABSTRACT. In an extensive form game, whether a player has a better strategy than in a presumed equilibrium depends on the other players ’ equilibrium reactions to a counterfactual deviation. To allowconditioning on counterfactual events with prior probability zero, extended probabilities are proposed and given the four equivalent characterizations: (i) complete conditional probability systems; (ii) lexicographic hierarchies of probabilities; (iii) extended logarithmic likelihood ratios; and (iv) certain ‘canonical rational probability functions ’ representing ‘trembles ’ directly as infinitesimal probabilities. However, having joint probability distributions be uniquely determined by independent marginal probability distributions requires general probabilities taking values in a space no smaller than the nonArchimedean ordered field whose members are rational functions of a particular infinitesimal. Elinor now found the difference between the expectation of an unpleasant event, however certain the mind may be told to consider it, and certainty itself. — Jane Austen, Sense and Sensibility, ch. 48.... a more attractive and manageable theory may result from a nonArchimedean representation.... One must keep in mind the fact that the refutability of axioms depends both on their mathematical form and their empirical interpretation. — Krantz, Luce, Suppes and Tversky (1971, p. 29).
Belief Maintenance with Probabilistic Logic
 In Proceedings of the AAAI Fall Symposium on Automated Deduction in Non Standard Logics
, 1993
"... Belief maintenance systems are natural extensions of truth maintenance systems that use probabilities rather than boolean truthvalues. This paper introduces a general method for belief maintenance, based on (the propositional fragment of) probabilistic logic, that extends the Boolean Constraint Pro ..."
Abstract

Cited by 13 (9 self)
 Add to MetaCart
Belief maintenance systems are natural extensions of truth maintenance systems that use probabilities rather than boolean truthvalues. This paper introduces a general method for belief maintenance, based on (the propositional fragment of) probabilistic logic, that extends the Boolean Constraint Propagation method used by the logicbased truth maintenance systems. From the concept of probabilistic entailment, we derive a set of constraints on the (probabilistic) truthvalues of propositions and we prove their soundness. These constraints are complete with respect to a welldefined set of clauses, and their partial incompleteness is compensated by a gain in computational efficiency. 1 Introduction Truth maintenance systems (tmss) are independent reasoning modules which incrementally maintain the beliefs of a general problem solving system, enabling it to reason with temporary assumptions in the growth of incomplete information. The concept of truth maintenance system is due to Doyle ...
Lp, A Logic for Representing and Reasoning with Statistical Knowledge
, 1990
"... This paper presents a logical formalism for representing and reasoning with statistical knowledge. One of the key features of the formalism is its ability to deal with qualitative statistical information. It is argued that statistical knowledge, especially that of a qualitative nature, is an importa ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
This paper presents a logical formalism for representing and reasoning with statistical knowledge. One of the key features of the formalism is its ability to deal with qualitative statistical information. It is argued that statistical knowledge, especially that of a qualitative nature, is an important component of our world knowledge and that such knowledge is used in many different reasoning tasks. The work is further motivated by the observation that previous formalisms for representing probabilistic information are inadequate for representing statistical knowledge. The representation mechanism takes the form of a logic that is capable of representing a wide variety of statistical knowledge, and that possesses an intuitive formal semantics based on the simple notions of sets of objects and probabilities defined over those sets. Furthermore, a proof theory is developed and is shown to be sound and complete. The formalism offers a perspicuous and powerful representational tool for stat...
Some Varieties of Qualitative Probability
 Proceedings of the 5th International Conference on Information Processing and the Management of Uncertainty
, 1994
"... In this essay I present a general characterization of qualitative probability, including a partial taxonomy of possible approaches. I discuss some of these in further depth, identify central issues, and suggest some general comparisons. 1. Introduction In the standard theory of probability, degree ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
In this essay I present a general characterization of qualitative probability, including a partial taxonomy of possible approaches. I discuss some of these in further depth, identify central issues, and suggest some general comparisons. 1. Introduction In the standard theory of probability, degrees of belief for events or propositions take values in the real interval [0,1]. From degrees of belief on the primitive propositions, the theory dictates degrees of belief for various compound and conditional propositions, and vice versa. Computational schemes for probabilistic reasoning apply this theory to the automated derivation of degrees of belief for designated propositions of interest given prespecified degrees of belief over some other propositions and some particular conditioning propositions observed or hypothesized. This approach has, among other advantages, those accruing to a well understood and powerful underlying theory. Despite these virtues, many have objected to the straig...
Significance Tests, Belief Calculi, and Burden of Proof in Legal and Scientific Discourse
 of Proof in Legal and Scientific Discourse. Laptec’03, Frontiers in Artificial Intell.and its Applications
, 2003
"... We review the definition of the Full Bayesian Significance Test (FBST), and summarize its main statistical and epistemological characteristics. ..."
Abstract

Cited by 7 (7 self)
 Add to MetaCart
We review the definition of the Full Bayesian Significance Test (FBST), and summarize its main statistical and epistemological characteristics.
Arbitrage, rationality, and equilibrium
 Theory and Decision
, 1991
"... ABSTRACT. Noarbitrage is the fundamental principle of economic rationality which unifies normative decision theory, game theory, and market theory. In economic environments where money is available as a medium of measurement and exchange, noarbitrage is synonymous with subjective expected utility ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
ABSTRACT. Noarbitrage is the fundamental principle of economic rationality which unifies normative decision theory, game theory, and market theory. In economic environments where money is available as a medium of measurement and exchange, noarbitrage is synonymous with subjective expected utility maximization in personal decisions, competitive equilibria in capital markets and exchange economies, and correlated equilibria in noncooperative games. The arbitrage principle directly characterizes rationality at the market level; the appearance of deliberate optimization by individual agents is a consequence of their adaptation to the market. Concepts of equilibrium behavior in games and markets can thus be reconciled with the ideas that individual rationality is bounded, that agents use evolutionarilyshaped decision rules rather than numerical optimization algorithms, and that personal probabilities and utilities are inseparable and to some extent indeterminate. Riskneutral probability distributions, interpretable as products of probabilities and marginal utilities, play a central role as observable quantities in economic systems.