Results 1  10
of
81
Evolutionary Game Theory
, 1995
"... Abstract. Experimentalists frequently claim that human subjects in the laboratory violate gametheoretic predictions. It is here argued that this claim is usually premature. The paper elaborates on this theme by way of raising some conceptual and methodological issues in connection with the very def ..."
Abstract

Cited by 663 (8 self)
 Add to MetaCart
Abstract. Experimentalists frequently claim that human subjects in the laboratory violate gametheoretic predictions. It is here argued that this claim is usually premature. The paper elaborates on this theme by way of raising some conceptual and methodological issues in connection with the very definition of a game and of players ’ preferences, in particular with respect to potential context dependence, interpersonal preference dependence, backward induction and incomplete information.
Probabilistic Default Reasoning with Conditional Constraints
 ANN. MATH. ARTIF. INTELL
, 2000
"... We present an approach to reasoning from statistical and subjective knowledge, which is based on a combination of probabilistic reasoning from conditional constraints with approaches to default reasoning from conditional knowledge bases. More precisely, we introduce the notions of , lexicographic, ..."
Abstract

Cited by 35 (20 self)
 Add to MetaCart
We present an approach to reasoning from statistical and subjective knowledge, which is based on a combination of probabilistic reasoning from conditional constraints with approaches to default reasoning from conditional knowledge bases. More precisely, we introduce the notions of , lexicographic, and conditional entailment for conditional constraints, which are probabilistic generalizations of Pearl's entailment in system , Lehmann's lexicographic entailment, and Geffner's conditional entailment, respectively. We show that the new formalisms have nice properties. In particular, they show a similar behavior as referenceclass reasoning in a number of uncontroversial examples. The new formalisms, however, also avoid many drawbacks of referenceclass reasoning. More precisely, they can handle complex scenarios and even purely probabilistic subjective knowledge as input. Moreover, conclusions are drawn in a global way from all the available knowledge as a whole. We then show that the new formalisms also have nice general nonmonotonic properties. In detail, the new notions of , lexicographic, and conditional entailment have similar properties as their classical counterparts. In particular, they all satisfy the rationality postulates proposed by Kraus, Lehmann, and Magidor, and they have some general irrelevance and direct inference properties. Moreover, the new notions of  and lexicographic entailment satisfy the property of rational monotonicity. Furthermore, the new notions of , lexicographic, and conditional entailment are proper generalizations of both their classical counterparts and the classical notion of logical entailment for conditional constraints. Finally, we provide algorithms for reasoning under the new formalisms, and we analyze its computational com...
Generalized Qualitative Probability: Savage revisited
 In: Proceedings of the 12th Conference on Uncertainty in Artificial Intelligence, UAI’96
, 1996
"... Preferences among acts are analyzed in the style of L. Savage, but as partially ordered. The rationality postulates considered are weaker than Savage's on three counts. The Sure Thing Principle is derived in this setting. The postulates are shown to lead to a characterization of generaliz ..."
Abstract

Cited by 32 (2 self)
 Add to MetaCart
Preferences among acts are analyzed in the style of L. Savage, but as partially ordered. The rationality postulates considered are weaker than Savage's on three counts. The Sure Thing Principle is derived in this setting. The postulates are shown to lead to a characterization of generalized qualitative probability that includes and blends both traditional qualitative probability and the ranked structures used in logical approaches. 1
The power of paradox: some recent developments in interactive epistemology
 International Journal of Game Theory
, 2007
"... Abstract Paradoxes of gametheoretic reasoning have played an important role in spurring developments in interactive epistemology, the area in game theory that studies the role of the players ’ beliefs, knowledge, etc. This paper describes two such paradoxes – one concerning backward induction, the ..."
Abstract

Cited by 26 (2 self)
 Add to MetaCart
Abstract Paradoxes of gametheoretic reasoning have played an important role in spurring developments in interactive epistemology, the area in game theory that studies the role of the players ’ beliefs, knowledge, etc. This paper describes two such paradoxes – one concerning backward induction, the other iterated weak dominance. We start with the basic epistemic condition of “rationality and common belief of rationality ” in a game, describe various ‘refinements ’ of this condition that have been proposed, and explain how these refinements resolve the two paradoxes. We will see that a unified epistemic picture of game theory emerges. We end with some new foundational questions uncovered by the epistemic program. 1
D.: Nonadditive beliefs and strategic equilibria
 Games and Economic Behavior
, 2000
"... This paper studies nplayer games where players ’ beliefs about their opponents’ behaviour are modelled as nonadditive probabilities. The concept of an “equilibrium under uncertainty ” which is introduced in this paper extends the equilibrium notion of Dow and Werlang (1994, J. Econom. Theory 64, ..."
Abstract

Cited by 25 (6 self)
 Add to MetaCart
This paper studies nplayer games where players ’ beliefs about their opponents’ behaviour are modelled as nonadditive probabilities. The concept of an “equilibrium under uncertainty ” which is introduced in this paper extends the equilibrium notion of Dow and Werlang (1994, J. Econom. Theory 64, 305–324) to nplayer games in strategic form. Existence of such an equilibrium is demonstrated under usual conditions. For low degrees of ambiguity, equilibria under uncertainty approximate Nash equilibria. At the other extreme, with a low degree of confidence, maximin equilibria appear. Finally, robustness against a lack of confidence may be viewed as a refinement for Nash equilibria. Journal of Economic Literature Classifi
Qualitative decision under uncertainty: back to expected utility
 In Proceedings of the International Joint Conference on Artificial Intelligence, IJCAI’03
, 2003
"... Different qualitative models have been proposed for decision under uncertainty in Artificial Intelligence, but they generally fail to satisfy the principle of strict Pareto dominance or principle of "efficiency", in contrast to the classical numerical criterion — expected utility. ..."
Abstract

Cited by 20 (3 self)
 Add to MetaCart
Different qualitative models have been proposed for decision under uncertainty in Artificial Intelligence, but they generally fail to satisfy the principle of strict Pareto dominance or principle of &quot;efficiency&quot;, in contrast to the classical numerical criterion — expected utility. In [Dubois and Prade, 1995J qualitative criteria based on possibility theory have been proposed, that are appealing but inefficient in the above sense. The question is whether it is possible to reconcile possibilistic criteria and efficiency. The present paper shows that the answer is yes, and that it leads to special kinds of expected utilities. It is also shown that although numerical, these expected utilities remain qualitative: they lead to two different decision procedures based on min, max and reverse operators only, generalizing the leximin and leximax orderings of vectors. 1 Introduction and
Admissibility in Games
"... Suppose that each player in a game is rational, each player thinks the other players are rational, and so on. Also, suppose that rationality is taken to incorporate an admissibility requirement–i.e., the avoidance of weakly dominated strategies. Which strategies can be played? We provide an epistemi ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
Suppose that each player in a game is rational, each player thinks the other players are rational, and so on. Also, suppose that rationality is taken to incorporate an admissibility requirement–i.e., the avoidance of weakly dominated strategies. Which strategies can be played? We provide an epistemic framework in which to address this question. Specifically, we formulate conditions of “rationality and mthorder assumption of rationality ” (RmAR) and “rationality and common assumption of rationality ” (RCAR). We show: (i) RCAR is characterized by a solution concept called a “selfadmissible set; ” (ii) in a “complete ” type structure, RmAR is characterized by the set of strategies that survive m + 1 rounds of elimination of inadmissible strategies; (iii) under a nontriviality condition, RCAR is impossible in a complete structure.
Lexicographic probability, conditional probability, and nonstandard probability
 In Theoretical Aspects of Rationality and Knowledge: Proc. Eighth Conference (TARK 2001
, 2001
"... The relationship between Popper spaces (conditional probability spaces that satisfy some regularity conditions), lexicographic probability systems (LPS’s) [Blume, Brandenburger, and Dekel 1991a; Blume, Brandenburger, and Dekel 1991b], and nonstandard probability spaces (NPS’s) is considered. If coun ..."
Abstract

Cited by 11 (2 self)
 Add to MetaCart
The relationship between Popper spaces (conditional probability spaces that satisfy some regularity conditions), lexicographic probability systems (LPS’s) [Blume, Brandenburger, and Dekel 1991a; Blume, Brandenburger, and Dekel 1991b], and nonstandard probability spaces (NPS’s) is considered. If countable additivity is assumed, Popper spaces and a subclass of LPS’s are equivalent; without the assumption of countable additivity, the equivalence no longer holds. If the state space is finite, LPS’s are equivalent to NPS’s. However, if the state space is infinite, NPS’s are shown to be more general than LPS’s. JEL classification numbers: C70; D80; D81; 1
Iterated Regret Minimization: A New Solution Concept
"... For some wellknown games, such as the Traveler’s Dilemma or the Centipede Game, traditional gametheoretic solution concepts—most notably Nash equilibrium—predict outcomes that are not consistent with empirical observations. We introduce a new solution concept, iterated regret minimization, which ex ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
For some wellknown games, such as the Traveler’s Dilemma or the Centipede Game, traditional gametheoretic solution concepts—most notably Nash equilibrium—predict outcomes that are not consistent with empirical observations. We introduce a new solution concept, iterated regret minimization, which exhibits the same qualitative behavior as that observed in experiments in many games of interest, including Traveler’s Dilemma, the Centipede Game, Nash bargaining, and Bertrand competition. As the name suggests, iterated regret minimization involves the iterated deletion of strategies that do not minimize regret. 1
Admissibility and common belief
, 2003
"... The concept of ‘fully permissible sets’ is defined by an algorithm that eliminates strategy subsets. It is characterized as choice sets when there is common certain belief of the event that each player prefer one strategy to another if and only if the former weakly dominates the latter on the set of ..."
Abstract

Cited by 10 (6 self)
 Add to MetaCart
The concept of ‘fully permissible sets’ is defined by an algorithm that eliminates strategy subsets. It is characterized as choice sets when there is common certain belief of the event that each player prefer one strategy to another if and only if the former weakly dominates the latter on the set of all opponent strategies or on the union of the choice sets that are deemed possible for the opponent. The concept refines the Dekel–Fudenberg procedure and captures aspects of forward induction.