Results 11  20
of
410
Optimal income tax in the presence of status effects
 Journal of Public Economics
, 2001
"... The classical optimal income tax problem does not reveal many general properties except for the wellknown tendency for marginal tax rates to reduce for high ability types, and in fact to become zero for the top type. The existence of distortions from individuals competing to attain social status by ..."
Abstract

Cited by 37 (0 self)
 Add to MetaCart
The classical optimal income tax problem does not reveal many general properties except for the wellknown tendency for marginal tax rates to reduce for high ability types, and in fact to become zero for the top type. The existence of distortions from individuals competing to attain social status by using consumption signals justifies some measure of income tax. The question posed here is whether it also constitutes a reason for a more progressive income tax schedule. The answer is found to be broadly negative if progressivity is interpreted as increasing marginal tax rates. On the other hand, statusseeking makes the optimal tax schedule steeper so that redistribution is increased. Broadly, the analysis of statusseeking based on a signalling approach confirms and strengthens the existing view of an optimal tax schedule.
Generalized Qualitative Probability: Savage revisited
 In: Proceedings of the 12th Conference on Uncertainty in Artificial Intelligence, UAI’96
, 1996
"... Preferences among acts are analyzed in the style of L. Savage, but as partially ordered. The rationality postulates considered are weaker than Savage's on three counts. The Sure Thing Principle is derived in this setting. The postulates are shown to lead to a characterization of generaliz ..."
Abstract

Cited by 34 (3 self)
 Add to MetaCart
Preferences among acts are analyzed in the style of L. Savage, but as partially ordered. The rationality postulates considered are weaker than Savage's on three counts. The Sure Thing Principle is derived in this setting. The postulates are shown to lead to a characterization of generalized qualitative probability that includes and blends both traditional qualitative probability and the ranked structures used in logical approaches. 1
Ambiguity and Nonparticipation: The Role of Regulation
 The Review of Financial Studies
, 2009
"... We investigate the implications of ambiguity aversion for performance and regulation of markets. In our model, agents ’ decision making may incorporate both risk and ambiguity, and we demonstrate that nonparticipation arises from the rational decision by some traders to avoid ambiguity. In equilibri ..."
Abstract

Cited by 34 (1 self)
 Add to MetaCart
We investigate the implications of ambiguity aversion for performance and regulation of markets. In our model, agents ’ decision making may incorporate both risk and ambiguity, and we demonstrate that nonparticipation arises from the rational decision by some traders to avoid ambiguity. In equilibrium, these participation decisions affect the equilibrium risk premium, and distort market performance when viewed from the perspective of traditional asset pricing models. We demonstrate how regulation, particularly regulation of unlikely events, can moderate the effects of ambiguity, thereby increasing participation and generating welfare gains. Our analysis demonstrates how legal systems affect participation in financial markets through their influence on ambiguity. (JEL G1, G2, D8) One of the more puzzling aspects of financial markets is the surprisingly large fraction of individuals who do not participate. For the period 1982–1995, the U.S. Consumer Expenditure Survey found that more than twothirds of households held neither stocks nor bonds (cited in Paiella 2006). Campbell (2006), using data from the 2001 Survey of Consumer Finance, found that even at the eightieth percentile of wealth, almost 20 % of households had no public equity even when their retirement savings are considered.1 Looking across asset classes reveals similar nonparticipation issues, as does looking internationally, where participation rates are shown to vary substantially across countries.2 Because such behavior is inconsistent with standard models of optimal portfolio We would like to thank seminar participants at Arizona State, Cornell University, Dartmouth, HECLausanne,
Coherent acceptability measures in multiperiod models
, 2002
"... The coherent risk framework has been introduced by Artzner et al. (1999) in a singleperiod setting. Here we investigate a similar framework in a multiperiod context. Acceptability measures are introduced as a function not only of a given position (payoff in each possible state of nature) but also of ..."
Abstract

Cited by 33 (4 self)
 Add to MetaCart
The coherent risk framework has been introduced by Artzner et al. (1999) in a singleperiod setting. Here we investigate a similar framework in a multiperiod context. Acceptability measures are introduced as a function not only of a given position (payoff in each possible state of nature) but also of available information. A notion of timeconsistency for acceptability measures is introduced, and conditions are given for this property to hold if the acceptability measure is expressed in terms of a family of test measures. We present sufficient conditions for the “no strictly acceptable opportunities” condition of Carr et al. (2001) to hold in the dynamic context. We show that the effect of hedging can be represented by a change in the set of test measures. Concerning the problem of computing hedges that optimize the degree of acceptability of a given position, we provide sufficient conditions under which an algorithm of dynamic programming type can be applied. For the special case of a derivative on a single underlying with convex payoff, and for a particular class of acceptability measures, we show that this algorithm simplifies considerably and we give explicit formulas for hedges that maximize the degree of acceptability.
A behavioral characterization of plausible priors
 Journal of Economic Theory
"... Recent theories of choice under uncertainty represent ambiguity via multiple priors, informally interpreted as alternative probabilistic models of the uncertainty that the decisionmaker considers equally plausible. This paper provides a robust behavioral foundation for this interpretation. A prior ..."
Abstract

Cited by 33 (6 self)
 Add to MetaCart
(Show Context)
Recent theories of choice under uncertainty represent ambiguity via multiple priors, informally interpreted as alternative probabilistic models of the uncertainty that the decisionmaker considers equally plausible. This paper provides a robust behavioral foundation for this interpretation. A prior P is deemed “plausible ” if (i) preferences over a subset C of acts are consistent with subjective expected utility (SEU), and (ii) jointly with an appropriate utility function, P provides the unique SEU representation of preferences over C. Under appropriate axioms, plausible priors can be elicited from preferences; moreover, if these axioms hold, (i) preferences are probabilistically sophisticated if and only if they are SEU, and (ii) under suitable consequentialism and dynamic consistency axioms, “plausible posteriors ” can be derived from plausible priors via Bayes ’ rule. Several wellknown decision models are consistent with the axioms proposed here. This paper has an Online Appendix: please visit
Axiomatic foundations of multiplier preferences
, 2007
"... This paper axiomatizes the robust control criterion of multiplier preferences introduced by Hansen and Sargent (2001). The axiomatization relates multiplier preferences to other classes of preferences studied in decision theory. Some properties of multiplier preferences are generalized to the broade ..."
Abstract

Cited by 33 (3 self)
 Add to MetaCart
This paper axiomatizes the robust control criterion of multiplier preferences introduced by Hansen and Sargent (2001). The axiomatization relates multiplier preferences to other classes of preferences studied in decision theory. Some properties of multiplier preferences are generalized to the broader class of variational preferences, recently introduced by Maccheroni, Marinacci and Rustichini (2006). The paper also establishes a link between the parameters of the multiplier criterion and the observable behavior of the agent. This link enables measurement of the parameters on the basis of observable choice data and provides a useful tool for applications. I am indebted to my advisor Eddie Dekel for his continuous guidance, support, and encouragement. I am grateful to Peter Klibanoff and Marciano Siniscalchi for many discussions which resulted in significant improvements of the paper. I would also like to thank Jeff Ely and Todd Sarver for helpful comments and suggestions. This project started after a very stimulating conversation with Tom Sargent and was further shaped by conversations with Lars Hansen. All errors are my own.
On the axiomatization of qualitative decision theory
 In Proceedings of the Fourteenth National Conference on Artificial Intelligence, AAAI97
, 1997
"... This paper investigates the foundation of rnaxipnin, one of the central qualitative decision criteria, using the approach taken by Savage (Savage 1972) to investigate the foundation and rationality of classical decision theory. This approach asks “which behaviors could result from the use of a parti ..."
Abstract

Cited by 33 (4 self)
 Add to MetaCart
(Show Context)
This paper investigates the foundation of rnaxipnin, one of the central qualitative decision criteria, using the approach taken by Savage (Savage 1972) to investigate the foundation and rationality of classical decision theory. This approach asks “which behaviors could result from the use of a particular decision procedure?” The answer to this question provides two important insights: (1) under what conditions can we employ a particular agent model, and (2) how rational is a particular decision procedure. Our main result is a constructive representation theorem in the spirit of Savage’s result for expected utility maximization, which uses two choice axioms to characterize the maxapnin criterion. These axioms characterize agent behaviors that can be modeled compactly using the maxcirnin model, and, with some reservations, indicate that rnaxionin is a reasonable decision criterion.
Vector Expected Utility and Attitudes toward Variation
, 2007
"... This paper analyzes a model of decision under ambiguity, deemed vector expected utility or VEU. According to the proposed model, an act f: Ω → X is evaluated via the functional V (f) = ..."
Abstract

Cited by 30 (5 self)
 Add to MetaCart
(Show Context)
This paper analyzes a model of decision under ambiguity, deemed vector expected utility or VEU. According to the proposed model, an act f: Ω → X is evaluated via the functional V (f) =
Continuous Subjective Expected Utility with Nonadditive Probabilities
 Journal of Mathematical Economics
, 1989
"... A wellknown theorem of Debreu about additive representations of preferences is applied in a nonadditive context, to characterize continuous subjective expected utility maximization for the case where the probability measures may be nonadditive. The approach of this paper does not need the assumpt ..."
Abstract

Cited by 29 (2 self)
 Add to MetaCart
A wellknown theorem of Debreu about additive representations of preferences is applied in a nonadditive context, to characterize continuous subjective expected utility maximization for the case where the probability measures may be nonadditive. The approach of this paper does not need the assumption that lotteries with known (objective) probability distributions over consequences are available. 1.