Results 1  10
of
10
Coherent Behavior in Noncooperative Games
 JOURNAL OF ECONOMIC THEORY
, 1990
"... A new concept of mutually expected rationality in noncooperative games is proposed: joint coherence. This is an extension of the “no arbitrage opportunities” axiom that underlies subjective probability theory and a variety of economic models. It sheds light on the controversy over the strategies tha ..."
Abstract

Cited by 46 (5 self)
 Add to MetaCart
A new concept of mutually expected rationality in noncooperative games is proposed: joint coherence. This is an extension of the “no arbitrage opportunities” axiom that underlies subjective probability theory and a variety of economic models. It sheds light on the controversy over the strategies that can reasonably be recommended to or expected to arise among Bayesian rational players. Joint coherence is shown to support Aumann’s position in favor of objective correlated equilibrium, although the common prior assumption is weakened and viewed as a theorem rather than an axiom. An elementary proof of the existence of correlated equilibria is given, and relationships with other solution concepts (Nash equilibrium, independent and correlated rationalizability) are also discussed.
Minimax Analysis of Stochastic Problems
 OPTIMIZATION METHODS AND SOFTWARE
, 2002
"... In practical applications of stochastic programming the involved probability distributions are never known exactly. One can try to hedge against the worst expected value resulting from a considered set of permissible distributions. This leads to a minmax formulation of the corresponding stochastic ..."
Abstract

Cited by 21 (0 self)
 Add to MetaCart
In practical applications of stochastic programming the involved probability distributions are never known exactly. One can try to hedge against the worst expected value resulting from a considered set of permissible distributions. This leads to a minmax formulation of the corresponding stochastic programming problem. We show that, under mild regularity conditions, such a minmax problem generates a probability distribution on the set of permissible distributions with the minmax problem being equivalent to the expected value problem with respect to the corresponding weighted distribution. We consider examples of the news vendor problem, the problem of moments and problems involving unimodal distributions. Finally, we discuss the Monte Carlo sample average approach to solving such minmax problems.
The Return of the Prodigal: Bayesian Inference For Astrophysics
"... Astronomers are skeptical of statistical analyses, the more so the more sophisticated they are. This has been true especially of Bayesian methods, despite the fact that such methods largely originated in the astronomical analyses of Laplace and his contemporaries in the early 1800s. I argue here tha ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Astronomers are skeptical of statistical analyses, the more so the more sophisticated they are. This has been true especially of Bayesian methods, despite the fact that such methods largely originated in the astronomical analyses of Laplace and his contemporaries in the early 1800s. I argue here that astronomers hold statistics in low regard because many astronomers are poor statisticians. Further, I argue that astronomers are poor statisticians because the frequentist methods they use have characteristics that invite statistical sloppiness when they are used by nonexperts. The Bayesian approach to statistical inference does not share these characteristics; adoption of Bayesian methods by astronomers thus promises to improve statistical practice in astronomy. I present a simplified discussion of some of the issues arising in the recent analysis of an important astrophysical data set—that provided by the Cosmic Background Explorer satellite—to illustrate some of the practical advantages of a Bayesian outlook. I offer some advice on how to educate astronomers about Bayesian methods. I conclude with a brief survey of recent applications of Bayesian methods to the analysis of astrophysical data. The breadth and number of these applications may well indicate that the time for Bayesian methods to return to the field of their origin has arrived. 1.
Some Results On Posterior Regret GammaMinimax Estimation
, 1993
"... . In this paper we study and compute posterior regret \Gammaminimax actions in several estimation problems. We show that under general conditions, posterior regret \Gammaminimax actions are Bayes for some prior in the class \Gamma. We also study some important special cases such as bounded normal ..."
Abstract
 Add to MetaCart
(Show Context)
. In this paper we study and compute posterior regret \Gammaminimax actions in several estimation problems. We show that under general conditions, posterior regret \Gammaminimax actions are Bayes for some prior in the class \Gamma. We also study some important special cases such as bounded normal means, Poisson means, and regression coefficients estimation. 1 Introduction Robust Bayesian analysis is typically concerned with the effects of changing a prior within a class \Gamma on the results of a Bayesian analysis, for example, the posterior expected value. If conclusions do not differ widely, robustness holds and the imprecision in the prior does not matter very much. On the other hand, if conclusions differ widely we should aim at eliciting additional information about the prior, in the hope of reducing the class and mitigating the differences in the conclusions. However, it is conceivable that the expert may not be willing to provide more information. Prescriptions in this case a...
DRAFT SUBMISSION DETC200735158 Updating uncertainty assessments: A comparison of statistical approaches
"... The performance of a product that is being designed is affected by variations in material, manufacturing process, use, and environmental variables. As a consequence of uncertainties in these factors, some items may fail. Failure is taken very generally, but we assume that it is a random event that o ..."
Abstract
 Add to MetaCart
The performance of a product that is being designed is affected by variations in material, manufacturing process, use, and environmental variables. As a consequence of uncertainties in these factors, some items may fail. Failure is taken very generally, but we assume that it is a random event that occurs at most once in the lifetime of an item. The designer wants the probability of failure to be less than a given threshold. This paper considers three approaches for modeling the uncertainty in whether or not the failure probability meets this threshold: a classical approach, a precise Bayesian approach, and a robust Bayesian (or imprecise probability) approach. In some scenarios, the designer may have some initial beliefs about the failure probability. The designer also has the opportunity to obtain more information about product performance (e.g. from either experiments with actual items or runs of a simulation program that provide the actual performance of interest or an acceptable surrogate for actual performance). This paper considers different approaches for forming and updating the designer's beliefs about the failure probability. The goal is to gain insight into the relative strengths and weaknesses of the approaches. Examples are presented for illustrating the conclusions.
Robust Analysis of Stochastic Problems
"... In practical applications of stochastic programming the involved probability distributions are never known exactly. One can try to hedge against the worst expected value resulting from a considered set of permissible distributions. This leads to a minmax formulation of the corresponding stochastic ..."
Abstract
 Add to MetaCart
In practical applications of stochastic programming the involved probability distributions are never known exactly. One can try to hedge against the worst expected value resulting from a considered set of permissible distributions. This leads to a minmax formulation of the corresponding stochastic programming problem. We show that, under mild regularity conditions, such a minmax problem generates a probability distribution on the set of permissible distributions with the minmax problem being equivalent to the expected value problem with respect to the corresponding weighted distribution. We consider examples of the news vendor problem, the problem of moments and problems involving unimodal distributions. Finally, we discuss the Monte Carlo sample average approach to solving such minmax problems.
Epistemic Uncertainty in CommonCause Failure Models. Submitted. A ROBUST BAYESIAN APPROACH TO MODELLING EPISTEMIC UNCERTAINTY IN COMMONCAUSE FAILURE MODELS
"... ar ..."
(Show Context)
SAvail oii~d/or Dist Special Prepared Under Contract A
, 1993
"... Approved for public release; distribution unlimited ..."
(Show Context)