Results 1  10
of
92
Quantum mechanics as quantum information (and only a little more), Quantum Theory: Reconsideration of Foundations
, 2002
"... In this paper, I try once again to cause some goodnatured trouble. The issue remains, when will we ever stop burdening the taxpayer with conferences devoted to the quantum foundations? The suspicion is expressed that no end will be in sight until a means is found to reduce quantum theory to two or ..."
Abstract

Cited by 113 (8 self)
 Add to MetaCart
(Show Context)
In this paper, I try once again to cause some goodnatured trouble. The issue remains, when will we ever stop burdening the taxpayer with conferences devoted to the quantum foundations? The suspicion is expressed that no end will be in sight until a means is found to reduce quantum theory to two or three statements of crisp physical (rather than abstract, axiomatic) significance. In this regard, no tool appears better calibrated for a direct assault than quantum information theory. Far from a strained application of the latest fad to a timehonored problem, this method holds promise precisely because a large part—but not all—of the structure of quantum theory has always concerned information. It is just that the physics community needs reminding. This paper, though takingquantph/0106166 as its core, corrects one mistake and offers several observations beyond the previous version. In particular, I identify one element of quantum mechanics that I would not label a subjective term in the theory—it is the integer parameter D traditionally ascribed to a quantum system via its Hilbertspace dimension. 1
Decision Theory in Expert Systems and Artificial Intelligence
 International Journal of Approximate Reasoning
, 1988
"... Despite their different perspectives, artificial intelligence (AI) and the disciplines of decision science have common roots and strive for similar goals. This paper surveys the potential for addressing problems in representation, inference, knowledge engineering, and explanation within the decision ..."
Abstract

Cited by 105 (19 self)
 Add to MetaCart
(Show Context)
Despite their different perspectives, artificial intelligence (AI) and the disciplines of decision science have common roots and strive for similar goals. This paper surveys the potential for addressing problems in representation, inference, knowledge engineering, and explanation within the decisiontheoretic framework. Recent analyses of the restrictions of several traditional AI reasoning techniques, coupled with the development of more tractable and expressive decisiontheoretic representation and inference strategies, have stimulated renewed interest in decision theory and decision analysis. We describe early experience with simple probabilistic schemes for automated reasoning, review the dominant expertsystem paradigm, and survey some recent research at the crossroads of AI and decision science. In particular, we present the belief network and influence diagram representations. Finally, we discuss issues that have not been studied in detail within the expertsystems sett...
Bayes in the sky: Bayesian inference and model selection in cosmology
 Contemp. Phys
"... The application of Bayesian methods in cosmology and astrophysics has flourished over the past decade, spurred by data sets of increasing size and complexity. In many respects, Bayesian methods have proven to be vastly superior to more traditional statistical tools, offering the advantage of higher ..."
Abstract

Cited by 58 (6 self)
 Add to MetaCart
(Show Context)
The application of Bayesian methods in cosmology and astrophysics has flourished over the past decade, spurred by data sets of increasing size and complexity. In many respects, Bayesian methods have proven to be vastly superior to more traditional statistical tools, offering the advantage of higher efficiency and of a consistent conceptual basis for dealing with the problem of induction in the presence of uncertainty. This trend is likely to continue in the future, when the way we collect, manipulate and analyse observations and compare them with theoretical models will assume an even more central role in cosmology. This review is an introduction to Bayesian methods in cosmology and astrophysics and recent results in the field. I first present Bayesian probability theory and its conceptual underpinnings, Bayes ’ Theorem and the role of priors. I discuss the problem of parameter inference and its general solution, along with numerical techniques such as Monte Carlo Markov Chain methods. I then review the theory and application of Bayesian model comparison, discussing the notions of Bayesian evidence and effective model complexity, and how to compute and interpret those quantities. Recent developments in cosmological parameter extraction and Bayesian cosmological model building are summarized, highlighting the challenges that lie ahead.
Coherent Behavior in Noncooperative Games
 JOURNAL OF ECONOMIC THEORY
, 1990
"... A new concept of mutually expected rationality in noncooperative games is proposed: joint coherence. This is an extension of the “no arbitrage opportunities” axiom that underlies subjective probability theory and a variety of economic models. It sheds light on the controversy over the strategies tha ..."
Abstract

Cited by 46 (5 self)
 Add to MetaCart
A new concept of mutually expected rationality in noncooperative games is proposed: joint coherence. This is an extension of the “no arbitrage opportunities” axiom that underlies subjective probability theory and a variety of economic models. It sheds light on the controversy over the strategies that can reasonably be recommended to or expected to arise among Bayesian rational players. Joint coherence is shown to support Aumann’s position in favor of objective correlated equilibrium, although the common prior assumption is weakened and viewed as a theorem rather than an axiom. An elementary proof of the existence of correlated equilibria is given, and relationships with other solution concepts (Nash equilibrium, independent and correlated rationalizability) are also discussed.
Updating Beliefs with Incomplete Observations
"... Currently, there is renewed interest in the problem, raised by Shafer in 1985, of updating probabilities when observations are incomplete (or setvalued). This is a fundamental problem in general, and of particular interest for Bayesian networks. Recently, Gr unwald and Halpern have shown that co ..."
Abstract

Cited by 41 (13 self)
 Add to MetaCart
Currently, there is renewed interest in the problem, raised by Shafer in 1985, of updating probabilities when observations are incomplete (or setvalued). This is a fundamental problem in general, and of particular interest for Bayesian networks. Recently, Gr unwald and Halpern have shown that commonly used updating strategies fail in this case, except under very special assumptions. In this paper we propose a new method for updating probabilities with incomplete observations. Our approach is deliberately conservative: we make no assumptions about the socalled incompleteness mechanism that associates complete with incomplete observations. We model our ignorance about this mechanism by a vacuous lower prevision, a tool from the theory of imprecise probabilities, and we use only coherence arguments to turn prior into posterior (updated) probabilities. In general, this new approach to updating produces lower and upper posterior probabilities and previsions (expectations), as well as partially determinate decisions. This is a logical consequence of the existing ignorance about the incompleteness mechanism. As an example, we use the new updating method to properly address the apparent paradox in the `Monty Hall' puzzle. More importantly, we apply it to the problem of classification of new evidence in probabilistic expert systems, where it leads to a new, socalled conservative updating rule.
Quantum Foundations in the Light of Quantum Information
 PROCEEDINGS OF THE NATO ADVANCED RESEARCH WORKSHOP, MYKONOS GREECE
, 2001
"... In this paper, I try to cause some goodnatured trouble. The issue at stake is when will we ever stop burdening the taxpayer with conferences and workshops devoted— explicitly or implicitly—to the quantum foundations? The suspicion is expressed that no end will be in sight until a means is found to ..."
Abstract

Cited by 20 (3 self)
 Add to MetaCart
(Show Context)
In this paper, I try to cause some goodnatured trouble. The issue at stake is when will we ever stop burdening the taxpayer with conferences and workshops devoted— explicitly or implicitly—to the quantum foundations? The suspicion is expressed that no end will be in sight until a means is found to reduce quantum theory to two or three statements of crisp physical (rather than abstract, axiomatic) significance. In this regard, no tool appears to be better calibrated for a direct assault than quantum information theory. Far from being a strained application of the latest fad to a deepseated problem, this method holds promise precisely because a large part (but not all) of the structure of quantum theory has always concerned information. It is just that the physics community has somehow forgotten this.
A Random Set Description of a Possibility Measure and Its Natural Extension
 IEEE Transactions on Systems, Man and Cybernetics
, 1997
"...  The relationship is studied between possibility and necessity measures dened on arbitrary spaces, the theory of imprecise probabilities, and elementary random set theory. It is shown how special random sets can be used to generate normal possibility and necessity measures, as well as their natural ..."
Abstract

Cited by 19 (7 self)
 Add to MetaCart
(Show Context)
 The relationship is studied between possibility and necessity measures dened on arbitrary spaces, the theory of imprecise probabilities, and elementary random set theory. It is shown how special random sets can be used to generate normal possibility and necessity measures, as well as their natural extensions. This leads to interesting alternative formulas for the calculation of these natural extensions. KeywordsUpper probability, upper prevision, coherence, natural extension, possibility measure, random sets. I. Introduction P OSSIBILITY measures were introduced by Zadeh [1] in 1978. In his view, these supremum preserving set functions are a mathematical representation of the information conveyed by typical armative statements in natural language. For recent discussions of this interpretation within the behavioural framework of the theory of imprecise probabilities, we refer to [2], [3], [4]. Supremum preserving set functions can also be found in the literature under a number o...
SYMMETRY OF MODELS VERSUS MODELS OF SYMMETRY
, 2008
"... A model for a subject’s beliefs about a phenomenon may exhibit symmetry, in the sense that it is invariant under certain transformations. On the other hand, such a belief model may be intended to represent that the subject believes or knows that the phenomenon under study exhibits symmetry. We def ..."
Abstract

Cited by 17 (8 self)
 Add to MetaCart
A model for a subject’s beliefs about a phenomenon may exhibit symmetry, in the sense that it is invariant under certain transformations. On the other hand, such a belief model may be intended to represent that the subject believes or knows that the phenomenon under study exhibits symmetry. We defend the view that these are fundamentally different things, even though the difference cannot be captured by Bayesian belief models. In fact, the failure to distinguish between both situations leads to Laplace’s socalled Principle of Insufficient Reason, which has been criticised extensively in the literature. We show that there are belief models (imprecise probability models, coherent lower previsions) that generalise and include the Bayesian belief models, but where this fundamental difference can be captured. This leads to two notions of symmetry for such belief models: weak invariance (representing symmetry of beliefs) and strong invariance (modelling beliefs of symmetry). We discuss various mathematical as well as more philosophical aspects of these notions. We also discuss a few examples to show the relevance of our findings both to probabilistic modelling and to statistical inference, and to the notion of exchangeability in particular.