Results 1  10
of
10
The philosophical significance of Cox’s theorem
 International Journal of Approximate Reasoning
, 2004
"... Cox’s theorem states that, under certain assumptions, any measure of belief is isomorphic to a probability measure. This theorem, although intended as a justification of the subjectivist interpretation of probability theory, is sometimes presented as an argument for more controversial theses. Of par ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
Cox’s theorem states that, under certain assumptions, any measure of belief is isomorphic to a probability measure. This theorem, although intended as a justification of the subjectivist interpretation of probability theory, is sometimes presented as an argument for more controversial theses. Of particular interest is the thesis that the only coherent means of representing uncertainty is via the probability calculus. In this paper I examine the logical assumptions of Cox’s theorem and I show how these impinge on the philosophical conclusions thought to be supported by the theorem. I show that the more controversial thesis is not supported by Cox’s theorem.
Belief Augmented Frames
, 2003
"... I would like to express my sincere appreciation to A/P Lua Kim Teng, who patiently guided me through not only my PhD degree, but earlier on through my Honors and Master degrees. Without his help, guidance and counseling this thesis would definitely not have become a reality. My sincere gratitude as ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
I would like to express my sincere appreciation to A/P Lua Kim Teng, who patiently guided me through not only my PhD degree, but earlier on through my Honors and Master degrees. Without his help, guidance and counseling this thesis would definitely not have become a reality. My sincere gratitude as well to good friends like “Tat”, Hong I, Michelle and the “girls next door”, who not only kept me sane and centered through the ordeal of putting this thesis together, but also kept me well fed with cookies, “liang teh " and instant cereal through all those long hours of work. To them I owe all the weight that I’ve put on. To my students, who thoughtfully organized themselves when seeking help from me, to minimize the amount of time that I need to spend with them. To my family, who put up with my terrible tantrums, acid tongue and general crabbiness. Most of all to my beloved wife Catherine, who slaved for hours over stove and oven to bake me the cakes and cookies that kept me going through the night, and who was always willing to go to an empty bed as I spent night after night working on this
Is probability the only coherent approach to uncertainty?, Risk Analysis, forthcoming
"... In this paper I discuss an argument that purports to prove that probability theory is the only sensible means of dealing with uncertainty. I show that this argument can succeed only if some rather controversial assumptions about the nature of uncertainty are accepted. I discuss these assumptions and ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
In this paper I discuss an argument that purports to prove that probability theory is the only sensible means of dealing with uncertainty. I show that this argument can succeed only if some rather controversial assumptions about the nature of uncertainty are accepted. I discuss these assumptions and provide reasons for rejecting them. I also present examples of what I take to be nonprobabilistic uncertainty. Key Words: Cox’s Theorem, NonClassical Logic, Probability, Uncertainty, Vagueness Uncertainties are ubiquitous in risk analysis and on the face of it, we must contend with a number of quite distinct sorts of uncertainty. There are, of course, many methods on hand to deal with uncertainty, so it is important to select the method best suited to the uncertainty in question. There is, however, a growing push towards dealing with all uncertainty in one fell swoop. That is, it is thought to be desirable to employ a single method capable of quantifying all sources of uncertainty. One candidate for this task is probability theory. For such a program to succeed, a demonstration that all uncertainty is,
A Predictive Theory of Games
, 2006
"... Conventional noncooperative game theory hypothesizes that the joint (mixed) strategy of a set of reasoning players in a game will necessarily satisfy an “equilibrium concept”. All other joint strategies are considered impossible. Moroever, often the number of joint strategies satisfying that equilib ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Conventional noncooperative game theory hypothesizes that the joint (mixed) strategy of a set of reasoning players in a game will necessarily satisfy an “equilibrium concept”. All other joint strategies are considered impossible. Moroever, often the number of joint strategies satisfying that equilibrium concept has measure zero. (Indeed, this is often considered a desirable property of an equilibrium concept.) Under this hypothesis the only issue is what equilibrium concept is “correct”. This hypothesis violates the firstprinciples arguments underlying probability theory. Indeed, probability theory renders moot the controversy over what equilibrium concept is correct — while in general there are joint (mixed) strategies with zero probability, in general the set {strategies with nonzero probability} has measure greater than zero. Rather than a firstprinciples derivation of an equilibrium concept, game theory requires a firstprinciples derivation of a distribution over joint strategies.
Extending Heuer’s analysis of competing hypotheses method to support complex decision analysis
 in Proceedings of the 2005 International Conference on Intelligence Analysis (IA05) (CDROM
, 2005
"... In this paper, we evaluate the Analysis of Competing Hypotheses (ACH) method using a normative Bayesian probabilistic framework. We describe the ACH method and present an example of how to use it to structure an analytic problem. We then show how to represent the same analytic problem using Bayesian ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
In this paper, we evaluate the Analysis of Competing Hypotheses (ACH) method using a normative Bayesian probabilistic framework. We describe the ACH method and present an example of how to use it to structure an analytic problem. We then show how to represent the same analytic problem using Bayesian networks and compare the result with that using the ACH method. We discuss how Bayesian networks generalize ACH tables and why the added generality might be important to the analyst for hypothesis management. Finally, we propose an approach for acquiring analytic models that interpret situations and for evaluating hypotheses, thereby combining the strengths of ACH and Bayesian networks. 1
Predictive Game Theory
, 2005
"... Conventional noncooperative game theory hypothesizes that the joint (mixed) strategy of a set of reasoning players in a game will necessarily satisfy an “equilibrium concept”. The number of joint strategies satisfying that equilibrium concept has measure zero, and all other joint strategies are cons ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Conventional noncooperative game theory hypothesizes that the joint (mixed) strategy of a set of reasoning players in a game will necessarily satisfy an “equilibrium concept”. The number of joint strategies satisfying that equilibrium concept has measure zero, and all other joint strategies are considered impossible. Under this hypothesis the only issue is what equilibrium concept is “correct”. This hypothesis violates the firstprinciples arguments underlying probability theory. Indeed, probability theory renders moot the controversy over what equilibrium concept is correct — in general all joint (mixed) strategies in a set with nonzero measure can arise with nonzero probability. Rather than a firstprinciples derivation of an equilibrium concept, game theory requires a firstprinciples derivation of a distribution over joint strategies. However say one wishes to predict a single joint strategy from that
Combining Facts and Expert Opinion in Analytical Models via Logical and Probabilistic Reasoning Summary
"... This report describes a proposal to develop computationally feasible technology to support optimal management of uncertainty, inconsistency and disagreement in collaborative intelligence analysis by importing semantically guided proof search techniques in development at HNC into Bayesian network tec ..."
Abstract
 Add to MetaCart
This report describes a proposal to develop computationally feasible technology to support optimal management of uncertainty, inconsistency and disagreement in collaborative intelligence analysis by importing semantically guided proof search techniques in development at HNC into Bayesian network techniques in development at USC. Uncertainty is pervasive in intelligence analysis. Support systems for intelligence must be able to quantify and track uncertainties in evidence findings, in data used by inferential processes, in the imperfect theories that emerge from the individual and collective experience of analysts, and from other sources. Bayesian probability theory defines the unique paradoxfree method for reasoning with uncertainty, a proven result [Van Horn 2003] that is less widely known than it deserves to be. Although they enjoy certain advantages in versatility and computational complexity, logical knowledge bases are illsuited to represent uncertainty and then reason about it correctly, because knowledge representation languages based on classical logic do not provide facilities for representing and reasoning about uncertainty expressed in a probabilistic form. Recent work shows that, in principle, such facilities can be provided by extending the logical framework to support such representations as multipleentity Bayesian networks and probabilistic relational models
Bayesian Statistical Analysis
, 2005
"... There are a great many books on Bayesian statistics available these days[]. This tutorial is being written since many of our papers and future work in gene arrays rely upon repeated use of these methods. Rather than continually have our readers refer to a variety of textbooks, we thought it best to ..."
Abstract
 Add to MetaCart
There are a great many books on Bayesian statistics available these days[]. This tutorial is being written since many of our papers and future work in gene arrays rely upon repeated use of these methods. Rather than continually have our readers refer to a variety of textbooks, we thought it best to have a
On thinking probabilistically
"... Abstract. This is about a personal journey starting from my lifelong skepticism about statistical significance tests — perhaps the most misapplied of all mathematical theories, especially as regards extreme events — toward a new clarity and power in the use of probability theory and a clear resolut ..."
Abstract
 Add to MetaCart
Abstract. This is about a personal journey starting from my lifelong skepticism about statistical significance tests — perhaps the most misapplied of all mathematical theories, especially as regards extreme events — toward a new clarity and power in the use of probability theory and a clear resolution of old dilemmas about subjectivity versus objectivity. There is little original thought here. Rather, the idea is to pull together some rudimentary threads, often seen as unrelated, from mathematics, biology, experimental psychology, and information theory.
Significance Testing as Perverse Probabilistic Reasoning: Supplemental Material
"... Fundamental ‘forward ’ and ‘backward ’ probabilities in diagnostic testing Interpreting diagnostic tests When a patient ‘tests positive ’ for a disease, what is the probability that this patient actually has the disease? For concreteness, consider using the serum level of brain natriuretic peptide ( ..."
Abstract
 Add to MetaCart
Fundamental ‘forward ’ and ‘backward ’ probabilities in diagnostic testing Interpreting diagnostic tests When a patient ‘tests positive ’ for a disease, what is the probability that this patient actually has the disease? For concreteness, consider using the serum level of brain natriuretic peptide (BNP) as a test for congestive heart failure, defining a positive result as BNP> 100, which reportedly has a sensitivity of 97 % and specificity of 84 % [1]. The interpretation of a positive result depends on the clinical context (e.g. the presence or absence of dyspnea on exertion, jugular venous distention, third heart sound, or edema), that is, on the pretest probability. The nomogram in Figure 1 allows one to precisely quantify the amount by which one’s clinical suspicion (i.e. pretest probability) should be swayed by the BNP value [2]. Positive predictive values The diagnostic testing problem can formulated in general terms as follows. Suppose we must decide whether or not patients have a particular disease. We consider two mutually exclusive hypotheses: H 1 = ‘the disease is present’, or H0 = ‘the disease is absent’, as determined by some goldstandard (e.g. a biopsy, or observation and further testing). Suppose a test for the disease (other than the gold standard) gives results as either negative D 0 or positive D1. Patients can be divided into four groups: true positives (H 1,D1), false positives (H0,D1), true negatives (H0,D0),