Results 1  10
of
17
Severe Testing as a Basic Concept in a NeymanPearson Philosophy of Induction
 BRITISH JOURNAL FOR THE PHILOSOPHY OF SCIENCE
, 2006
"... Despite the widespread use of key concepts of the Neyman–Pearson (N–P) statistical paradigm—type I and II errors, significance levels, power, confidence levels—they have been the subject of philosophical controversy and debate for over 60 years. Both current and longstanding problems of N–P tests s ..."
Abstract

Cited by 50 (21 self)
 Add to MetaCart
Despite the widespread use of key concepts of the Neyman–Pearson (N–P) statistical paradigm—type I and II errors, significance levels, power, confidence levels—they have been the subject of philosophical controversy and debate for over 60 years. Both current and longstanding problems of N–P tests stem from unclarity and confusion, even among N–P adherents, as to how a test’s (predata) error probabilities are to be used for (postdata) inductive inference as opposed to inductive behavior. We argue that the relevance of error probabilities is to ensure that only statistical hypotheses that have passed severe or probative tests are inferred from the data. The severity criterion supplies a metastatistical principle for evaluating proposed statistical inferences, avoiding classic fallacies from tests that are overly sensitive, as well as those not sensitive enough to particular errors and discrepancies.
From association to causation: Some remarks on the history of statistics
 Statist. Sci
, 1999
"... The “numerical method ” in medicine goes back to Pierre Louis ’ study of pneumonia (1835), and John Snow’s book on the epidemiology of cholera (1855). Snow took advantage of natural experiments and used convergent lines of evidence to demonstrate that cholera is a waterborne infectious disease. More ..."
Abstract

Cited by 36 (7 self)
 Add to MetaCart
The “numerical method ” in medicine goes back to Pierre Louis ’ study of pneumonia (1835), and John Snow’s book on the epidemiology of cholera (1855). Snow took advantage of natural experiments and used convergent lines of evidence to demonstrate that cholera is a waterborne infectious disease. More recently, investigators in the social and life sciences have used statistical models and significance tests to deduce causeandeffect relationships from patterns of association; an early example is Yule’s study on the causes of poverty (1899). In my view, this modeling enterprise has not been successful. Investigators tend to neglect the difficulties in establishing causal relations, and the mathematical complexities obscure rather than clarify the assumptions on which the analysis is based. Formal statistical inference is, by its nature, conditional. If maintained hypotheses A, B, C,... hold, then H can be tested against the data. However, if A, B, C,... remain in doubt, so must inferences about H. Careful scrutiny of maintained hypotheses should therefore be a critical part of empirical work—a principle honored more often in the breach than the observance. Snow’s work on cholera will be contrasted with modern studies that depend on statistical models and tests of significance. The examples may help to clarify the limits of current statistical techniques for making causal inferences from patterns of association. 1.
On specifying graphical models for causation, and the identification problem
 Evaluation Review
, 2004
"... This paper (which is mainly expository) sets up graphical models for causation, having a bit less than the usual complement of hypothetical counterfactuals. Assuming the invariance of error distributions may be essential for causal inference, but the errors themselves need not be invariant. Graphs c ..."
Abstract

Cited by 29 (2 self)
 Add to MetaCart
This paper (which is mainly expository) sets up graphical models for causation, having a bit less than the usual complement of hypothetical counterfactuals. Assuming the invariance of error distributions may be essential for causal inference, but the errors themselves need not be invariant. Graphs can be interpreted using conditional distributions, so that we can better address connections between the mathematical framework and causality in the world. The identification problem is posed in terms of conditionals. As will be seen, causal relationships cannot be inferred from a data set by running regressions unless there is substantial prior knowledge about the mechanisms that generated the data. There are few successful applications of graphical models, mainly because few causal pathways can be excluded on a priori grounds. The invariance conditions themselves remain to be assessed.
What Is The Chance Of An Earthquake?
"... this paper, we try to interpret such probabilities. Making sense of earthquake forecasts is surprisingly dicult. In part, this is because the forecasts are based on a complicated mixture of geological maps, empirical rules of thumb, expert opinion, physical models, stochastic models, numerical simul ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
this paper, we try to interpret such probabilities. Making sense of earthquake forecasts is surprisingly dicult. In part, this is because the forecasts are based on a complicated mixture of geological maps, empirical rules of thumb, expert opinion, physical models, stochastic models, numerical simulations, as well as geodetic, seismic, and paleoseismic data. Even the concept of probability is hard to dene in this context. We examine the problems in applying standard denitions of probability to earthquakes, taking the USGS forecastthe product of a particularly careful and ambitious studyas our lead example. The issues are general, and concern the interpretation more than the numerical values. Despite the work involved in the USGS forecast, the probability estimate is shaky, as is the estimate of its uncertainty
Supplement to “Parametric or nonparametric? A parametricness index for model selection.” DOI:10.1214/11AOS899SUPP
, 2011
"... ar ..."
(Show Context)
Dutch Book against some `Objective' Priors
"... Dutch book' and `strong inconsistency' are generally equivalent: there is a system of bets that makes money for the gambler, whatever the state of nature may be. As de Finetti showed, an oddsmaker who is not a Bayesian is subject to a dutch book, under certain highly stylized rules of pla ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Dutch book' and `strong inconsistency' are generally equivalent: there is a system of bets that makes money for the gambler, whatever the state of nature may be. As de Finetti showed, an oddsmaker who is not a Bayesian is subject to a dutch book, under certain highly stylized rules of playafactoften used as an argument against frequentists. However, socalled `objective' or `uninformative' priors may also be subject to a dutch book. This note explains, in a relatively simple and selfcontained way, how to make dutch book against a frequentlyrecommended uninformative prior for covariance matrices.
Introductory Remarks on Metastatistics for The Practically Minded Non–Bayesian Regression Runner Contents
, 2008
"... ..."
Mapping of Probabilities Theory for the Interpretation of Uncertain Physical Measurements
, 2007
"... In this book, I attempt to reach two goals. The first is purely mathematical: to clarify some of the basic concepts of probability theory. The second goal is physical: to clarify the methods to be used when handling the information brought by measurements, in order to understand how accurate are the ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In this book, I attempt to reach two goals. The first is purely mathematical: to clarify some of the basic concepts of probability theory. The second goal is physical: to clarify the methods to be used when handling the information brought by measurements, in order to understand how accurate are the inferences they allow. Probability theory is solidly based on Kolmogorov axioms, but the basic inference tool provided by Kolmogorov’s theory is the definition of conditional probability. While some simple problems can be solved though this notion of conditional probability, more elaborate problems, in particular, most of the inference problems that use inaccurate observations require a more advanced probability theory. When considering sets, there are some well known notions, for instance, the intersection of two sets, or, when a mapping is considered between two sets, the notion of image of a set, or of reciprocal image of a set. I develop in this book the theory that generalizes these notions when, instead of sets,
Salt and blood pressure: Conventional wisdom reconsidered
 Evaluation Review
, 2001
"... The “salt hypothesis ” is that higher levels of salt in the diet lead to higher levels of blood pressure, increasing the risk of cardiovascular disease. Intersalt, a crosssectional study of salt levels and blood pressures in 52 populations, is often cited to support the salt hypothesis, but the data ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
The “salt hypothesis ” is that higher levels of salt in the diet lead to higher levels of blood pressure, increasing the risk of cardiovascular disease. Intersalt, a crosssectional study of salt levels and blood pressures in 52 populations, is often cited to support the salt hypothesis, but the data are somewhat contradictory. Four of the populations (Kenya, Papua, and two Indian tribes in Brazil) do have low levels of salt and blood pressure. Across the other 48 populations, however, blood pressures go down as salt levels go up—contradicting the hypothesis. Experimental evidence suggests that the effect of a large reduction in salt intake on blood pressure is modest, and health consequences remain to be determined. Funding agencies and medical journals have taken a stronger position favoring the salt hypothesis than is warranted, raising questions about the interaction between the policy process and science.
ADVISOR
, 2010
"... and support of my adviser Dr. Richard Brundage. His guidance encouraged me to fearlessly pursue novel research methods. A special thanks goes to his editorial input during writing of this thesis. I am indebted to Brian Corrigan and Raymond Miller for letting me use the data from PD 0220390 as part o ..."
Abstract
 Add to MetaCart
and support of my adviser Dr. Richard Brundage. His guidance encouraged me to fearlessly pursue novel research methods. A special thanks goes to his editorial input during writing of this thesis. I am indebted to Brian Corrigan and Raymond Miller for letting me use the data from PD 0220390 as part of this thesis and providing suggestions during the course of this research. A special thanks goes to Dr Timothy Hanson for guiding me on the usage of Bayesian methodologies in this research..