Results 1  10
of
36
Bayesian hypothesis testing: A reference approach
 Internat. Statist. Rev
, 2002
"... For any probability model M ≡{p(x  θ, ω), θ ∈ Θ, ω ∈ Ω} assumed to describe the probabilistic behaviour of data x ∈ X, it is argued that testing whether or not the available data are compatible with the hypothesis H0 ≡{θ = θ0} is best considered as a formal decision problem on whether to use (a0), ..."
Abstract

Cited by 17 (5 self)
 Add to MetaCart
For any probability model M ≡{p(x  θ, ω), θ ∈ Θ, ω ∈ Ω} assumed to describe the probabilistic behaviour of data x ∈ X, it is argued that testing whether or not the available data are compatible with the hypothesis H0 ≡{θ = θ0} is best considered as a formal decision problem on whether to use (a0), or not to use (a1), the simpler probability model (or null model) M0 ≡{p(x  θ0, ω), ω ∈ Ω}, where the loss difference L(a0, θ, ω) − L(a1, θ, ω) is proportional to the amount of information δ(θ0, θ, ω) which would be lost if the simplified model M0 were used as a proxy for the assumed model M. For any prior distribution π(θ, ω), the appropriate normative solution is obtained by rejecting the null model M0 whenever the corresponding posterior expectation ∫ ∫ δ(θ0, θ, ω) π(θ, ω  x) dθ dω is sufficiently large. Specification of a subjective prior is always difficult, and often polemical, in scientific communication. Information theory may be used to specify a prior, the reference prior, which only depends on the assumed model M, and mathematically describes a situation where no prior information is available about the quantity of interest. The reference posterior expectation, d(θ0, x) = ∫ δπ(δ  x) dδ, of the amount of information δ(θ0, θ, ω) which could be lost if the null model were used, provides an attractive nonnegative test function, the intrinsic statistic, which is
A Geometric Formulation of Occam’s Razor For Inference of Parametric Distributions
, 1996
"... I define a natural measure of the complexity of a parametric distribution relative to a given true distribution called the razor of a model family. The Minimum Description Length principle (MDL) and Bayesian inference are shown to give empirical approximations of the razor via an analysis that signi ..."
Abstract

Cited by 16 (4 self)
 Add to MetaCart
I define a natural measure of the complexity of a parametric distribution relative to a given true distribution called the razor of a model family. The Minimum Description Length principle (MDL) and Bayesian inference are shown to give empirical approximations of the razor via an analysis that significantly extends existing results on the asymptotics of Bayesian model selection. I treat parametric families as manifolds embedded in the space of distributions and derive a canonical metric and a measure on the parameter manifold by appealing to the classical theory of hypothesis testing. I find that the Fisher information is the natural measure of distance, and give a novel justification for a choice of Jeffreys prior for Bayesian inference. The results of this paper suggest corrections to MDL that can be important for model selection with a small amount of data. These corrections are interpreted as natural measures of the simplicity of a model family. I show that in a certain sense the logarithm of the Bayesian posterior converges to the logarithm of the razor of a model family as defined here. Close connections with known results on density estimation and “information geometry” are discussed as they arise.
Computing Normalizing Constants for Finite Mixture Models via Incremental Mixture Importance Sampling (IMIS)
, 2003
"... We propose a method for approximating integrated likelihoods in finite mixture models. We formulate the model in terms of the unobserved group memberships, z, and make them the variables of integration. The integral is then evaluated using importance sampling over the z. We propose an adaptive imp ..."
Abstract

Cited by 14 (5 self)
 Add to MetaCart
We propose a method for approximating integrated likelihoods in finite mixture models. We formulate the model in terms of the unobserved group memberships, z, and make them the variables of integration. The integral is then evaluated using importance sampling over the z. We propose an adaptive importance sampling function which is itself a mixture, with two types of component distributions, one concentrated and one diffuse. The more concentrated type of component serves the usual purpose of an importance sampling function, sampling mostly group assignments of high posterior probability. The less concentrated type of component allows for the importance sampling function to explore the space in a controlled way to find other, unvisited assignments with high posterior probability. Components are added adaptively, one at a time, to cover areas of high posterior probability not well covered by the current important sampling function. The method is called Incremental Mixture Importance Sampling (IMIS). IMIS is easy to implement and to monitor for convergence. It scales easily for higher dimensional
Default priors for Bayesian and frequentist inference
 J. Royal Statist. Soc. B
, 2010
"... We investigate the choice of default prior for use with likelihood to facilitate Bayesian and frequentist inference. Such a prior is a density or relative density that weights an observed likelihood function leading to the elimination of parameters not of interest and accordingly providing a density ..."
Abstract

Cited by 12 (4 self)
 Add to MetaCart
We investigate the choice of default prior for use with likelihood to facilitate Bayesian and frequentist inference. Such a prior is a density or relative density that weights an observed likelihood function leading to the elimination of parameters not of interest and accordingly providing a density type assessment for a parameter of interest. For regular models with independent coordinates we develop a secondorder prior for the full parameter based on an approximate location relation from near a parameter value to near the observed data point; this derives directly from the coordinate distribution functions and is closely linked to the original Bayes approach. We then develop a modified prior that is targetted on a component parameter of interest and avoids the marginalization paradoxes of Dawid, Stone and Zidek (1973); this uses some extensions of WelchPeers theory that modify the Jeffreys prior and builds more generally on the approximate location property. A third type of prior is then developed that targets a vector interest parameter in the presence of a vector nuisance parameter and is based more directly on the original Jeffreys approach. Examples are given to clarify the computation of the priors and the flexibility of the approach.
Posterior Distributions in Limited Information Analysis of the Simultaneous Equations Model Using the Jeffreys Prior
 Journal of Econometrics
, 1998
"... Posterior distributions in limited information analysis of the simultaneous equations model using the ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
Posterior distributions in limited information analysis of the simultaneous equations model using the
Probabilities are singlecase, or nothing
 Optics and Spectroscopy
, 2005
"... Physicists have, hitherto, mostly adopted a frequentist conception of probability, according to which probability statements apply only to ensembles. It is argued that we should, instead, adopt an epistemic, or Bayesian conception, in which probabilities are conceived as logical constructs rather th ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Physicists have, hitherto, mostly adopted a frequentist conception of probability, according to which probability statements apply only to ensembles. It is argued that we should, instead, adopt an epistemic, or Bayesian conception, in which probabilities are conceived as logical constructs rather than physical realities, and in which probability statements do apply directly to individual events. The question is closely related to the disagreement between the orthodox school of statistical thought and the Bayesian school. It has important technical implications (it makes a difference, what statistical methodology one adopts). It may also have important implications for the interpretation of the quantum state. 1 1.
Some Bayesian perspectives on statistical modelling
, 1988
"... I would like to thank my supervisor, Professor A. F. M. Smith, for all his advice and encourage ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
I would like to thank my supervisor, Professor A. F. M. Smith, for all his advice and encourage
Justification as TruthFinding Efficiency: How Ockham’s Razor Works
, 2004
"... I propose that empirical procedures, like computational procedures, are justified in terms of truthfinding efficiency. I contrast the idea with more standard philosophies of science and illustrate it by deriving Ockham’s razor from the aim of minimizing dramatic changes of opinion en route to the ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
I propose that empirical procedures, like computational procedures, are justified in terms of truthfinding efficiency. I contrast the idea with more standard philosophies of science and illustrate it by deriving Ockham’s razor from the aim of minimizing dramatic changes of opinion en route to the truth.
CONCERNING DICE AND DIVINITY
, 2006
"... Einstein initially objected to the probabilistic aspect of quantum mechanics— the idea that God is playing at dice. Later he changed his ground, and focussed instead on the point that the Copenhagen Interpretation leads to what Einstein saw as the abandonment of physical realism. We argue here that ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Einstein initially objected to the probabilistic aspect of quantum mechanics— the idea that God is playing at dice. Later he changed his ground, and focussed instead on the point that the Copenhagen Interpretation leads to what Einstein saw as the abandonment of physical realism. We argue here that Einstein’s initial intuition was perfectly sound, and that it is precisely the fact that quantum mechanics is a fundamentally probabilistic theory which is at the root of all the controversies regarding its interpretation. Probability is an intrinsically logical concept. This means that the quantum state has an essentially logical significance. It is extremely difficult to reconcile that fact with Einstein’s belief, that it is the task of physics to give us a vision of the world apprehended sub specie aeternitatis. Quantum mechanics thus presents us with a simple choice: either to follow Einstein in looking for a theory which is not probabilistic at the fundamental level, or else to accept that physics does not in fact put us in the position of God looking down on things from above. There is a widespread fear that the latter alternative must inevitably lead to a greatly impoverished,
In Search of a Sulphur Dioxide Environmental Kuznets Curve: A Bayesian Model Averaging Approach,'' working paper
, 2006
"... The exact specification and motivation of the Environmental Kuznets Curve (EKC) is the subject of a vast literature in environmental economics. A remarkably diverse set of econometric approaches has been employed to support or reject a specific relationship between environmental quality and pollutio ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
The exact specification and motivation of the Environmental Kuznets Curve (EKC) is the subject of a vast literature in environmental economics. A remarkably diverse set of econometric approaches has been employed to support or reject a specific relationship between environmental quality and pollution. Nevertheless, methods employed to date have not addressed the issue of model uncertainty, given that a sizable number of competing theories exist that can explain the income/pollution relationship. We introduce Bayesian Model Averaging to the EKC analysis to examine a) whether a sulphur dioxide EKC exists, and if so, b) which income/pollution specification is most strongly supported by the data. We find only weak support for an EKC, which disappears altogether when we address oversampling issues in the data. In contrast, our results highlight the relative importance of political economy and sitespecific variables in explaining pollution outcomes. Trade is also shown to play an important indirect role. It moderates the influence of the composition effect on pollution. Our findings run contrary to the deterministic view of the income/pollution relationship that is persistent in the literature.