Results 1  10
of
25
Prior Probabilities
 IEEE Transactions on Systems Science and Cybernetics
, 1968
"... e case of location and scale parameters, rate constants, and in Bernoulli trials with unknown probability of success. In realistic problems, both the transformation group analysis and the principle of maximum entropy are needed to determine the prior. The distributions thus found are uniquely determ ..."
Abstract

Cited by 166 (3 self)
 Add to MetaCart
e case of location and scale parameters, rate constants, and in Bernoulli trials with unknown probability of success. In realistic problems, both the transformation group analysis and the principle of maximum entropy are needed to determine the prior. The distributions thus found are uniquely determined by the prior information, independently of the choice of parameters. In a certain class of problems, therefore, the prior distributions may now be claimed to be fully as "objective" as the sampling distributions. I. Background of the problem Since the time of Laplace, applications of probability theory have been hampered by difficulties in the treatment of prior information. In realistic problems of decision or inference, we often have prior information which is highly relevant to the question being asked; to fail to take it into account is to commit the most obvious inconsistency of reasoning and may lead to absurd or dangerously misleading results. As an extreme examp
The case for objective Bayesian analysis
 Bayesian Analysis
, 2006
"... Abstract. Bayesian statistical practice makes extensive use of versions of objective Bayesian analysis. We discuss why this is so, and address some of the criticisms that have been raised concerning objective Bayesian analysis. The dangers of treating the issue too casually are also considered. In p ..."
Abstract

Cited by 35 (3 self)
 Add to MetaCart
Abstract. Bayesian statistical practice makes extensive use of versions of objective Bayesian analysis. We discuss why this is so, and address some of the criticisms that have been raised concerning objective Bayesian analysis. The dangers of treating the issue too casually are also considered. In particular, we suggest that the statistical community should accept formal objective Bayesian techniques with confidence, but should be more cautious about casual objective Bayesian techniques.
Asymptotics and the theory of inference
, 2003
"... Asymptotic analysis has always been very useful for deriving distributions in statistics in cases where the exact distribution is unavailable. More importantly, asymptotic analysis can also provide insight into the inference process itself, suggesting what information is available and how this infor ..."
Abstract

Cited by 16 (7 self)
 Add to MetaCart
Asymptotic analysis has always been very useful for deriving distributions in statistics in cases where the exact distribution is unavailable. More importantly, asymptotic analysis can also provide insight into the inference process itself, suggesting what information is available and how this information may be extracted. The development of likelihood inference over the past twentysome years provides an illustration of the interplay between techniques of approximation and statistical theory.
When did Bayesian inference become “Bayesian"?
 BAYESIAN ANALYSIS
, 2006
"... While Bayes’ theorem has a 250year history, and the method of inverse probability that flowed from it dominated statistical thinking into the twentieth century, the adjective “Bayesian” was not part of the statistical lexicon until relatively recently. This paper provides an overview of key Bayesi ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
While Bayes’ theorem has a 250year history, and the method of inverse probability that flowed from it dominated statistical thinking into the twentieth century, the adjective “Bayesian” was not part of the statistical lexicon until relatively recently. This paper provides an overview of key Bayesian developments, beginning with Bayes’ posthumously published 1763 paper and continuing up through approximately 1970, including the period of time when “Bayesian” emerged as the label of choice for those who advocated Bayesian methods.
DempsterShafer theory and statistical inference with weak beliefs
, 2008
"... DempsterShafer (DS) theory is a powerful tool for probabilistic reasoning and decisionmaking based on a formal calculus for combining statistical and nonstatistical evidence, as represented by a system of belief functions. DS theory has been widely used in computer science and engineering applica ..."
Abstract

Cited by 10 (10 self)
 Add to MetaCart
DempsterShafer (DS) theory is a powerful tool for probabilistic reasoning and decisionmaking based on a formal calculus for combining statistical and nonstatistical evidence, as represented by a system of belief functions. DS theory has been widely used in computer science and engineering applications, but has yet to reach the statistical mainstream, perhaps because the DS belief functions do not satisfy the familiar longrun frequency properties statisticians are used to. Recently, two of the authors proposed an extension of DS, called the weak belief (WB) approach, which has the ability to incorporate desirable frequency properties into the DS framework. The present paper reviews and extends this WB approach. We present a general description of the WB approach, its interplay with the DS calculus, and the resulting maximal belief solution, some simple illustrative examples, and some new perspectives, namely, frequency properties/interpretations and the potential of WB for situationspecific inference. New applications of the WB method in two interesting statistical problems—largescale simultaneous testing and nonparametrics—are given. Simulations show that the WB procedures, suitably calibrated, perform well compared to popular classical methods. Most importantly, the WB approach combines the probabilistic reasoning of DS with the desirable frequency properties of classical statistics, making it possible to solve challenging inference problems in a situationspecific way.
Monte Carlo conditioning on a sufficient statistic
, 2000
"... We derive general formulae for computation of conditional expectations of functions (X) given T = t, when (X; T) is a pair of random vectors such that T is sufficient compared to X for a parameter . The basic assumption is that there is a random vector U with a known distribution and functions ; s ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
We derive general formulae for computation of conditional expectations of functions (X) given T = t, when (X; T) is a pair of random vectors such that T is sufficient compared to X for a parameter . The basic assumption is that there is a random vector U with a known distribution and functions ; such that (X; T) under the parameter value has the same distribution as the pair ((U; ); (U; )). The conditional expectations are then expressed in terms of ordinary expectations of functions of U, and are thus well suited for computation by Monte Carlo simulation. The clue is to equip the parameter space by a suitably chosen σfinite measure. The problem of direct sampling from conditional distributions of X given T = t is also considered. It is shown in particular that when the model satises a certain pivotal condition, then this can be done by sampling (U; ) where the value of is adjusted for each U in such a manner that (U; ) = t is kept fixed. Several examples are given in order to demonstrate different cases which may occur in practice.
Some Bayesian perspectives on statistical modelling
, 1988
"... I would like to thank my supervisor, Professor A. F. M. Smith, for all his advice and encourage ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
I would like to thank my supervisor, Professor A. F. M. Smith, for all his advice and encourage
SITUATIONSPECIFIC INFERENCE USING THE DEMPSTERSHAFER THEORY
"... R.A. Fisher questioned the samplingbased approach to statistical inference on the grounds that it often cannot really answer the scientific question of interest. Fisher’s fiducial argument and the DempsterShafer (DS) theory are inferential methods that strive towards answering these situationspec ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
R.A. Fisher questioned the samplingbased approach to statistical inference on the grounds that it often cannot really answer the scientific question of interest. Fisher’s fiducial argument and the DempsterShafer (DS) theory are inferential methods that strive towards answering these situationspecific questions. For some important problems, such as testing of a sharp null hypothesis, these alternative theories suffer from the same drawbacks as their samplingbased counterparts. The Weak Belief (WB) extension of DS is applied in such cases to achieve the best of both worlds: the desirable personal probabilitybased inference of DS with the additional flexibility of WB. We formulate a general framework for situation specific inference, which we call the WBDS method. Applications of the WBDS method are illustrated in two important statistical problems, namely largescale simultaneous hypothesis testing and nonparametrics. We show in simulations that the WBDS procedures, suitably calibrated,
Fiducial inference for discrete and continuous distributions
"... Abstract. This paper describes the general principles and methods of the fiducial inference. A brief survey of its competing inferencial theories as well as a comparison with them are also provided. Arguments in favour of the application of the fiducial method to the parameters of discrete random va ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract. This paper describes the general principles and methods of the fiducial inference. A brief survey of its competing inferencial theories as well as a comparison with them are also provided. Arguments in favour of the application of the fiducial method to the parameters of discrete random variables are given, and, as an application, the fiducial distribution associated to the binomial proportions is shown to be of the beta family.
Practical Small Sample Asymptotics for Distributions Used in LifeData Analysis
"... Fraser (1968) and Lawless (1982, Appendix G) discussed exact conditional intervals for the parameters and quantiles of the locationscale model when complete data are used. Moreover, Lawless (1982) extended the exact method to failurecensored (Type II censored) data. Nevertheless, the exact interva ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Fraser (1968) and Lawless (1982, Appendix G) discussed exact conditional intervals for the parameters and quantiles of the locationscale model when complete data are used. Moreover, Lawless (1982) extended the exact method to failurecensored (Type II censored) data. Nevertheless, the exact intervals are difficult to obtain in practice and are unavailable under time censoring (Type I censoring). As a consequence, approximate largesample intervals are widely used. In this article, a likelihood based third order procedure is developed. The method does not require explicit nuisance parameterization and can be easily implemented into algebraic computational packages. Numerical examples are presented to show the accuracy of the method even when the sample size is small. Keywords: Ancillary; Bartlett correction; Likelihood ratio statistic; Mean and variance correction. 1 1. INTRODUCTION The locationscale model has density of the form f(y; ; oe) = 1 oe g ` y \Gamma oe ' \Gamma 1 !...