Results 1 
7 of
7
Bayesian Interpolation
 Neural Computation
, 1991
"... Although Bayesian analysis has been in use since Laplace, the Bayesian method of modelcomparison has only recently been developed in depth. In this paper, the Bayesian approach to regularisation and modelcomparison is demonstrated by studying the inference problem of interpolating noisy data. T ..."
Abstract

Cited by 522 (18 self)
 Add to MetaCart
Although Bayesian analysis has been in use since Laplace, the Bayesian method of modelcomparison has only recently been developed in depth. In this paper, the Bayesian approach to regularisation and modelcomparison is demonstrated by studying the inference problem of interpolating noisy data. The concepts and methods described are quite general and can be applied to many other problems. Regularising constants are set by examining their posterior probability distribution. Alternative regularisers (priors) and alternative basis sets are objectively compared by evaluating the evidence for them. `Occam's razor' is automatically embodied by this framework. The way in which Bayes infers the values of regularising constants and noise levels has an elegant interpretation in terms of the effective number of parameters determined by the data set. This framework is due to Gull and Skilling. 1 Data modelling and Occam's razor In science, a central task is to develop and compare models to a...
Informationtheoretic asymptotics of Bayes methods
 IEEE Transactions on Information Theory
, 1990
"... AbstractIn the absence of knowledge of the true density function, Bayesian models take the joint density function for a sequence of n random variables to be an average of densities with respect to a prior. We examine the relative entropy distance D,, between the true density and the Bayesian densit ..."
Abstract

Cited by 107 (10 self)
 Add to MetaCart
AbstractIn the absence of knowledge of the true density function, Bayesian models take the joint density function for a sequence of n random variables to be an average of densities with respect to a prior. We examine the relative entropy distance D,, between the true density and the Bayesian density and show that the asymptotic distance is (d/2Xlogn)+ c, where d is the dimension of the parameter vector. Therefore, the relative entropy rate D,,/n converges to zero at rate (logn)/n. The constant c, which we explicitly identify, depends only on the prior density function and the Fisher information matrix evaluated at the true parameter value. Consequences are given for density estimation, universal data compression, composite hypothesis testing, and stockmarket portfolio selection. 1.
Dutch book in simple multivariate normal prediction: Another look
"... Abstract: In this expository paper we describe a relatively elementary method of establishing the existence of a Dutch book in a simple multivariate normal prediction setting. The method involves deriving a nonstandard predictive distribution that is motivated by invariance. This predictive distribu ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract: In this expository paper we describe a relatively elementary method of establishing the existence of a Dutch book in a simple multivariate normal prediction setting. The method involves deriving a nonstandard predictive distribution that is motivated by invariance. This predictive distribution satisfies an interesting identity which in turn yields an elementary demonstration of the existence of a Dutch book for a variety of possible predictive distributions. 1.
Expectations and Variances of Nonpositive Functions
"... Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at ..."
Abstract
 Add to MetaCart
Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at
Poincaré’s Odds
, 2013
"... Abstract. This paper is devoted to Poincaré’s work in probability. Though the subject does not represent a large part of the mathematician’s achievements, it provides significant insight into the evolution of Poincaré’s thought on several important matters such as the changes in physics implied by s ..."
Abstract
 Add to MetaCart
Abstract. This paper is devoted to Poincaré’s work in probability. Though the subject does not represent a large part of the mathematician’s achievements, it provides significant insight into the evolution of Poincaré’s thought on several important matters such as the changes in physics implied by statistical mechanics and molecular theories. After having drawn the general historical context of this evolution, I focus on several important steps in Poincaré’s texts dealing with probability theory, and eventually consider how his legacy was developed by the next generation.