Results 1  10
of
316,815
Evaluating the Accuracy of SamplingBased Approaches to the Calculation of Posterior Moments
 IN BAYESIAN STATISTICS
, 1992
"... Data augmentation and Gibbs sampling are two closely related, samplingbased approaches to the calculation of posterior moments. The fact that each produces a sample whose constituents are neither independent nor identically distributed complicates the assessment of convergence and numerical accurac ..."
Abstract

Cited by 583 (14 self)
 Add to MetaCart
Data augmentation and Gibbs sampling are two closely related, samplingbased approaches to the calculation of posterior moments. The fact that each produces a sample whose constituents are neither independent nor identically distributed complicates the assessment of convergence and numerical
Bayes Factors
, 1995
"... In a 1935 paper, and in his book Theory of Probability, Jeffreys developed a methodology for quantifying the evidence in favor of a scientific theory. The centerpiece was a number, now called the Bayes factor, which is the posterior odds of the null hypothesis when the prior probability on the null ..."
Abstract

Cited by 1766 (74 self)
 Add to MetaCart
In a 1935 paper, and in his book Theory of Probability, Jeffreys developed a methodology for quantifying the evidence in favor of a scientific theory. The centerpiece was a number, now called the Bayes factor, which is the posterior odds of the null hypothesis when the prior probability on the null
Bayesian Interpolation
 Neural Computation
, 1991
"... Although Bayesian analysis has been in use since Laplace, the Bayesian method of modelcomparison has only recently been developed in depth. In this paper, the Bayesian approach to regularisation and modelcomparison is demonstrated by studying the inference problem of interpolating noisy data. T ..."
Abstract

Cited by 721 (17 self)
 Add to MetaCart
. The concepts and methods described are quite general and can be applied to many other problems. Regularising constants are set by examining their posterior probability distribution. Alternative regularisers (priors) and alternative basis sets are objectively compared by evaluating the evidence for them
Markov Random Field Models in Computer Vision
, 1994
"... . A variety of computer vision problems can be optimally posed as Bayesian labeling in which the solution of a problem is defined as the maximum a posteriori (MAP) probability estimate of the true labeling. The posterior probability is usually derived from a prior model and a likelihood model. The l ..."
Abstract

Cited by 515 (18 self)
 Add to MetaCart
. A variety of computer vision problems can be optimally posed as Bayesian labeling in which the solution of a problem is defined as the maximum a posteriori (MAP) probability estimate of the true labeling. The posterior probability is usually derived from a prior model and a likelihood model
Premotor cortex and the recognition of motor actions
 Cognitive Brain Research
, 1996
"... In area F5 of the monkey premotor cortex there are neurons that discharge both when the monkey performs an action and when he observes a similar action made by another monkey or by the experimenter. We report here some of the properties of these 'mirror' neurons and we propose that their a ..."
Abstract

Cited by 702 (44 self)
 Add to MetaCart
In area F5 of the monkey premotor cortex there are neurons that discharge both when the monkey performs an action and when he observes a similar action made by another monkey or by the experimenter. We report here some of the properties of these 'mirror' neurons and we propose
On Sequential Monte Carlo Sampling Methods for Bayesian Filtering
 STATISTICS AND COMPUTING
, 2000
"... In this article, we present an overview of methods for sequential simulation from posterior distributions. These methods are of particular interest in Bayesian filtering for discrete time dynamic models that are typically nonlinear and nonGaussian. A general importance sampling framework is develop ..."
Abstract

Cited by 1032 (76 self)
 Add to MetaCart
In this article, we present an overview of methods for sequential simulation from posterior distributions. These methods are of particular interest in Bayesian filtering for discrete time dynamic models that are typically nonlinear and nonGaussian. A general importance sampling framework
Robust Monte Carlo Localization for Mobile Robots
, 2001
"... Mobile robot localization is the problem of determining a robot's pose from sensor data. This article presents a family of probabilistic localization algorithms known as Monte Carlo Localization (MCL). MCL algorithms represent a robot's belief by a set of weighted hypotheses (samples), whi ..."
Abstract

Cited by 826 (88 self)
 Add to MetaCart
), which approximate the posterior under a common Bayesian formulation of the localization problem. Building on the basic MCL algorithm, this article develops a more robust algorithm called MixtureMCL, which integrates two complimentary ways of generating samples in the estimation. To apply this algorithm
On Bayesian analysis of mixtures with an unknown number of components
 INSTITUTE OF INTERNATIONAL ECONOMICS PROJECT ON INTERNATIONAL COMPETITION POLICY,&QUOT; COM/DAFFE/CLP/TD(94)42
, 1997
"... ..."
Results 1  10
of
316,815