Results 1  10
of
16
Bayes Factors
, 1995
"... In a 1935 paper, and in his book Theory of Probability, Jeffreys developed a methodology for quantifying the evidence in favor of a scientific theory. The centerpiece was a number, now called the Bayes factor, which is the posterior odds of the null hypothesis when the prior probability on the null ..."
Abstract

Cited by 981 (70 self)
 Add to MetaCart
In a 1935 paper, and in his book Theory of Probability, Jeffreys developed a methodology for quantifying the evidence in favor of a scientific theory. The centerpiece was a number, now called the Bayes factor, which is the posterior odds of the null hypothesis when the prior probability on the null is onehalf. Although there has been much discussion of Bayesian hypothesis testing in the context of criticism of P values, less attention has been given to the Bayes factor as a practical tool of applied statistics. In this paper we review and discuss the uses of Bayes factors in the context of five scientific applications in genetics, sports, ecology, sociology and psychology.
Bayes factors and model uncertainty
 DEPARTMENT OF STATISTICS, UNIVERSITY OFWASHINGTON
, 1993
"... In a 1935 paper, and in his book Theory of Probability, Jeffreys developed a methodology for quantifying the evidence in favor of a scientific theory. The centerpiece was a number, now called the Bayes factor, which is the posterior odds of the null hypothesis when the prior probability on the null ..."
Abstract

Cited by 89 (6 self)
 Add to MetaCart
In a 1935 paper, and in his book Theory of Probability, Jeffreys developed a methodology for quantifying the evidence in favor of a scientific theory. The centerpiece was a number, now called the Bayes factor, which is the posterior odds of the null hypothesis when the prior probability on the null is onehalf. Although there has been much discussion of Bayesian hypothesis testing in the context of criticism of Pvalues, less attention has been given to the Bayes factor as a practical tool of applied statistics. In this paper we review and discuss the uses of Bayes factors in the context of five scientific applications. The points we emphasize are: from Jeffreys's Bayesian point of view, the purpose of hypothesis testing is to evaluate the evidence in favor of a scientific theory; Bayes factors offer a way of evaluating evidence in favor ofa null hypothesis; Bayes factors provide a way of incorporating external information into the evaluation of evidence about a hypothesis; Bayes factors are very general, and do not require alternative models to be nested; several techniques are available for computing Bayes factors, including asymptotic approximations which are easy to compute using the output from standard packages that maximize likelihoods; in "nonstandard " statistical models that do not satisfy common regularity conditions, it can be technically simpler to calculate Bayes factors than to derive nonBayesian significance
Methods for Approximating Integrals in Statistics with Special Emphasis on Bayesian Integration Problems
 Statistical Science
"... This paper is a survey of the major techniques and approaches available for the numerical approximation of integrals in statistics. We classify these into five broad categories; namely, asymptotic methods, importance sampling, adaptive importance sampling, multiple quadrature and Markov chain method ..."
Abstract

Cited by 32 (4 self)
 Add to MetaCart
This paper is a survey of the major techniques and approaches available for the numerical approximation of integrals in statistics. We classify these into five broad categories; namely, asymptotic methods, importance sampling, adaptive importance sampling, multiple quadrature and Markov chain methods. Each method is discussed giving an outline of the basic supporting theory and particular features of the technique. Conclusions are drawn concerning the relative merits of the methods based on the discussion and their application to three examples. The following broad recommendations are made. Asymptotic methods should only be considered in contexts where the integrand has a dominant peak with approximate ellipsoidal symmetry. Importance sampling, and preferably adaptive importance sampling, based on a multivariate Student should be used instead of asymptotics methods in such a context. Multiple quadrature, and in particular subregion adaptive integration, are the algorithms of choice for...
Dynamic Analyses of Information Encoding in Neural Ensembles
 Neural Computation
, 2004
"... Neural spike train decoding algorithms and techniques to compute Shannon
mutual information are important methods for analyzing how neural
systems represent biological signals.Decoding algorithms are also one of
several strategies being used to design controls for brainmachine interfaces.
Developin ..."
Abstract

Cited by 22 (1 self)
 Add to MetaCart
Neural spike train decoding algorithms and techniques to compute Shannon
mutual information are important methods for analyzing how neural
systems represent biological signals.Decoding algorithms are also one of
several strategies being used to design controls for brainmachine interfaces.
Developing optimal strategies to desig n decoding algorithms and
compute mutual information are therefore important problems in computational
neuroscience. We present a general recursive lter decoding
algorithm based on a point process model of individual neuron spiking
activity and a linear stochastic statespace model of the biological signal.
We derive from the algorithm new instantaneous estimates of the entropy,
entropy rate, and the mutual information between the signal and
the ensemble spiking activity. We assess the accuracy of the algorithm
by computing, along with the decoding error, the true coverage probability
of the approximate 0.95 condence regions for the individual signal
estimates. We illustrate the new algorithm by reanalyzing the position
and ensemble neural spiking activity of CA1 hippocampal neurons from
two rats foraging in an open circular environment. We compare the performance
of this algorithm with a linear lter constructed by the widely
used reverse correlation method. The median decoding error for Animal
1 (2) during 10 minutes of open foraging was 5.9 (5.5) cm, the median
entropy was 6.9 (7.0) bits, the median information was 9.4 (9.4) bits, and
the true coverage probability for 0.95 condence regions was 0.67 (0.75)
using 34 (32) neurons. These ndings improve signicantly on our previous
results and suggest an integrated approach to dynamically reading
neural codes, measuring their properties, and quantifying the accuracy
with which encoded information is extracted.
Use of ParInt for Parallel Computation of Statistics Integrals
 Computing Science and Statistics
, 1996
"... We present applications of ParInt, a package of Parallel and distributed multivariate Integration algorithms. The integrals arise in Bayesian statistical computation. The integrals are transformed according to the methods of Genz and Kass [10]. The parallel integration algorithms are based on subreg ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
We present applications of ParInt, a package of Parallel and distributed multivariate Integration algorithms. The integrals arise in Bayesian statistical computation. The integrals are transformed according to the methods of Genz and Kass [10]. The parallel integration algorithms are based on subregion partitioning. An overview of ParInt and its graphical user interface is given and the parallel speedups are discussed for the problems under consideration. 1 Introduction An investigation of fast techniques for numerical integration is motivated by the need to compute computationally intensive multiple integrals arising in various areas of science and engineering, for example in statistics and in finite element applications. The work reported in this paper is part of a project (ParInt) involving the design, analysis and development of a set of coarse grain parallel and distributed algorithms for multivariate numerical integration. We discuss an automatic integration algorithm which uses...
Numerical Integration in SPLUS or R: A Survey
 JOURNAL OF STATISTICAL SOFTWARE
"... This paper reviews current quadrature methods for approximate calculation of integrals within SPlus or R. Starting with the general framework, Gaussian quadrature will be discussed first, followed by adaptive rules and Monte Carlo methods. Finally, a comparison of the methods presented is given ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
This paper reviews current quadrature methods for approximate calculation of integrals within SPlus or R. Starting with the general framework, Gaussian quadrature will be discussed first, followed by adaptive rules and Monte Carlo methods. Finally, a comparison of the methods presented is given. The aim of this survey paper is to help readers, not expert in computing, to apply numerical integration methods and to realize that numerical analysis is an art, not a science
Monte Carlo integration with acceptancerejection
 J. Comput. Graphical Statist
"... This article considers Monte Carlo integration under rejection sampling or MetropolisHastings sampling. Each algorithm involves accepting or rejecting observations from proposal distributions other than a target distribution. While taking a likelihood approach, we basically treat the sampling schem ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
This article considers Monte Carlo integration under rejection sampling or MetropolisHastings sampling. Each algorithm involves accepting or rejecting observations from proposal distributions other than a target distribution. While taking a likelihood approach, we basically treat the sampling scheme as a random design, and define a stratified estimator of the baseline measure. We establish that the likelihood estimator has no greater asymptotic variance than the crude Monte Carlo estimator under rejection sampling or independence MetropolisHastings sampling. We employ a subsampling technique to reduce the computational cost, and illustrate with three examples the computational effectiveness of the likelihood method under general MetropolisHastings sampling.
Towards Dependable Perception: Guaranteed Inference for Global Localization
"... Abstract — Reliable state estimation is an important enabler for robot operation in human environments. Uncertainty and unpredictability of these environments requires global uncertainty problems to be solved for dependable operation. Relative sensors — such as vision, laser and tactile — are common ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Abstract — Reliable state estimation is an important enabler for robot operation in human environments. Uncertainty and unpredictability of these environments requires global uncertainty problems to be solved for dependable operation. Relative sensors — such as vision, laser and tactile — are common in these applications leading to challenging perceptual problems, for which modern inference methods fail to guarantee accurate estimates. Further, as we show, the reliability of these estimates degrades quickly as initial uncertainty increases. In this paper, our aim is to maximize the amount of information extracted from sensory data, allowing the robot to make the most of its sensors. We present an inference algorithm, which guarantees that all optimal solutions will be found and provides provable error bounds on the approximation of the underlying probability distribution. The approach is based on insight into the sensor model, which is used to guide the refinement process in an adaptive grid algorithm. The approach is applicable to a variety of pose estimation problems with relative sensors. We demonstrate the generality of the approach on the examples of indoor robot localization and tactile manipulation, where it dramatically outperforms stateoftheart. Empirically, our method increased safety of decision making to 100%. The proposed algorithm also demonstrated logarithmic dependence on desired precision, allowing for efficient highaccuracy estimation. In indoor localization experiments, the approach led to 1mm accuracy of pose estimation based on the commonly used laser range finders. This high accuracy is useful for accurate maneuvering in tight spaces and is sufficient for reliable manipulation of stationary objects of interest within the environment (e.g. door handles, elevator buttons, etc.) It also opens up new potential applications during building construction, inspection and maintenance. In the tactile manipulation setting, the method results in efficient, accurate and reliable 6DOF object pose estimation from tactile data, allowing for reliable manipulation. I.
Variance Reduction
"... elation among x i1 ,K , x iM in such a way that s *2 < M 1 var p g x ij ( ) [ ] . If in addition the cost of generating the M tuple is insignificantly greater than the cost of generating M independent variables from p x ( ) , then I N M provides a computationally more efficient approximatio ..."
Abstract
 Add to MetaCart
elation among x i1 ,K , x iM in such a way that s *2 < M 1 var p g x ij ( ) [ ] . If in addition the cost of generating the M tuple is insignificantly greater than the cost of generating M independent variables from p x ( ) , then I N M provides a computationally more efficient approximation of I than does I N . There are numerous variants on this technique. This section takes up four that account for most use of the method: antithetic variables, systematic sampling, conditional expectations, and control variables. The scope for combining these variance reduction techniques with the methods of Section 4 or Section 6 is enormous. Rather than list all the pos
Markov Chain Monte Carlo in Practice: A Roundtable Discussion
"... This article is an edited recreation of that discussion. Its purpose is to o#er advice and guidance to novice users of MCMCand to notso novice users as well. Topics include building confidence in simulation results, methods for speeding and assessing convergence, estimating standard errors, iden ..."
Abstract
 Add to MetaCart
This article is an edited recreation of that discussion. Its purpose is to o#er advice and guidance to novice users of MCMCand to notso novice users as well. Topics include building confidence in simulation results, methods for speeding and assessing convergence, estimating standard errors, identification of models for which good MCMC algorithms exist, and the current state of software development.