Results 1  10
of
26
An Improved Particle Filter for Nonlinear Problems
, 2004
"... The Kalman filter provides an effective solution to the linearGaussian filtering problem. However, where there is nonlinearity, either in the model specification or the observation process, other methods are required. We consider methods known generically as particle filters, which include the c ..."
Abstract

Cited by 156 (8 self)
 Add to MetaCart
The Kalman filter provides an effective solution to the linearGaussian filtering problem. However, where there is nonlinearity, either in the model specification or the observation process, other methods are required. We consider methods known generically as particle filters, which include the condensation algorithm and the Bayesian bootstrap or sampling importance resampling (SIR) filter. These filters
Bayes factors and model uncertainty
 DEPARTMENT OF STATISTICS, UNIVERSITY OFWASHINGTON
, 1993
"... In a 1935 paper, and in his book Theory of Probability, Jeffreys developed a methodology for quantifying the evidence in favor of a scientific theory. The centerpiece was a number, now called the Bayes factor, which is the posterior odds of the null hypothesis when the prior probability on the null ..."
Abstract

Cited by 89 (6 self)
 Add to MetaCart
In a 1935 paper, and in his book Theory of Probability, Jeffreys developed a methodology for quantifying the evidence in favor of a scientific theory. The centerpiece was a number, now called the Bayes factor, which is the posterior odds of the null hypothesis when the prior probability on the null is onehalf. Although there has been much discussion of Bayesian hypothesis testing in the context of criticism of Pvalues, less attention has been given to the Bayes factor as a practical tool of applied statistics. In this paper we review and discuss the uses of Bayes factors in the context of five scientific applications. The points we emphasize are: from Jeffreys's Bayesian point of view, the purpose of hypothesis testing is to evaluate the evidence in favor of a scientific theory; Bayes factors offer a way of evaluating evidence in favor ofa null hypothesis; Bayes factors provide a way of incorporating external information into the evaluation of evidence about a hypothesis; Bayes factors are very general, and do not require alternative models to be nested; several techniques are available for computing Bayes factors, including asymptotic approximations which are easy to compute using the output from standard packages that maximize likelihoods; in "nonstandard " statistical models that do not satisfy common regularity conditions, it can be technically simpler to calculate Bayes factors than to derive nonBayesian significance
An efficient Markov chain Monte Carlo method for distributions with intractable normalising constants
 Biometrika
, 2006
"... Maximum likelihood parameter estimation and sampling from Bayesian posterior distributions are problematic when the probability density for the parameter of interest involves an intractable normalising constant which is also a function of that parameter. In this paper, an auxiliary variable method i ..."
Abstract

Cited by 49 (2 self)
 Add to MetaCart
Maximum likelihood parameter estimation and sampling from Bayesian posterior distributions are problematic when the probability density for the parameter of interest involves an intractable normalising constant which is also a function of that parameter. In this paper, an auxiliary variable method is presented which requires only that independent samples can be drawn from the unnormalised density at any particular parameter value. The proposal distribution is constructed so that the normalising constant cancels from the Metropolis–Hastings ratio. The method is illustrated by producing posterior samples for parameters of the Ising model given a particular lattice realisation.
Building Robust Simulationbased Filters for Evolving Data Sets
, 1999
"... this paper we will focus on an alternative class of filters in which theoretical distributions on the state space are approximated by simulated random measures. The first goal in filter design is to produce a compact description of the posterior distribution of the state given all the observations a ..."
Abstract

Cited by 26 (0 self)
 Add to MetaCart
this paper we will focus on an alternative class of filters in which theoretical distributions on the state space are approximated by simulated random measures. The first goal in filter design is to produce a compact description of the posterior distribution of the state given all the observations available so far. A basic requirement is that this description should be readily updated as new data become available. A mechanism has therefore to be devised which enables the approximating random measure to evolve and adapt. 3 SIMULATION BASED FILTERS Simulation based filters have a long history in the engineering literature, dating back to the work of Handschin and Mayne (1969); Handschin (1970); Akashi and Kumamoto (1977). Doucet (1998) provides a comprehensive review of the material. Since the Kalman filter is essentially a Bayesian update formula, the theory of Bayesian time series analysis is directly relevant (West and Harrison, 1997). We take as our starting point the filter developed by Gordon (1993); Gordon et al. (1993). The essence of the method is contained in a paper by Rubin (1988) who proposed the Sampling Importance Resampling (SIR) algorithm for obtaining samples from a complex posterior distribution without recourse to MCMC. In the simple nondynamic case described by Rubin (1988), the method consists of sampling n observations from the prior distribution, attaching weights to the sampled points according to their likelihood, and then sampling with replacement from this weighted discrete distribution. As n ! 1, the resulting set of values then approximates a sample from the required posterior (Smith and Gelfand, 1992). In the dynamic version, proposed by Gordon et al. (1993), the SIR algorithm is applied repeatedly as new data are acquired. One can think of...
PROBABILISTIC PROJECTIONS OF HIV PREVALENCE USING BAYESIAN MELDING 1
"... the Estimation and Projection Package (EPP) for making national estimates and shortterm projections of HIV prevalence based on observed prevalence trends at antenatal clinics. Assessing the uncertainty about its estimates and projections is important for informed policy decision making, and we prop ..."
Abstract

Cited by 21 (11 self)
 Add to MetaCart
the Estimation and Projection Package (EPP) for making national estimates and shortterm projections of HIV prevalence based on observed prevalence trends at antenatal clinics. Assessing the uncertainty about its estimates and projections is important for informed policy decision making, and we propose the use of Bayesian melding for this purpose. Prevalence data and other information about the EPP model’s input parameters are used to derive a probabilistic HIV prevalence projection, namely a probability distribution over a set of future prevalence trajectories. We relate antenatal clinic prevalence to population prevalence and account for variability between clinics using a random effects model. Predictive intervals for clinic prevalence are derived for checking the model. We discuss predictions given by the EPP model and the results of the Bayesian melding procedure for Uganda, where prevalence peaked at around 28 % in 1990; the 95 % prediction interval for 2010 ranges from 2 % to 7%. 1. Introduction. In
Accurate and efficient curve detection in images: the importance sampling Hough transform
 PATTERN RECOGNITION
, 2002
"... ..."
Studies in Solution Sampling
"... We introduce novel algorithms for generating random solutions from a uniform distribution over the solutions of a boolean satisfiability problem. Our algorithms operate in two phases. In the first phase, we use a recently introduced SampleSearch scheme to generate biased samples while in the second ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
We introduce novel algorithms for generating random solutions from a uniform distribution over the solutions of a boolean satisfiability problem. Our algorithms operate in two phases. In the first phase, we use a recently introduced SampleSearch scheme to generate biased samples while in the second phase we correct the bias by using either Sampling/Importance Resampling or the MetropolisHastings method. Unlike stateoftheart algorithms, our algorithms guarantee convergence in the limit. Our empirical results demonstrate the superior performance of our new algorithms over several competing schemes.
Estimating and Adjusting for Publication Bias Using Data Augmentation in Bayesian MetaAnalysis
, 1995
"... We introduce a Bayesian approach which estimates and adjusts for selection bias in a set of studies used in a metaanalysis. We use a hierarchical model for study outcome, and propose an additional model component to account for publication bias, which is the possibility that studies of interest are ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
We introduce a Bayesian approach which estimates and adjusts for selection bias in a set of studies used in a metaanalysis. We use a hierarchical model for study outcome, and propose an additional model component to account for publication bias, which is the possibility that studies of interest are not equally likely to be published and hence observed studies are not a random sample. Estimation is based on the data augmentation principle and the number and outcomes of unobserved studies are simulated using Gibbs sampling methods. After examining simulation performance, we apply our techniques to a metaanalysis of 35 studies of the relationship between lung cancer and spousal exposure to environmental tobacco smoke. We find that the 95% posterior probability interval for relative risk is shifted downward after allowing for this. These results are consistent with earlier, ad hoc, approaches to this problem. Keywords and phrases: Metaanalysis, publication bias, missing studies, Markov...
Estimating and Projecting Trends in HIV/AIDS Generalized Epidemics Using Incremental Mixture Importance Sampling
"... The Joint United Nations Programme on HIV/AIDS (UNAIDS) has decided to use Bayesian melding as the basis for its probabilistic projections of HIV prevalence in countries with generalized epidemics. This combines a mechanistic epidemiological model, prevalence data and expert opinion. Initially, the ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
The Joint United Nations Programme on HIV/AIDS (UNAIDS) has decided to use Bayesian melding as the basis for its probabilistic projections of HIV prevalence in countries with generalized epidemics. This combines a mechanistic epidemiological model, prevalence data and expert opinion. Initially, the posterior distribution was approximated by samplingimportanceresampling, which is simple to implement, easy to interpret, transparent to users and gave acceptable results for most countries. For some countries, however, this is not computationally efficient because the posterior distribution tends to be concentrated around nonlinear ridges and can also be multimodal. We propose instead Incremental Mixture Importance Sampling (IMIS), which iteratively builds up a better importance sampling function. This retains the simplicity and transparency of sampling importance resampling, but is much more efficient computationally. It also leads to a simple estimator of the integrated likelihood that is the basis for Bayesian model comparison and model averaging. In simulation experiments and on real data it outperformed both sampling importance resampling and three publicly available generic Markov chain Monte Carlo algorithms for this
Approximate Solution Sampling ( and Counting) on AND/OR spaces
"... Abstract. In this paper, we describe a new algorithm for sampling solutions from a uniform distribution over the solutions of a constraint network. Our new algorithm improves upon the Sampling/Importance Resampling (SIR) component of our previous scheme of SampleSearchSIR by taking advantage of the ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Abstract. In this paper, we describe a new algorithm for sampling solutions from a uniform distribution over the solutions of a constraint network. Our new algorithm improves upon the Sampling/Importance Resampling (SIR) component of our previous scheme of SampleSearchSIR by taking advantage of the decomposition implied by the network’s AND/OR search space. We also describe how our new scheme can approximately count and lower bound the number of solutions of a constraint network. We demonstrate both theoretically and empirically that our new algorithm yields far better performance than competing approaches. 1