Results 1  10
of
20
The WangLandau algorithm in general state spaces: applications and convergence analysis
 Statistica Sinica
"... Abstract: The WangLandau algorithm ([21]) is a recent Monte Carlo method that has generated much interest in the Physics literature due to some spectacular simulation performances. The objective of this paper is twofold. First, we show that the algorithm can be naturally extended to more general s ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
Abstract: The WangLandau algorithm ([21]) is a recent Monte Carlo method that has generated much interest in the Physics literature due to some spectacular simulation performances. The objective of this paper is twofold. First, we show that the algorithm can be naturally extended to more general state spaces and used to improve on Markov Chain Monte Carlo schemes of more interest in Statistics. In a second part, we study asymptotic behaviors of the algorithm. We show that with an appropriate choice of the stepsize, the algorithm is consistent and a strong law of large numbers holds under some fairly mild conditions. We have also shown by simulations the potential advantage of the WL algorithm for problems in the Bayesian inference.
A double MetropolisHastings sampler for spatial models with intractable normalizing constants
 Journal of Statistical Computing and Simulation
"... The problem of simulating from distributions with intractable normalizing constants has received much attention in the recent literature. In this paper, we propose an asymptotic algorithm, the socalled double MetropolisHastings (MH) sampler, for tickling this problem. Unlike other auxiliary variabl ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
The problem of simulating from distributions with intractable normalizing constants has received much attention in the recent literature. In this paper, we propose an asymptotic algorithm, the socalled double MetropolisHastings (MH) sampler, for tickling this problem. Unlike other auxiliary variable algorithms, the double MH sampler removes the need of exact sampling, the auxiliary variables being generated using MH kernels, and thus can be applied to a wide range of problems for which exact sampling is not available. While for the problems for which exact sampling is available, it can typically produce the same accurate results as the exchange algorithm, but using much less CPU time. The new method is illustrated by various spatial models.
On the use of stochastic approximation Monte Carlo for Monte Carlo integration
 Statistics and Probability Letters
, 2009
"... The stochastic approximation Monte Carlo (SAMC) algorithm has recently been proposed as a dynamic optimization algorithm in the literature. In this paper, we show in theory that the samples generated by SAMC can be used for Monte Carlo integration via a dynamically weighted estimator by calling some ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
The stochastic approximation Monte Carlo (SAMC) algorithm has recently been proposed as a dynamic optimization algorithm in the literature. In this paper, we show in theory that the samples generated by SAMC can be used for Monte Carlo integration via a dynamically weighted estimator by calling some results from the literature of nonhomogeneous Markov chains. Our numerical results indicate that SAMC can yield significant savings over conventional Monte Carlo algorithms, such as the MetropolisHastings algorithm, for the problems for which the energy landscape is rugged.
Improving SAMC using smoothing methods: theory and applications to Bayesian model selection problems. The Annals of Statistics
, 2008
"... Stochastic approximation Monte Carlo (SAMC) has recently been proposed by Liang, Liu and Carroll [J. Amer. Statist. Assoc. 102 (2007) 305–320] as a general simulation and optimization algorithm. In this paper, we propose to improve its convergence using smoothing methods and discuss the application ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
Stochastic approximation Monte Carlo (SAMC) has recently been proposed by Liang, Liu and Carroll [J. Amer. Statist. Assoc. 102 (2007) 305–320] as a general simulation and optimization algorithm. In this paper, we propose to improve its convergence using smoothing methods and discuss the application of the new algorithm to Bayesian model selection problems. The new algorithm is tested through a changepoint identification example. The numerical results indicate that the new algorithm can outperform SAMC and reversible jump MCMC significantly for the model selection problems. The new algorithm represents a general form of the stochastic approximation Markov chain Monte Carlo algorithm. It allows multiple samples to be generated at each iteration, and a bias term to be included in the parameter updating step. A rigorous proof for the convergence of the general algorithm is established under verifiable conditions. This paper also provides a framework on how to improve efficiency of Monte
Bayesian computation for statistical models with intractable normalizing constants
, 2008
"... normalizing constants ..."
A Stochastic approximation method for inference in probabilistic graphical models
"... We describe a new algorithmic framework for inference in probabilistic models, and apply it to inference for latent Dirichlet allocation (LDA). Our framework adopts the methodology of variational inference, but unlike existing variational methods such as mean field and expectation propagation it is ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We describe a new algorithmic framework for inference in probabilistic models, and apply it to inference for latent Dirichlet allocation (LDA). Our framework adopts the methodology of variational inference, but unlike existing variational methods such as mean field and expectation propagation it is not restricted to tractable classes of approximating distributions. Our approach can also be viewed as a “populationbased ” sequential Monte Carlo (SMC) method, but unlike existing SMC methods there is no need to design the artificial sequence of distributions. Significantly, our framework offers a principled means to exchange the variance of an importance sampling estimate for the bias incurred through variational approximation. We conduct experiments on a difficult inference problem in population genetics, a problem that is related to inference for LDA. The results of these experiments suggest that our method can offer improvements in stability and accuracy over existing methods, and at a comparable cost. 1
Learning Bayesian Networks for Discrete Data
"... Bayesian networks have received much attention in the recent literature. In this article, we propose an approach to learn Bayesian networks using the stochastic approximation Monte Carlo (SAMC) algorithm. Our approach has two nice features. Firstly, it possesses the selfadjusting mechanism and thus ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Bayesian networks have received much attention in the recent literature. In this article, we propose an approach to learn Bayesian networks using the stochastic approximation Monte Carlo (SAMC) algorithm. Our approach has two nice features. Firstly, it possesses the selfadjusting mechanism and thus avoids essentially the localtrap problem suffered by conventional MCMC simulationbased approaches in learning Bayesian networks. Secondly, it falls into the class of dynamic importance sampling algorithms; the network features can be inferred by dynamically weighted averaging the samples generated in the learning process, and the resulting estimates can have much lower variation than the single modelbased estimates. The numerical results indicate that our approach can mix much faster over the space of Bayesian networks than the conventional MCMC simulationbased approaches.
Bayesian Phylogeny Analysis via Stochastic Approximation Monte
, 2008
"... Monte Carlo methods have received much attention in the recent literature of phylogeny analysis. However, the conventional Markov chain Monte Carlo algorithms, such as the MetropolisHastings algorithm, tend to get trapped in a local energy minimum in simulating from the posterior distribution of ph ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Monte Carlo methods have received much attention in the recent literature of phylogeny analysis. However, the conventional Markov chain Monte Carlo algorithms, such as the MetropolisHastings algorithm, tend to get trapped in a local energy minimum in simulating from the posterior distribution of phylogenetic trees, rendering the inference ineffective. In this paper, we apply an advanced Monte Carlo algorithm, the stochastic approximation Monte Carlo algorithm, to Bayesian phylogeny analysis. Our method is compared with two popular Bayesian phylogeny software, BAMBE and MrBayes, on simulated and real datasets. The numerical results favor to our method, which tends to produce better consensus trees and more accurate estimates for the parameters of the sequence evolutionary model, but uses less CPU time, than do the methods under comparison.
Trajectory Averaging for Stochastic Approximation MCMC Algorithms
, 2008
"... In this paper, we first show that the trajectory averaging estimator is asymptotically efficient for the stochastic approximation MCMC algorithm under the stability conditions as specified in Andrieu et al. (2005), and then apply this result to the stochastic approximation Monte Carlo algorithm (Lia ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
In this paper, we first show that the trajectory averaging estimator is asymptotically efficient for the stochastic approximation MCMC algorithm under the stability conditions as specified in Andrieu et al. (2005), and then apply this result to the stochastic approximation Monte Carlo algorithm (Liang et al., 2007). The theoretical result is illustrated by a numerical example, which indicates that for the stochastic approximation Monte Carlo algorithm, the trajectory averaging estimator can be generally superior to the conventional estimator in terms of bias and variance.
Reconstructing the Energy Landscape of a Distribution from
"... Defining the energy function as the negative logarithm of the density, we explore the energy landscape of a distribution via the tree of sublevel sets of its energy. This tree represents the hierarchy among the connected components of the sublevel sets. We propose ways to annotate the tree so that i ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Defining the energy function as the negative logarithm of the density, we explore the energy landscape of a distribution via the tree of sublevel sets of its energy. This tree represents the hierarchy among the connected components of the sublevel sets. We propose ways to annotate the tree so that it provides information on both topological and statistical aspects of the distribution, such as the local energy minima (local modes), their local domains and volumes, and the barriers between them. We develop a computational method to estimate the tree and reconstruct the energy landscape from Monte Carlo samples simulated at a wide energy range of a distribution. This method can be applied to any arbitrary distribution on a space with defined connectedness. We test the method on multimodal distributions and posterior distributions to show that our estimated trees are accurate compared to theoretical values. When used to perform Bayesian inference of DNA sequence segmentation, this approach reveals much more information than the standard approach based on marginal posterior distributions. Key words and phrases: Monte Carlo, cluster tree, sublevel set, connected component, disconnectivity graph, posterior distribution, sequence segmentation, change point. 1