Results 1  10
of
14
Transdimensional Markov chain Monte Carlo
 in Highly Structured Stochastic Systems
, 2003
"... In the context of samplebased computation of Bayesian posterior distributions in complex stochastic systems, this chapter discusses some of the uses for a Markov chain with a prescribed invariant distribution whose support is a union of euclidean spaces of differing dimensions. This leads into a re ..."
Abstract

Cited by 91 (0 self)
 Add to MetaCart
In the context of samplebased computation of Bayesian posterior distributions in complex stochastic systems, this chapter discusses some of the uses for a Markov chain with a prescribed invariant distribution whose support is a union of euclidean spaces of differing dimensions. This leads into a reformulation of the reversible jump MCMC framework for constructing such ‘transdimensional ’ Markov chains. This framework is compared to alternative approaches for the same task, including methods that involve separate sampling within different fixeddimension models. We consider some of the difficulties researchers have encountered with obtaining adequate performance with some of these methods, attributing some of these to misunderstandings, and offer tentative recommendations about algorithm choice for various classes of problem. The chapter concludes with a look towards desirable future developments.
Bayesian Curve Fitting Using MCMC With Applications to Signal Segmentation
 IEEE Transactions on Signal Processing
, 2002
"... We propose some Bayesian methods to address the problem of fitting a signal modeled by a sequence of piecewise constant linear (in the parameters) regression models, for example, autoregressive or Volterra models. A joint prior distribution is set up over the number of the changepoints/knots, their ..."
Abstract

Cited by 76 (1 self)
 Add to MetaCart
(Show Context)
We propose some Bayesian methods to address the problem of fitting a signal modeled by a sequence of piecewise constant linear (in the parameters) regression models, for example, autoregressive or Volterra models. A joint prior distribution is set up over the number of the changepoints/knots, their positions, and over the orders of the linear regression models within each segment if these are unknown. Hierarchical priors are developed and, as the resulting posterior probability distributions and Bayesian estimators do not admit closedform analytical expressions, reversible jump Markov chain Monte Carlo (MCMC) methods are derived to estimate these quantities. Results are obtained for standard denoising and segmentation of speech data problems that have already been examined in the literature. These results demonstrate the performance of our methods.
MCMC Methods for Computing Bayes Factors: A Comparative Review
 Journal of the American Statistical Association
, 2000
"... this paper we review several of these methods, and subsequently compare them in the context of two examples, the first a simple regression example, and the second a much more challenging hierarchical longitudinal model of the kind often encountered in biostatistical practice. We find that the joint ..."
Abstract

Cited by 38 (1 self)
 Add to MetaCart
this paper we review several of these methods, and subsequently compare them in the context of two examples, the first a simple regression example, and the second a much more challenging hierarchical longitudinal model of the kind often encountered in biostatistical practice. We find that the joint modelparameter space search methods perform adequately but can be difficult to program and tune, while the marginal likelihood methods are often less troublesome and require less in the way of additional coding. Our results suggest that the latter methods may be most appropriate for practitioners working in many standard model choice settings, while the former remain important for comparing large numbers of models, or models whose parameters cannot be easily updated in relatively few blocks. We caution however that all of the methods we compare require significant human and computer effort, suggesting that less formal Bayesian model choice methods may offer a more realistic alternative in many cases.
Bayesian MetaAnalysis for Longitudinal Data Models using Multivariate Mixture Priors
, 2002
"... ..."
Reversible jump Markov chain Monte Carlo strategies for Bayesian model selection in autoregressive processes
 Journal of time series analysis
, 2004
"... Abstract. This paper addresses the problem of Bayesian inference in autoregressive (AR) processes in the case where the correct model order is unknown. Original hierarchical prior models that allow the stationarity of the model to be enforced are proposed. Obtaining the quantities of interest, such ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
(Show Context)
Abstract. This paper addresses the problem of Bayesian inference in autoregressive (AR) processes in the case where the correct model order is unknown. Original hierarchical prior models that allow the stationarity of the model to be enforced are proposed. Obtaining the quantities of interest, such as parameter estimates, predictions of future values of the time series, posterior modelorder probabilities, etc., requires integration with respect to the full posterior distribution, an operation which is analytically intractable. Reversible jump Markov chain Monte Carlo (MCMC) algorithms are developed to perform the required integration implicitly by simulating from the posterior distribution. The methods developed are evaluated in simulation studies on a number of synthetic and real data sets. Keywords. Autoregressive process; Bayesian estimation; Markov chain Monte Carlo; model selection. 1.
Spike and Slab Prior Distributions for Simultaneous Bayesian Hypothesis Testing, Model Selection, and Prediction, of Nonlinear Outcomes
"... A small body of literature has used the spike and slab prior specification for model selection with strictly linear outcomes. In this setup a twocomponent mixture distribution is stipulated for coefficients of interest with one part centered at zero with very high precision (the spike) and the oth ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
A small body of literature has used the spike and slab prior specification for model selection with strictly linear outcomes. In this setup a twocomponent mixture distribution is stipulated for coefficients of interest with one part centered at zero with very high precision (the spike) and the other as a distribution diffusely centered at the research hypothesis (the slab). With the selective shrinkage, this setup incorporates the zero coefficient contingency directly into the modeling process to produce posterior probabilities for hypothesized outcomes. We extend the model to qualitative responses by designing a hierarchy of forms over both the parameter and model spaces to achieve variable selection, model averaging, and individual coefficient hypothesis testing. To overcome the technical challenges in estimating the marginal posterior distributions possibly with a dramatic ratio of density heights of the spike to the slab, we develop a hybrid Gibbs sampling algorithm using an adaptive rejection approach for various discrete outcome models, including dichotomous, polychotomous, and count responses. The performance of the models and methods are assessed with both Monte Carlo experiments and empirical applications in political science.
A Gabor regression scheme for audio signal analysis
 in Proceedings of the IEEE Workshop on Applications of Signal Processing to Audio and Acoustics
"... Here we describe novel Bayesian models for timefrequency analysis of nonstationary audio waveforms. These models are based on the idea of a Gabor regression, in which a time series is represented as a superposition of timefrequency shifted versions of a simple window function. Prior distributions ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Here we describe novel Bayesian models for timefrequency analysis of nonstationary audio waveforms. These models are based on the idea of a Gabor regression, in which a time series is represented as a superposition of timefrequency shifted versions of a simple window function. Prior distributions over the corresponding timefrequency coefficients are constructed in a manner which favours both smoothness of the estimated function and sparseness of the coefficient representation (either indirectly through scale mixtures of normals, or directly through prior probability mass at zero). In this way prior regularisation may induce a parsimonious, meaningful representation of the underlying audio time series. 1. THE DISCRETETIME GABOR EXPANSION The discretetime Gabor expansion of a periodic sequence f ∈ ℓ 2 (Z) having period L is given by f = M−1 � N−1 � m=0 n=0
Article A Bayesian Algorithm for Functional Mapping of Dynamic Complex Traits
, 2009
"... algorithms ..."
(Show Context)
SUMMARY
, 2000
"... Several MCMC methods have been proposed for estimating probabilities of models and associated `modelaveraged ' posterior distributions in the presence of model uncertainty. We discuss, compare, develop and illustrate several of these methods, focussing on connections between them. ..."
Abstract
 Add to MetaCart
Several MCMC methods have been proposed for estimating probabilities of models and associated `modelaveraged ' posterior distributions in the presence of model uncertainty. We discuss, compare, develop and illustrate several of these methods, focussing on connections between them.