Results 1 
7 of
7
Transdimensional Markov chain Monte Carlo
 in Highly Structured Stochastic Systems
, 2003
"... In the context of samplebased computation of Bayesian posterior distributions in complex stochastic systems, this chapter discusses some of the uses for a Markov chain with a prescribed invariant distribution whose support is a union of euclidean spaces of differing dimensions. This leads into a re ..."
Abstract

Cited by 58 (0 self)
 Add to MetaCart
In the context of samplebased computation of Bayesian posterior distributions in complex stochastic systems, this chapter discusses some of the uses for a Markov chain with a prescribed invariant distribution whose support is a union of euclidean spaces of differing dimensions. This leads into a reformulation of the reversible jump MCMC framework for constructing such ‘transdimensional ’ Markov chains. This framework is compared to alternative approaches for the same task, including methods that involve separate sampling within different fixeddimension models. We consider some of the difficulties researchers have encountered with obtaining adequate performance with some of these methods, attributing some of these to misunderstandings, and offer tentative recommendations about algorithm choice for various classes of problem. The chapter concludes with a look towards desirable future developments.
Bayesian Curve Fitting Using MCMC With Applications to Signal Segmentation
 IEEE Transactions on Signal Processing
, 2002
"... We propose some Bayesian methods to address the problem of fitting a signal modeled by a sequence of piecewise constant linear (in the parameters) regression models, for example, autoregressive or Volterra models. A joint prior distribution is set up over the number of the changepoints/knots, their ..."
Abstract

Cited by 54 (0 self)
 Add to MetaCart
We propose some Bayesian methods to address the problem of fitting a signal modeled by a sequence of piecewise constant linear (in the parameters) regression models, for example, autoregressive or Volterra models. A joint prior distribution is set up over the number of the changepoints/knots, their positions, and over the orders of the linear regression models within each segment if these are unknown. Hierarchical priors are developed and, as the resulting posterior probability distributions and Bayesian estimators do not admit closedform analytical expressions, reversible jump Markov chain Monte Carlo (MCMC) methods are derived to estimate these quantities. Results are obtained for standard denoising and segmentation of speech data problems that have already been examined in the literature. These results demonstrate the performance of our methods.
MCMC Methods for Computing Bayes Factors: A Comparative Review
 Journal of the American Statistical Association
, 2000
"... this paper we review several of these methods, and subsequently compare them in the context of two examples, the first a simple regression example, and the second a much more challenging hierarchical longitudinal model of the kind often encountered in biostatistical practice. We find that the joint ..."
Abstract

Cited by 30 (1 self)
 Add to MetaCart
this paper we review several of these methods, and subsequently compare them in the context of two examples, the first a simple regression example, and the second a much more challenging hierarchical longitudinal model of the kind often encountered in biostatistical practice. We find that the joint modelparameter space search methods perform adequately but can be difficult to program and tune, while the marginal likelihood methods are often less troublesome and require less in the way of additional coding. Our results suggest that the latter methods may be most appropriate for practitioners working in many standard model choice settings, while the former remain important for comparing large numbers of models, or models whose parameters cannot be easily updated in relatively few blocks. We caution however that all of the methods we compare require significant human and computer effort, suggesting that less formal Bayesian model choice methods may offer a more realistic alternative in many cases.
Bayesian MetaAnalysis for Longitudinal Data Models using Multivariate Mixture Priors
, 2002
"... ..."
A Gabor regression scheme for audio signal analysis
 in Proceedings of the IEEE Workshop on Applications of Signal Processing to Audio and Acoustics
"... Here we describe novel Bayesian models for timefrequency analysis of nonstationary audio waveforms. These models are based on the idea of a Gabor regression, in which a time series is represented as a superposition of timefrequency shifted versions of a simple window function. Prior distributions ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Here we describe novel Bayesian models for timefrequency analysis of nonstationary audio waveforms. These models are based on the idea of a Gabor regression, in which a time series is represented as a superposition of timefrequency shifted versions of a simple window function. Prior distributions over the corresponding timefrequency coefficients are constructed in a manner which favours both smoothness of the estimated function and sparseness of the coefficient representation (either indirectly through scale mixtures of normals, or directly through prior probability mass at zero). In this way prior regularisation may induce a parsimonious, meaningful representation of the underlying audio time series. 1. THE DISCRETETIME GABOR EXPANSION The discretetime Gabor expansion of a periodic sequence f ∈ ℓ 2 (Z) having period L is given by f = M−1 � N−1 � m=0 n=0
C ○ 2002 Kluwer Academic Publishers. Manufactured in The Netherlands. On Bayesian model and variable selection using MCMC
, 1998
"... Several MCMC methods have been proposed for estimating probabilities of models and associated ‘modelaveraged ’ posterior distributions in the presence of model uncertainty. We discuss, compare, develop and illustrate several of these methods, focussing on connections between them. Keywords: Gibbs s ..."
Abstract
 Add to MetaCart
Several MCMC methods have been proposed for estimating probabilities of models and associated ‘modelaveraged ’ posterior distributions in the presence of model uncertainty. We discuss, compare, develop and illustrate several of these methods, focussing on connections between them. Keywords: Gibbs sampler, independence sampler, Metropolis–Hastings, reversible jump