Results 1  10
of
12
Monte Carlo Statistical Methods
, 1998
"... This paper is also the originator of the Markov Chain Monte Carlo methods developed in the following chapters. The potential of these two simultaneous innovations has been discovered much latter by statisticians (Hastings 1970; Geman and Geman 1984) than by of physicists (see also Kirkpatrick et al. ..."
Abstract

Cited by 1475 (29 self)
 Add to MetaCart
This paper is also the originator of the Markov Chain Monte Carlo methods developed in the following chapters. The potential of these two simultaneous innovations has been discovered much latter by statisticians (Hastings 1970; Geman and Geman 1984) than by of physicists (see also Kirkpatrick et al. 1983). 5.5.5 ] PROBLEMS 211
On the Relationship Between Markov Chain Monte Carlo Methods for Model Uncertainty
 JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS
, 2001
"... This article considers Markov chain computational methods for incorporating uncertainty about the dimension of a parameter when performing inference within a Bayesian setting. A general class of methods is proposed for performing such computations, based upon a product space representation of the ..."
Abstract

Cited by 52 (4 self)
 Add to MetaCart
This article considers Markov chain computational methods for incorporating uncertainty about the dimension of a parameter when performing inference within a Bayesian setting. A general class of methods is proposed for performing such computations, based upon a product space representation of the problem which is similar to that of Carlin and Chib. It is shown that all of the existing algorithms for incorporation of model uncertainty into Markov chain Monte Carlo (MCMC) can be derived as special cases of this general class of methods. In particular, we show that the popular reversible jump method is obtained when a special form of MetropolisHastings (MH) algorithm is applied to the product space. Furthermore, the Gibbs sampling method and the variable selection method are shown to derive straightforwardly from the general framework. We believe that these new relationships between methods, which were until now seen as diverse procedures, are an important aid to the understanding of MCMC model selection procedures and may assist in the future development of improved procedures. Our discussion also sheds some light upon the important issues of "pseudoprior" selection in the case of the Carlin and Chib sampler and choice of proposal distribution in the case of reversible jump. Finally, we propose efficient reversible jump proposal schemes that take advantage of any analytic structure that may be present in the model. These proposal schemes are compared with a standard reversible jump scheme for the problem of model order uncertainty in autoregressive time series, demonstrating the improvements which can be achieved through careful choice of proposals
A Reversible Jump Sampler for Autoregressive Time Series, Employing Full Conditionals to Achieve Efficient Model Space Moves
, 1998
"... Introduction When fitting an autoregressive model to Gaussian time series data, often the correct order of the model is unknown. The model order cannot be estimated analytically by conventional Bayesian techniques when the excitation variance is unknown. We present MCMC methods for drawing samples ..."
Abstract

Cited by 37 (10 self)
 Add to MetaCart
Introduction When fitting an autoregressive model to Gaussian time series data, often the correct order of the model is unknown. The model order cannot be estimated analytically by conventional Bayesian techniques when the excitation variance is unknown. We present MCMC methods for drawing samples from the joint posterior of all the unknowns, from which Monte Carlo estimates of the quantities of interest can be made, with the possibility of model mixing, if required, for tasks such as prediction, interpolation, smoothing or noise reduction. Previous work on MCMC autoregressive model selection has parameterised the model using partial correlation coefficients (Barnett, Kohn & Sheather 1996, Barbieri & O'Hagan 1996) or pole positions 1 (Huerta & West 1997). These have a simple physical interpretation for certain types of signal, and allow stationarity to be enforced in a straightforward manner. We use the AR parameters, a, directly. This allows
TimeVarying Autoregressions With Model Order Uncertainty
 Journal of Time Series Analysis
, 1999
"... We explore some aspects of the analysis of latent component structure in nonstationary time series based on timevarying autoregressive (TVAR) models that incorporate uncertainty on model order. Our modelling approach assumes that the AR coecients evolve in time according to a random walk and th ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
We explore some aspects of the analysis of latent component structure in nonstationary time series based on timevarying autoregressive (TVAR) models that incorporate uncertainty on model order. Our modelling approach assumes that the AR coecients evolve in time according to a random walk and that the model order may also change in time following a discrete random walk. In addition, we use a conjugate prior structure on the autoregressive coecients and a discrete uniform prior on model order. Simulation from the posterior distribution of the model parameters can be obtained via standard Forward Filtering Backward Simulation algorithms. Aspects of implementation and inference on decompositions, latent structure and model order are discussed for a synthetic series and for an electroencephalogram (EEG) trace previously analysed using xed order TVAR models. Keywords: Dynamic linear models; Timevarying autoregressions; Model uncertainty; Time series decompositions; Markov cha...
Reversible jump Markov chain Monte Carlo strategies for Bayesian model selection in autoregressive processes
 Journal of time series analysis
, 2004
"... Abstract. This paper addresses the problem of Bayesian inference in autoregressive (AR) processes in the case where the correct model order is unknown. Original hierarchical prior models that allow the stationarity of the model to be enforced are proposed. Obtaining the quantities of interest, such ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
Abstract. This paper addresses the problem of Bayesian inference in autoregressive (AR) processes in the case where the correct model order is unknown. Original hierarchical prior models that allow the stationarity of the model to be enforced are proposed. Obtaining the quantities of interest, such as parameter estimates, predictions of future values of the time series, posterior modelorder probabilities, etc., requires integration with respect to the full posterior distribution, an operation which is analytically intractable. Reversible jump Markov chain Monte Carlo (MCMC) algorithms are developed to perform the required integration implicitly by simulating from the posterior distribution. The methods developed are evaluated in simulation studies on a number of synthetic and real data sets. Keywords. Autoregressive process; Bayesian estimation; Markov chain Monte Carlo; model selection. 1.
Bayesian timevarying autoregressions: Theory, methods and Applications
 UNIVERSITY OF SAO PAOLO
, 2000
"... We review the class of timevarying autoregressive (TVAR) models and a range of related recent developments of Bayesian time series modelling. Beginning with TVAR models in a Bayesian dynamic linear modelling framework, we review aspects of latent structure analysis, including timedomain decompo ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
We review the class of timevarying autoregressive (TVAR) models and a range of related recent developments of Bayesian time series modelling. Beginning with TVAR models in a Bayesian dynamic linear modelling framework, we review aspects of latent structure analysis, including timedomain decomposition methods that provide inferences on the structure underlying nonstationary time series, and that are now central tools in the time series analyst's toolkit. Recent model extensions that deal with model order uncertainty, and are enabled using efficient Markov Chain Monte Carlo simulation methods, are discussed, as are novel approaches to sequential filtering and smoothing using particulate filtering methods. We emphasize the relevance of TVAR modelling in a range of applied contexts, including biomedical signal processing and communications, and highlight some of the central developments via examples arising in studies of multiple electroencephalographic (EEG) traces in neurophysiology. We conclude with comments about current research frontiers.
Bayesian Analysis of Order Uncertainty in ARIMA Models
"... ... efficient proposal schemes for reversible jump MCMC in the context of autoregressive moving average models. In particular, the full conditional distribution is not available for the added parameters and approximations to it are provided by suggesting an adaptive updating scheme which automatical ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
... efficient proposal schemes for reversible jump MCMC in the context of autoregressive moving average models. In particular, the full conditional distribution is not available for the added parameters and approximations to it are provided by suggesting an adaptive updating scheme which automatically selects proposal parameter values to improve the efficiency of betweenmodel moves. The performance of the proposed algorithms is assessed by simulation studies and the methodology is illustrated by applying it to a real data set.
Bayesian Model Selection of Autoregressive Processes
 J. TIME SERIES ANALYSIS
, 2000
"... This paper poses the problem of model order determination of an autoregressive (AR) process within a Bayesian framework. Several original hierarchical prior models are proposed that allow for the stability of the model to be enforced and account for a possible unknown initial state. Obtaining the po ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
(Show Context)
This paper poses the problem of model order determination of an autoregressive (AR) process within a Bayesian framework. Several original hierarchical prior models are proposed that allow for the stability of the model to be enforced and account for a possible unknown initial state. Obtaining the posterior model order probabilities requires integration of the resulting posterior distribution, an operation which is analytically intractable. Here stochastic reversible jump Markov chain Monte Carlo (MCMC) algorithms are developed to perform the required integration by simulating from the posterior distribution. The methods developed are evaluated in simulation studies on a number of synthetic and real data sets, and compared to standard model selection criteria.
A New Strategy for Simulating From Mixture Distributions With Applications to Bayesian Model Selection
"... We present a method of generating random vectors from a distribution having an absolutely continuous component and a discrete component. The method is then extended to more general mixture distributions that arise quite naturally when dealing with nested models within a Bayesian framework. The main ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
We present a method of generating random vectors from a distribution having an absolutely continuous component and a discrete component. The method is then extended to more general mixture distributions that arise quite naturally when dealing with nested models within a Bayesian framework. The main idea is to transform the mixture distribution of interest into an absolutely continuous one, in a way that does not require the explicit calculation of the relative weights of the various components of the mixture. For nested models, the proposed method represents a simple alternative to Reversible Jump MCMC schemes. Its distinguishing features are the absence of a proposal step to reduce/increase the dimension of the current space and the fact that in order to assess the convergence of the chain, one can use all the standard tools available for MCMC on a space of fixed dimension.