Results 1  10
of
40
Monte Carlo Statistical Methods
, 1998
"... This paper is also the originator of the Markov Chain Monte Carlo methods developed in the following chapters. The potential of these two simultaneous innovations has been discovered much latter by statisticians (Hastings 1970; Geman and Geman 1984) than by of physicists (see also Kirkpatrick et al. ..."
Abstract

Cited by 900 (23 self)
 Add to MetaCart
This paper is also the originator of the Markov Chain Monte Carlo methods developed in the following chapters. The potential of these two simultaneous innovations has been discovered much latter by statisticians (Hastings 1970; Geman and Geman 1984) than by of physicists (see also Kirkpatrick et al. 1983). 5.5.5 ] PROBLEMS 211
Reversible jump Markov chain Monte Carlo computation and Bayesian model determination
 Biometrika
, 1995
"... Markov chain Monte Carlo methods for Bayesian computation have until recently been restricted to problems where the joint distribution of all variables has a density with respect to some xed standard underlying measure. They have therefore not been available for application to Bayesian model determi ..."
Abstract

Cited by 827 (19 self)
 Add to MetaCart
Markov chain Monte Carlo methods for Bayesian computation have until recently been restricted to problems where the joint distribution of all variables has a density with respect to some xed standard underlying measure. They have therefore not been available for application to Bayesian model determination, where the dimensionality of the parameter vector is typically not xed. This article proposes a new framework for the construction of reversible Markov chain samplers that jump between parameter subspaces of di ering dimensionality, which is exible and entirely constructive. It should therefore have wide applicability in model determination problems. The methodology is illustrated with applications to multiple changepoint analysis in one and two dimensions, and toaBayesian comparison of binomial experiments.
The Equity Premium and Structural Breaks
, 2000
"... A long return history is useftil in estimating the current equity premium even if the historical distribution has experienced structural breaks. The long series helps not only if the timing of breaks is uncertain but also if one believes that large shifts in the premium are unlikely or that the prem ..."
Abstract

Cited by 43 (4 self)
 Add to MetaCart
A long return history is useftil in estimating the current equity premium even if the historical distribution has experienced structural breaks. The long series helps not only if the timing of breaks is uncertain but also if one believes that large shifts in the premium are unlikely or that the premium is associated, in part, with volatility. Our framework incorporates these features along with a belief that prices are likely to move opposite to contemporaneous shifts in the premium. The estimated premium since 1834 fluctuates between four and six percent and exhibits its sharpest drop in the last decade.
Modeling changing dependency structure in multivariate time series
 In International Conference in Machine Learning
, 2007
"... We show how to apply the efficient Bayesian changepoint detection techniques of Fearnhead in the multivariate setting. We model the joint density of vectorvalued observations using undirected Gaussian graphical models, whose structure we estimate. We show how we can exactly compute the MAP segmenta ..."
Abstract

Cited by 31 (0 self)
 Add to MetaCart
We show how to apply the efficient Bayesian changepoint detection techniques of Fearnhead in the multivariate setting. We model the joint density of vectorvalued observations using undirected Gaussian graphical models, whose structure we estimate. We show how we can exactly compute the MAP segmentation, as well as how to draw perfect samples from the posterior over segmentations, simultaneously accounting for uncertainty about the number and location of changepoints, as well as uncertainty about the covariance structure. We illustrate the technique by applying it to financial data and to bee tracking data. 1.
Bayesian Online Changepoint Detection
"... Changepoints are abrupt variations in the generative parameters of a data sequence. Online detection of changepoints is useful in modelling and prediction of time series in application areas such as finance, biometrics, and robotics. While frequentist methods have yielded online filtering and predic ..."
Abstract

Cited by 26 (0 self)
 Add to MetaCart
Changepoints are abrupt variations in the generative parameters of a data sequence. Online detection of changepoints is useful in modelling and prediction of time series in application areas such as finance, biometrics, and robotics. While frequentist methods have yielded online filtering and prediction techniques, most Bayesian papers have focused on the retrospective segmentation problem. Here we examine the case where the model parameters before and after the changepoint are independent and we derive an online algorithm for exact inference of the most recent changepoint. We compute the probability distribution of the length of the current “run, ” or time since the last changepoint, using a simple messagepassing algorithm. Our implementation is highly modular so that the algorithm may be applied to a variety of types of data. We illustrate this modularity by demonstrating the algorithm on three different realworld data sets. 1
Nonlinearity, Structural Breaks Or Outliers In Economic Time Series?
 Nonlinear Econometric Modeling in Time Series Analysis
, 2000
"... This paper has its motivation from discussions at the EC ..."
Abstract

Cited by 16 (4 self)
 Add to MetaCart
This paper has its motivation from discussions at the EC
Computational Methods for Complex Stochastic Systems: A Review of Some Alternatives to MCMC
"... We consider analysis of complex stochastic models based upon partial information. MCMC and reversible jump MCMC are often the methods of choice for such problems, but in some situations they can be difficult to implement; and suffer from problems such as poor mixing, and the difficulty of diagnosing ..."
Abstract

Cited by 14 (2 self)
 Add to MetaCart
We consider analysis of complex stochastic models based upon partial information. MCMC and reversible jump MCMC are often the methods of choice for such problems, but in some situations they can be difficult to implement; and suffer from problems such as poor mixing, and the difficulty of diagnosing convergence. Here we review three alternatives to MCMC methods: importance sampling, the forwardbackward algorithm, and sequential Monte Carlo (SMC). We discuss how to design good proposal densities for importance sampling, show some of the range of models for which the forwardbackward algorithm can be applied, and show how resampling ideas from SMC can be used to improve the efficiency of the other two methods. We demonstrate these methods on a range of examples, including estimating the transition density of a diffusion and of a discretestate continuoustime Markov chain; inferring structure in population genetics; and segmenting genetic divergence data.
Forecasting and Estimating Multiple Changepoint Models with an Unknown Number of Changepoints
, 2006
"... This paper develops a new approach to changepoint modeling that allows the number of changepoints in the observed sample to be unknown. The model we develop assumes regime durations have a Poisson distribution. It approximately nests the two most common approaches: the time varying parameter model ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
This paper develops a new approach to changepoint modeling that allows the number of changepoints in the observed sample to be unknown. The model we develop assumes regime durations have a Poisson distribution. It approximately nests the two most common approaches: the time varying parameter model with a changepoint every period and the changepoint model with a small number of regimes. We focus considerable attention on the construction of reasonable hierarchical priors both for regime durations and for the parameters which characterize each regime. A Markov Chain Monte Carlo posterior sampler is constructed to estimate a version of our model which allows for change in conditional means and variances. We show how real time forecasting can be done in an efficient manner using sequential importance sampling. Our techniques are found to work well in an empirical exercise involving US GDP growth and in‡ation. Empirical results suggest that the number of changepoints is larger than previously estimated in these series and the implied model is similar to a time varying parameter (with stochastic volatility) model.
Bayesian Partitioning for Classification and Regression
, 1999
"... In this paper we propose a new Bayesian approach to data modelling. The Bayesian partition model constructs arbitrarily complex regression and classification surfaces by splitting the design space into an unknown number of disjoint regions. Within each region the data is assumed to be exchangeable a ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
In this paper we propose a new Bayesian approach to data modelling. The Bayesian partition model constructs arbitrarily complex regression and classification surfaces by splitting the design space into an unknown number of disjoint regions. Within each region the data is assumed to be exchangeable and to come from some simple distribution. Using conjugate priors the marginal likelihoods of the models can be obtained analytically for any proposed partitioning of the space where the number and location of the regions is assumed unknown a priori. Markov chain Monte Carlo simulation techniques are used to obtain distributions on partition structures and by averaging across samples smooth prediction surfaces are formed.
An algorithm for optimal partitioning of data on an interval
 IEEE Signal Processing Letters
, 2005
"... Abstract — Many signal processing problems can be solved by maximizing the fitness of a segmented model over all possible partitions of the data interval. This letter describes a simple but powerful algorithm that searches the exponentially large space of partitions of N data points in time O(N 2). ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
Abstract — Many signal processing problems can be solved by maximizing the fitness of a segmented model over all possible partitions of the data interval. This letter describes a simple but powerful algorithm that searches the exponentially large space of partitions of N data points in time O(N 2). The algorithm is guaranteed to find the exact global optimum, automatically determines the model order (the number of segments), has a convenient realtime mode, can be extended to higher dimensional data spaces, and solves a surprising variety of problems in signal detection and characterization, density estimation, cluster analysis and classification. Index Terms — signal detection, density estimation, optimization, Bayesian modeling, histograms, cluster analysis EDICS: 1.STAT I.