Results 1  10
of
34
Filtering Via Simulation: Auxiliary Particle Filters
, 1997
"... This paper analyses the recently suggested particle approach to filtering time series. We suggest that the algorithm is not robust to outliers for two reasons: the design of the simulators and the use of the discrete support to represent the sequentially updating prior distribution. Both problems ar ..."
Abstract

Cited by 514 (15 self)
 Add to MetaCart
This paper analyses the recently suggested particle approach to filtering time series. We suggest that the algorithm is not robust to outliers for two reasons: the design of the simulators and the use of the discrete support to represent the sequentially updating prior distribution. Both problems are tackled in this paper. We believe we have largely solved the first problem and have reduced the order of magnitude of the second. In addition we introduce the idea of stratification into the particle filter which allows us to perform online Bayesian calculations about the parameters which index the models and maximum likelihood estimation. The new methods are illustrated by using a stochastic volatility model and a time series model of angles. Some key words: Filtering, Markov chain Monte Carlo, Particle filter, Simulation, SIR, State space. 1 1
Bayesian Analysis of Stochastic Volatility Models
, 1994
"... this article is to develop new methods for inference and prediction in a simple class of stochastic volatility models in which logarithm of conditional volatility follows an autoregressive (AR) times series model. Unlike the autoregressive conditional heteroscedasticity (ARCH) and gener alized ARCH ..."
Abstract

Cited by 363 (20 self)
 Add to MetaCart
this article is to develop new methods for inference and prediction in a simple class of stochastic volatility models in which logarithm of conditional volatility follows an autoregressive (AR) times series model. Unlike the autoregressive conditional heteroscedasticity (ARCH) and gener alized ARCH (GARCH) models [see Bollerslev, Chou, and Kroner (1992) for a survey of ARCH modeling], both the mean and logvolatility equations have separate error terms. The ease of evaluating the ARCH likelihood function and the ability of the ARCH specification to accommodate the timevarying volatility found in many economic time series has fostered an explosion in the use of ARCH models. On the other hand, the likelihood function for stochastic volatility models is difficult to evaluate, and hence these models have had limited empirical application
Stochastic Volatility: Likelihood Inference And Comparison With Arch Models
, 1994
"... this paper we exploit Gibbs sampling to provide a likelihood framework for the analysis of stochastic volatility models, demonstrating how to perform either maximum likelihood or Bayesian estimation. The paper includes an extensive Monte Carlo experiment which compares the efficiency of the maximum ..."
Abstract

Cited by 350 (36 self)
 Add to MetaCart
this paper we exploit Gibbs sampling to provide a likelihood framework for the analysis of stochastic volatility models, demonstrating how to perform either maximum likelihood or Bayesian estimation. The paper includes an extensive Monte Carlo experiment which compares the efficiency of the maximum likelihood estimator with that of quasilikelihood and Bayesian estimators proposed in the literature. We also compare the fit of the stochastic volatility model to that of ARCH models using the likelihood criterion to illustrate the flexibility of the framework presented. Some key words: ARCH, Bayes estimation, Gibbs sampler, Heteroscedasticity, Maximum likelihood, Quasimaximum likelihood, Simulation, Stochastic EM algorithm, Stochastic volatility, Stock returns. 1 INTRODUCTION
Bayesian Forecasting
, 1996
"... rapolation techniques, especially exponential smoothing and exponentially weighted moving average methods ([20, 71]). Developments of smoothing and discounting techniques in stock control and production planning areas led to formalisms in terms of linear, statespace models for time series with time ..."
Abstract

Cited by 58 (2 self)
 Add to MetaCart
rapolation techniques, especially exponential smoothing and exponentially weighted moving average methods ([20, 71]). Developments of smoothing and discounting techniques in stock control and production planning areas led to formalisms in terms of linear, statespace models for time series with timevarying trends and seasonal patterns, and eventually to the associated Bayesian formalism of methods of inference and prediction. From the early 1960s, practical Bayesian forecasting systems in this context involved the combination of formal time series models and historical data analysis together with methods for subjective intervention and forecast monitoring, so that complete forecasting systems, rather than just routine and automatic data analysis and extrapolation, were in use at that time ([19, 22]). Methods developed in those early days are still in use now in some companies in sales forecasting and stock control areas. There have been major developments in models and methods since t
Forecasting Time Series Subject to Multiple Structural Breaks
, 2004
"... This paper provides a novel approach to forecasting time series subject to discrete structural breaks. We propose a Bayesian estimation and prediction procedure that allows for the possibility of new breaks over the forecast horizon, taking account of the size and duration of past breaks (if any) by ..."
Abstract

Cited by 50 (8 self)
 Add to MetaCart
This paper provides a novel approach to forecasting time series subject to discrete structural breaks. We propose a Bayesian estimation and prediction procedure that allows for the possibility of new breaks over the forecast horizon, taking account of the size and duration of past breaks (if any) by means of a hierarchical hidden Markov chain model. Predictions are formed by integrating over the hyper parameters from the meta distributions that characterize the stochastic break point process. In an application to US Treasury bill rates, we find that the method leads to better outofsample forecasts than alternative methods that ignore breaks, particularly at long horizons.
Statistical Reconstruction And Analysis Of Autoregressive Signals In Impulsive Noise
, 1998
"... Modelling and reconstruction methods are presented for noise reduction of autocorrelated signals in nonGaussian, impulsive noise environments. A Bayesian probabilistic framework is adopted and Markov chain Monte Carlo methods are developed for detection and correction of impulses. Individual noise ..."
Abstract

Cited by 47 (16 self)
 Add to MetaCart
Modelling and reconstruction methods are presented for noise reduction of autocorrelated signals in nonGaussian, impulsive noise environments. A Bayesian probabilistic framework is adopted and Markov chain Monte Carlo methods are developed for detection and correction of impulses. Individual noise sources are modelled as Gaussian with unknown scale (variance), allowing for robustness to `heavytailed' impulse distributions, while the underlying signal is modelled as autoregressive (AR). Results are presented for both artificial and real data from voice and music recordings and comparisons are made with existing techniques. The new techniques are found to give improved detection and elimination of impulses in adverse noise conditions at the expense of some extra computational complexity.
Joint segmentation of piecewise constant autoregressive processes by using a hierarchical model and a Bayesian sampling approach
 IEEE Transactions on Signal Processing
, 2007
"... We propose a joint segmentation algorithm for piecewise constant AR processes recorded by several independent sensors. The algorithm is based on a hierarchical Bayesian model. Appropriate priors allow to introduce correlations between the change locations of the observed signals. Numerical problems ..."
Abstract

Cited by 28 (16 self)
 Add to MetaCart
We propose a joint segmentation algorithm for piecewise constant AR processes recorded by several independent sensors. The algorithm is based on a hierarchical Bayesian model. Appropriate priors allow to introduce correlations between the change locations of the observed signals. Numerical problems inherent to Bayesian inference are solved by a Gibbs sampling strategy. The proposed joint segmentation methodology provides interesting results compared to a signalbysignal segmentation. 1.
2007): “Efficient Bayesian Inference for Multiple ChangePoint and Mixture Innovation Models,” forthcoming
 Journal of Business and Economic Statistics
"... Time series subject to parameter shifts of random magnitude and timing are commonly modeled with a changepoint approach using Chib’s (1998) algorithm to draw the break dates. We outline some advantages of an alternative approach in which breaks come through mixture distributions in state innovation ..."
Abstract

Cited by 17 (1 self)
 Add to MetaCart
Time series subject to parameter shifts of random magnitude and timing are commonly modeled with a changepoint approach using Chib’s (1998) algorithm to draw the break dates. We outline some advantages of an alternative approach in which breaks come through mixture distributions in state innovations, and for which the sampler of Gerlach, Carter and Kohn (2000) allows reliable and efficient inference. We show how the same sampler can be used to (i) model shifts in variance that occur independently of shifts in other parameters (ii) draw the break dates in O(n) rather than O(n 3) operations in the changepoint model of Koop and Potter (2004b), the most general to date. Finally, we introduce to the time series literature the concept of adaptive MetropolisHastings sampling for discrete latent variable models. We develop an easily implemented adaptive algorithm that improves on Gerlach et al. (2000) and promises to significantly reduce computing time in a variety of problems including mixture innovation, changepoint, regimeswitching, and outlier detection. The efficiency gains on two models for U.S. inflation and real interest rates are 257 % and 341%.
Forecasting and Estimating Multiple Changepoint Models with an Unknown Number of Changepoints
, 2006
"... This paper develops a new approach to changepoint modeling that allows the number of changepoints in the observed sample to be unknown. The model we develop assumes regime durations have a Poisson distribution. It approximately nests the two most common approaches: the time varying parameter model ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
This paper develops a new approach to changepoint modeling that allows the number of changepoints in the observed sample to be unknown. The model we develop assumes regime durations have a Poisson distribution. It approximately nests the two most common approaches: the time varying parameter model with a changepoint every period and the changepoint model with a small number of regimes. We focus considerable attention on the construction of reasonable hierarchical priors both for regime durations and for the parameters which characterize each regime. A Markov Chain Monte Carlo posterior sampler is constructed to estimate a version of our model which allows for change in conditional means and variances. We show how real time forecasting can be done in an efficient manner using sequential importance sampling. Our techniques are found to work well in an empirical exercise involving US GDP growth and in‡ation. Empirical results suggest that the number of changepoints is larger than previously estimated in these series and the implied model is similar to a time varying parameter (with stochastic volatility) model.
A Parallel CuttingPlane Algorithm for the Vehicle Routing Problem With Time Windows
, 1999
"... In the vehicle routing problem with time windows a number of identical vehicles must be routed to and from a depot to cover a given set of customers, each of whom has a specified time interval indicating when they are available for service. Each customer also has a known demand, and a vehicle may on ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
In the vehicle routing problem with time windows a number of identical vehicles must be routed to and from a depot to cover a given set of customers, each of whom has a specified time interval indicating when they are available for service. Each customer also has a known demand, and a vehicle may only serve the customers on a route if the total demand does not exceed the capacity of the vehicle. The most effective solution method proposed to date for this problem is due to Kohl, Desrosiers, Madsen, Solomon, and Soumis. Their algorithm uses a cuttingplane approach followed by a branchand bound search with column generation, where the columns of the LP relaxation represent routes of individual vehicles. We describe a new implementation of their method, using Karger's randomized minimumcut algorithm to generate cutting planes. The standard benchmark in this area is a set of 87 problem instances generated in 1984 by M. Solomon; making using of parallel processing in both the cuttingpla...