Results 1  10
of
20
Posterior Predictive Assessment of Model Fitness Via Realized Discrepancies
 Statistica Sinica
, 1996
"... Abstract: This paper considers Bayesian counterparts of the classical tests for goodness of fit and their use in judging the fit of a single Bayesian model to the observed data. We focus on posterior predictive assessment, in a framework that also includes conditioning on auxiliary statistics. The B ..."
Abstract

Cited by 177 (29 self)
 Add to MetaCart
Abstract: This paper considers Bayesian counterparts of the classical tests for goodness of fit and their use in judging the fit of a single Bayesian model to the observed data. We focus on posterior predictive assessment, in a framework that also includes conditioning on auxiliary statistics. The Bayesian formulation facilitates the construction and calculation of a meaningful reference distribution not only for any (classical) statistic, but also for any parameterdependent “statistic ” or discrepancy. The latter allows us to propose the realized discrepancy assessment of model fitness, which directly measures the true discrepancy between data and the posited model, for any aspect of the model which we want to explore. The computation required for the realized discrepancy assessment is a straightforward byproduct of the posterior simulation used for the original Bayesian analysis. We illustrate with three applied examples. The first example, which serves mainly to motivate the work, illustrates the difficulty of classical tests in assessing the fitness of a Poisson model to a positron emission tomography image that is constrained to be nonnegative. The second and third examples illustrate the details of the posterior predictive approach in two problems: estimation in a model with inequality constraints on the parameters, and estimation in a mixture model. In all three examples, standard test statistics (either a χ 2 or a likelihood ratio) are not pivotal: the difficulty is not just how to compute the reference distribution for the test, but that in the classical framework no such distribution exists, independent of the unknown model parameters. Key words and phrases: Bayesian pvalue, χ 2 test, discrepancy, graphical assessment, mixture model, model criticism, posterior predictive pvalue, prior predictive
A monte carlo approach to nonnormal and nonlinear statespace modeling
 Journal of the American Statistical Association
, 1992
"... ..."
Bayesian Forecasting
, 1996
"... rapolation techniques, especially exponential smoothing and exponentially weighted moving average methods ([20, 71]). Developments of smoothing and discounting techniques in stock control and production planning areas led to formalisms in terms of linear, statespace models for time series with time ..."
Abstract

Cited by 59 (2 self)
 Add to MetaCart
rapolation techniques, especially exponential smoothing and exponentially weighted moving average methods ([20, 71]). Developments of smoothing and discounting techniques in stock control and production planning areas led to formalisms in terms of linear, statespace models for time series with timevarying trends and seasonal patterns, and eventually to the associated Bayesian formalism of methods of inference and prediction. From the early 1960s, practical Bayesian forecasting systems in this context involved the combination of formal time series models and historical data analysis together with methods for subjective intervention and forecast monitoring, so that complete forecasting systems, rather than just routine and automatic data analysis and extrapolation, were in use at that time ([19, 22]). Methods developed in those early days are still in use now in some companies in sales forecasting and stock control areas. There have been major developments in models and methods since t
Particle learning and smoothing
 Statistical Science
, 2010
"... In this paper we develop particle learning (PL) methods for state filtering, sequential parameter learning and smoothing in a general class of nonlinear state space models. The approach extends existing particle methods by incorporating static parameters and utilizing sufficient statistics for the p ..."
Abstract

Cited by 18 (7 self)
 Add to MetaCart
In this paper we develop particle learning (PL) methods for state filtering, sequential parameter learning and smoothing in a general class of nonlinear state space models. The approach extends existing particle methods by incorporating static parameters and utilizing sufficient statistics for the parameters and/or the states as particles. State smoothing with parameter uncertainty is also solved as a by product of particle learning. In a number of applications, we show that our algorithms outperform existing particle filtering algorithms as well as MCMC.
A State Space Model for Multivariate Longitudinal Count Data
, 1998
"... A state space model for multivariate longitudinal count data driven by a latent gamma Markov process is proposed, the observed counts being conditionally independent and Poisson distributed given the latent process. We consider regression analysis for this model with timevarying covariates entering ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
A state space model for multivariate longitudinal count data driven by a latent gamma Markov process is proposed, the observed counts being conditionally independent and Poisson distributed given the latent process. We consider regression analysis for this model with timevarying covariates entering either via the Poisson model or via the latent gamma process. We develop the Kalman filter and smoother and investigate estimation based on the EM algorithm with the Estep approximated by the Kalman smoother. We also consider analysis of residuals from both the Poisson model and the gamma process. Key words: EM algorithm; Estimation function; Generalized linear model; Kalman filter; Kalman smoother; Latent process; Mixed Poisson distribution; Overdispersion; Random effects; Regression model; Residual analysis; Timevarying covariates. 1 Introduction An important problem in longitudinal data analysis is the development of models for nonnormal response variables. We consider a kdimensiona...
Combined parameter and state estimation, in
 Gordon (Eds.), Sequential Monte Carlo Methods in Practice
, 2001
"... simulationbased ltering ..."
Modeling outliers, bursts and flat stretches in time series using Mixture Transition Distribution (MTD) models
, 1990
"... The class of Mixture Transition Distribution (MTD) time series models is introduced. In these models, the conditional distribution of the current observation given the past is a mixture of conditional distributions given each one of the last p observations. They can capture nonGaussian and nonline ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
The class of Mixture Transition Distribution (MTD) time series models is introduced. In these models, the conditional distribution of the current observation given the past is a mixture of conditional distributions given each one of the last p observations. They can capture nonGaussian and nonlinear features such as outliers, bursts of activity and flat stretches, in a single unified model class. They can also represent time series defined on arbitrary state spaces, which need not even be Euclidean. They perform well in the usual case of Gaussian time series without obvious nonstandard behaviors. The models are simple, analytically tractable, easy to simulate and readily estimated. The stationarity and autocorrelation properties of the models are derived. _A. simple EM algorithm is given and shown to work well for estimation. The models are applied to several real and simulated data sets with satisfactory results. They appear to capture the features of the data better than the best competing ARIMA models.
Bayesian Time Series: Analysis Methods Using SimulationBased Computation
, 2000
"... This dissertation introduces new simulationbased analysis approaches, including both sequential and offline learning algorithms, for various Bayesian time series models. We provide a Markov Chain Monte Carlo (MCMC) method for an autoregressive (AR) model with innovations following exponential powe ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
This dissertation introduces new simulationbased analysis approaches, including both sequential and offline learning algorithms, for various Bayesian time series models. We provide a Markov Chain Monte Carlo (MCMC) method for an autoregressive (AR) model with innovations following exponential power distributions using the fact that an exponential power distribution is a scale mixture of normals. This model has application in signal processing, specifically image processing, with orthogonal wave...
1 BAYESIAN DYNAMIC MODELLING
"... Bayesian time series and forecasting is a very broad field and any attempt at ..."
Abstract
 Add to MetaCart
Bayesian time series and forecasting is a very broad field and any attempt at
Designing Bayesian EWMA Monitors
"... We derive an exponentially weighted moving average (EWMA) as the Bayesian prior mean for a random walk observed with normal error. This derivation shows how the weight on the last observation varies over time. This weight depends on the migration rate of the random walk and the noise variance, which ..."
Abstract
 Add to MetaCart
We derive an exponentially weighted moving average (EWMA) as the Bayesian prior mean for a random walk observed with normal error. This derivation shows how the weight on the last observation varies over time. This weight depends on the migration rate of the random walk and the noise variance, which can be estimated from reliability data and studies of gage repeatability and reproducibility, respectively. The variations in the weight on the last observation provide a solution to the “fast initial response ” problem that we believe is intuitively more satisfying than the current standard solution.