Results 1  10
of
20
Posterior Predictive Assessment of Model Fitness Via Realized Discrepancies
 Statistica Sinica
, 1996
"... Abstract: This paper considers Bayesian counterparts of the classical tests for goodness of fit and their use in judging the fit of a single Bayesian model to the observed data. We focus on posterior predictive assessment, in a framework that also includes conditioning on auxiliary statistics. The B ..."
Abstract

Cited by 253 (35 self)
 Add to MetaCart
Abstract: This paper considers Bayesian counterparts of the classical tests for goodness of fit and their use in judging the fit of a single Bayesian model to the observed data. We focus on posterior predictive assessment, in a framework that also includes conditioning on auxiliary statistics. The Bayesian formulation facilitates the construction and calculation of a meaningful reference distribution not only for any (classical) statistic, but also for any parameterdependent “statistic ” or discrepancy. The latter allows us to propose the realized discrepancy assessment of model fitness, which directly measures the true discrepancy between data and the posited model, for any aspect of the model which we want to explore. The computation required for the realized discrepancy assessment is a straightforward byproduct of the posterior simulation used for the original Bayesian analysis. We illustrate with three applied examples. The first example, which serves mainly to motivate the work, illustrates the difficulty of classical tests in assessing the fitness of a Poisson model to a positron emission tomography image that is constrained to be nonnegative. The second and third examples illustrate the details of the posterior predictive approach in two problems: estimation in a model with inequality constraints on the parameters, and estimation in a mixture model. In all three examples, standard test statistics (either a χ 2 or a likelihood ratio) are not pivotal: the difficulty is not just how to compute the reference distribution for the test, but that in the classical framework no such distribution exists, independent of the unknown model parameters. Key words and phrases: Bayesian pvalue, χ 2 test, discrepancy, graphical assessment, mixture model, model criticism, posterior predictive pvalue, prior predictive
A Monte Carlo Approach to Nonnormal and Nonlinear StateSpace Modeling
 Journal of the American Statistical Association
, 1992
"... ..."
Bayesian Forecasting
, 1996
"... rapolation techniques, especially exponential smoothing and exponentially weighted moving average methods ([20, 71]). Developments of smoothing and discounting techniques in stock control and production planning areas led to formalisms in terms of linear, statespace models for time series with time ..."
Abstract

Cited by 83 (2 self)
 Add to MetaCart
rapolation techniques, especially exponential smoothing and exponentially weighted moving average methods ([20, 71]). Developments of smoothing and discounting techniques in stock control and production planning areas led to formalisms in terms of linear, statespace models for time series with timevarying trends and seasonal patterns, and eventually to the associated Bayesian formalism of methods of inference and prediction. From the early 1960s, practical Bayesian forecasting systems in this context involved the combination of formal time series models and historical data analysis together with methods for subjective intervention and forecast monitoring, so that complete forecasting systems, rather than just routine and automatic data analysis and extrapolation, were in use at that time ([19, 22]). Methods developed in those early days are still in use now in some companies in sales forecasting and stock control areas. There have been major developments in models and methods since t
Particle learning and smoothing
 Statistical Science
, 2010
"... In this paper we develop particle learning (PL) methods for state filtering, sequential parameter learning and smoothing in a general class of nonlinear state space models. The approach extends existing particle methods by incorporating static parameters and utilizing sufficient statistics for the p ..."
Abstract

Cited by 25 (9 self)
 Add to MetaCart
(Show Context)
In this paper we develop particle learning (PL) methods for state filtering, sequential parameter learning and smoothing in a general class of nonlinear state space models. The approach extends existing particle methods by incorporating static parameters and utilizing sufficient statistics for the parameters and/or the states as particles. State smoothing with parameter uncertainty is also solved as a by product of particle learning. In a number of applications, we show that our algorithms outperform existing particle filtering algorithms as well as MCMC.
A State Space Model for Multivariate Longitudinal Count Data
, 1998
"... A state space model for multivariate longitudinal count data driven by a latent gamma Markov process is proposed, the observed counts being conditionally independent and Poisson distributed given the latent process. We consider regression analysis for this model with timevarying covariates entering ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
A state space model for multivariate longitudinal count data driven by a latent gamma Markov process is proposed, the observed counts being conditionally independent and Poisson distributed given the latent process. We consider regression analysis for this model with timevarying covariates entering either via the Poisson model or via the latent gamma process. We develop the Kalman filter and smoother and investigate estimation based on the EM algorithm with the Estep approximated by the Kalman smoother. We also consider analysis of residuals from both the Poisson model and the gamma process. Key words: EM algorithm; Estimation function; Generalized linear model; Kalman filter; Kalman smoother; Latent process; Mixed Poisson distribution; Overdispersion; Random effects; Regression model; Residual analysis; Timevarying covariates. 1 Introduction An important problem in longitudinal data analysis is the development of models for nonnormal response variables. We consider a kdimensiona...
Combined parameter and state estimation, in
 Gordon (Eds.), Sequential Monte Carlo Methods in Practice
, 2001
"... simulationbased ltering ..."
(Show Context)
Bayesian Time Series: Analysis Methods Using SimulationBased Computation
, 2000
"... This dissertation introduces new simulationbased analysis approaches, including both sequential and offline learning algorithms, for various Bayesian time series models. We provide a Markov Chain Monte Carlo (MCMC) method for an autoregressive (AR) model with innovations following exponential powe ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
This dissertation introduces new simulationbased analysis approaches, including both sequential and offline learning algorithms, for various Bayesian time series models. We provide a Markov Chain Monte Carlo (MCMC) method for an autoregressive (AR) model with innovations following exponential power distributions using the fact that an exponential power distribution is a scale mixture of normals. This model has application in signal processing, specifically image processing, with orthogonal wave...
Modeling outliers, bursts and flat stretches in time series using Mixture Transition Distribution (MTD) models
, 1990
"... The class of Mixture Transition Distribution (MTD) time series models is introduced. In these models, the conditional distribution of the current observation given the past is a mixture of conditional distributions given each one of the last p observations. They can capture nonGaussian and nonline ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
The class of Mixture Transition Distribution (MTD) time series models is introduced. In these models, the conditional distribution of the current observation given the past is a mixture of conditional distributions given each one of the last p observations. They can capture nonGaussian and nonlinear features such as outliers, bursts of activity and flat stretches, in a single unified model class. They can also represent time series defined on arbitrary state spaces, which need not even be Euclidean. They perform well in the usual case of Gaussian time series without obvious nonstandard behaviors. The models are simple, analytically tractable, easy to simulate and readily estimated. The stationarity and autocorrelation properties of the models are derived. _A. simple EM algorithm is given and shown to work well for estimation. The models are applied to several real and simulated data sets with satisfactory results. They appear to capture the features of the data better than the best competing ARIMA models.
Quantile Filtering and Learning
, 2009
"... Quantile and leastabsolute deviations (LAD) methods are popular robust statistical methods but have not generally been applied to state filtering and sequential parameter learning. This paper introduces robust state space models whose error structure coincides with quantile estimation criterion, wi ..."
Abstract
 Add to MetaCart
Quantile and leastabsolute deviations (LAD) methods are popular robust statistical methods but have not generally been applied to state filtering and sequential parameter learning. This paper introduces robust state space models whose error structure coincides with quantile estimation criterion, with LAD a special case. We develop an efficient particle based method for sequential state and parameter inference. Existing approaches focus solely on the problem of state filtering, conditional on parameter values. Our approach allows for sequential hypothesis testing and model monitoring by computing marginal likelihoods and Bayes factors sequentially through time. We illustrate our approach with a number of applications with real and simulated data. In all cases we compare our results with existing algorithms where possible and document the efficiency of our methodology. 1
© Institute of Mathematical Statistics, 2010 Particle Learning and Smoothing
"... Abstract. Particle learning (PL) provides state filtering, sequential parameter learning and smoothing in a general class of state space models. Our approach extends existing particle methods by incorporating the estimation of static parameters via a fullyadapted filter that utilizes conditional su ..."
Abstract
 Add to MetaCart
Abstract. Particle learning (PL) provides state filtering, sequential parameter learning and smoothing in a general class of state space models. Our approach extends existing particle methods by incorporating the estimation of static parameters via a fullyadapted filter that utilizes conditional sufficient statistics for parameters and/or states as particles. State smoothing in the presence of parameter uncertainty is also solved as a byproduct of PL. In a number of examples, we show that PL outperforms existing particle filtering alternatives and proves to be a competitor to MCMC. Key words and phrases: Mixture Kalman filter, parameter learning, particle learning, sequential inference, smoothing, state filtering, state space models. 1.