Results 11  20
of
52
A SIMPLE BAYESIAN APPROACH TO MULTIPLE CHANGEPOINTS
"... Abstract: After a brief review of previous frequentist and Bayesian approaches to multiple changepoints, we describe a Bayesian model for multiple parameter changes in a multiparameter exponential family. This model has attractive statistical and computational properties and yields explicit recurs ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Abstract: After a brief review of previous frequentist and Bayesian approaches to multiple changepoints, we describe a Bayesian model for multiple parameter changes in a multiparameter exponential family. This model has attractive statistical and computational properties and yields explicit recursive formulas for the Bayes estimates of the piecewise constant parameters. Efficient estimators of the hyperparameters of the Bayesian model for the parameter jumps can be used in conjunction, yielding empirical Bayes estimates. The empirical Bayes approach is also applied to solve longstanding frequentist problems such as significance testing of the null hypothesis of no changepoints versus multiple changepoint alternatives, and inference on the number and locations of changepoints that partition the unknown parameter sequence into segments of equal values. Simulation studies of performance and an illustrative application to the British coal mine data are also given. Extensions from the exponential family to general parametric families and from independent observations to genearlized linear time series models are then provided. Key words and phrases: Empirical Bayes, exponential families, generalized linear autoregressive models, multiple changepoints, segmentation. 1.
Approximate Bayesian . . . Weighted Likelihood Bootstrap
, 1991
"... We introduce the weighted likelihood bootstrap (WLB) as a simple way of approximately simulating from a posterior distribution. This is easy to implement, requiring only an algorithm for calculating the maximum likelihood estimator, such as the EM algorithm or iteratively reweighted least squares; i ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We introduce the weighted likelihood bootstrap (WLB) as a simple way of approximately simulating from a posterior distribution. This is easy to implement, requiring only an algorithm for calculating the maximum likelihood estimator, such as the EM algorithm or iteratively reweighted least squares; it does not necessarily require actual calculation of the likelihood itself. The method is exact up to an effective prior which is generally unknown but can be identified exactly for unconstrained discretedata models and approximately for other models. Accuracy of the WLB relies on the chosen distribution of weights. In the generic scheme, the WLB is at least firstorder correct under quite general conditions. We have also been able to prove higherorder correctness in some classes of models. The method, which generalizes Rubin's Bayesian bootstrap, provides approximate posterior distributions for prediction, calibration, dependent data and partial likelihood problems, as well as more standard models. The calculation of approximate Bayes factors for model comparison is also considered. We note that, given a sample simulated from the posterior distribution, the required marginal likelihood may be simulationconsistently estimated by the harmonic mean of the associated likelihood values; a modification of this estimator that avoids instability is also noted. An alternative, predictionbased, estimator of the marginal likelihood using the WLB is also described. These methods provide simple ways of calculating approximate Bayes factors and posterior model probabilities for a very wide class of models.
Simulationbased Bayesian analysis for multiple changepoints’, Available at http://arxiv.org/abs/1011.2932
, 2010
"... Abstract: This paper presents a Markov chain Monte Carlo method to generate approximate posterior samples in retrospective multiple changepoint problems where the number of changes is not known in advance. The method uses conjugate models whereby the marginal likelihood for the data between consecu ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Abstract: This paper presents a Markov chain Monte Carlo method to generate approximate posterior samples in retrospective multiple changepoint problems where the number of changes is not known in advance. The method uses conjugate models whereby the marginal likelihood for the data between consecutive changepoints is tractable. Inclusion of hyperpriors gives a near automatic algorithm providing a robust alternative to popular filtering recursions approaches in cases which may be sensitive to prior information. Three real examples are used to demonstrate the proposed approach.
USE OF CUMULATIVE SUMS FOR DETECTION OF CHANGEPOINTS IN THE RATE PARAMETER OF A POISSON PROCESS
, 2004
"... This paper studies the problem of multiple changepoints in rate parameter of a Poisson process. We propose a binary segmentation algorithm in conjunction with a cumulative sums statistic for detection of changepoints such that in each step we need only to test the presence of a simple changepoint. W ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
This paper studies the problem of multiple changepoints in rate parameter of a Poisson process. We propose a binary segmentation algorithm in conjunction with a cumulative sums statistic for detection of changepoints such that in each step we need only to test the presence of a simple changepoint. We derive the asymptotic distribution of the proposed statistic, prove its consistency and obtain the limiting distribution of the estimate of the changepoint. A Monte Carlo analysis shows the good performance of the proposed procedure, which is illustrated with a real data example.
Direct Simulation Methods for Multiple Changepoints Problems
, 2007
"... The multiple changepoint model has been considered in a wide range of statistical modelling, as it increases the flexibility to simple statistical applications. The main purpose of the thesis enables the Bayesian inference from such models by using the idea of particle filters. Compared to the exist ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
The multiple changepoint model has been considered in a wide range of statistical modelling, as it increases the flexibility to simple statistical applications. The main purpose of the thesis enables the Bayesian inference from such models by using the idea of particle filters. Compared to the existed methodology such as RJMCMC of Green (1995), the attraction of our particle filter is its simplicity and efficiency. We propose an online algorithm for exact filtering for a class of multiple changepoint problems. This class of models satisfy an important conditional independence property. This algorithm enables simulation from the true joint posterior distribution of the number and position of the changepoints for a class of changepoint models. The computational cost of this exact algorithm is quadratic in the number of observations. We further show how resampling ideas from particle filters can be used to reduce the computational cost to linear in the number of observations, at the expense of introducing small errors; and propose two new, optimum resampling algorithms for this problem. In practice, large computational
Bayesian Inference from Continuously Arriving Informant Reports, With . . .
, 2004
"... Effective decisionmaking for crisis response depends upon the rapid integration of limited information from (possibly unreliable) human sources. Here, a Bayesian modeling framework is developed for inference from informant reports. Reports are assumed to arrive via a Poissonlike process, whose ..."
Abstract
 Add to MetaCart
Effective decisionmaking for crisis response depends upon the rapid integration of limited information from (possibly unreliable) human sources. Here, a Bayesian modeling framework is developed for inference from informant reports. Reports are assumed to arrive via a Poissonlike process, whose rates are dependent upon the (unknown) state of the world in addition to assorted covariates. A hierarchical modeling structure is used to represent error processes which vary based on informants' group memberships, with the possibility of multiple, overlapping memberships for each informant. Procedures are shown for sampling from the joint posterior distribution of the parameters, and for obtaining posterior predictive quantities.
ii TABLE OF CONTENTS
, 2010
"... 2010 c ○ 2010 Jonathan HutchinsThe dissertation of Jonathan Hutchins ..."
(Show Context)
Doctoral Committee:
, 2009
"... Zhaohui Qin for their invaluable guidance for the completion of this thesis. I am also grateful to Dr. Sinae Kim for her helpful comments on the thesis. Special thanks are due to Dr. Alexey Nesvizhskii who has shown me the merit of rigourous reasoning and the importance of solid scientific principle ..."
Abstract
 Add to MetaCart
(Show Context)
Zhaohui Qin for their invaluable guidance for the completion of this thesis. I am also grateful to Dr. Sinae Kim for her helpful comments on the thesis. Special thanks are due to Dr. Alexey Nesvizhskii who has shown me the merit of rigourous reasoning and the importance of solid scientific principles in an elegant manner. The work presented in this thesis would not have been possible if it were not for Dr. Jun Li, who introduced me into contemporary molecular biology and has taught me through the discipline since I worked in his lab at Stanford University. Drs. Johan Lim and Kyu Hahn have always supported me through critical moments during my graduate studies since we met in Palo Alto. Mike Tyers lab in Samuel Lunenfeld Research Institute generously allowed me to use their data in the development of proteinprotein interactome analysis. Timothy Green at the Gayle Morris Sweetland Writing Center deserves a mention for his fabulous job in proofreading this thesis. This thesis is dedicated to my inspiring parents, lovely wife Yoon Kyung and her
Statistical Methods in Applied Computer Science Lecture Notes Preliminary for course 2D5342, Data Mining, JanApril 2006.
, 2006
"... We overview fundamental inference principles for hypothesis and decision choice, parameter(state) estimation and tracking methods. We also explore methods for finding dependencies and graphical models, latent variables and robust decision trees in a joint Bayesian framework. We also consider the mo ..."
Abstract
 Add to MetaCart
(Show Context)
We overview fundamental inference principles for hypothesis and decision choice, parameter(state) estimation and tracking methods. We also explore methods for finding dependencies and graphical models, latent variables and robust decision trees in a joint Bayesian framework. We also consider the most important methods for performing Bayesian inference in various settings: Analytic integration using conjugate families of distributions and likelihoods, discretization, Monte Carlo as well as Markov Chain Monte Carlo and particle filters. We overview related probabilistic methods such as robust Bayesian analysis, evidence theory and PAC learning.