Results 1  10
of
18
Studies in astronomical time series analysis. V. Bayesian blocks, a new method to analyze structure in photon counting data, Astrophys
 J
, 1998
"... Subject headings: numerical methods – data analysis — models – Xray astronomy — γray astronomy Received; accepted Astrophysical Journal2 I describe a new timedomain algorithm for detecting localized structures (bursts), revealing pulse shapes, and generally characterizing intensity variations. Th ..."
Abstract

Cited by 28 (9 self)
 Add to MetaCart
Subject headings: numerical methods – data analysis — models – Xray astronomy — γray astronomy Received; accepted Astrophysical Journal2 I describe a new timedomain algorithm for detecting localized structures (bursts), revealing pulse shapes, and generally characterizing intensity variations. The input is raw counting data, in any of three forms: timetagged photon events (TTE), binned counts, or timetospill (TTS) data. The output is the most likely segmentation of the observation into time intervals during which the photon arrival rate is perceptibly constant – i.e. has a fixed intensity without statistically significant variations. Since the analysis is based on Bayesian statistics, I call the resulting structures Bayesian Blocks. Unlike most, this method does not stipulate time bins – instead the data themselves
Bayesian Online Changepoint Detection
"... Changepoints are abrupt variations in the generative parameters of a data sequence. Online detection of changepoints is useful in modelling and prediction of time series in application areas such as finance, biometrics, and robotics. While frequentist methods have yielded online filtering and predic ..."
Abstract

Cited by 26 (0 self)
 Add to MetaCart
Changepoints are abrupt variations in the generative parameters of a data sequence. Online detection of changepoints is useful in modelling and prediction of time series in application areas such as finance, biometrics, and robotics. While frequentist methods have yielded online filtering and prediction techniques, most Bayesian papers have focused on the retrospective segmentation problem. Here we examine the case where the model parameters before and after the changepoint are independent and we derive an online algorithm for exact inference of the most recent changepoint. We compute the probability distribution of the length of the current “run, ” or time since the last changepoint, using a simple messagepassing algorithm. Our implementation is highly modular so that the algorithm may be applied to a variety of types of data. We illustrate this modularity by demonstrating the algorithm on three different realworld data sets. 1
Nonparametric changepoint estimation
 Annals of Statistics
, 1988
"... Consider a sequence of independent random variables {X.: I < i < n} having 1.cdf F for i < en and cdf G otherwise. A strongly consistent esti!.Jlator of the changepoint e E (0.1) is proposed. The estimator requires no knowledge of the ~ ~ functional forms or parametric families of F and G. Furthe ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
Consider a sequence of independent random variables {X.: I < i < n} having 1.cdf F for i < en and cdf G otherwise. A strongly consistent esti!.Jlator of the changepoint e E (0.1) is proposed. The estimator requires no knowledge of the ~ ~ functional forms or parametric families of F and G. Furthermore. F and G need not differ in their means (or other measure of location). is that F and G differ on a set of positive probability. The only requirement The proof of consistency provides rates of convergence and bounds on the error probability for the estimator. The estimator is applied to two wellknown data sets; in both cases it yields results in close agreement with previous parametric analyses. A simulation study
Change Point and Change Curve Modeling in Stochastic Processes and Spatial Statistics
 Journal of Applied Statistical Science
, 1993
"... In simple onedimensional stochastic processes it is feasible to model change points explicitly and to make inference about them. I have found that the Bayesian approach produces results more easily than nonBayesian approaches. It has the advantages of relative technical simplicity, theoretical opt ..."
Abstract

Cited by 9 (4 self)
 Add to MetaCart
In simple onedimensional stochastic processes it is feasible to model change points explicitly and to make inference about them. I have found that the Bayesian approach produces results more easily than nonBayesian approaches. It has the advantages of relative technical simplicity, theoretical optimality, and of allowing a formal comparison between abrupt and gradual descriptions of change. When it can be assumed that there is at most one changepoint, this is especially simple. This is illustrated in the context of Poisson point processes. A simple approximation is introduced that is applicable to a wide range of problems in which the change point model can be written as a regression or generalized linear model. When the number of change points is unknown, the Bayesian approach proceeds most naturally by statespace modeling or "hidden Markov chains". The general ideas of this are briefly reviewed, particularly the multiprocess Kalman filter. I then describe the application of these...
Prediction and Decision Making Using Bayesian Hierarchical Models Statistics in Medicine
, 1995
"... This paper uses Bayesian hierarchical models to analyze multicenter clinical trial data where the outcome variable of interest is continuous, but not normally distributed, and where censoring has occurred. The goal of such an analysis is the same as for any subgroup analysis, to provide survival es ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
This paper uses Bayesian hierarchical models to analyze multicenter clinical trial data where the outcome variable of interest is continuous, but not normally distributed, and where censoring has occurred. The goal of such an analysis is the same as for any subgroup analysis, to provide survival estimates for specific subgroups as well as for the population and to provide estimates of the degree of heterogeneity between subgroups. An analysis of the Collaborative Study of LongTerm Maintenance Drug Therapy in Recurrent Affective Illness, a multicenter clinical trial funded by the National Institute for Mental Health's Pharmacologic Research Branch, serves to illustrate the proposed methodology. A feature of this data set is that one treatment group was withdrawn from medication at the time of randomization. The paper contains comparison of models, one that accounts for the drug washout period through the use of a changepoint model as well as a comparison of results across several choi...
Two Statistical Methods for the Detection of Environmental Thresholds
, 2001
"... A nonparametric method and a Bayesian hierarchical modelling method are proposed in this paper for the detection of environmental thresholds. The nonparametric method is based on the reduction of deviance, while the Bayesian method is based on the change in the response variable distribution para ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
A nonparametric method and a Bayesian hierarchical modelling method are proposed in this paper for the detection of environmental thresholds. The nonparametric method is based on the reduction of deviance, while the Bayesian method is based on the change in the response variable distribution parameters.
Segmentation of Signals Using Piecewise Constant Linear Regression Models
, 1994
"... The signal segmentation approach described herein assumes that the signal can be accurately modelled by a linear regression with piecewise constant parameters. A simultaneous estimate of the change times is considered. The maximum likelihood and maximum a posteriori probability estimates are derive ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
The signal segmentation approach described herein assumes that the signal can be accurately modelled by a linear regression with piecewise constant parameters. A simultaneous estimate of the change times is considered. The maximum likelihood and maximum a posteriori probability estimates are derived after marginalization of the linear regression parameters and the measurement noise variance, which are considered as nuisance parameters. A wellknown problem is that the complexity of segmentation increases exponentially in the number of data. Therefore, two inequalities are derived enabling the exact estimate to be computed with quadratic complexity. A linear in time complexity recursive approximation is proposed as well, based on these inequalities. The method is evaluated on a speech signal previously analyzed in literature, showing that a comparable result is obtained directly without the usual tuning effort. It is also detailed how it successfully has been applied in a car for onlin...
Assessing Placebo Response Using Bayesian Hierarchical Survival Models
, 1995
"... The National Institute of Mental Health (NIMH) Collaborative Study of LongTerm Maintenance Drug Therapy in Recurrent Affective Illness was a multicenter randomized controlled clinical trial designed to determine the efficacy of a pharmacotherapy for the prevention of the recurrence of unipolar affe ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
The National Institute of Mental Health (NIMH) Collaborative Study of LongTerm Maintenance Drug Therapy in Recurrent Affective Illness was a multicenter randomized controlled clinical trial designed to determine the efficacy of a pharmacotherapy for the prevention of the recurrence of unipolar affective disorders. The outcome of interest in this study was the time until the recurrence of a depressive episode. The data show much heterogeneity between centers for the placebo group. The aim of this paper is to use Bayesian hierarchical survival models to investigate the heterogeneity of placebo effects among centers in the NIMH study. This heterogeneity is explored in terms of the marginal posterior distributions of parameters of interest and predictive distributions of future observations. The Gibbs sampling algorithm is used to approximate posterior and predictive distributions. Sensitivity of results to the assumption of a constant hazard survival distribution at the first stage of th...
The Juvenile Homicide Epidemic: SpatioTemporal Dynamics of Homicide in American Cities
, 1999
"... Introduction 1.1. Homicide in the 1980's. Today, criminologists are working to explain a welcome trend in nationwide violence: 1997 marked the third year of decrease in the homicide arrest rate among juveniles. The 1997 juvenile homicide rate represented a drop of 16 percent from 1996 (U.S. Federal ..."
Abstract
 Add to MetaCart
Introduction 1.1. Homicide in the 1980's. Today, criminologists are working to explain a welcome trend in nationwide violence: 1997 marked the third year of decrease in the homicide arrest rate among juveniles. The 1997 juvenile homicide rate represented a drop of 16 percent from 1996 (U.S. Federal Bureau of Investigation, 1997). But the recent attention to explaining this recent decline only heightens the importance of answering the still unresolved question of why the juvenile homicide rate increased during the late 1980's and early 1990's, by 169% between 1984 and 1993. As has been well documented in Blumstein (1995) and elsewhere, this late1980's upsurge in homicide was driven largely by juveniles using firearms; the homicide rate for older offenders remained largely flat and there is no corresponding growth in juvenile nongun violence. One striking aspect of these recent trends in juvenile violence is the abruptness of the growth following a changepoint
Linear Model Selection in the Presence of Outliers and Break Points
, 1999
"... The presence of outliers and break points are important questions in any applications of econometric time series analysis. This paper shows how Bayes tests for different type and complexity can be constructed using the concept of marginal likelihood. The results can be viewed as an extension of Pola ..."
Abstract
 Add to MetaCart
The presence of outliers and break points are important questions in any applications of econometric time series analysis. This paper shows how Bayes tests for different type and complexity can be constructed using the concept of marginal likelihood. The results can be viewed as an extension of Polasek and Ren (1997). If the location of the break point is not known, the lag length, and possible heteroskedastic errors are present, one can calculate the Bayes factors in a normalgamma regression model quite easily. The search for outliers and break points can be extended to the multivariate normalWishart regression model. The approach is demonstrated with Swiss macroeconomic time series GNP and consumption. Keywords: Bayes tests, marginal likelihood, outliers, break points, heteroskedasticity, variable selection, autoregressive processes. 1 1 Introduction The selection of models for practical applications is despite many discussions and many new developments an unsolved problem. Aut...