## Bayesian Curve Fitting Using MCMC With Applications to Signal Segmentation (2002)

Venue: | IEEE Transactions on Signal Processing |

Citations: | 54 - 0 self |

### BibTeX

@ARTICLE{Punskaya02bayesiancurve,

author = {Elena Punskaya and Christophe Andrieu and Arnaud Doucet and William J. Fitzgerald},

title = {Bayesian Curve Fitting Using MCMC With Applications to Signal Segmentation},

journal = {IEEE Transactions on Signal Processing},

year = {2002},

volume = {50},

pages = {747--758}

}

### Years of Citing Articles

### OpenURL

### Abstract

We propose some Bayesian methods to address the problem of fitting a signal modeled by a sequence of piecewise constant linear (in the parameters) regression models, for example, autoregressive or Volterra models. A joint prior distribution is set up over the number of the changepoints/knots, their positions, and over the orders of the linear regression models within each segment if these are unknown. Hierarchical priors are developed and, as the resulting posterior probability distributions and Bayesian estimators do not admit closed-form analytical expressions, reversible jump Markov chain Monte Carlo (MCMC) methods are derived to estimate these quantities. Results are obtained for standard denoising and segmentation of speech data problems that have already been examined in the literature. These results demonstrate the performance of our methods.

### Citations

971 | Monte Carlo Statistical Methods
- Robert, Casella
- 1999
(Show Context)
Citation Context ...orm a local exploration of the space. 3) Update of the Hyperparameters: The algorithm developed requires the simulation of the hyperparameters and . This can be done according to standard Gibbs moves =-=[15] s-=-o that and are sampled from Inverse-Gamma and Gamma distributions, respectively. „ (12) (13) The probability distribution allowing the update of requires the simulation of the nuisance parameters , ... |

883 | Ideal spatial adaptation by wavelet shrinkage
- Donoho, Johnstone
- 1994
(Show Context)
Citation Context ...12). Similarly, we sample as in (13) and according to IV. SIMULATIONS „ (29) (30) In the first set of simulations, we address the standard problem of denoising smooth and unsmooth test functions [1]=-=, [17]-=-. To compare our results with [1], we have used a fixed model order . Subsequently, we apply our algorithm with unknown model orders to the segmentation of signals modeled as piecewise constant AR pro... |

465 | On Bayesian Analysis of Mixtures with an Unknown Number of Components
- Richardson, Green
- 1997
(Show Context)
Citation Context ...face a “double” model selection problem. We also adopt hierarchical prior distributions where the hyperparameters are assumed random with a vague prior distribution; a similar approach was adopted=-= in [10]-=-. This has the effect of increasing robustness of the Bayesian models in comparison with the standard approach, where these parameters are fixed [1], [2], which was also demonstrated by a simulation s... |

435 | Detection of Abrupt Changes: Theory and Applications - Basseville, Nikiforov - 1993 |

167 | Adaptive filtering and change detection
- Gustafsson
- 2000
(Show Context)
Citation Context ...hical Bayesian model. 1) Bayesian Hierarchical Model: In our case, it is natural to introduce a binomial distribution as a prior distribution for the number of changepoints and their positions (as in =-=[2]-=-) where such that , and is an indicator function of the set (1 if , 0 otherwise). We assign a normal distribution to the parameters of the models ( here) with the same hyperparameter for all segments ... |

128 |
Reversible jump MCMC computation and Bayesian model determination
- Green
- 1995
(Show Context)
Citation Context ...ach, where these parameters are fixed [1], [2], which was also demonstrated by a simulation study. We propose efficient algorithms in order to sample from the posteriors based on reversible jump MCMC =-=[11]. D.-=- Plan The rest of the paper is organized as follows. For the sake of clarity, as the “double” selection problem is quite complex, we have chosen to begin in Section II with the case where the orde... |

92 |
Numerical Bayesian Methods Applied to Signal Processing
- Ruanaidh, Fitzgerald
- 1996
(Show Context)
Citation Context ...oint detection and signal segmentation [2]. For example, the general piecewise linear model and its extension to study multiple changepoints in non-Gaussian impulsive noise environments is studied in =-=[3]-=-. In [4] and [5], it is shown that piecewise constant autoregressive (AR) processes excited by white Gaussian noise have proved useful for processing real signals, such as speech data. In general, thi... |

77 |
Automatic Bayesian curve fitting
- Denison, Mallick, et al.
- 1998
(Show Context)
Citation Context ...ber of changepoints and the associated parameters ) are unknown. Given , our aim is to estimate and . B. Background This model allows for a wide range of applications from curve fitting of noisy data =-=[1]-=- to changepoint detection and signal segmentation [2]. For example, the general piecewise linear model and its extension to study multiple changepoints in non-Gaussian impulsive noise environments is ... |

48 |
The Bayesian Choice: A Decision-Theoretic Motivation
- Robert
- 1989
(Show Context)
Citation Context ...wn prior that reflects our degree of belief in the different values of these quantities. In order to increase robustness of the prior, the hyperparameters are assumed random with a vague distribution =-=[12]-=-, that is, we adopt a hierarchical Bayesian model. 1) Bayesian Hierarchical Model: In our case, it is natural to introduce a binomial distribution as a prior distribution for the number of changepoint... |

45 |
A new statistical approach for the automatic segmentation of continuous speech signals. pe er -0 0, v er sio n
- Andre-Obrecht
- 1988
(Show Context)
Citation Context ...ection and signal segmentation [2]. For example, the general piecewise linear model and its extension to study multiple changepoints in non-Gaussian impulsive noise environments is studied in [3]. In =-=[4]-=- and [5], it is shown that piecewise constant autoregressive (AR) processes excited by white Gaussian noise have proved useful for processing real signals, such as speech data. In general, this class ... |

30 | Adaptive sequential segmentation of piecewise stationary time series - APPEL, BRANDT - 1983 |

27 |
Bayesian retrospective multiple-changepoint identification
- Stephens
- 1994
(Show Context)
Citation Context ...tribution and any posterior feature of interest is estimated using MCMC. Bayesian approaches for multiple changepoint detection based on MCMC for different models are proposed, for example, in [7] or =-=[8]-=-. The closest work to the one presented here is the technique followed in [1]; see also [9]. Our methodology is, however, different in many respects. Our model is more general as it allows not only fo... |

17 | Design and comparative study of some sequential jump detection algorithms for digital signals - Basseville, Benveniste - 1983 |

9 |
A MAP solution to off-line segmentation of signals
- Djurić
- 1994
(Show Context)
Citation Context ...orks as a penalty against overfitting) on all the unknown parameters. Bayesian curve fitting/signal segmentation for related models has been studied by several authors recently, including [1]–[3] an=-=d [6].-=- Gustafsson [2] and Djurić [6] have proposed to perform MAP (maximum a posteriori) changepoint estimation using deterministic algorithms. Although these methods are fast and can give good results, on... |

9 | Bayesian curve estimation by polynomials of random order
- Mallick
- 1997
(Show Context)
Citation Context ...s for multiple changepoint detection based on MCMC for different models are proposed, for example, in [7] or [8]. The closest work to the one presented here is the technique followed in [1]; see also =-=[9]-=-. Our methodology is, however, different in many respects. Our model is more general as it allows not only for an unknown number of segments [1] but for an unknown model order within each segment as w... |

9 | On the relationship between MCMC model uncertainty methods
- Godsill
- 2000
(Show Context)
Citation Context ... probability is given by Here, the proposal is made directly in the new parameter space rather than via “dimensional” matching random variables [11], and the Jacobian term is equal to 1; see [13] =-=and [14] for-=- a detailed introduction. In fact, a particular choice of the moves will only affect the convergence rate of the algorithm. To ensure a low level of rejection, we want the proposed “jumps” to be s... |

3 |
A Bayesian analysis for changepoint problems
- Barry, Hartigan
- 1993
(Show Context)
Citation Context ...ior distribution and any posterior feature of interest is estimated using MCMC. Bayesian approaches for multiple changepoint detection based on MCMC for different models are proposed, for example, in =-=[7]-=- or [8]. The closest work to the one presented here is the technique followed in [1]; see also [9]. Our methodology is, however, different in many respects. Our model is more general as it allows not ... |

3 |
Bayesian segmentation of piecewise constant autoregressive processes using mcmc
- Punskaya, Andrieu, et al.
- 1999
(Show Context)
Citation Context ...l distribution for „ (20) where , are the means of the distributions given by (14) and (15) but with matrices and corresponding to the value of the hyperparameter [the mean of the distribution ] (se=-=e [16] f-=-or details) (21) The acceptance probabilities for the birth and death moves are as in (19) where from (18) for the birth of the changepoint , , we obtain with „ (22) (23) (24)s754 IEEE TRANSACTIONS ... |

1 |
MCMC computation for Bayesian model selection
- Andrieu, Djurić, et al.
- 2001
(Show Context)
Citation Context ...cceptance probability is given by Here, the proposal is made directly in the new parameter space rather than via “dimensional” matching random variables [11], and the Jacobian term is equal to 1; =-=see [13] a-=-nd [14] for a detailed introduction. In fact, a particular choice of the moves will only affect the convergence rate of the algorithm. To ensure a low level of rejection, we want the proposed “jumps... |