## Stochastic Sampling Algorithms for State Estimation of Jump Markov Linear Systems (2000)

Venue: | IEEE Transactions on Automatic Control |

Citations: | 25 - 2 self |

### BibTeX

@ARTICLE{Doucet00stochasticsampling,

author = {Arnaud Doucet and Andrew Logothetis and Vikram Krishnamurthy},

title = {Stochastic Sampling Algorithms for State Estimation of Jump Markov Linear Systems},

journal = {IEEE Transactions on Automatic Control},

year = {2000},

volume = {45},

pages = {200--0}

}

### Years of Citing Articles

### OpenURL

### Abstract

Jump Markov linear systems are linear systems whose parameters evolve with time according to a finite-state Markov chain. Given a set of observations, our aim is to estimate the states of the finite-state Markov chain and the continuous (in space) states of the linear system. The computational cost in computing conditional mean or maximum a posteriori (MAP) state estimates of the Markov chain or the state of the jump Markov linear system grows exponentially in the number of observations.

### Citations

4598 | A tutorial on hidden Markov models and selected applications in speech processing
- Rabiner
- 1989
(Show Context)
Citation Context ...roof: The proof can be found in [11, Th. 4.1]. In our case, and can be easily computed using, respectively, a Kalman smoother [1] and the forward–backward recursions of a hidden Markov model smoothe=-=r [17]-=-. IV. MAXIMUM APOSTERIORI STATE SEQUENCE ESTIMATION OF AND A. Simulated Annealing Data Augmentation Scheme SA is a numerical optimization technique that allows us to solve combinatorial optimization p... |

4060 |
Stochastic Relaxation, Gibbs Distribution and the Bayesian Restoration of Images
- Geman, Geman
- 1984
(Show Context)
Citation Context ...stochastic algorithms used to sample from complex multivariate probability distributions. These methods are well known in image processing because they have been introduced by Geman and Geman in 1984 =-=[6] t-=-o simulate from the Gibbs distribution of a Markov random field. Their introduction in the early 1990’s has revolutionized the field of applied statistics. The key idea behind MCMC methodology consi... |

849 | Markov chains for exploring posterior distributions (with discussion
- Tierney
- 1994
(Show Context)
Citation Context ...on . Hence, it is uniformly ergodic. From [19, pp. 401–402], satisfies [and if ], where satisfies Other bounds exist in the literature. Applying the duality principle of Robert and Diebolt [4], [18]=-=, [23], we-=- now show that (65) Thus, the property of uniform geometric convergence of the Markov chain is “transferred” to the continuous state-space Markov chain . To prove (65), first note that Thus, (66) ... |

821 | Stochastic Processes and Filtering Theory - Jazwinski - 1970 |

785 |
Optimal Filtering
- Anderson, Moore
- 1979
(Show Context)
Citation Context ...stem ���. and converge a.s. to and , respectively, and satisfy (18) (19) Proof: The proof can be found in [11, Th. 4.1]. In our case, and can be easily computed using, respectively, a Kalman smo=-=other [1] a-=-nd the forward–backward recursions of a hidden Markov model smoother [17]. IV. MAXIMUM APOSTERIORI STATE SEQUENCE ESTIMATION OF AND A. Simulated Annealing Data Augmentation Scheme SA is a numerical ... |

724 |
Statistical analysis of finite mixture distributions
- Titterington, Smith, et al.
- 1985
(Show Context)
Citation Context ...ons of the finite Markov chain [25]. Thus, it is necessary to consider in practice suboptimal estimation algorithms. A variety of such suboptimal algorithms have been proposed; see, for example, [7], =-=[24]-=-, and [25]. In particular, [25] presents a truncated (approximate) maximum likelihood procedure for parameter estimation and a truncated approximation of the conditional mean state estimates. The esti... |

646 |
The Calculation of Posterior Distributions by Data Augmentation
- Tanner, Wong
- 1987
(Show Context)
Citation Context ...es and MAP estimates of the Markov state and the state of the jump Markov linear system in (1) and (2). These algorithms are based on the data augmentation (DA) algorithm (proposed by Tanner and Wong =-=[21]-=-, [22]) and two originally proposed hybrid DA/stochasticsDOUCET et al.: STATE ESTIMATION OF JUMP MARKOV LINEAR SYSTEMS 189 annealing (SA) algorithms. The algorithms have a computational cost of per it... |

412 |
Simulated Annealing: Theory and Applications. D
- Laarhoven, Aarts
- 1989
(Show Context)
Citation Context ...the following four lemmas. Lemma 5.1: For any , admits as its invariant distribution, where (44) Proof: See Appendix II-G. We obtain straightforwardly the following lemmas; see, for example, [14] and =-=[27]-=-. Lemma 5.2: The sequence of invariant distributions [defined in (44)] converges, as goes to infinity, toward the set of global maxima of ; that is, (45) where if and otherwise. denotes the cardinalit... |

386 |
On gibbs sampling for state space models
- Carter, Kohn
- 1994
(Show Context)
Citation Context ...hemes: The DA algorithm presented in Fig. 1 requires us to compute samples from and . One possible scheme is the efficient forward filtering–backward sampling recursions introduced by Carter and Koh=-=n [2] -=-and independently by Früwirth-Schnatter [5]. These recursions are given in the Appendix. An alternative scheme, not investigated here, for sampling from posterior densities of Gaussian state space sy... |

327 | Bayesian Forecasting and Dynamic Models - West, Harrison - 1989 |

249 | Tutorial On Hidden Markov Models and Selected - Rabiner, “A - 1989 |

211 |
Covariance structure of the Gibbs sampler with applications to the comparisons of estimators and augmentation schemes. Biometrika 81:27–40
- Liu, Wong, et al.
- 1994
(Show Context)
Citation Context ...timates: Corollary 3.1 can straightforwardly be applied to compute the conditional mean estimates of and by the ergodic averages and 2 : (14) (15) where and are known as the empirical estimators (see =-=[11]-=-). Conditional mean estimates may also be computed via the mixture estimators (see [11]), i.e., (16) (17) The empirical estimator and the mixture estimator will almost surely converge to the true cond... |

167 |
Estimation of Finite Mixture Distributions through Bayesian Sampling
- Diebolt, Robert
- 1994
(Show Context)
Citation Context ...r via the data augmentation algorithm is summarized in Fig. 1. Remark 3.1: Theoretically speaking, the DA algorithm does not have a stopping criterion. However, a reasonable choice (see, for example, =-=[4]-=-) is to terminate the algorithm when is less than some specified tolerance limit. Sampling Schemes: The DA algorithm presented in Fig. 1 requires us to compute samples from and . One possible scheme i... |

161 |
Tools for Statistical Inference – Methods for the Exploration of Posterior Distributions and Likelihood Functions
- Tanner
- 1993
(Show Context)
Citation Context ... MAP estimates of the Markov state and the state of the jump Markov linear system in (1) and (2). These algorithms are based on the data augmentation (DA) algorithm (proposed by Tanner and Wong [21], =-=[22]-=-) and two originally proposed hybrid DA/stochasticsDOUCET et al.: STATE ESTIMATION OF JUMP MARKOV LINEAR SYSTEMS 189 annealing (SA) algorithms. The algorithms have a computational cost of per iteratio... |

147 |
The simulation smoother for time series models
- Jong, Shephard
- 1995
(Show Context)
Citation Context ...sions are given in the Appendix. An alternative scheme, not investigated here, for sampling from posterior densities of Gaussian state space systems is the simulation smoother of De Jong and Shephard =-=[3]-=-. B. Convergence of Data Augmentation The DA algorithm described in Section III-A has been used in [2] for identification of linear state-space models with errors that are a mixture of normals and coe... |

91 | Markov chains theory and applications - Isaacson, Madsen - 1976 |

67 | Convergence rates for Markov chains - Rosenthal - 1995 |

52 |
Convergence and Finite-Time Behavior of Simulated Annealing
- Mitra, Romeo, et al.
- 1986
(Show Context)
Citation Context ...XIMUM APOSTERIORI STATE SEQUENCE ESTIMATION OF AND A. Simulated Annealing Data Augmentation Scheme SA is a numerical optimization technique that allows us to solve combinatorial optimization problems =-=[14]-=-. It is a stochastic algorithm, which will converge to globally optimal solutions, by randomly generating a sequence of possible solutions. In this section, we introduce an algorithm for obtaining opt... |

49 |
Interference Rejection techniques in Spread Spectrum Communications
- Milstein
- 1988
(Show Context)
Citation Context ...same radio frequency (RF) bandwidth. It is well known that system performance is greatly enhanced if the receiver employs some means of suppressing narrowband interference before signal “despreading=-=” [28]-=-. Numerous recent papers study the problem of narrowband interference suppression in CDMA systems; see [15], [16], and [29], and the references therein. Our aim here is to examine the use of the itera... |

34 |
Adaptive estimation and identification for discrete systems with Markov jump parameters
- Tugnait
- 1982
(Show Context)
Citation Context ...ct computation of these estimates involves a prohibitive computational cost of order , where denotes the number of measurements and corresponds to all possible realizations of the finite Markov chain =-=[25]-=-. Thus, it is necessary to consider in practice suboptimal estimation algorithms. A variety of such suboptimal algorithms have been proposed; see, for example, [7], [24], and [25]. In particular, [25]... |

32 | Code-Aided Interference Suppression for DS/CDMA CommunicationsPart II: Parallel Blind Adaptive Implementations - Poor, Wang - 1997 |

19 |
Fixed-interval smoothing for Markovian switching systems
- Helmick, Blair, et al.
- 1995
(Show Context)
Citation Context ...izations of the finite Markov chain [25]. Thus, it is necessary to consider in practice suboptimal estimation algorithms. A variety of such suboptimal algorithms have been proposed; see, for example, =-=[7]-=-, [24], and [25]. In particular, [25] presents a truncated (approximate) maximum likelihood procedure for parameter estimation and a truncated approximation of the conditional mean state estimates. Th... |

17 | Narrowband interference suppression in CDMA spread spectrum communications
- Rusch, Poor
(Show Context)
Citation Context ...ome means of suppressing narrowband interference before signal “despreading” [28]. Numerous recent papers study the problem of narrowband interference suppression in CDMA systems; see [15], [16], =-=and [29]-=-, and the references therein. Our aim here is to examine the use of the iterative stochastic sampling algorithms proposed in the previous sections for narrowband interference suppression. Note, howeve... |

14 |
Estimation maximization algorithms for MAP estimation of jump Markov linear systems
- Logothetis, Krishnamurthy
- 1999
(Show Context)
Citation Context ...rements, the states of the jump Markov linear system, and the states of the finite-state Markov chain, respectively. Jump Markov linear systems appear in several fields in electrical engineering (see =-=[12]-=- and references therein), including control (e.g., hybrid systems, target tracking), signal processing (e.g., blind channel equalization), communications (e.g., interference suppression in mobile tele... |

10 |
Maximum-likelihood deconvolution
- Kormylo, Mendel
- 1983
(Show Context)
Citation Context ...to geophysics, nuclear science, or speech processing, the signal of interest can be modeled as an autoregressive process excited by a noise that admits as marginal distribution a mixture of Gaussians =-=[13]-=-. We consider the following model: where is the dynamic (mixture) noise process (46) (47) is often assumed to be a white noise sequence, but it could be also modeled as a first-order Markov sequence t... |

10 |
Bayesian estimation of hidden Markov models: a stochastic implementation
- Robert, Celeux, et al.
- 1993
(Show Context)
Citation Context ...n and is ergodic. Ergodicity implies convergence of ergodic (sample) averages [23, Th. 3, p. 1717]. Uniform ergodicity implies that the Law of Large Numbers and a central limit theorem also hold [4], =-=[18], -=-and [23, Th. 5, p. 1717]). Corollary 3.1—Convergence of Ergodic Averages: For every real-valued function , let us consider the time average of the first outputs of the Markov chain .If , then, for a... |

7 | Adaptive nonlinear filters for narrowband interference suppression in spread spectrum CDMA - Krishnamurthy, Logothetis - 1999 |

4 |
interference suppression for DS/CDMA communications—Part II: Parallel blind adaptive implementations
- “Code-aided
- 1997
(Show Context)
Citation Context ... employs some means of suppressing narrowband interference before signal “despreading” [28]. Numerous recent papers study the problem of narrowband interference suppression in CDMA systems; see [1=-=5], [16]-=-, and [29], and the references therein. Our aim here is to examine the use of the iterative stochastic sampling algorithms proposed in the previous sections for narrowband interference suppression. No... |

3 |
Data augmentation and dynamic linear models
- Früwirth-Schnatter
- 1994
(Show Context)
Citation Context ... as in numerous applications when f@�Af @�A is singular. The jump Markov linear system can be transformed to a new system where the noise covariance matrix is positive definite. See [10, Sec. 3.9]=-= or [5]-=- for details.s190 IEEE TRANSACTIONS ON AUTOMATIC CONTROL, VOL. 45, NO. 1, JANUARY 2000 chain . We assume and let and be mutually independent for all . Assumption 2.1: The model parameters are assumed ... |

1 |
Bayesian Forecasting and Dynamic Models, ser
- West, Harrison
- 1996
(Show Context)
Citation Context ...the model parameters are not known, several algorithms for estimating these parameters are available; see, for example, [25]. Such models are also considered under a dynamic linear model framework in =-=[26]-=-. It is also possible in a Bayesian framework to use algorithms presented in this paper to jointly compute state and parameter estimates. An important issue beyond the scope of this paper is the ident... |

1 | A Bayesian Expectation-Maximization Algorithm for Estimating Jump Markov Linear Systems", accepted to - Logothetis, Krishnamurthy - 1998 |