Results 1  10
of
36
Pairwise Markov chains
 IEEE Transactions on Pattern Analysis and Machine Intelligence
, 2003
"... Abstract. The restoration of a hidden process X from an observed process Y is often performed in the framework of hidden Markov chains (HMC). HMC have been recently generalized to triplet Markov chains (TMC). In the TMC model one introduces a third random chain U and assumes that the triplet T = (X, ..."
Abstract

Cited by 51 (25 self)
 Add to MetaCart
Abstract. The restoration of a hidden process X from an observed process Y is often performed in the framework of hidden Markov chains (HMC). HMC have been recently generalized to triplet Markov chains (TMC). In the TMC model one introduces a third random chain U and assumes that the triplet T = (X, U, Y) is a Markov chain (MC). TMC generalize HMC but still enable the development of efficient Bayesian algorithms for restoring X from Y. This paper lists some recent results concerning TMC; in particular, we recall how TMC can be used to model hidden semiMarkov Chains or deal with nonstationary HMC.
Multisensor triplet Markov chains and theory of evidence
 International Journal of Approximate Reasoning
, 2006
"... Hidden Markov chains (HMC) are widely applied in various problems occurring in different areas like Biosciences, Climatology, Communications, Ecology, Econometrics and Finances, Image or Signal processing. In such models, the hidden process of interest X is a Markov chain, which must be estimated fr ..."
Abstract

Cited by 24 (10 self)
 Add to MetaCart
Hidden Markov chains (HMC) are widely applied in various problems occurring in different areas like Biosciences, Climatology, Communications, Ecology, Econometrics and Finances, Image or Signal processing. In such models, the hidden process of interest X is a Markov chain, which must be estimated from an observable Y, interpretable as being a noisy version of X. The success of HMC is mainly due to the fact that the conditional probability distribution of the hidden process with respect to the observed process remains Markov, which makes possible different processing strategies such as Bayesian restoration. HMC have been recently generalized to ‘‘Pairwise’ ’ Markov chains (PMC) and ‘‘Triplet’ ’ Markov chains (TMC), which offer similar processing advantages and superior modeling capabilities. In PMC, one directly assumes the Markovianity of the pair (X, Y) and in TMC, the distribution of the pair (X, Y) is the marginal distribution of a Markov process (X, U, Y), where U is an auxiliary process, possibly contrived. Otherwise, the Dempster–Shafer fusion can offer interesting extensions of the calculation of the ‘‘a posteriori’ ’ distribution of the hidden data. The aim of this paper is to present different possibilities of using the Dempster–Shafer fusion in the context of different multisensor Markov models. We show that the posterior distribution remains calculable in different general situations and present some examples of their applications in remote sensing area.
Kalman filtering for triplet Markov chains : Applications and extensions
 in Proceedings of the International Conference on Acoustics, Speech and Signal Processing (ICASSP 05
, 2005
"... Abstract—Let � a � sx be a hidden process, � a � sx an observed process, and � a � sx some additional process. We assume that � a @ � � �A is a (socalled “Triplet”) vector Markov chain (TMC). We first show that the linear TMC model encompasses and generalizes, among other models, the classical st ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
Abstract—Let � a � sx be a hidden process, � a � sx an observed process, and � a � sx some additional process. We assume that � a @ � � �A is a (socalled “Triplet”) vector Markov chain (TMC). We first show that the linear TMC model encompasses and generalizes, among other models, the classical statespace systems with colored process and/or measurement noise(s). We next propose restoration Kalmanlike filters for arbitrary linear Gaussian (LG) TMC. Index Terms—Bayesian signal restoration, hidden Markov chains, Kalman filtering, Markovian models, triplet Markov chains.
Unsupervised Statistical Segmentation of Nonstationary Images Using Triplet Markov Fields
"... Abstract—Recent developments in statistical theory and associated computational techniques have opened new avenues for image modeling as well as for image segmentation techniques. Thus, a host of models have been proposed and the ones which have probably received considerable attention are the hidde ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
Abstract—Recent developments in statistical theory and associated computational techniques have opened new avenues for image modeling as well as for image segmentation techniques. Thus, a host of models have been proposed and the ones which have probably received considerable attention are the hidden Markov fields (HMF) models. This is due to their simplicity of handling and their potential for providing improved image quality. Although these models provide satisfying results in the stationary case, they can fail in the nonstationary one. In this paper, we tackle the problem of modeling a nonstationary hidden random field and its effect on the unsupervised statistical image segmentation. We propose an original approach, based on the recent triplet Markov field (TMF) model, which enables one to deal with nonstationary class fields. Moreover, the noise can be correlated and possibly nonGaussian. An original parameter estimation method which uses the Pearson system to find the natures of the noise margins, which can vary with the class, is also proposed and used to perform unsupervised segmentation of such images. Experiments indicate that the new model and related processing algorithm can improve the results obtained with the classical ones. Index Terms—Triplet Markov fields, statistical image segmentation, paramater estimation, Pearson system, iterative conditional estimation, nonstationary images, textures classification.
Unsupervised Segmentation of Hidden Semi Markov Non Stationary Chains
"... Abstract. In the classical hidden Markov chain (HMC) model we have a hidden chain X, which is a Markov one and an observed chain Y. HMC are widely used; however, in some situations they have to be replaced by the more general “hidden semiMarkov chains ” (HSMC), which are particular “triplet Markov ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
Abstract. In the classical hidden Markov chain (HMC) model we have a hidden chain X, which is a Markov one and an observed chain Y. HMC are widely used; however, in some situations they have to be replaced by the more general “hidden semiMarkov chains ” (HSMC), which are particular “triplet Markov chains ” (TMC) T = ( X, U, Y) , where the auxiliary chain U models the semiMarkovianity of X. Otherwise, non stationary classical HMC can also be modeled by a triplet Markov stationary chain with, as a consequence, the possibility of parameters ' estimation. The aim of this paper is to use simultaneously both properties. We
Statistical image segmentation using Triplet Markov fields
 SPIE’s International Symposium on Remote Sensing
, 2002
"... Hidden Markov fields (HMF) are widely used in image processing. In such models, the hidden random field of interest S s s X X = ) ( is a Markov field, and the distribution of the observed random field S s s Y Y = ) ( (conditional on X ) is given by s s x y p x y p ) ( ) ( . The posterior dist ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
Hidden Markov fields (HMF) are widely used in image processing. In such models, the hidden random field of interest S s s X X = ) ( is a Markov field, and the distribution of the observed random field S s s Y Y = ) ( (conditional on X ) is given by s s x y p x y p ) ( ) ( . The posterior distribution ) ( y x p is then a Markov distribution, which affords different Bayesian processing. However, when dealing with the segmentation of images containing numerous classes with different textures, the simple form of the distribution ) ( x y p above is insufficient and has to be replaced by a Markov field distribution. This poses problems, because taking ) ( x y p Markovian implies that the posterior distribution ) ( y x p , whose Markovianity is needed to use Bayesian techniques, may no longer be a Markov distribution, and so different model approximations must be made to remedy this. This drawback disappears when considering directly the Markovianity of ) , ( Y X ; in these recent "Pairwise Markov Fields (PMF) models, both ) ( x y p and ) ( y x p are then Markovian, the first one allowing us to model textures, and the second one allowing us to use Bayesian restoration without model approximations.
Modeling and Unsupervised Classification of Multivariate Hidden Markov Chains With Copulas
"... Abstract—Parametric modeling and estimation of nonGaussian multidimensional probability density function is a difficult problem whose solution is required by many applications in signal and image processing. A lot of efforts have been devoted to escape the usual Gaussian assumption by developing pe ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Abstract—Parametric modeling and estimation of nonGaussian multidimensional probability density function is a difficult problem whose solution is required by many applications in signal and image processing. A lot of efforts have been devoted to escape the usual Gaussian assumption by developing perturbed Gaussian models such as Spherically Invariant Random Vectors (SIRVs). In this work, we introduce an alternative solution based on copulas that enables theoretically to represent any multivariate distribution. Estimation procedures are proposed for some mixtures of copulabased densities and are compared in the hidden Markov chain setting, in order to perform statistical unsupervised classification of signals or images. Useful copulas and SIRV for multivariate signal classification are particularly studied through experiments Index Terms—Copulas, EM algorithm, hidden Markov chains, hidden Markov models, inference for margins, maximum likelihood, multivariate modeling, spherically invariant random vector (SIRV), statistical classification. I.
Exact Bayesian smoothing in triplet switching Markov chains
 COMPLEX DATA MODELING AND COMPUTATIONALLY INTENSIVE STATISTICAL METHODS FOR ESTIMATION AND PREDICTION
, 2009
"... Bayesian smoothing in conditionally linear Gaussian models, also called jumpMarkov statespace systems, is an NPhard problem. As a result, a number of approximate methods either deterministic or Monte Carlo based have been developed. In this paper we address the Bayesian smoothing problem in an ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
Bayesian smoothing in conditionally linear Gaussian models, also called jumpMarkov statespace systems, is an NPhard problem. As a result, a number of approximate methods either deterministic or Monte Carlo based have been developed. In this paper we address the Bayesian smoothing problem in another triplet Markov chain model, in which the switching process R is not necessarily Markovian and the additive noises do not need to be Gaussian. We show that in this model the smoothing posterior mean and covariance matrix can be computed exactly with complexity linear in time.
Bayesian smoothing algorithms in pairwise and triplet Markov chains
 in Proceedings of the 2005 IEEE Workshop on Statistical Signal Processing (SSP 05
, 2005
"... An important problem in signal processing consists in estimating an unobservable process x = {xn}n∈IN from an observed process y = {yn}n∈IN. In Linear Gaussian Hidden Markov Chains (LGHMC), recursive solutions are given by Kalmanlike Bayesian restoration algorithms. In this paper, we consider the m ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
An important problem in signal processing consists in estimating an unobservable process x = {xn}n∈IN from an observed process y = {yn}n∈IN. In Linear Gaussian Hidden Markov Chains (LGHMC), recursive solutions are given by Kalmanlike Bayesian restoration algorithms. In this paper, we consider the more general framework of Linear Gaussian Triplet Markov Chains (LGTMC), i.e. of models in which the triplet (x, r, y) (where r = {rn}n∈IN is some additional process) is Markovian and Gaussian. We address fixedinterval smoothing algorithms, and we extend to LGTMC the RTS algorithm by Rauch, Tung and Striebel, as well as the TwoFilter algorithm by Mayne and Fraser and Potter. 1.
Exact Smoothing in Hidden Conditionally Markov . . .
, 2009
"... The problem considered in this paper is the problem of the exact calculation of smoothing in hidden switching statespace systems. There is a hidden statespace chain X, the switching Markov chain R, and the observed chain Y. In the classical, widely used “conditionally Gaussian statespace linear m ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
The problem considered in this paper is the problem of the exact calculation of smoothing in hidden switching statespace systems. There is a hidden statespace chain X, the switching Markov chain R, and the observed chain Y. In the classical, widely used “conditionally Gaussian statespace linear model” (CGSSLM) the exact calculation with complexity linear in time is not feasible and different approximations have to be made. Different alternative models, in which the exact calculations are feasible, have been recently proposed (2008). The core difference between these models and the classical ones is that the couple (R, Y) is a Markov one in the recent models, while it is not in the classical ones. Here we propose a further extension of these recent models by relaxing the hypothesis of the Markovianity of X conditionally on (R, Y). In fact, in all classical models and as well as in the recent ones, the hidden chain X is always a Markov one conditionally on (R, Y). In the proposed model it can be of any form. In particular, different “long memory ” processes can be considered. In spite of this larger generality, we show that the smoothing formulae are still calculable exactly with the complexity polynomial in time.