Results 1 
8 of
8
The consistency of the BIC Markov order estimator.
"... . The Bayesian Information Criterion (BIC) estimates the order of a Markov chain (with finite alphabet A) from observation of a sample path x 1 ; x 2 ; : : : ; x n , as that value k = k that minimizes the sum of the negative logarithm of the kth order maximum likelihood and the penalty term jAj ..."
Abstract

Cited by 55 (3 self)
 Add to MetaCart
. The Bayesian Information Criterion (BIC) estimates the order of a Markov chain (with finite alphabet A) from observation of a sample path x 1 ; x 2 ; : : : ; x n , as that value k = k that minimizes the sum of the negative logarithm of the kth order maximum likelihood and the penalty term jAj k (jAj\Gamma1) 2 log n: We show that k equals the correct order of the chain, eventually almost surely as n ! 1, thereby strengthening earlier consistency results that assumed an apriori bound on the order. A key tool is a strong ratiotypicality result for Markov sample paths. We also show that the Bayesian estimator or minimum description length estimator, of which the BIC estimator is an approximation, fails to be consistent for the uniformly distributed i.i.d. process. AMS 1991 subject classification: Primary 62F12, 62M05; Secondary 62F13, 60J10 Key words and phrases: Bayesian Information Criterion, order estimation, ratiotypicality, Markov chains. 1 Supported in part by a joint N...
Context tree estimation for not necessarily finite memory processes, via BIC and MDL
 IEEE Trans. Inf. Theory
, 2006
"... The concept of context tree, usually defined for finite memory processes, is extended to arbitrary stationary ergodic processes (with finite alphabet). These context trees are not necessarily complete, and may be of infinite depth. The familiar BIC and MDL principles are shown to provide strongly co ..."
Abstract

Cited by 25 (1 self)
 Add to MetaCart
The concept of context tree, usually defined for finite memory processes, is extended to arbitrary stationary ergodic processes (with finite alphabet). These context trees are not necessarily complete, and may be of infinite depth. The familiar BIC and MDL principles are shown to provide strongly consistent estimators of the context tree, via optimization of a criterion for hypothetical context trees of finite depth, allowed to grow with the sample size n as o(log n). Algorithms are provided to compute these estimators in O(n) time, and to compute them online for all i ≤ n in o(n log n) time.
Order Estimation for a Special Class of Hidden Markov Sources and Binary Renewal Processes
 IEEE Trans. Inform. Theory
, 2002
"... We consider the estimation of the order, i.e., the number of hidden states, of a special class of discretetime finitealphabet hidden Markov sources. This class can be characterized in terms of equivalent renewal processes. No a priori bound is assumed on the maximum permissible order. An order est ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
We consider the estimation of the order, i.e., the number of hidden states, of a special class of discretetime finitealphabet hidden Markov sources. This class can be characterized in terms of equivalent renewal processes. No a priori bound is assumed on the maximum permissible order. An order estimator based on renewal types is constructed, and is shown to be strongly consistent by computing the precise asymptotics of the probability of estimation error. The probability of underestimation of the true order decays exponentially in the number of observations while the probability of overestimation goes to zero sufficiently fast. It is further shown that this estimator has the best possible error exponent in a large class of estimators. Our results are also valid for the general class of binary independentrenewal processes with finite mean renewal times.
Two new Markov order estimators
 Arxiv Preprint Math/0506080
, 2005
"... Abstract. We present two new methods for estimating the order (memory depth) of a finite alphabet Markov chain from observation of a sample path. One method is based on entropy estimation via recurrence times of patterns, and the other relies on a comparison of empirical conditional probabilities. T ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Abstract. We present two new methods for estimating the order (memory depth) of a finite alphabet Markov chain from observation of a sample path. One method is based on entropy estimation via recurrence times of patterns, and the other relies on a comparison of empirical conditional probabilities. The key to both methods is a qualitative change that occurs when a parameter (a candidate for the order) passes the true order. We also present extensions to order estimation for Markov random fields.
CONSISTENT ESTIMATION OF THE BASIC NEIGHBORHOOD OF MARKOV RANDOM FIELDS
, 2006
"... For Markov random fields on Z d with finite state space, we address the statistical estimation of the basic neighborhood, the smallest region that determines the conditional distribution at a site on the condition that the values at all other sites are given. A modification of the Bayesian Informati ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
For Markov random fields on Z d with finite state space, we address the statistical estimation of the basic neighborhood, the smallest region that determines the conditional distribution at a site on the condition that the values at all other sites are given. A modification of the Bayesian Information Criterion, replacing likelihood by pseudolikelihood, is proved to provide strongly consistent estimation from observing a realization of the field on increasing finite regions: the estimated basic neighborhood equals the true one eventually almost surely, not assuming any prior bound on the size of the latter. Stationarity of the Markov field is not required, and phase transition does not affect the results. 1. Introduction. In this paper Markov
TwiceUniversal Simulation of Markov Sources and Individual Sequences
"... The problem of universal simulation given a training sequence is studied both in a stochastic setting and for individual sequences. In the stochastic setting, the training sequence is assumed to be emitted by a Markov source of unknown order, extending previous work where the order is assumed known ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
The problem of universal simulation given a training sequence is studied both in a stochastic setting and for individual sequences. In the stochastic setting, the training sequence is assumed to be emitted by a Markov source of unknown order, extending previous work where the order is assumed known and leading to the notion of twiceuniversal simulation. A simulation scheme, which partitions the set of sequences of a given length into classes, is proposed for this setting and shown to be asymptotically optimal. This partition extends the notion of type classes to the twiceuniversal setting. In the individual sequence scenario, the same simulation scheme is shown to generate sequences which are statistically similar, in a strong sense, to the training sequence, for statistics of any order, while essentially maximizing the uncertainty on the output.
Consistency Of The Bic Order Estimator
, 1999
"... . We announce two results on the problem of estimating the order of a Markov chain from observation of a sample path. First is that the Bayesian Information Criterion (BIC) leads to an almost surely consistent estimator. Second is that the Bayesian minimum description length estimator, of which the ..."
Abstract
 Add to MetaCart
. We announce two results on the problem of estimating the order of a Markov chain from observation of a sample path. First is that the Bayesian Information Criterion (BIC) leads to an almost surely consistent estimator. Second is that the Bayesian minimum description length estimator, of which the BIC estimator is an approximation, fails to be consistent for the uniformly distributed i.i.d. process. A key tool is a strong ratiotypicality result for empirical kblock distributions. Complete proofs are given in the authors' article to appear in The Annals of Statistics. 1. Introduction Let M k denote the class of Markov chains of order at most k, with values drawn from a finite set A, and let M = S 1 k=0 M k . An important problem is to estimate the order of a Markov chain from observation of a finite sample path. A popular method is the socalled Bayesian Information Criterion (BIC), first introduced by Schwarz, [12], which gives the estimator defined by k BIC = k BIC (x n 1 ) ...
Estimation in autoregressive models with Markov regime
, 2008
"... In this paper we derive the consistency of the penalized likelihood method for the number state of the hidden Markov chain in autoregressive models with Markov regimen. Using a SAEM type algorithm to estimate the models parameters. We test the null hypothesis of hidden Markov Model against an autore ..."
Abstract
 Add to MetaCart
In this paper we derive the consistency of the penalized likelihood method for the number state of the hidden Markov chain in autoregressive models with Markov regimen. Using a SAEM type algorithm to estimate the models parameters. We test the null hypothesis of hidden Markov Model against an autoregressive process with Markov regime.