## From Finite-System Entropy to Entropy Rate for a (2006)

Venue: | Hidden Markov Process. Signal Processing Letters, IEEE, Volume 13, Issue 9, Sept. 2006 Page(s):517 |

Citations: | 10 - 0 self |

### BibTeX

@INPROCEEDINGS{Zuk06fromfinite-system,

author = {Or Zuk and Student Member and Eytan Domany and Ido Kanter and Michael Aizenman},

title = {From Finite-System Entropy to Entropy Rate for a},

booktitle = {Hidden Markov Process. Signal Processing Letters, IEEE, Volume 13, Issue 9, Sept. 2006 Page(s):517},

year = {2006},

pages = {520}

}

### OpenURL

### Abstract

Abstract—A recent result presented the expansion for the entropy rate of a hidden Markov process (HMP) as a power series in the noise variable. The coefficients of the expansion around the noiseless @ aHAlimit were calculated up to 11th order, using a conjecture that relates the entropy rate of an HMP to the entropy of a process of finite length (which is calculated analytically). In this letter, we generalize and prove the conjecture and discuss its theoretical and practical consequences.

### Citations

8564 |
Elements of Information Theory
- Cover, Thomas
- 2003
(Show Context)
Citation Context ...e ; sometimes we omit the realization of the variable ,so should be understood as . For a finite-entropy stationary process, the limit (1) exists, and can also be computed via the conditional entropy =-=[5]-=- as . Here, represents the conditional entropy, which for random variables and is the average uncertainty of the conditional distribution of given , that is, . By the entropy chain rule, it is also gi... |

6041 |
A mathematical theory of communication
- Shannon
- 1948
(Show Context)
Citation Context ...esults can be easily generalized to more cases (e.g. continuous observations). An important quantity for a stochastic process is the Shannon entropy rate, which measures its ’uncertainty per-symbol’ (=-=[4]-=-). More formally, for i ≤ j let [Y ] j i denote the vector (Yi, .., Yj). The entropy rate of Y is defined as: ¯H(Y H([Y ] ) = lim N→∞ N 1 ) N Where H(Y ) = − � Y P (Y ) log P (Y ); Sometimes we omit t... |

4269 | A tutorial on hidden Markov models and selected applications in speech recognition
- Rabiner
- 1989
(Show Context)
Citation Context ...ervation of through a noisy channel. It is a hidden Markov process (HMP), governed by the parameters , , and . HMPs have a rich theory, with applications in various fields, such as speech recognition =-=[1]-=-, information theory [2], and signal processing [3]. While we concentrate on a finite-state first-order HMP, our results can be easily generalized to more cases (e.g., continuous observations). An imp... |

424 |
A mathematical theory of communication, Bell Syst
- Shannon
(Show Context)
Citation Context ...ael. M. Aizenman is with the Departments of Physics and Mathematics, Princeton University, Princeton, NJ 08544-0708 USA. Digital Object Identifier 10.1109/LSP.2006.874466 1070-9908/$20.00 © 2006 IEEE =-=[4]-=-. More formally, for , let denote the vector . The entropy rate of is defined as where ; sometimes we omit the realization of the variable ,so should be understood as . For a finite-entropy stationary... |

324 | Wavelet-based statistical signal processing using hidden Markov models
- Crouse, Nowak, et al.
- 1998
(Show Context)
Citation Context ... Markov process (HMP), governed by the parameters , , and . HMPs have a rich theory, with applications in various fields, such as speech recognition [1], information theory [2], and signal processing =-=[3]-=-. While we concentrate on a finite-state first-order HMP, our results can be easily generalized to more cases (e.g., continuous observations). An important quantity for a stochastic process is the Sha... |

172 | Hidden Markov processes
- Ephraim, Merhav
(Show Context)
Citation Context ...isy channel. It is a hidden Markov process (HMP), governed by the parameters , , and . HMPs have a rich theory, with applications in various fields, such as speech recognition [1], information theory =-=[2]-=-, and signal processing [3]. While we concentrate on a finite-state first-order HMP, our results can be easily generalized to more cases (e.g., continuous observations). An important quantity for a st... |

49 | Completely analytical interactions: constructive description
- Dobrushin, Shlosman
- 1987
(Show Context)
Citation Context ... showed that the law of the (1) (2) (3)s518 IEEE SIGNAL PROCESSING LETTERS, VOL. 13, NO. 9, SEPTEMBER 2006 process is Gibbsian, together with the complete analyticity results for Gibbsian measures of =-=[11]-=-, to deduce analyticity of . is in fact an upper-bound [5] for . The behavior stated in Theorem 1 was discovered using symbolic computations but was proven only for , in the binary symmetric case [8].... |

16 |
On the entropy of a hidden Markov process
- Jacquet, Seroussi, et al.
- 2004
(Show Context)
Citation Context ...that is, . By the entropy chain rule, it is also given as a difference of entropies, . This relation will be used below. There is at present no explicit expression for the entropy rate of an HMP [2], =-=[6]-=-. Few recent works [6]–[8] have studied the asymptotic behavior of in several regimes, albeit giving rigorously only bounds or at most second-order [8] behavior. Here, we generalize and prove a relati... |

14 |
On the Entropy of a
- Jacquet, Seroussi, et al.
(Show Context)
Citation Context ...ropy chain rule, it is also given as a difference of entropies, H(U|V ) = H(U, V ) − H(V ). This relation will be used below. There is at present no explicit expression for the entropy rate of a HMP (=-=[2, 6]-=-). Few recent works ([6, 7, 8]) have studied the asymptotic behavior of ¯ H in several regimes, albeit giving rigorously only bounds or at most second ([8]) order behavior. Here we generalize and prov... |

11 | Asymptotics of the entropy rate for a hidden Markov process
- Zuk, Kanter, et al.
- 2005
(Show Context)
Citation Context ...chain rule, it is also given as a difference of entropies, . This relation will be used below. There is at present no explicit expression for the entropy rate of an HMP [2], [6]. Few recent works [6]–=-=[8]-=- have studied the asymptotic behavior of in several regimes, albeit giving rigorously only bounds or at most second-order [8] behavior. Here, we generalize and prove a relationship, first posed in [8]... |

10 | New bounds on the entropy rate of hidden Markov process - Ordentlich, Weissman - 2004 |

10 |
Velde, “Transformations of Gibbs measures
- Lörinczi, Maes, et al.
- 1998
(Show Context)
Citation Context ...ons given by and are analytic in , with Taylor The coefficients are functions of and . From now on, we omit this dependence. Then Analyticity of and around was recently shown in [9]. One may also use =-=[10]-=-, which showed that the law of the (1) (2) (3)s518 IEEE SIGNAL PROCESSING LETTERS, VOL. 13, NO. 9, SEPTEMBER 2006 process is Gibbsian, together with the complete analyticity results for Gibbsian measu... |

7 |
New Bounds on the Entropy Rate
- Ordentlich, Weissman
- 2004
(Show Context)
Citation Context ... given as a difference of entropies, H(U|V ) = H(U, V ) − H(V ). This relation will be used below. There is at present no explicit expression for the entropy rate of a HMP ([2, 6]). Few recent works (=-=[6, 7, 8]-=-) have studied the asymptotic behavior of ¯ H in several regimes, albeit giving rigorously only bounds or at most second ([8]) order behavior. Here we generalize and prove a relationship, first posed ... |

4 | Analyticity of entropy rate in families of hidden Markov chains
- Han, Marcus
- 2006
(Show Context)
Citation Context ... the functions expansions given by and are analytic in , with Taylor The coefficients are functions of and . From now on, we omit this dependence. Then Analyticity of and around was recently shown in =-=[9]-=-. One may also use [10], which showed that the law of the (1) (2) (3)s518 IEEE SIGNAL PROCESSING LETTERS, VOL. 13, NO. 9, SEPTEMBER 2006 process is Gibbsian, together with the complete analyticity res... |

1 |
series expansions for the entropy rate of hidden Markov processes
- Zuk, Kanter, et al.
- 2006
(Show Context)
Citation Context ... . Using lemma 1, we get Assume now . Write the probability of (11) (12) Let denote the vector we get from by changing to (while keeping other coordinates). Differentiating with respect to gives (see =-=[12]-=- for more details) (13) By Bayes’ rule ,we get (14)sZUK et al.: FROM FINITE-SYSTEM ENTROPY TO ENTROPY RATE 519 This gives and therefore (15) (16) The latter equality comes from using (7), which “block... |

1 |
Asymptotics of the Entropy Rate for a Hidden
- Zuk, Kanter, et al.
(Show Context)
Citation Context ... given as a difference of entropies, H(U|V ) = H(U, V ) − H(V ). This relation will be used below. There is at present no explicit expression for the entropy rate of a HMP ([2, 6]). Few recent works (=-=[6, 7, 8]-=-) have studied the asymptotic behavior of ¯ H in several regimes, albeit giving rigorously only bounds or at most second ([8]) order behavior. Here we generalize and prove a relationship, first posed ... |