Results 1  10
of
10
Analyticity of entropy rate of a hidden Markov chain
 In Proc. of IEEE International Symposium on Information Theory, Adelaide, Australia, September 4September 9 2005
, 1995
"... We prove that under mild positivity assumptions the entropy rate of a hidden Markov chain varies analytically as a function of the underlying Markov chain parameters. A general principle to determine the domain of analyticity is stated. An example is given to estimate the radius of convergence for t ..."
Abstract

Cited by 19 (8 self)
 Add to MetaCart
We prove that under mild positivity assumptions the entropy rate of a hidden Markov chain varies analytically as a function of the underlying Markov chain parameters. A general principle to determine the domain of analyticity is stated. An example is given to estimate the radius of convergence for the entropy rate. We then show that the positivity assumptions can be relaxed, and examples are given for the relaxed conditions. We study a special class of hidden Markov chains in more detail: binary hidden Markov chains with an unambiguous symbol, and we give necessary and sufficient conditions for analyticity of the entropy rate for this case. Finally, we show that under the positivity assumptions the hidden Markov chain itself varies analytically, in a strong sense, as a function of the underlying Markov chain parameters. 1
From FiniteSystem Entropy to Entropy Rate for a
 Hidden Markov Process. Signal Processing Letters, IEEE, Volume 13, Issue 9, Sept. 2006 Page(s):517
, 2006
"... Abstract—A recent result presented the expansion for the entropy rate of a hidden Markov process (HMP) as a power series in the noise variable. The coefficients of the expansion around the noiseless @ aHAlimit were calculated up to 11th order, using a conjecture that relates the entropy rate of an H ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
Abstract—A recent result presented the expansion for the entropy rate of a hidden Markov process (HMP) as a power series in the noise variable. The coefficients of the expansion around the noiseless @ aHAlimit were calculated up to 11th order, using a conjecture that relates the entropy rate of an HMP to the entropy of a process of finite length (which is calculated analytically). In this letter, we generalize and prove the conjecture and discuss its theoretical and practical consequences.
Derivatives of Entropy Rate in Special Families of Hidden Markov Chains
 Issue 7, July 2007, Page(s):2642
"... Consider a hidden Markov chain obtained as the observation process of an ordinary Markov chain corrupted by noise. Zuk, et. al. [13, 14] showed how, in principle, one can explicitly compute the derivatives of the entropy rate of at extreme values of the noise. Namely, they showed that the derivative ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
Consider a hidden Markov chain obtained as the observation process of an ordinary Markov chain corrupted by noise. Zuk, et. al. [13, 14] showed how, in principle, one can explicitly compute the derivatives of the entropy rate of at extreme values of the noise. Namely, they showed that the derivatives of standard upper approximations to the entropy rate actually stabilize at an explicit finite time. We generalize this result to a natural class of hidden Markov chains called “Black Holes. ” We also discuss in depth special cases of binary Markov chains observed in binary symmetric noise, and give an abstract formula for the first derivative in terms of a measure on the simplex due to Blackwell. 1
Asymptotics of the inputconstrained binary symmetric channel capacity
 Annals of Applied Probability
, 2009
"... We study the classical problem of noisy constrained capacity in the case of the binary symmetric channel (BSC), namely, the capacity of a BSC whose inputs are sequences chosen from a constrained set. Motivated by a result of Ordentlich and Weissman [In Proceedings of IEEE Information Theory Workshop ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
We study the classical problem of noisy constrained capacity in the case of the binary symmetric channel (BSC), namely, the capacity of a BSC whose inputs are sequences chosen from a constrained set. Motivated by a result of Ordentlich and Weissman [In Proceedings of IEEE Information Theory Workshop (2004) 117–122], we derive an asymptotic formula (when the noise parameter is small) for the entropy rate of a hidden Markov chain, observed when a Markov chain passes through a BSC. Using this result, we establish an asymptotic formula for the capacity of a BSC with input process supported on an irreducible finite type constraint, as the noise parameter tends to zero. 1. Introduction and background. Let X,Y be discrete random variables with alphabet X,Y and joint probability mass function pX,Y (x,y) △ = P(X = x,Y = y), x ∈ X,y ∈ Y [for notational simplicity, we will write p(x,y) rather than pX,Y (x,y), similarly p(x),p(y) rather than pX(x),pY (y), resp., when it
Analyticity of Entropy Rate in Families of Hidden Markov Chains, submitted to
 IEEE Tran. Inf. Th
, 2005
"... We prove that under a mild positivity assumption the entropy rate of a hidden Markov chain varies analytically as a function of the underlying Markov chain parameters. We give examples to show how this can fail in some cases. And we study two natural special classes of hidden Markov chains in more d ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
We prove that under a mild positivity assumption the entropy rate of a hidden Markov chain varies analytically as a function of the underlying Markov chain parameters. We give examples to show how this can fail in some cases. And we study two natural special classes of hidden Markov chains in more detail: binary hidden Markov chains with an unambiguous symbol and binary Markov chains corrupted by binary symmetric noise. Finally, we show that under the positivity assumption the hidden Markov chain itself varies analytically, in a strong sense, as a function of the underlying Markov chain parameters. 1
LimitedRate Channel State Feedback for Multicarrier Block Fading Channels 1
, 2008
"... The capacity of a fading channel can be substantially increased by feeding back channel state information from the receiver to the transmitter. With limitedrate feedback what state information to feed back and how to encode it are important open questions. This paper studies power loading in a mult ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
The capacity of a fading channel can be substantially increased by feeding back channel state information from the receiver to the transmitter. With limitedrate feedback what state information to feed back and how to encode it are important open questions. This paper studies power loading in a multicarrier system using no more than one bit of feedback per subchannel. The subchannels can be correlated and full channel state information is assumed at the receiver. First, a simple model with N parallel twostate (good/bad) memoryless subchannels is considered, where the channel state feedback is used to select a fixed number of subchannels to activate. The optimal feedback scheme is the solution to a vector quantization problem, and the associated performance for large N is characterized by a rate distortion function. As N increases, we show that the loss in forward rate from the asymptotic (ratedistortion) value decreases as (log N)/N and √ (log N)/N with optimal variable and fixedrate feedback codes, respectively. We subsequently extend these results to parallel Rayleigh block fading subchannels, where the feedback designates a set of subchannels, which are activated with equal power. Ratedistortion feedback codes are proposed for designating subsets of (good) subchannels with SignaltoNoise Ratios (SNRs) that exceed a threshold. The associated performance is compared with that of a simpler lossless source coding scheme, which designates groups of good subchannels, where both the group size and threshold are optimized. The ratedistortion codes can provide a significant increase in forward rate at low SNRs.
Asymptotics of Entropy Rate in Special Families of Hidden Markov Chains
, 2008
"... We generalize a result in [8] and derive an asymptotic formula for entropy rate of a hidden Markov chain around a “weak Black Hole”. We also discuss applications of the asymptotic formula to the asymptotic behaviors of certain channels. Index Terms–entropy, entropy rate, hidden Markov chain, hidden ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
We generalize a result in [8] and derive an asymptotic formula for entropy rate of a hidden Markov chain around a “weak Black Hole”. We also discuss applications of the asymptotic formula to the asymptotic behaviors of certain channels. Index Terms–entropy, entropy rate, hidden Markov chain, hidden Markov model, hidden Markov process 1
Concavity of Mutual Information Rate for InputRestricted FiniteState Memoryless Channels
"... Abstract—We consider a finitestate memoryless channel with i.i.d. channel state and the input Markov process supported on a mixing finitetype constraint. We discuss the asymptotic behavior of entropy rate of the output hidden Markov chain and deduce that the mutual information rate of such a chann ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract—We consider a finitestate memoryless channel with i.i.d. channel state and the input Markov process supported on a mixing finitetype constraint. We discuss the asymptotic behavior of entropy rate of the output hidden Markov chain and deduce that the mutual information rate of such a channel is concave with respect to the parameters of the input Markov processes at high signaltonoise ratio. In principle, the concavity result enables good numerical approximation of the maximum mutual information rate and capacity of such a channel. I. CHANNEL MODEL In this paper, we show that for certain inputrestricted finitestate memoryless channels, the mutual information rate, at high SNR, is effectively a concave function of Markov input processes of a given order. While not directly addressed here, the goal is to help estimate the maximum of this function and
1 LimitedRate Channel State Feedback for Multicarrier Block Fading Channels
, 2009
"... The capacity of a fading channel can be substantially increased by feeding back channel state information from the receiver to the transmitter. With limitedrate feedback what state information to feed back and how to encode it are important open questions. This paper studies power loading in a mult ..."
Abstract
 Add to MetaCart
The capacity of a fading channel can be substantially increased by feeding back channel state information from the receiver to the transmitter. With limitedrate feedback what state information to feed back and how to encode it are important open questions. This paper studies power loading in a multicarrier system using no more than one bit of feedback per subchannel. The subchannels can be correlated and full channel state information is assumed at the receiver. First, a simple model with N parallel twostate (good/bad) memoryless subchannels is considered, where the channel state feedback is used to select a fixed number of subchannels to activate. The optimal feedback scheme is the solution to a vector quantization problem, and the associated performance for large N is characterized by a rate distortion function. As N increases, we show that the loss in forward rate from the asymptotic (ratedistortion) value decreases as (log N)/N and √ (log N)/N with optimal variable and fixedrate feedback codes, respectively. We subsequently extend these results to parallel Rayleigh block fading subchannels, where the feedback designates a set of subchannels, which are activated with equal power. Ratedistortion feedback codes are proposed for designating subsets of (good) subchannels with SignaltoNoise Ratios (SNRs) that exceed a threshold. The associated performance is compared with that of a simpler lossless
Asymptotics of Noisy Constrained Channel Capacity
, 2007
"... In this paper, we generalize a result in [17] and derive an asymptotic formula for the entropy rate of a hidden Markov chain, observed when a Markov chain passes through a binary symmetric channel. And we prove an asymptotic formula for the capacity of a binary symmetric channel with input process s ..."
Abstract
 Add to MetaCart
In this paper, we generalize a result in [17] and derive an asymptotic formula for the entropy rate of a hidden Markov chain, observed when a Markov chain passes through a binary symmetric channel. And we prove an asymptotic formula for the capacity of a binary symmetric channel with input process supported on an irreducible finite type constraint. 1