Results 11  20
of
127
From FiniteSystem Entropy to Entropy Rate for a
 Hidden Markov Process. Signal Processing Letters, IEEE, Volume 13, Issue 9, Sept. 2006 Page(s):517
, 2006
"... Abstract—A recent result presented the expansion for the entropy rate of a hidden Markov process (HMP) as a power series in the noise variable. The coefficients of the expansion around the noiseless @ aHAlimit were calculated up to 11th order, using a conjecture that relates the entropy rate of an H ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
Abstract—A recent result presented the expansion for the entropy rate of a hidden Markov process (HMP) as a power series in the noise variable. The coefficients of the expansion around the noiseless @ aHAlimit were calculated up to 11th order, using a conjecture that relates the entropy rate of an HMP to the entropy of a process of finite length (which is calculated analytically). In this letter, we generalize and prove the conjecture and discuss its theoretical and practical consequences.
New bounds on the entropy rate of hidden Markov process
 Information Theory Workshop, 2004. IEEE 2429 Oct. 2004 Page(s):117  122
, 2004
"... Abstract — Let {Xt} be a stationary finitealphabet Markov chain and {Zt} denote its noisy version when corrupted by a discrete memoryless channel. Let P (Xt ∈ ·Z t −∞) denote the conditional distribution of Xt given all past and present noisy observations, a simplexvalued random variable. We pres ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
Abstract — Let {Xt} be a stationary finitealphabet Markov chain and {Zt} denote its noisy version when corrupted by a discrete memoryless channel. Let P (Xt ∈ ·Z t −∞) denote the conditional distribution of Xt given all past and present noisy observations, a simplexvalued random variable. We present a new approach to bounding the entropy rate of {Zt} by approximating the distribution of this random variable. This approximation is facilitated by the construction and study of a Markov process whose stationary distribution determines the distribution of P (Xt ∈ ·Z t −∞). To illustrate the efficacy of this approach, we specialize it and derive concrete bounds for the case of a binary Markov chain corrupted by a binary symmetric channel (BSC). These bounds are seen to capture the behavior of the entropy rate in various asymptotic regimes. I.
Universal minimax discrete denoising under channel uncertainty
 IEEE Trans. Inf. Theory
, 2006
"... The goal of a denoising algorithm is to recover a signal from its noisecorrupted observations. Perfect recovery is seldom possible and performance is measured under a given singleletter fidelity criterion. For discrete signals corrupted by a known DMC, a denoising scheme, the DUDE algorithm, was r ..."
Abstract

Cited by 10 (3 self)
 Add to MetaCart
The goal of a denoising algorithm is to recover a signal from its noisecorrupted observations. Perfect recovery is seldom possible and performance is measured under a given singleletter fidelity criterion. For discrete signals corrupted by a known DMC, a denoising scheme, the DUDE algorithm, was recently shown to perform this task practically and asymptotically optimally, with no knowledge of the statistical properties of the signal. In the present work we address the scenario where, in addition to the lack of knowledge of the source statistics, there is also uncertainty in the channel characteristics. We propose a family of discrete denoisers and establish their asymptotic optimality under a minimax criterion we argue appropriate for this setting. The proposed schemes can be implemented computationally efficiently. 1
Determining and approaching achievable rates of binary intersymbol interference channels using multistage decoding
 IEEE Trans. Information Theory
, 2007
"... Abstract—By examining the achievable rates of a multistage decoding system on stationary ergodic channels, we derive lower bounds on the mutual information rate corresponding to independent and uniformly distributed (i.u.d.) inputs, also referred to as the i.u.d. information rate. For binary intersy ..."
Abstract

Cited by 9 (4 self)
 Add to MetaCart
Abstract—By examining the achievable rates of a multistage decoding system on stationary ergodic channels, we derive lower bounds on the mutual information rate corresponding to independent and uniformly distributed (i.u.d.) inputs, also referred to as the i.u.d. information rate. For binary intersymbol interference (ISI) channels, we show that these bounds become tight as the number of decoding stages increases. Our analysis, which focuses on the marginal conditional output densities at each stage of decoding, provides an information rate corresponding to each stage. These rates underlie the design of multilevel coding schemes, based upon lowdensity paritycheck (LDPC) codes and message passing, that in combination with multistage decoding approach the i.u.d. information rate for binary ISI channels. We give example constructions for channel models that have been commonly used in magnetic recording. These examples demonstrate that the technique is very effective even for a small number of decoding stages. Index Terms—Bahl–Cocke–Jelinek–Raviv (BCJR) algorithm, coset codes, density evolution, finitestate channels, information rates, intersymbol interference (ISI) channels, lowdensity paritycheck (LDPC) codes, magnetic recording, multilevel coding, multistage decoding. I.
On a role of predictor in the filtering stability
 Electron. Comm. Probab
"... Abstract. When is a nonlinear filter stable with respect to its initial condition? In spite of the recent progress, this question still lacks a complete answer in general. Currently available results indicate that stability of the filter depends on the signal ergodic properties and the observation p ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
Abstract. When is a nonlinear filter stable with respect to its initial condition? In spite of the recent progress, this question still lacks a complete answer in general. Currently available results indicate that stability of the filter depends on the signal ergodic properties and the observation process regularity and may fail if either of the ingredients is ignored. In this note we address the question of stability in a particular weak sense and show that the estimates of certain functions are always stable. This is verified without dealing directly with the filtering equation and turns to be inherited from certain onestep predictor estimates. 1.
Selecting Hidden Markov Model State Number with CrossValidated Likelihood
 Computational Statistics
"... Abstract: The problem of estimating the number of hidden states in a hidden Markov model is considered. Emphasis is placed on crossvalidated likelihood criteria. Using crossvalidation to assess the number of hidden states allows to circumvent the well documented ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
Abstract: The problem of estimating the number of hidden states in a hidden Markov model is considered. Emphasis is placed on crossvalidated likelihood criteria. Using crossvalidation to assess the number of hidden states allows to circumvent the well documented
Discrete denoising for channels with memory
 Communications in Information and Systems (CIS
"... Abstract. We consider the problem of estimating a discrete signal X n = (X1,..., Xn) based on its noisecorrupted observation signal Z n = (Z1,..., Zn). The noisefree, noisy, and reconstruction signals are all assumed to have components taking values in the same finite Mary alphabet {0,..., M − 1 ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
Abstract. We consider the problem of estimating a discrete signal X n = (X1,..., Xn) based on its noisecorrupted observation signal Z n = (Z1,..., Zn). The noisefree, noisy, and reconstruction signals are all assumed to have components taking values in the same finite Mary alphabet {0,..., M − 1}. For concreteness we focus on the additive noise channel Zi = Xi + Ni, where addition is moduloM, and {Ni} is the noise process. The cumulative loss is measured by a given loss function. The distribution of the noise is assumed known, and may have memory restricted only to stationarity and a mild mixing condition. We develop a sequence of denoisers (indexed by the block length n) which we show to be asymptotically universal in both a semistochastic setting (where the noiseless signal is an individual sequence) and in a fully stochastic setting (where the noiseless signal is emitted from a stationary source). It is detailed how the problem formulation, denoising schemes, and performance guarantees carry over to nonadditive channels, as well as to higherdimensional data arrays. The proposed schemes are shown to be computationally implementable. We also discuss a variation on these schemes that is likely to do well on data of moderate size. We conclude with a report of experimental results for the binary burst noise channel, where the noise is a finitestate hidden Markov process (FSHMP), and a finitestate hidden Markov random field (FSHMRF), in the respective cases of one and twodimensional data. These support the theoretical predictions and show that, in practice, there is much to be gained by taking the channel memory into account. 1. Introduction. The
The entropy of a binary hidden Markov process
 J. Stat. Phys
, 2005
"... The entropy of a binary symmetric Hidden Markov Process is calculated as an expansion in the noise parameter ɛ. We map the problem onto a onedimensional Ising model in a large field of random signs and calculate the expansion coefficients up to second order in ɛ. Using a conjecture we extend the ca ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
The entropy of a binary symmetric Hidden Markov Process is calculated as an expansion in the noise parameter ɛ. We map the problem onto a onedimensional Ising model in a large field of random signs and calculate the expansion coefficients up to second order in ɛ. Using a conjecture we extend the calculation to 11th order and discuss the convergence of the resulting series.
The empirical distribution of rateconstrained source codes
 IEEE Trans. Inform. Theory
"... Let X =(X1,...) be a stationary ergodic finitealphabet source, X n denote its first n symbols, and Y n be the codeword assigned to X n by a lossy source code. The empirical kthorder joint distribution ˆ Q k [X n,Y n](x k,y k)is defined as the frequency of appearances of pairs of kstrings (x k,y k ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
Let X =(X1,...) be a stationary ergodic finitealphabet source, X n denote its first n symbols, and Y n be the codeword assigned to X n by a lossy source code. The empirical kthorder joint distribution ˆ Q k [X n,Y n](x k,y k)is defined as the frequency of appearances of pairs of kstrings (x k,y k)alongthepair(X n,Y n). Our main interest is in the sample behavior of this (random) distribution. Letting I(Q k) denote the mutual information I(X k; Y k) when (X k,Y k) ∼ Q k we show that for any (sequence of) lossy source code(s) of rate ≤ R lim sup n→∞ 1 k I ˆQ k n n
An Autoregressive Model with TimeVarying Coefficients for Wind Fields
, 2005
"... In this paper, an original Markovswitching autoregressive model is proposed to describe the spacetime evolution of wind fields. At first, a nonobservable process is introduced in order to model the motion of the meteorological structures. Then, ..."
Abstract

Cited by 7 (5 self)
 Add to MetaCart
In this paper, an original Markovswitching autoregressive model is proposed to describe the spacetime evolution of wind fields. At first, a nonobservable process is introduced in order to model the motion of the meteorological structures. Then,