Results 1  10
of
16
Moderatedeviations of lossy source coding for discrete and Gaussian sources
 In Int. Symp. Inf. Th
, 2012
"... ar ..."
(Show Context)
The Dispersion of SlepianWolf Coding
"... Abstract—We characterize secondorder coding rates (or dispersions) for distributed lossless source coding (the SlepianWolf problem). We introduce a fundamental quantity known as the entropy dispersion matrix, which is analogous to scalar dispersion quantities. We show that if this matrix is posit ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
Abstract—We characterize secondorder coding rates (or dispersions) for distributed lossless source coding (the SlepianWolf problem). We introduce a fundamental quantity known as the entropy dispersion matrix, which is analogous to scalar dispersion quantities. We show that if this matrix is positivedefinite, the optimal rate region under the constraint of a fixed blocklength and nonzero error probability has a curved boundary compared to being polyhedral for the SlepianWolf case. In addition, the entropy dispersion matrix governs the rate of convergence of the nonasymptotic region to the asymptotic one. As a byproduct of our analyses, we develop a general universal achievability procedure for dispersion analysis of some other network information theory problems such as the multipleaccess channel. Numerical examples show how the region given by Gaussian approximations compares to the SlepianWolf region.
Moderate Deviations for Joint SourceChannel Coding of Systems With Markovian Memory
"... Abstract—We study the (almost lossless) joint sourcechannel coding problem from the moderate deviations perspective where the bandwidth expansion ratio tends towards the ratio of the channel capacity and source entropy at a rate larger than n−1/2 (n being the channel blocklength) and the error prob ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
Abstract—We study the (almost lossless) joint sourcechannel coding problem from the moderate deviations perspective where the bandwidth expansion ratio tends towards the ratio of the channel capacity and source entropy at a rate larger than n−1/2 (n being the channel blocklength) and the error probability decays subexponentially. We consider the stationary ergodic Markov (SEM) source as well as discrete memoryless and additive SEM channels. We also discuss the loss due to separation in the moderate deviations setting. I.
On the Linear CodebookLevel Duality Between Slepian–Wolf Coding and Channel Coding
"... Abstract—In this paper, it is shown that each Slepian–Wolf coding problem is related to a dual channel coding problem in the sense that the sphere packing exponents, random coding exponents, and correct decoding exponents in these two problems are mirrorsymmetrical to each other. This mirror symmet ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Abstract—In this paper, it is shown that each Slepian–Wolf coding problem is related to a dual channel coding problem in the sense that the sphere packing exponents, random coding exponents, and correct decoding exponents in these two problems are mirrorsymmetrical to each other. This mirror symmetry is interpreted as a manifestation of the linear codebooklevel duality between Slepian–Wolf coding and channel coding. Furthermore, this duality, in conjunction with a systematic analysis of the expurgated exponents, reveals that nonlinear Slepian–Wolf codes can strictly outperform linear Slepian–Wolf codes in terms of rateerror tradeoff at high rates. The linear codebooklevel duality is also established for general sources and channels. Index Terms—Channel coding, duality, error exponent, linear code, reliability function, Slepian–Wolf coding. I.
On Secret Key Generation From Finite Source Observations
"... AbstractAll existing secret key generation schemes assume that the users have access to infinite number of source ob servations. Motivated by applications in wireless networks, we consider the problem of generating secret keys from a finite number of correlated observations under the source model. ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
AbstractAll existing secret key generation schemes assume that the users have access to infinite number of source ob servations. Motivated by applications in wireless networks, we consider the problem of generating secret keys from a finite number of correlated observations under the source model. We investigate the relationship among the achievable secret key rate, the decoding error probability, and the number of observations. Based on the connection between secret key generation and dis tributed source coding problems, we present two key generation schemes. In the first scheme, length of the generated key varies depending on the the realization of the source sequence, and the generated key is conditionally perfectly secure. We characterize the penalty, due to the finiteness of the number of observations, in the key rate compared to that of the scheme with infinite number of observations. In the second scheme, length of the generated key is fixed, and the key is unconditionally perfectly secure. We characterize the additional penalty associated with these additional features in the achievable key rate. Index TermsFinite block length, fixedlength, key generation, variablelength. I.
Near Lossless Source Coding with Side Information at The Decoder: Beyond Conditional Entropy
"... Abstract — In near lossless source coding with decoder only side information, i.e., SlepianWolf coding (with one encoder), a source X with finite alphabet X is first encoded, and then later decoded subject to a small error probability with the help of side information Y with finite alphabet Y avail ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract — In near lossless source coding with decoder only side information, i.e., SlepianWolf coding (with one encoder), a source X with finite alphabet X is first encoded, and then later decoded subject to a small error probability with the help of side information Y with finite alphabet Y available only to the decoder. The classical result by Slepian and Wolf shows that the minimum average compression rate achievable asymptotically subject to a small error probability constraint for a memoryless pair (X, Y) is given by the conditional entropy H(XY). In this paper, we look beyond conditional entropy and investigate the tradeoff between compression rate and decoding error spectrum in SlepianWolf coding when the decoding error probability goes to zero exponentially fast. It is shown that when the decoding error probability goes to zero at the speed of 2 −δn, where δ is a positive constant and n denotes the source sequences ’ length, the minimum average compression rate achievable asymptotically is strictly greater than H(XY) regardless of how small δ is. More specifically, the minimum average compression rate achievable asymptotically is lower bounded by a quantity called the intrinsic conditional entropy Hin(XY, δ), which is strictly greater than H(XY), and is also asymptotically achievable for small δ. I.
Finite Blocklength SlepianWolf Coding
"... Abstract—We characterize the fundamental limits for distributed lossless source coding (SlepianWolf) in the finite blocklength regime. We introduce a fundamental quantity known as the entropy dispersion matrix, which is analogous to scalar dispersion quantities. We show that if this matrix is pos ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract—We characterize the fundamental limits for distributed lossless source coding (SlepianWolf) in the finite blocklength regime. We introduce a fundamental quantity known as the entropy dispersion matrix, which is analogous to scalar dispersion quantities. We show that if this matrix is positivedefinite, the optimal rate region under the constraint of a fixed blocklength and nonzero error probability has a curved boundary compared to being polyhedral for the SlepianWolf case. In addition, the entropy dispersion matrix governs the rate of convergence of the nonasymptotic region to the asymptotic one. As a byproduct of our analyses, we develop a general universal achievability procedure for finite blocklength analysis of some other network information theory problems such as the multipleaccess channel. Numerical examples show how the nonasymptotic region compares to the SlepianWolf region. Index Terms—SlepianWolf, Dispersion, Finite blocklength I.
NonAsymptotic and Asymptotic Analyses on Markov Chains in Several Problems
"... Abstract—In this paper, we derive nonasymptotic achievability and converse bounds on the source coding with sideinformation and the random number generation with sideinformation. Our bounds are efficiently computable in the sense that the computational complexity does not depend on the block len ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract—In this paper, we derive nonasymptotic achievability and converse bounds on the source coding with sideinformation and the random number generation with sideinformation. Our bounds are efficiently computable in the sense that the computational complexity does not depend on the block length. We also characterize the asymptotic behaviors of the large deviation regime and the moderate deviation regime by using our bounds, which implies that our bounds are asymptotically tight in those regimes. We also show the second order rates of those problems, and derive single letter forms of the variances characterizing the second order rates. I.