Results 1 
6 of
6
The finitedimensional Witsenhausen counterexample
, 2009
"... Recently, we considered a vector version of Witsenhausen’s counterexample and connected the problem formulation to an informationtheoretic problem called “assisted interference suppression” that was itself inspired by work on the socalled “cognitive radio channel.” We used a new lower bound to sho ..."
Abstract

Cited by 13 (8 self)
 Add to MetaCart
Recently, we considered a vector version of Witsenhausen’s counterexample and connected the problem formulation to an informationtheoretic problem called “assisted interference suppression” that was itself inspired by work on the socalled “cognitive radio channel.” We used a new lower bound to show that in that limit of infinite vector length, certain quantizationbased strategies are provably within a constant factor of the optimal cost for all possible problem parameters. In this paper, finite vector lengths are considered with the vector length being viewed as an additional problem parameter. By applying the “spherepacking ” philosophy, a lower bound to the optimal cost for this finitelength problem is derived that uses appropriate shadows of the infinitelength bounds. We also introduce latticebased quantization strategies for any finite length. Using the new finitelength lower bound, we show that the latticebased strategies achieve within a constant factor of the optimal cost uniformly over all possible problem parameters, including the vector length. For Witsenhausen’s original problem — which corresponds to the scalar case — latticebased strategies attain within a factor of 8 of the optimal cost. Based on observations in the scalar case and the infinitedimensional case, we also conjecture what the optimal strategies could be for any finite vector length.
The Dispersion of SlepianWolf Coding
"... Abstract—We characterize secondorder coding rates (or dispersions) for distributed lossless source coding (the SlepianWolf problem). We introduce a fundamental quantity known as the entropy dispersion matrix, which is analogous to scalar dispersion quantities. We show that if this matrix is posit ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
Abstract—We characterize secondorder coding rates (or dispersions) for distributed lossless source coding (the SlepianWolf problem). We introduce a fundamental quantity known as the entropy dispersion matrix, which is analogous to scalar dispersion quantities. We show that if this matrix is positivedefinite, the optimal rate region under the constraint of a fixed blocklength and nonzero error probability has a curved boundary compared to being polyhedral for the SlepianWolf case. In addition, the entropy dispersion matrix governs the rate of convergence of the nonasymptotic region to the asymptotic one. As a byproduct of our analyses, we develop a general universal achievability procedure for dispersion analysis of some other network information theory problems such as the multipleaccess channel. Numerical examples show how the region given by Gaussian approximations compares to the SlepianWolf region.
Finite Blocklength SlepianWolf Coding
"... Abstract—We characterize the fundamental limits for distributed lossless source coding (SlepianWolf) in the finite blocklength regime. We introduce a fundamental quantity known as the entropy dispersion matrix, which is analogous to scalar dispersion quantities. We show that if this matrix is pos ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract—We characterize the fundamental limits for distributed lossless source coding (SlepianWolf) in the finite blocklength regime. We introduce a fundamental quantity known as the entropy dispersion matrix, which is analogous to scalar dispersion quantities. We show that if this matrix is positivedefinite, the optimal rate region under the constraint of a fixed blocklength and nonzero error probability has a curved boundary compared to being polyhedral for the SlepianWolf case. In addition, the entropy dispersion matrix governs the rate of convergence of the nonasymptotic region to the asymptotic one. As a byproduct of our analyses, we develop a general universal achievability procedure for finite blocklength analysis of some other network information theory problems such as the multipleaccess channel. Numerical examples show how the nonasymptotic region compares to the SlepianWolf region. Index Terms—SlepianWolf, Dispersion, Finite blocklength I.
1The finitedimensional Witsenhausen counterexample
"... Recently, a vector version of Witsenhausen’s counterexample was considered and it was shown that in that limit of infinite vector length, certain quantizationbased control strategies are provably within a constant factor of the optimal cost for all possible problem parameters. In this paper, finite ..."
Abstract
 Add to MetaCart
(Show Context)
Recently, a vector version of Witsenhausen’s counterexample was considered and it was shown that in that limit of infinite vector length, certain quantizationbased control strategies are provably within a constant factor of the optimal cost for all possible problem parameters. In this paper, finite vector lengths are considered with the dimension being viewed as an additional problem parameter. By applying a largedeviation “spherepacking” philosophy, a lower bound to the optimal cost for the finite dimensional case is derived that uses appropriate shadows of the infinitelength bound. Using the new lower bound, we show that good latticebased control strategies achieve within a constant factor of the optimal cost uniformly over all possible problem parameters, including the vector length. For Witsenhausen’s original problem — the scalar case — the gap between regular latticebased strategies and the lower bound is numerically never more than a factor of 8. I.
Universal Quadratic Lower Bounds on Source Coding Error Exponents
"... Abstract — We consider the problem of blocksize selection to achieve a desired probability of error for universal source coding. While Baron, et al in [1], [9] studied this question for rates in the vicinity of entropy for known distributions using centrallimittheorem techniques, we are intereste ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract — We consider the problem of blocksize selection to achieve a desired probability of error for universal source coding. While Baron, et al in [1], [9] studied this question for rates in the vicinity of entropy for known distributions using centrallimittheorem techniques, we are interested in all rates for unknown distributions and use errorexponent techniques. By adapting a technique of Gallager from the exercises of [7], we derive a universal lower bound to the sourcecoding error exponent that depends only on the alphabet size and is quadratic in the gap to entropy. I.