Results 1  10
of
171
The capacity of channels with feedback
 IEEE Trans. Information Theory
, 2009
"... We introduce a general framework for treating channels with memory and feedback. First, we generalize Massey’s concept of directed information [23] and use it to characterize the feedback capacity of general channels. Second, we present coding results for Markov channels. This requires determining a ..."
Abstract

Cited by 96 (4 self)
 Add to MetaCart
(Show Context)
We introduce a general framework for treating channels with memory and feedback. First, we generalize Massey’s concept of directed information [23] and use it to characterize the feedback capacity of general channels. Second, we present coding results for Markov channels. This requires determining appropriate sufficient statistics at the encoder and decoder. Third, a dynamic programming framework for computing the capacity of Markov channels is presented. Fourth, it is shown that the average cost optimality equation (ACOE) can be viewed as an implicit singleletter characterization of the capacity. Fifth, scenarios
Information Spectrum Approach to SecondOrder Coding Rate in Channel Coding
 IEEE Transactions on Information Theory
, 2009
"... Secondorder coding rate of channel coding is discussed for general sequence of channels. The optimum secondorder transmission rate with a constant error constraint ǫ is obtained by using the information spectrum method. We apply this result to the discrete memoryless case, the discrete memoryless ..."
Abstract

Cited by 71 (9 self)
 Add to MetaCart
(Show Context)
Secondorder coding rate of channel coding is discussed for general sequence of channels. The optimum secondorder transmission rate with a constant error constraint ǫ is obtained by using the information spectrum method. We apply this result to the discrete memoryless case, the discrete memoryless case with a cost constraint, the additive Markovian case, and the Gaussian channel case with an energy constraint. We also clarify that the Gallager bound does not give the optimum evaluation in the secondorder coding rate.
Fifty Years of Shannon Theory
, 1998
"... A brief chronicle is given of the historical development of the central problems in the theory of fundamental limits of data compression and reliable communication. ..."
Abstract

Cited by 50 (1 self)
 Add to MetaCart
A brief chronicle is given of the historical development of the central problems in the theory of fundamental limits of data compression and reliable communication.
Coordination Capacity
, 2009
"... We develop elements of a theory of cooperation and coordination in networks. Rather than considering a communication network as a means of distributing information, or of reconstructing random processes at remote nodes, we ask what dependence can be established among the nodes given the communicatio ..."
Abstract

Cited by 48 (17 self)
 Add to MetaCart
(Show Context)
We develop elements of a theory of cooperation and coordination in networks. Rather than considering a communication network as a means of distributing information, or of reconstructing random processes at remote nodes, we ask what dependence can be established among the nodes given the communication constraints. Specifically, in a network with communication rates {Ri,j} between the nodes, we ask what is the set of all achievable joint distributions p(x1,..., xm) of actions at the nodes on the network. Several networks are solved, including arbitrarily large cascade networks. Distributed cooperation can be the solution to many problems such as distributed games, distributed control, and establishing mutual information bounds on the influence of one part of a physical system on another.
The Empirical Distribution of Good Codes
"... Finding the input distribution that maximizes mutual information leads, not only to the capacity of the channel, but to engineering insights that tell.the designer what good codes should be like. This is due to the folk theorem: The empirical distribution of any good code (i.e., approaching capacity ..."
Abstract

Cited by 48 (2 self)
 Add to MetaCart
Finding the input distribution that maximizes mutual information leads, not only to the capacity of the channel, but to engineering insights that tell.the designer what good codes should be like. This is due to the folk theorem: The empirical distribution of any good code (i.e., approaching capacity with vanishing probability of error) maximizes mutual information. This paper formalizes and proves this statement.
Communication requirements for generating correlated random variables
 in Proc. IEEE Int. Symp. Information Theory (ISIT
, 2008
"... Abstract — Two familiar notions of correlation are rediscovered as extreme operating points for simulating a discrete memoryless channel, in which a channel output is generated based only on a description of the channel input. Wyner’s “common information ” coincides with the minimum description rate ..."
Abstract

Cited by 40 (10 self)
 Add to MetaCart
(Show Context)
Abstract — Two familiar notions of correlation are rediscovered as extreme operating points for simulating a discrete memoryless channel, in which a channel output is generated based only on a description of the channel input. Wyner’s “common information ” coincides with the minimum description rate needed. However, when common randomness independent of the input is available, the necessary description rate reduces to Shannon’s mutual information. This work characterizes the optimal tradeoff between the amount of common randomness used and the required rate of description. I.
An informationspectrum approach to classical and quantum hypothesis testing for simple hypotheses
 IEEE TRANS. INFORM. THEORY
, 2006
"... The informationspectrum analysis made by Han for classical hypothesis testing for simple hypotheses is extended to a unifying framework including both classical and quantum hypothesis testing. The results are also applied to fixedlength source coding when loosening the normalizing condition for pr ..."
Abstract

Cited by 35 (17 self)
 Add to MetaCart
(Show Context)
The informationspectrum analysis made by Han for classical hypothesis testing for simple hypotheses is extended to a unifying framework including both classical and quantum hypothesis testing. The results are also applied to fixedlength source coding when loosening the normalizing condition for probability distributions and for quantum states. We establish general formulas for several quantities relating to the asymptotic optimality of tests/codes in terms of classical and quantum information spectra.
A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks
, 2013
"... We consider two fundamental tasks in quantum information theory, data compression with quantum side information as well as randomness extraction against quantum side information. We characterize these tasks for general sources using socalled oneshot entropies. These characterizations — in contrast ..."
Abstract

Cited by 26 (15 self)
 Add to MetaCart
We consider two fundamental tasks in quantum information theory, data compression with quantum side information as well as randomness extraction against quantum side information. We characterize these tasks for general sources using socalled oneshot entropies. These characterizations — in contrast to earlier results — enable us to derive tight second order asymptotics for these tasks in the i.i.d. limit. More generally, our derivation establishes a hierarchy of information quantities that can be used to investigate information theoretic tasks in the quantum domain: The oneshot entropies most accurately describe an operational quantity, yet they tend to be difficult to calculate for large systems. We show that they asymptotically agree (up to logarithmic terms) with entropies related to the quantum and classical information spectrum, which are easier to calculate in the i.i.d. limit. Our technique also naturally yields bounds on operational quantities for finite block lengths.
On universal types
 PROC. ISIT 2004
, 2004
"... We define the universal type class of a sequence x n, in analogy to the notion used in the classical method of types. Two sequences of the same length are said to be of the same universal (LZ) type if and only if they yield the same set of phrases in the incremental parsing of Ziv and Lempel (1978 ..."
Abstract

Cited by 25 (6 self)
 Add to MetaCart
(Show Context)
We define the universal type class of a sequence x n, in analogy to the notion used in the classical method of types. Two sequences of the same length are said to be of the same universal (LZ) type if and only if they yield the same set of phrases in the incremental parsing of Ziv and Lempel (1978). We show that the empirical probability distributions of any finite order of two sequences of the same universal type converge, in the variational sense, as the sequence length increases. Consequently, the normalized logarithms of the probabilities assigned by any kth order probability assignment to two sequences of the same universal type, as well as the kth order empirical entropies of the sequences, converge for all k. We study the size of a universal type class, and show that its asymptotic behavior parallels that of the conventional counterpart, with the LZ78 code length playing the role of the empirical entropy. We also estimate the number of universal types for sequences of length n, and show that it is of the form exp((1+o(1))γ n/log n) for a well characterized constant γ. We describe algorithms for enumerating the sequences in a universal type class, and for drawing a sequence from the class with uniform probability. As an application, we consider the problem of universal simulation of individual sequences. A sequence drawn with uniform probability from the universal type class of x n is an optimal simulation of x n in a well defined mathematical sense.