Results 1 
6 of
6
Fading Channels: InformationTheoretic And Communications Aspects
 IEEE TRANSACTIONS ON INFORMATION THEORY
, 1998
"... In this paper we review the most peculiar and interesting informationtheoretic and communications features of fading channels. We first describe the statistical models of fading channels which are frequently used in the analysis and design of communication systems. Next, we focus on the information ..."
Abstract

Cited by 289 (1 self)
 Add to MetaCart
In this paper we review the most peculiar and interesting informationtheoretic and communications features of fading channels. We first describe the statistical models of fading channels which are frequently used in the analysis and design of communication systems. Next, we focus on the information theory of fading channels, by emphasizing capacity as the most important performance measure. Both singleuser and multiuser transmission are examined. Further, we describe how the structure of fading channels impacts code design, and finally overview equalization of fading multipath channels.
The Method of Types
, 1998
"... The method of types is one of the key technical tools in Shannon Theory, and this tool is valuable also in other fields. In this paper, some key applications will be presented in sufficient detail enabling an interested nonspecialist to gain a working knowledge of the method, and a wide selection of ..."
Abstract

Cited by 95 (0 self)
 Add to MetaCart
The method of types is one of the key technical tools in Shannon Theory, and this tool is valuable also in other fields. In this paper, some key applications will be presented in sufficient detail enabling an interested nonspecialist to gain a working knowledge of the method, and a wide selection of further applications will be surveyed. These range from hypothesis testing and large deviations theory through error exponents for discrete memoryless channels and capacity of arbitrarily varying channels to multiuser problems. While the method of types is suitable primarily for discrete memoryless models, its extensions to certain models with memory will also be discussed. Index TermsArbitrarily varying channels, choice of decoder, counting approach, error exponents, extended type concepts, hypothesis testing, large deviations, multiuser problems, universal coding. I.
Mismatched decoding revisited: general alphabets, channels with memory, and the wideband limit
 IEEE Trans. Inform. Theory
, 2000
"... Abstract—The mismatch capacity of a channel is the highest rate at which reliable communication is possible over the channel with a given (possibly suboptimal) decoding rule. This quantity has been studied extensively for singleletter decoding rules over discrete memoryless channels (DMCs). Here we ..."
Abstract

Cited by 22 (0 self)
 Add to MetaCart
Abstract—The mismatch capacity of a channel is the highest rate at which reliable communication is possible over the channel with a given (possibly suboptimal) decoding rule. This quantity has been studied extensively for singleletter decoding rules over discrete memoryless channels (DMCs). Here we extend the study to memoryless channels with general alphabets and to channels with memory with possibly nonsingleletter decoding rules. We also study the wideband limit, and, in particular, the mismatch capacity per unit cost, and the achievable rates on an additivenoise spreadspectrum system with singleletter decoding and binary signaling. Index Terms—Capacity per unit cost, channels with memory, general alphabets, mismatched decoding, nearest neighbor decoding, spread spectrum. I.
An Identity of Chernoff Bounds with an Interpretation in Statistical Physics and Applications in Information Theory
, 2007
"... An identity between two versions of the Chernoff bound on the probability a certain large deviations event, is established. This identity has an interpretation in statistical physics, namely, an isothermal equilibrium of a composite system that consists of multiple subsystems of particles. Several i ..."
Abstract

Cited by 9 (8 self)
 Add to MetaCart
An identity between two versions of the Chernoff bound on the probability a certain large deviations event, is established. This identity has an interpretation in statistical physics, namely, an isothermal equilibrium of a composite system that consists of multiple subsystems of particles. Several information–theoretic application examples, where the analysis of this large deviations probability naturally arises, are then described from the viewpoint of this statistical mechanical interpretation. This results in several relationships between information theory and statistical physics, which we hope, the reader will find insightful.
Universal A Posteriori Metrics Game
, 2009
"... Over binary input channels, uniform distribution is a universal prior, in the sense that it allows to maximize the worst case mutual information over all binary input channels, ensuring at least 94.2 % of the capacity. In this paper, we address a similar question, but with respect to a universal ge ..."
Abstract
 Add to MetaCart
Over binary input channels, uniform distribution is a universal prior, in the sense that it allows to maximize the worst case mutual information over all binary input channels, ensuring at least 94.2 % of the capacity. In this paper, we address a similar question, but with respect to a universal generalized linear decoder. We look for the best collection of finitely many a posteriori metrics, to maximize the worst case mismatched mutual information achieved by decoding with these metrics (instead of an optimal decoder such as the Maximum Likelihood (ML) tuned to the true channel). It is shown that for binary input and output channels, two metrics suffice to actually achieve the same performance as an optimal decoder. In particular, this implies that there exist a decoder which is generalized linear and achieves at least 94.2 % of the compound capacity on any compound set, without the knowledge of the underlying set.