## Reliable Communication Under Channel Uncertainty (1998)

Venue: | IEEE TRANS. INFORM. THEORY |

Citations: | 123 - 3 self |

### BibTeX

@ARTICLE{Lapidoth98reliablecommunication,

author = {Amos Lapidoth and Prakash Narayan},

title = {Reliable Communication Under Channel Uncertainty},

journal = {IEEE TRANS. INFORM. THEORY},

year = {1998},

volume = {44},

number = {6},

pages = {2148--2177}

}

### Years of Citing Articles

### OpenURL

### Abstract

In many communication situations, the transmitter and the receiver must be designed without a complete knowledge of the probability law governing the channel over which transmission takes place. Various models for such channels and their corresponding capacities are surveyed. Special emphasis is placed on the encoders and decoders which enable reliable communication over these channels.

### Citations

9216 |
Elements of Information Theory
- Cover, Thomas
- 1991
(Show Context)
Citation Context ...on error exponents for the discrete memoryless channel can be found in [32], [44], [64], and in references therein. Example 1 (Continued): The capacity of a BSC with crossover probability is given by =-=[39]-=-, [44], [64] where is the binary entropy function. In [114], Shannon considered a different model in which the channel law at time depends on a state rv , with values in a finite set , evolving in a m... |

735 |
Writing on dirty paper
- Costa
- 1983
(Show Context)
Citation Context ...e receiver has no additional information. This technique also applies to situations where the receiver may have noisy observations of the channel states. A variation of this problem was considered in =-=[37]-=-, [67], [78], and in references therein, where state information is available to the transmitter in a noncausal way in that the entire realization of the i.i.d. state sequence is known when transmissi... |

383 | Comments on broadcast channels
- Cover
- 1998
(Show Context)
Citation Context ... decoder [64, pp. 176–177] as well as a merged decoder, obtained by merging the maximum-likelihood decoders of each of the channels in the family [60], can be used to demonstrate achievability. Cove=-=r [38]-=- has shown interesting connections between communication over a compound channel and over a broadcast channel. An application of these ideas to communication over slowly varying flat-fading channels u... |

318 | Principles and Practice of Information Theory - Blahut - 1987 |

162 | Common randomness in information theory and cryptography—Part I: Secret sharing
- Ahlswede, Csiszár
- 1993
(Show Context)
Citation Context ...age probability of error equals its randomized code capacity given by Theorem 2. For more on this result due to Ahlswede and Csiszár, as also implications of “common randomness” for AVC capacity,=-= see [18]-=-. Ahlswede and Cai [17] have examined another situation in which the transmitter and receiver observe the components and , respectively, of a memoryless correlated source (i.e., an i.i.d. process with... |

123 |
Estimates of Error Rates for Codes on Burst-noise Channels
- Elliott
- 1963
(Show Context)
Citation Context ...t the model (7) corresponds to a known channel, and the set of states should not be confused with the state space definition of an AVC. introduced in (5) in the Example 4: The Gilbert–Elliott channe=-=l [57], [68], -=-[69], [101] is a finite-state channel with two states , the state corresponding to the “good” state and state corresponding to the “bad” state (see Fig. 1). The channel has input and output al... |

98 | List decoding for noisy channels
- Elias
- 1957
(Show Context)
Citation Context ... rarely available) and can substantially increase capacity. The inefficacy of feedback in increasing capacity was demonstrated by Shannon in [112]. For some of the results on list decoding, see [44], =-=[55]-=-, [56], [62], [115], [120], and references therein. 1) The Compound Discrete Memoryless Channel: We now turn to the compound discrete memoryless channel, which models communication over a memoryless c... |

86 |
Multiterminal source coding,” in The Information Theory Approach to
- Berger
- 1978
(Show Context)
Citation Context ...typicality” decoders. These decoders are usually classified as “weak typicality” decoders [39] (which are sometimes referred to as “entropy typicality” decoders [44]), and “joint typicalit=-=y” decoders [24], [4-=-4], [126] (which are sometimes referred to as “strong” typicality decoders). We describe below the joint-type typicality decoder as well as a more stringent version which relies on a notion of typ... |

51 |
Shitz), “Fading channels: Information theoretic and communications aspects
- Biglieri, Proakis, et al.
- 1998
(Show Context)
Citation Context ...rch are indicated. We limit our discussion to single-user channels, in which case the receiver for a given user treats all other users’ signals (when present) as noise. (For some multiuser models se=-=e [26]-=-, [110], and references therein.) We do not, therefore, investigate the benefits of using the multiple-access transmitters and receivers suggested by the work mentioned in Section VI. We remark that t... |

30 |
Universal decoding for channels with memory
- Feder, Lapidoth
- 1998
(Show Context)
Citation Context ...ss of code rate alluded to earlier. Channels characterized by slow frequency-selective fading can be described by a compound finite-state channel model (cf. Section III-A1)). The universal decoder in =-=[60]-=- achieves channel capacity and the random coding-error exponent. The high complexity of this decoder, however, renders it impractical if complexity is an overriding concern. In this situation, a train... |

18 |
Arbitrarily varying channels with constrained inputs and states
- Csiszár, Narayan
- 1988
(Show Context)
Citation Context ... knowledge of each other with which the codeword and state sequences are selected. For a summary of the work on AVC’s through the late 1980’s, and for basic results, we refer the reader to [6], [4=-=4], [47]��-=-�[49], and [126]. Before we turn to a presentation of key AVC results, it is useful to revisit the probability of error criteria in (18) and (19). Observe that in the definition of an -achievable rate... |

14 |
Proof of Shannon’s transmission theorem for finite-state indecomposable channels
- Blackwell, Breiman, et al.
- 1958
(Show Context)
Citation Context ...nt of AVC theory is provided in [49, Sec. VI]. See also [44, pp. 219–222 and 226–233]. B. Finite-State Channels The capacity of a finite-state channel (7) has been studied under various conditions=-= in [29]-=-, [64], [113], and [126]. Of particular importance is [64], where error exponents for a general finite-state channel are also computed. Before stating the capacity theorem for this channel, we introdu... |

14 |
Maximal error capacity regions are smaller than average error capacity regions for multi-user channels
- Dueck
- 1978
(Show Context)
Citation Context ...discussed in [70] and [105]. It is interesting to note that even for a known MAC, the average probability of error and the maximal probability of error criteria can lead to different capacity regions =-=[54]; -=-this is in contrast with the capacity of a known single-user channel. The compound channel capacity region for a finite family of discrete memoryless MAC’s has been computed by HansLAPIDOTH AND NARA... |

12 |
Some information theoretic saddlepoints
- Borden, Mason, et al.
- 1985
(Show Context)
Citation Context ...ion by noting the role of compound DMC’s and AVC’s in the study of communication situations partially controlled by an adversarial jammer. For dealing with such situations, several authors (cf. e.=-=g., [36], [79], -=-and [97]) have proposed a game-theoretic approach which involves a two-person zero-sum game between the “communicator” and the “jammer” with mutual information as the payoff function. An analy... |

12 |
Optimum information transmission through a channel with unknown parameters
- Dobrushin
- 1959
(Show Context)
Citation Context ...n II, (3) and (4)). The parameter space (cf. (3)) now corresponds to the set of distribution functions of real-valued rv’s with . The capacity of this Gaussian compound channel follows from Dobrushi=-=n [52]-=-, and is given by the formula in (115). Thus ignorance of the true distribution of the i.i.d interference , other than knowing that it satisfies (118), does not reduce achievable rates any more than i... |

7 |
Theory: Coding Theorems for Discrete Memoryless Systems
- Information
- 1981
(Show Context)
Citation Context ...ng error exponent of a channel. Our survey does not address these important notionssLAPIDOTH AND NARAYAN: RELIABLE COMMUNICATION UNDER CHANNEL UNCERTAINTY 2153 for which we direct the reader to [43], =-=[44]-=-, [46], [64], [65], [95], [115], [116], and references therein. In the situations considered above, quite often the selection of codes is restricted in that the transmitted codewords must satisfy appr... |

6 |
Certain Results in Coding Theory for Compound Channels I
- Ahlswede
- 1967
(Show Context)
Citation Context ...s [30] establishes the converse. A strong converse for the maximum probability of error criterion can be found in [44] and [126]. For the average probability of error, a strong converse need not hold =-=[1]-=-, [44]. Proving the direct part requires showing that for any input pmf , any rate , and any , there exists a sequence of encoders parametrized by the blocklength that can be reliably decoded on any c... |

6 |
Two proofs of Pinskers conjecture concerning arbitrarily varying channels
- Ahlswede, Cai
- 1991
(Show Context)
Citation Context ...which satisfy ), and have shown that equals the randomized code capacity given by Theorem 2. The performance of an AVC (5) using deterministic list codes (cf. (32) and (33)) is examined in [5], [12], =-=[14], -=-[33]–[35], [82], and [83]. The value of this capacity for the maximum probability of error and vanishingly small list rate was determined by Ahlswede [5]. Lower bounds on the sizes of constant lists... |

6 |
Coding theorem for discrete memoryless channels with given decision rules
- Balakirsky
- 1991
(Show Context)
Citation Context ...oblem” consists of finding the set of achievable rates for this situation, i.e., the supremum of all rates that can be achieved over the DMC with the decoder . This problem was studied extensively i=-=n [21]-=-, [22], [43], [51], [84], [87], and [100]. A lower bound on , which can be derived using a random-coding argument, is given by the following.sLAPIDOTH AND NARAYAN: RELIABLE COMMUNICATION UNDER CHANNEL... |

6 | A new look at the error exponent of discrete memoryless channels,” presented at the - Csiszár, Körner, et al. - 1977 |

6 |
of the Gaussian arbitrarily varying channel
- “Capacity
- 1991
(Show Context)
Citation Context ...ichotomy: it either equals the randomized code capacity or else is zero, according to whether or not the transmitter power exceeds the power of the (arbitrary) interference . This result is proved in =-=[50]-=- as Theorem 15: The deterministic code capacity of the Gaussian AVC (111) under input constraint and state constraint , for the average probability of error, is given by Furthermore, if , a strong con... |

5 |
Maximal Error Capacity of Arbitrarily Varying Channels for Constant List Sizes
- “The
- 1993
(Show Context)
Citation Context ... rv’s which satisfy ), and have shown that equals the randomized code capacity given by Theorem 2. The performance of an AVC (5) using deterministic list codes (cf. (32) and (33)) is examined in [5]=-=, [12], -=-[14], [33]–[35], [82], and [83]. The value of this capacity for the maximum probability of error and vanishingly small list rate was determined by Ahlswede [5]. Lower bounds on the sizes of constant... |

5 | Localized random and arbitrary errors in the light of AV channel theory
- Ahlswede, Bassalygo, et al.
- 1995
(Show Context)
Citation Context ...ails a combination of the aforementioned “elimination technique” with the “robustification technique” developed in [8] and [9]. The situation considered above in [11] is to be contrasted with =-=that in [13]-=-, [67], and [78] where the channel states which are known to the transmitter alone at thesLAPIDOTH AND NARAYAN: RELIABLE COMMUNICATION UNDER CHANNEL UNCERTAINTY 2161 commencement of transmission, cons... |

5 |
Coding theorems for classes of arbitrarily varying discrete memoryless channels,” Probl
- Dobrushin, Stambler
- 1975
(Show Context)
Citation Context ...rect part in [48] uses a code with the codewords chosen at random from sequences of a fixed type, and selectively identified by a generalized Chernoff bounding technique due to Dobrushin and Stambler =-=[53]-=-. The linchpin is a subtle decoding rule which decides on the basis of a joint typicality test together with a threshold test using empirical mutual information quantities, similarly as in [45]. A key... |

4 |
Correlated decoding for channel with arbitrarily varying channel probability functions
- Ahlswede, Wolfowitz
- 1969
(Show Context)
Citation Context ... let denote the set of all pmfs on . The capacity of the AVC (5) for randomized codes is, of course, the same for the maximum and average probabilities of error, and is given by the following theorem =-=[19]-=-, [31], [119]. Theorem 2: The randomized code capacity of the AVC (5) is given by Further, a strong converse holds so that (46) (47) The direct part of Theorem 2 can be proved [19] using a random-codi... |

4 |
Capacity of the arbitrarily varying channel under list decoding
- Blinovsky, Narayan, et al.
- 1995
(Show Context)
Citation Context ...satisfy ), and have shown that equals the randomized code capacity given by Theorem 2. The performance of an AVC (5) using deterministic list codes (cf. (32) and (33)) is examined in [5], [12], [14], =-=[33]��-=-�[35], [82], and [83]. The value of this capacity for the maximum probability of error and vanishingly small list rate was determined by Ahlswede [5]. Lower bounds on the sizes of constant lists for a... |

4 |
decomposition: A new key to coding theorems
- “Graph
- 1981
(Show Context)
Citation Context ...m-coding error exponent of a channel. Our survey does not address these important notionssLAPIDOTH AND NARAYAN: RELIABLE COMMUNICATION UNDER CHANNEL UNCERTAINTY 2153 for which we direct the reader to =-=[43]-=-, [44], [46], [64], [65], [95], [115], [116], and references therein. In the situations considered above, quite often the selection of codes is restricted in that the transmitted codewords must satisf... |

3 |
Estimation of the size of the list when decoding over an arbitrarily varying channel
- Blinovsky, Pinsker
- 1993
(Show Context)
Citation Context ...14]. The fact that the deterministic list code capacity of an AVC (5) for the average probability of error displays a dichotomy similar to that described by (57) was observed by Blinovsky and Pinsker =-=[34]-=- who also determined a threshold for the list size above which said capacity equals the randomized code capacity given by Theorem 2. A complete characterization of the deterministic list code capacity... |

3 |
The method of types,” this issue
- Csiszár
(Show Context)
Citation Context ...ious codeword which, for some , also appears to be jointly typical with in the sense of (104), then can be expected to be only vanishingly dependent on given and , in the sense of (105). As stated in =-=[40], th-=-e form of this decoder is, in fact, suggested by the procedure for bounding the maximum probability of error using the “method of types.” An important element of the proof of Theorem 4 in [45] con... |

2 |
capacities for list codes
- “Channel
- 1973
(Show Context)
Citation Context ...neric rv’s which satisfy ), and have shown that equals the randomized code capacity given by Theorem 2. The performance of an AVC (5) using deterministic list codes (cf. (32) and (33)) is examined i=-=n [5], -=-[12], [14], [33]–[35], [82], and [83]. The value of this capacity for the maximum probability of error and vanishingly small list rate was determined by Ahlswede [5]. Lower bounds on the sizes of co... |

2 |
method of coding and an application to arbitrarily varying channels
- “A
- 1980
(Show Context)
Citation Context ...bability of receiving , when is transmitted and is the channel state sequence, is given by (5). The standard AVC model introduced in [31], and subsequently studied by several authors (e.g., [2], [6], =-=[10], [2-=-0], [45]), assumes that the transmitter and receiver are unaware of the actual state sequence which governs a transmission. In the same vein, the “selector” of the state sequence , is ignorant of ... |

2 |
Minimum description length principle in modeling and coding,” this issue
- Barron, Rissanen, et al.
- 1998
(Show Context)
Citation Context ...D 20742 USA. Publisher Item Identifier S 0018-9448(98)05288-2. (Invited Paper) 0018–9448/98$10.00 © 1998 IEEE of the source. The body of literature on this subject is vast, and we refer the reader =-=to [23]-=-, [25], [61], [71], and [128] in this issue. In selecting a model for a communication situation, several factors must be considered. These include the physical and statistical nature of the channel di... |

2 |
The effect of statistically dependent interference upon channel capacity
- Blachman
- 1962
(Show Context)
Citation Context ...onverse; see [80]. The results of Theorem 14 can be extended to a “vector” Gaussian AVC [81] (see also [41]). Earlier work on the randomized code capacity of the Gaussian AVC (111) is due to Blach=-=man [27]-=-, [28] who provided lower and upper bounds on capacity when the state sequence is allowed to depend on the actual codeword transmitted. Also, the randomized code capacity problem for the Gaussian AVC ... |

2 |
varying channels with general alphabets and states
- “Arbitrarily
- 1992
(Show Context)
Citation Context ...well as the deterministic code capacities of the AVC (5) with input constraint and state constraint have been extended by Csiszár to AVC’s with general input and output alphabets and state space; s=-=ee [41]-=-. It remains to characterize AVC performance using codes with stochastic encoders. For the AVC (5) without input or state constraints, the following result is due to Ahlswede [6]. Theorem 7: For codes... |

2 |
the capacity of the arbitrarily varying channel for maximum probability of error
- “On
- 1981
(Show Context)
Citation Context ...receiving , when is transmitted and is the channel state sequence, is given by (5). The standard AVC model introduced in [31], and subsequently studied by several authors (e.g., [2], [6], [10], [20], =-=[45]), a-=-ssumes that the transmitter and receiver are unaware of the actual state sequence which governs a transmission. In the same vein, the “selector” of the state sequence , is ignorant of the actual m... |

2 |
capacity for a given decoding metric
- “Channel
- 1995
(Show Context)
Citation Context ...(but polynomial in the blocklength) set of DMC’s which is in a sense dense in the class of all DMC’s. The existence of a code is demonstrated using a random-coding argument. It is interesting to n=-=ote [51]-=-, [119], that if the set of stochastic matrices is compact and convex, then the decoder can be chosen as the maximum-likelihood decoder for the DMC with stochastic matrix , where is a saddle point for... |

2 |
error capacity under list decoding
- “Zero
- 1988
(Show Context)
Citation Context ...y available) and can substantially increase capacity. The inefficacy of feedback in increasing capacity was demonstrated by Shannon in [112]. For some of the results on list decoding, see [44], [55], =-=[56]-=-, [62], [115], [120], and references therein. 1) The Compound Discrete Memoryless Channel: We now turn to the compound discrete memoryless channel, which models communication over a memoryless channel... |

2 |
A min-max theorem for antijamming group codes
- Ericson
- 1984
(Show Context)
Citation Context ...s of practical engineering devices; in fact, commonly used spread-spectrum techniques such as direct sequence and frequency hopping can be interpreted as practical implementations of randomized codes =-=[58]-=-, employing synchronized random number generators at the transmitter and receiver. From a practical standpoint, however, a (lengthblock) randomized code of rate bits per channel use, such as that just... |

1 |
note on the existence of the weak capacity for channels with arbitrarily varying channel probability functions and its relation to Shannon’s zero error capacity,” Ann
- “A
- 1970
(Show Context)
Citation Context ...d. The probability of receiving , when is transmitted and is the channel state sequence, is given by (5). The standard AVC model introduced in [31], and subsequently studied by several authors (e.g., =-=[2], [6-=-], [10], [20], [45]), assumes that the transmitter and receiver are unaware of the actual state sequence which governs a transmission. In the same vein, the “selector” of the state sequence , is i... |

1 |
communication channels
- “Multi-way
- 1971
(Show Context)
Citation Context ...and a stochastic matrix . The rates and for the two users are defined analogously as in (12). The capacity region of the MAC for the average probability of error was derived independently by Ahlswede =-=[4] and-=- Liao [94]. A rate-pair is achievable for the average probability of error iff and (121) (122) (123) for some joint pmf on of the form where the “time-sharing” random variable with values in the s... |

1 | varying channels with states sequence known to the sender - “Arbitrarily - 1986 |

1 |
varying multiple-access channels Part I. Ericson’s symmetrizability is adequate, Gubner’s conjecture is true
- “Arbitrarily
- 1997
(Show Context)
Citation Context ...nditions for , we refer the reader to [49, Appendix I]. Yet another means of determining the deterministic code capacity of the AVC (5) is derived as a special case of recent work by Ahlswede and Cai =-=[15]-=- which completely resolves the deterministic code capacity problem for a multiple-access AVC for the average probability of error. For the AVC (5), the approach in [15], in effect, consists of element... |

1 |
varying multiple-access channels, Part II: Correlated sender’s side information, correlated messages, and ambiguous transmission
- “Arbitrarily
- 1997
(Show Context)
Citation Context ...d by Ahlswede and Cai [15], thereby completely resolving the problem of characterizing . (It was shown in [72] that under a set of conditions which are sufficient but not necessary.) Ahlswede and Cai =-=[16]-=- have further demonstrated that if the multiple-access AVC is only nonsymmetrizable(but can be symmetrizable- or symmetrizable- ), both users can still reliably transmit information over the channel u... |

1 |
sources help transmission over an arbitrarily varying channel
- “Correlated
- 1997
(Show Context)
Citation Context ...r equals its randomized code capacity given by Theorem 2. For more on this result due to Ahlswede and Csiszár, as also implications of “common randomness” for AVC capacity, see [18]. Ahlswede and=-= Cai [17] h-=-ave examined another situation in which the transmitter and receiver observe the components and , respectively, of a memoryless correlated source (i.e., an i.i.d. process with generic rv’s which sat... |

1 |
converse coding theorem for mismatched decoding at the output of binary-input memoryless channels
- “A
- 1995
(Show Context)
Citation Context ... consists of finding the set of achievable rates for this situation, i.e., the supremum of all rates that can be achieved over the DMC with the decoder . This problem was studied extensively in [21], =-=[22]-=-, [43], [51], [84], [87], and [100]. A lower bound on , which can be derived using a random-coding argument, is given by the following.sLAPIDOTH AND NARAYAN: RELIABLE COMMUNICATION UNDER CHANNEL UNCER... |

1 |
Lossy data compression,” this issue
- Berger, Gibson
(Show Context)
Citation Context ...2 USA. Publisher Item Identifier S 0018-9448(98)05288-2. (Invited Paper) 0018–9448/98$10.00 © 1998 IEEE of the source. The body of literature on this subject is vast, and we refer the reader to [23=-=], [25]-=-, [61], [71], and [128] in this issue. In selecting a model for a communication situation, several factors must be considered. These include the physical and statistical nature of the channel disturba... |

1 |
the capacity of a band-limited channel perturbed by statistically dependent interference
- “On
- 1962
(Show Context)
Citation Context ...e; see [80]. The results of Theorem 14 can be extended to a “vector” Gaussian AVC [81] (see also [41]). Earlier work on the randomized code capacity of the Gaussian AVC (111) is due to Blachman [2=-=7], [28]-=- who provided lower and upper bounds on capacity when the state sequence is allowed to depend on the actual codeword transmitted. Also, the randomized code capacity problem for the Gaussian AVC has pr... |

1 |
Many coding theorems follow from an elementary combinatorial lemma
- Csiszár, Körner
- 1980
(Show Context)
Citation Context ...r the maximum probability of error, the decoder in [45] defined by (104) and (105) involves the joint types of triples . This decoder, thus, belongs to a more general class of decoders, introduced in =-=[42]-=- under the name of -decoders, which are based on pairwise comparisons of codewords relying on joint types of triples . We turn next to decoders used for achieving the deterministic code capacity of th... |

1 |
and decoding rules for classes of arbitrarily varying channels
- “Capacity
- 1989
(Show Context)
Citation Context ...e joint typicality decoder, the “independence” decoder, the MMI decoder (cf. Section IV-B2)) or the minimum-distance decoder. This issue is briefly addressed below; for a comprehensive treatment, =-=see [49]-=-. Given a set of codewords in as above, the joint typicality decoder in [49] is defined as follows: iff for some (108) where is defined by (45), and is chosen suitably small. If more than one satisfie... |

1 |
error bounds for random codes in the arbitrarily varying channel
- “Exponential
- 1985
(Show Context)
Citation Context ...ion of the prefixes. A necessary and sufficient computable characterization of AVC’s for deciding between the alternatives in (57) was not provided in [6]. This lacuna was partially filled by Ericso=-=n [59] who-=- gave a necessary condition for the deterministic code capacity to be positive. By enlarging on an idea in [31], it was shown [59] that if the AVC state “selector” could emulate the channel input ... |