Results 1  10
of
12
Capacity of Fading Channels with Channel Side Information
, 1997
"... We obtain the Shannon capacity of a fading channel with channel side information at the transmitter and receiver, and at the receiver alone. The optimal power adaptation in the former case is "waterpouring" in time, analogous to waterpouring in frequency for timeinvariant frequencyselective fadi ..."
Abstract

Cited by 397 (23 self)
 Add to MetaCart
We obtain the Shannon capacity of a fading channel with channel side information at the transmitter and receiver, and at the receiver alone. The optimal power adaptation in the former case is "waterpouring" in time, analogous to waterpouring in frequency for timeinvariant frequencyselective fading channels. Inverting the channel results in a large capacity penalty in severe fading.
Simulationbased computation of information rates for channels with memory
 IEEE TRANS. INFORM. THEORY
, 2006
"... The information rate of finitestate source/channel models can be accurately estimated by sampling both a long channel input sequence and the corresponding channel output sequence, followed by a forward sum–product recursion on the joint source/channel trellis. This method is extended to compute up ..."
Abstract

Cited by 54 (11 self)
 Add to MetaCart
The information rate of finitestate source/channel models can be accurately estimated by sampling both a long channel input sequence and the corresponding channel output sequence, followed by a forward sum–product recursion on the joint source/channel trellis. This method is extended to compute upper and lower bounds on the information rate of very general channels with memory by means of finitestate approximations. Further upper and lower bounds can be computed by reducedstate methods.
The capacity of channels with feedback
 IEEE Trans. Information Theory
, 2009
"... We introduce a general framework for treating channels with memory and feedback. First, we generalize Massey’s concept of directed information [23] and use it to characterize the feedback capacity of general channels. Second, we present coding results for Markov channels. This requires determining a ..."
Abstract

Cited by 38 (2 self)
 Add to MetaCart
We introduce a general framework for treating channels with memory and feedback. First, we generalize Massey’s concept of directed information [23] and use it to characterize the feedback capacity of general channels. Second, we present coding results for Markov channels. This requires determining appropriate sufficient statistics at the encoder and decoder. Third, a dynamic programming framework for computing the capacity of Markov channels is presented. Fourth, it is shown that the average cost optimality equation (ACOE) can be viewed as an implicit singleletter characterization of the capacity. Fifth, scenarios
Turbo Codes For Binary Markov Channels
 In ICC '98
, 1998
"... We describe parallel concatenated codes for communication over binaryinput, binaryoutput hidden Markov channels. We present encoder design techniques and decoder processing modifications that utilize the a priori statistics of the channel and show that the resulting codes allow reliable communicat ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
We describe parallel concatenated codes for communication over binaryinput, binaryoutput hidden Markov channels. We present encoder design techniques and decoder processing modifications that utilize the a priori statistics of the channel and show that the resulting codes allow reliable communication at rates which are above the capacity of a memoryless channel with the same stationary bit error probability as the Markov channel. These codes outperform systems based on the traditional approach of using a channel interleaver to create a channel which is assumed to be memoryless. In addition, we introduce a joint estimation/decoding method that allows the estimation of the parameters of the Markov model when they are not known a priori. I. Introduction Many practical communications channels can be modeled using discrete Markov channels. Such channels are characterized by a set of states S j ; 0 j S \Gamma 1, the matrix of transition probabilities among states, and the list giving th...
Exploiting Binary Markov Channels With Unknown Parameters In Turbo Decoding
 in Proc. IEEE Glob. Telecommun. Conf
, 1998
"... We describe parallel concatenated codes for communication over binaryinput, binaryoutput hidden Markov channels when the parameters of the Markov channel are unknown a priori. Specifically, we develop a joint estimation/decoding method that allows the estimation of the parameters of the model with ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
We describe parallel concatenated codes for communication over binaryinput, binaryoutput hidden Markov channels when the parameters of the Markov channel are unknown a priori. Specifically, we develop a joint estimation/decoding method that allows the estimation of the parameters of the model without the need for training sequences. This method involves little or no sacrifice in performance relative to the case where the Markov channel parameters are provided to the receiver as a priori information. Furthermore, we show communication at rates which are above the capacity of a memoryless channel with the same stationary bit error probability as the Markov channel, thereby outperforming systems based on the traditional approach of using a channel interleaver to create a channel which is assumed to be memoryless. I. Introduction Many practical communications channels can be modeled using discrete Markov channels. Such channels are characterized by a set of states S j ; 0 j S \Gamma...
Capacity of TimeVarying Channels with Causal Channel Side Information.
 IEEE Trans. Information Theory
, 1999
"... The capacity of timevarying asymptotically blockmemoryless channels with causal channel side information (CSI) at the sender and receiver is considered. We obtain a formal weak coding theorem that achieves the maximum expected mutual information. We also obtain a converse theorem showing that this ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
The capacity of timevarying asymptotically blockmemoryless channels with causal channel side information (CSI) at the sender and receiver is considered. We obtain a formal weak coding theorem that achieves the maximum expected mutual information. We also obtain a converse theorem showing that this maximum expected mutual information is the highest attainable rate. We apply the coding theorem to determine the capacity and optimal input distribution of intersymbol interference (ISI) timevarying channels, whose capacity cannot be found through traditional decomposition methods. We also apply our coding theorem to flat fading channels with imperfect i.i.d. channel estimates.
Dynamic Power Control under Energy and Delay Constraints
 IEEE TRANSACTIONS ON INFORMATION THEORY
, 2001
"... Informationtheoretic limits of fading channels have been discussed extensively in the literature under average power constraints. In this paper, we introduce total energy constraints on transmission and study dynamic power control on a time varying channel. Three types of delay constraints are di ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Informationtheoretic limits of fading channels have been discussed extensively in the literature under average power constraints. In this paper, we introduce total energy constraints on transmission and study dynamic power control on a time varying channel. Three types of delay constraints are discussed: no delay constraint, soft delay constraint and strict delay constraint. We consider a block fading channel modeled as a finite state Markov chain and study binary power control schemes for both variable rate and constant rate systems. For variable rate systems, we search for the optimal transmission policy that maximizes the minimum value of the expected sumofrates over the total communication window. Under no delay constraints, the optimal policy is extremely selective in that it stipulates transmission only on the best channel state, and the corresponding energy efficiency (the ratio of average mutual information to energy) serves as an upper bound for all delay constrained cases. Under delay constraints, the optimal policies are less selective and result in threshold rules on both the received signal strength (channel state) and the residual battery energy. Further, the energy efficiency is observed to increase with the Doppler frequency. For constant rate systems, optimal schemes are obtained to minimize the maximum value of the outage probability. It is observed that the outage probability decreases with the Doppler frequency in the strict delay constraint case. In addition, repetition diversity is also considered under a strict delay constraint.
Communication Through Jamming Over a Slotted ALOHA Channel
, 2005
"... This work derives bounds on the jamming capacity of a slotted ALOHA system. A system with n legitimate users, each with a Bernoulli arrival process is considered. Packets are temporarily stored at the corresponding user queues, and a slotted ALOHA strategy is used for packet transmissions over the s ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
This work derives bounds on the jamming capacity of a slotted ALOHA system. A system with n legitimate users, each with a Bernoulli arrival process is considered. Packets are temporarily stored at the corresponding user queues, and a slotted ALOHA strategy is used for packet transmissions over the shared channel. The scenario considered is that of a pair of covert users that jam legitimate transmissions in order to communicate over the slotted ALOHA channel. Jamming leads to binary signaling between the covert users, with packet collisions due to legitimate users treated as (multiplicative) noise in this channel. Further, the queueing dynamics at the legitimate users stochastically couples the jamming strategy used by the covert users and the channel evolution. By considering various i.i.d. jamming strategies, achievable jamming rates over the slotted ALOHA channel are derived. Further, an upper bound on the jamming capacity over the class of all ergodic jamming policies is derived. These bounds are shown to be tight in the limit where the offered system load approaches unity.
Signal Processing for Joint SourceChannel Coding of Digital Images
, 2000
"... This thesis addresses the problems of signal processing for image communication and restoration. Significant attention is devoted to developing novel stochastic models for images, investigating the information theoretic performance bounds for them, and designing e#cient learning and inference method ..."
Abstract
 Add to MetaCart
This thesis addresses the problems of signal processing for image communication and restoration. Significant attention is devoted to developing novel stochastic models for images, investigating the information theoretic performance bounds for them, and designing e#cient learning and inference methods for the proposed models. Unlike the commonly accepted approach in which the design of communication systems is performed by first compressing the data into binary representation and then channel coding it to recover from transmission errors, this thesis advocates the joint sourcechannel coding solution to the problem. The joint approach potentially leads to significant performance gains in emerging multiuser communication scenarios like digital audio and video broadcast (DAB and DVB) and multicast over wireless and wireline networks, multimedia communication in heterogeneous environments, and situations with uncertainty and fluctuations in the data source or channel parameters as is typic...