Results 1  10
of
108
Sum Capacity of a Gaussian Vector Broadcast Channel
 IEEE Trans. Inform. Theory
, 2002
"... This paper characterizes the sum capacity of a class of nondegraded Gaussian vectB broadcast channels where a singletransmitter with multiple transmit terminals sends independent information to multiple receivers. Coordinat+[ is allowed among the transmit teminals, but not among the different recei ..."
Abstract

Cited by 277 (21 self)
 Add to MetaCart
(Show Context)
This paper characterizes the sum capacity of a class of nondegraded Gaussian vectB broadcast channels where a singletransmitter with multiple transmit terminals sends independent information to multiple receivers. Coordinat+[ is allowed among the transmit teminals, but not among the different receivers. The sum capacity is shown t be a saddlepoint of a Gaussian mu al informat]R game, where a signal player chooses a tansmit covariance matrix to maximize the mutual information, and a noise player chooses a fictitious noise correlation to minimize the mutual information. This result holds fort he class of Gaussian channels whose saddlepoint satisfies a full rank condition. Furt her,t he sum capacity is achieved using a precoding method for Gaussian channels with additive side information noncausally known at the transmitter. The optimal precoding structure is shown t correspond to a decisionfeedback equalizer that decomposes t e broadcast channel into a series of singleuser channels with intk ference presubtract] at the transmiter.
Breaking Spectrum Gridlock with Cognitive Radios: An Information Theoretic Perspective
, 2008
"... Cognitive radios hold tremendous promise for increasing spectral efficiency in wireless systems. This paper surveys the fundamental capacity limits and associated transmission techniques for different wireless network design paradigms based on this promising technology. These paradigms are unified b ..."
Abstract

Cited by 247 (3 self)
 Add to MetaCart
Cognitive radios hold tremendous promise for increasing spectral efficiency in wireless systems. This paper surveys the fundamental capacity limits and associated transmission techniques for different wireless network design paradigms based on this promising technology. These paradigms are unified by the definition of a cognitive radio as an intelligent wireless communication device that exploits side information about its environment to improve spectrum utilization. This side information typically comprises knowledge about the activity, channels, codebooks and/or messages of other nodes with which the cognitive node shares the spectrum. Based on the nature of the available side information as well as a priori rules about spectrum usage, cognitive radio systems seek to underlay, overlay or interweave the cognitive radios ’ signals with the transmissions of noncognitive nodes. We provide a comprehensive summary of the known capacity characterizations in terms of upper and lower bounds for each of these three approaches. The increase in system degrees of freedom obtained through cognitive radios is also illuminated. This information theoretic survey provides guidelines for the spectral efficiency gains possible through cognitive radios, as well as practical design ideas to mitigate the coexistence challenges in today’s crowded spectrum.
Towards an Information Theory of Large Networks: An Achievable Rate Region
 IEEE Trans. Inform. Theory
, 2003
"... Abstract — We study communication networks of arbitrary size and topology and communicating over a general vector discrete memoryless channel. We propose an informationtheoretic constructive scheme for obtaining an achievable rate region in such networks. Many wellknown capacitydefining achievabl ..."
Abstract

Cited by 202 (12 self)
 Add to MetaCart
Abstract — We study communication networks of arbitrary size and topology and communicating over a general vector discrete memoryless channel. We propose an informationtheoretic constructive scheme for obtaining an achievable rate region in such networks. Many wellknown capacitydefining achievable rate regions can be derived as special cases of the proposed scheme. A few such examples are the physically degraded and reverselydegraded relay channels, the Gaussian multipleaccess channel, and the Gaussian broadcast channel. The proposed scheme also leads to inner bounds for the multicast and allcast capacities. Applying the proposed scheme to a specific wireless network of nodes located in a region of unit area, we show that a transport capacity of ¡£ ¢ bitmeters/sec is feasible in a certain family of networks, as compared to the best possible transport capacity ¡£¢§ ¦ ¨ ¤ of bitmeters/sec in [16] where the receiver capabilities were limited. Even though the improvement is shown for a specific class of networks, a clear implication is that designing and employing more sophisticated multiuser coding schemes can provide sizable gains in at least some large wireless networks. Index Terms — Discrete memoryless channels, Gaussian channels, multiuser communications, network information theory,
Capacity and Optimal Resource Allocation for Fading Broadcast Channels: Part I: Ergodic Capacity
"... ..."
(Show Context)
Minimumenergy multicast in mobile ad hoc networks using network coding,”
 IEEE Trans. Communications,
, 2005
"... ..."
(Show Context)
An extremal inequality motivated by multiterminal information theoretic problems
, 2006
"... We prove a new extremal inequality, motivated by the vector Gaussian broadcast channel and the distributed source coding with a single quadratic distortion constraint problem. As a corollary, this inequality yields a generalization of the classical vector entropypower inequality (EPI). As another c ..."
Abstract

Cited by 80 (4 self)
 Add to MetaCart
(Show Context)
We prove a new extremal inequality, motivated by the vector Gaussian broadcast channel and the distributed source coding with a single quadratic distortion constraint problem. As a corollary, this inequality yields a generalization of the classical vector entropypower inequality (EPI). As another corollary, this inequality sheds insight into maximizing the differential entropy of the sum of two jointly distributed random variables.
Broadcast Channels with Cooperating Decoders
, 2006
"... We consider the problem of communicating over the general discrete memoryless broadcast channel (BC) with partially cooperating receivers. In our setup, receivers are able to exchange messages over noiseless conference links of finite capacities, prior to decoding the messages sent from the transmi ..."
Abstract

Cited by 59 (4 self)
 Add to MetaCart
(Show Context)
We consider the problem of communicating over the general discrete memoryless broadcast channel (BC) with partially cooperating receivers. In our setup, receivers are able to exchange messages over noiseless conference links of finite capacities, prior to decoding the messages sent from the transmitter. In this paper we formulate the general problem of broadcast with cooperation. We first find the capacity region for the case where the BC is physically degraded. Then, we give achievability results for the general broadcast channel, for both the two independent messages case and the single common message case.
Reliable physical layer network coding
 Proceedings of the IEEE
, 2011
"... Abstract—When two or more users in a wireless network transmit simultaneously, their electromagnetic signals are linearly superimposed on the channel. As a result, a receiver that is interested in one of these signals sees the others as unwanted interference. This property of the wireless medium is ..."
Abstract

Cited by 55 (6 self)
 Add to MetaCart
(Show Context)
Abstract—When two or more users in a wireless network transmit simultaneously, their electromagnetic signals are linearly superimposed on the channel. As a result, a receiver that is interested in one of these signals sees the others as unwanted interference. This property of the wireless medium is typically viewed as a hindrance to reliable communication over a network. However, using a recently developed coding strategy, interference can in fact be harnessed for network coding. In a wired network, (linear) network coding refers to each intermediate node taking its received packets, computing a linear combination over a finite field, and forwarding the outcome towards the destinations. Then, given an appropriate set of linear combinations, a destination can solve for its desired packets. For certain topologies, this strategy can attain significantly higher throughputs over routingbased strategies. Reliable physical layer network coding takes this idea one step further: using judiciously chosen linear errorcorrecting codes, intermediate nodes in a wireless network can directly recover linear combinations of the packets from the observed noisy superpositions of transmitted signals. Starting with some simple examples, this survey explores the core ideas behind this new technique and the possibilities it offers for communication over interferencelimited wireless networks. Index Terms—Digital communication, wireless networks, interference, network coding, channel coding, linear code, modulation, physical layer, fading, multiuser channels, multiple access, broadcast. I.
Estimation in Gaussian Noise: Properties of the minimum meansquare error
 IEEE Trans. Inf. Theory
, 2011
"... Abstract—Consider the minimum meansquare error (MMSE) of estimating an arbitrary random variable from its observation contaminated by Gaussian noise. The MMSE can be regarded as a function of the signaltonoise ratio (SNR) as well as a functional of the input distribution (of the random variable t ..."
Abstract

Cited by 44 (12 self)
 Add to MetaCart
(Show Context)
Abstract—Consider the minimum meansquare error (MMSE) of estimating an arbitrary random variable from its observation contaminated by Gaussian noise. The MMSE can be regarded as a function of the signaltonoise ratio (SNR) as well as a functional of the input distribution (of the random variable to be estimated). It is shown that the MMSE is concave in the input distribution at any given SNR. For a given input distribution, the MMSE is found to be infinitely differentiable at all positive SNR, and in fact a real analytic function in SNR under mild conditions. The key to these regularity results is that the posterior distribution conditioned on the observation through Gaussian channels always decays at least as quickly as some Gaussian density. Furthermore, simple expressions for the first three derivatives of the MMSE with respect to the SNR are obtained. It is also shown that, as functions of the SNR, the curves for the MMSE of a Gaussian input and that of a nonGaussian input cross at most once over all SNRs. These properties lead to simple proofs of the facts that Gaussian inputs achieve both the secrecy capacity of scalar Gaussian wiretap channels and the capacity of scalar Gaussian broadcast channels, as well as a simple proof of the entropy power inequality in the special case where one of the variables is Gaussian. Index Terms—Entropy, estimation, Gaussian broadcast channel, Gaussian noise, Gaussian wiretap channel, minimum mean square error (MMSE), mutual information. I.