Results 1 
8 of
8
Sum capacity of the vector Gaussian broadcast channel and uplinkdownlink duality
 IEEE Trans. on Inform. Theory
, 1912
"... We characterize the sum capacity of the vector Gaussian broadcast channel by showing that the existing inner bound of Marton and the existing upper bound of Sato are tight for this channel. We exploit an intimate fourway connection between the vector broadcast channel, the corresponding pointtopo ..."
Abstract

Cited by 300 (2 self)
 Add to MetaCart
(Show Context)
We characterize the sum capacity of the vector Gaussian broadcast channel by showing that the existing inner bound of Marton and the existing upper bound of Sato are tight for this channel. We exploit an intimate fourway connection between the vector broadcast channel, the corresponding pointtopoint channel (where the receivers can cooperate), the multiple access channel (where the role of transmitters and receivers are reversed), and the corresponding pointtopoint channel (where the transmitters can cooperate). 1
On The Capacity Of Wireless Networks: The Relay Case
 in Proc. IEEE INFOCOM
, 2002
"... In [1], Gupta and Kumar determined the capacity of wireless networks under certain assumptions, among them pointtopoint coding, which excludes for example multiaccess and broadcast codes. In this paper, we consider essentially the same physical model of a wireless network under a different traffi ..."
Abstract

Cited by 224 (11 self)
 Add to MetaCart
(Show Context)
In [1], Gupta and Kumar determined the capacity of wireless networks under certain assumptions, among them pointtopoint coding, which excludes for example multiaccess and broadcast codes. In this paper, we consider essentially the same physical model of a wireless network under a different traffic pattern, namely the relay traffic pattern, but we allow for arbitrarily complex network coding. In our model, there is only one active source/destination pair, while all other nodes assist this transmission. We show code constructions leading to achievable rates and derive upper bounds from the maxflow mincut theorem. It is shown that lower and upper bounds meet asymptotically as the number of nodes in the network goes to infinity, thus proving that the capacity of the wireless network with n nodes under the relay traffic pattern behaves like log n bits per second. This demonstrates also that network coding is essential: under the pointtopoint coding assumption considered in [1], the achievable rate is constant, independent of the number of nodes.
Sum Capacity of the Multiple Antenna Gaussian Broadcast Channel And UplinkDownlink Duality
 IEEE Transactions on Information Theory
, 2002
"... We characterize the sum capacity of the multiple antenna Gaussian broadcast channel by showing that the existing inner bound of Marton and the existing upper bound of Sato are tight for this channel. We exploit an intimate fourway connection between the multiple antenna broadcast channel, the corre ..."
Abstract

Cited by 48 (4 self)
 Add to MetaCart
(Show Context)
We characterize the sum capacity of the multiple antenna Gaussian broadcast channel by showing that the existing inner bound of Marton and the existing upper bound of Sato are tight for this channel. We exploit an intimate fourway connection between the multiple antenna broadcast channel, the corresponding pointtopoint channel (where the receivers can cooperate), the multiple access channel (where the role of transmitters and receivers are reversed), and the corresponding pointtopoint channel (where the transmitters can cooperate).
On Joint SourceChannel Coding for the WynerZiv Source and the Gel'fandPinsker Channel
 IEEE Trans. Inform. Theory
, 2002
"... We consider the problem of lossy joint sourcechannel coding in a communication system where the encoder has access to channel state information (CSI) and the decoder has access to side information that is correlated to the source. This configuration combines the WynerZiv model of pure lossy source ..."
Abstract

Cited by 30 (3 self)
 Add to MetaCart
We consider the problem of lossy joint sourcechannel coding in a communication system where the encoder has access to channel state information (CSI) and the decoder has access to side information that is correlated to the source. This configuration combines the WynerZiv model of pure lossy source coding with side information at the decoder and the Shannon/Gel'fandPinsker model of pure channel coding with CSI at the encoder. We prove a separation theorem for this communication system, which asserts that there is no loss in asymptotic optimality in applying first, an optimal WynerZiv source code and then, an optimal Gel'fandPinsker channel code. We then derive conditions for the optimality of a symbolbysymbol (scalar) sourcechannel code, and demonstrate situations where these conditions are met. Finally, we discuss a few practical applications, including of overlaid communication where the model under discussion is useful.
Sum Rate of Multiterminal Gaussian Source Coding
 DIMACS SERIES IN DISCRETE MATHEMATICS AND THEORETICAL COMPUTER SCIENCE
"... We characterize the sum rate of a class of multiterminal Gaussian source coding problems with quadratic distortion constraints. The key component of the solution is the identification of a multiple antenna broadcast channel that serves as a test channel. ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
We characterize the sum rate of a class of multiterminal Gaussian source coding problems with quadratic distortion constraints. The key component of the solution is the identification of a multiple antenna broadcast channel that serves as a test channel.
TO CODE OR NOT TO CODE
, 2002
"... de nationalité suisse et originaire de Zurich (ZH) et Lucerne (LU) acceptée sur proposition du jury: ..."
Abstract
 Add to MetaCart
(Show Context)
de nationalité suisse et originaire de Zurich (ZH) et Lucerne (LU) acceptée sur proposition du jury:
DIMACS Series in Discrete Mathematics and Theoretical Computer Science On the Capacity of the Multiple Antenna Broadcast Channel
"... Abstract. The capacity region of the multiple antenna (transmit and receive) broadcast channel is considered. We propose an outer bound to the capacity region by converting this nondegraded broadcast channel into a degraded one with users privy to the signals of users ordered below them. We extend o ..."
Abstract
 Add to MetaCart
Abstract. The capacity region of the multiple antenna (transmit and receive) broadcast channel is considered. We propose an outer bound to the capacity region by converting this nondegraded broadcast channel into a degraded one with users privy to the signals of users ordered below them. We extend our proof techniques in the characterization of the sum capacity of the multiple antenna broadcast channel to evaluate this outer bound with Gaussian inputs. Our main result is the observation that if Gaussian inputs are optimal to the constructed degraded channel, then the capacity region of the multiple antenna broadcast channel is characterized. 1.
Broadcasting of a Common Source: InformationTheoretic Results and System Challenges
"... Abstract — We consider the problem of sending a common source to several users simultaneously at possibly different level of distortion. In this paper we review the classical information theoretic results on Broadcast Channels, then we examine the optimality of analog communication (singleletter co ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract — We consider the problem of sending a common source to several users simultaneously at possibly different level of distortion. In this paper we review the classical information theoretic results on Broadcast Channels, then we examine the optimality of analog communication (singleletter coding) and we review the Hybrid Digital Analog (HDA) approach. Finally we motivate the need of all digital transmission and outline a joint sourcechannel coding approach based on layering. Classical Information Theoretic Approach. In 1972 Cover [1] formally introduced the concept of broadcast channel coding theory and discussed the basic problem of finding the capacity region C(Γ) = {(R1, R2,... Rk) ∈ R k + simultaneously achievable} where k is the number of