Results 1  10
of
74
Computation over MultipleAccess Channels
 IEEE TRANSACTIONS ON INFORMATION THEORY
, 2007
"... The problem of reliably reconstructing a function of sources over a multipleaccess channel is considered. It is shown that there is no sourcechannel separation theorem even when the individual sources are independent. Joint sourcechannel strategies are developed that are optimal when the structure ..."
Abstract

Cited by 139 (24 self)
 Add to MetaCart
The problem of reliably reconstructing a function of sources over a multipleaccess channel is considered. It is shown that there is no sourcechannel separation theorem even when the individual sources are independent. Joint sourcechannel strategies are developed that are optimal when the structure of the channel probability transition matrix and the function are appropriately matched. Even when the channel and function are mismatched, these computation codes often outperform separationbased strategies. Achievable distortions are given for the distributed refinement of the sum of Gaussian sources over a Gaussian multipleaccess channel with a joint sourcechannel lattice code. Finally, computation codes are used to determine the multicast capacity of finite field multipleaccess networks, thus linking them to network coding.
Linear Coherent Decentralized Estimation
"... Abstract—We consider the distributed estimation of an unknown vector signal in a resource constrained sensor network with a fusion center. Due to power and bandwidth limitations, each sensor compresses its data in order to minimize the amount of information that needs to be communicated to the fusio ..."
Abstract

Cited by 47 (1 self)
 Add to MetaCart
(Show Context)
Abstract—We consider the distributed estimation of an unknown vector signal in a resource constrained sensor network with a fusion center. Due to power and bandwidth limitations, each sensor compresses its data in order to minimize the amount of information that needs to be communicated to the fusion center. In this context, we study the linear decentralized estimation of the source vector, where each sensor linearly encodes its observations and the fusion center also applies a linear mapping to estimate the unknown vector signal based on the received messages. We adopt the mean squared error (MSE) as the performance criterion. When the channels between sensors and the fusion center are orthogonal, it has been shown previously that the complexity of designing the optimal encoding matrices is NPhard in general. In this paper, we study the optimal linear decentralized estimation when the multiple access channel (MAC) is coherent. For the case when the source and observations are scalars, we derive the optimal power scheduling via convex optimization and show that it admits a simple distributed implementation. Simulations show that the proposed power scheduling improves the MSE performance by a large margin when compared to the uniform power scheduling. We also show that under a finite network power budget, the asymptotic MSE performance (when the total number of sensors is large) critically depends on the multiple access scheme. For the case when the source and observations are vectors, we study the optimal linear decentralized estimation under both bandwidth and power constraints. We show that when the MAC between sensors and the fusion center is noiseless, the resulting problem has a closedform solution (which is in sharp contrast to the orthogonal MAC case), while in the noisy MAC case, the problem can be efficiently solved by semidefinite programming (SDP). Index Terms—Distributed estimation, energy efficiency, multiple access channel, linear sourcechannel coding, convex optimization. I.
Lattices are Everywhere
"... As bees and crystals (and people selling oranges in the market) know it for many years, lattices provide efficient structures for packing, covering, quantization and channel coding. In the recent years, interesting links were found between lattices and coding schemes for multiterminal networks. Thi ..."
Abstract

Cited by 41 (3 self)
 Add to MetaCart
(Show Context)
As bees and crystals (and people selling oranges in the market) know it for many years, lattices provide efficient structures for packing, covering, quantization and channel coding. In the recent years, interesting links were found between lattices and coding schemes for multiterminal networks. This tutorial paper covers close to 20 years of my research in the area; of enjoying the beauty of lattice codes, and discovering their power in dithered quantization, dirty paper coding, WynerZiv DPCM, modulolattice modulation, distributed interference cancellation, and more.
Sending a BiVariate Gaussian over a Gaussian MAC
, 901
"... We study the power versus distortion tradeoff for the distributed transmission of a memoryless bivariate Gaussian source over a twotoone averagepower limited Gaussian multipleaccess channel. In this problem, each of two separate transmitters observes a different component of a memoryless biva ..."
Abstract

Cited by 30 (3 self)
 Add to MetaCart
(Show Context)
We study the power versus distortion tradeoff for the distributed transmission of a memoryless bivariate Gaussian source over a twotoone averagepower limited Gaussian multipleaccess channel. In this problem, each of two separate transmitters observes a different component of a memoryless bivariate Gaussian source. The two transmitters then describe their source component to a common receiver via an averagepower constrained Gaussian multipleaccess channel. From the output of the multipleaccess channel, the receiver wishes to reconstruct each source component with the least possible expected squarederror distortion. Our interest is in characterizing the distortion pairs that are simultaneously achievable on the two source components. We present sufficient conditions and necessary conditions for the achievability of a distortion pair. These conditions are expressed as a function of the channel signaltonoise ratio (SNR) and of the source correlation. In several cases the necessary conditions and sufficient conditions are shown to agree. In particular, we show that if the channel SNR is below a certain threshold, then an uncoded transmission scheme is optimal. We also derive the precise highSNR asymptotics of an optimal scheme. 1
Broadcasting Correlated Gaussians
, 710
"... We consider the transmission of a bivariate Gaussian source over a onetotwo powerlimited Gaussian broadcast channel. Receiver 1 observes the transmitted signal corrupted by Gaussian noise and wishes to estimate the first component of the source. Receiver 2 observes the transmitted signal in larg ..."
Abstract

Cited by 18 (1 self)
 Add to MetaCart
(Show Context)
We consider the transmission of a bivariate Gaussian source over a onetotwo powerlimited Gaussian broadcast channel. Receiver 1 observes the transmitted signal corrupted by Gaussian noise and wishes to estimate the first component of the source. Receiver 2 observes the transmitted signal in larger Gaussian noise and wishes to estimate the second component. We seek to characterize the pairs of mean squarederror distortions that are simultaneously achievable at the two receivers. Our result is that below a certain SNRthreshold an “uncoded scheme ” that sends a linear combination of the source components is optimal. The SNRtheshold can be expressed as a function of the source correlation and the distortion at Receiver 1. 1 Problem Statement We consider the transmission of a bivariate Gaussian source over a onetotwo powerlimited Gaussian broadcast channel. Receiver 1 observes the transmitted signal corrupted by Gaussian noise and wishes to estimate the first component of the source. Receiver 2 observes the transmitted signal in larger Gaussian noise and wishes to estimate the second component. We seek to characterize the pairs of mean squarederror distortions that are simultaneously achievable at the two receivers. This problem will be now stated formally. Our setup is illustrated in Figure 1. A memoryless source emits at each
EnergyDistortion Tradeoffs in Gaussian Joint SourceChannel Coding Problems
"... Abstract—The informationtheoretic notion of energy efficiency is studied in the context of various joint sourcechannel coding problems. The minimum transmission energy ( ) required to communicate a source over a noisy channel so that it can be reconstructed within a target distortion is analyzed. ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
Abstract—The informationtheoretic notion of energy efficiency is studied in the context of various joint sourcechannel coding problems. The minimum transmission energy ( ) required to communicate a source over a noisy channel so that it can be reconstructed within a target distortion is analyzed. Unlike the traditional joint sourcechannel coding formalisms, no restrictions are imposed on the number of channel uses per source sample. For singlesource memoryless pointtopoint channels, ( ) is shown to be equal to the product of the minimum energy per bit min of the channel and the ratedistortion function ( ) of the source, regardless of whether channel output feedback is available at the transmitter. The primary focus is on Gaussian sources and channels affected by additive white Gaussian noise under quadratic distortion criteria, with or without perfect channel output feedback. In particular, for two correlated Gaussian sources communicated over a Gaussian multipleaccess channel, inner and outer bounds on the energydistortion region are obtained, which coincide in special cases. For symmetric channels, the difference between the upper and lower bounds on energy is shown to be at most a constant even when the lower bound goes to infinity as 0. It is also shown that simple uncoded transmission schemes perform better than the separationbased schemes in many different regimes, both with and without feedback. Index Terms—Energy efficiency, feedback, information theory, joint sourcechannel coding, multipleaccess channel (MAC), separate source and channel coding, uncoded transmission. I.
Zerodelay joint sourcechannel coding for a bivariate Gaussian on a Gaussian MAC
 IEEE Trans. Commun
, 2012
"... ar ..."
(Show Context)
Asymptotics and power allocation for state estimation over fading channels
 IEEE TRANS. AEROSP. ELECTRON. SYST
, 2011
"... State estimation of linear systems using analog amplify and forwarding with multiple sensors, for both multiple access and orthogonal access schemes is considered. Optimal state estimation can be achieved at the fusion center using a timevarying Kalman filter. We show that in many situations, the e ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
(Show Context)
State estimation of linear systems using analog amplify and forwarding with multiple sensors, for both multiple access and orthogonal access schemes is considered. Optimal state estimation can be achieved at the fusion center using a timevarying Kalman filter. We show that in many situations, the estimation error covariance decays at a rate of 1=M when the number of sensors M is large. We consider optimal allocation of transmission powers that 1) minimizes the sum power usage subject to an error covariance constraint, and 2) minimizes the error covariance subject to a sum power constraint. In the case of fading channels with channelstate information, the optimization problems are solved using a greedy approach, while for fading channels without channel state information (CSI) but with channel statistics available, a suboptimal linear estimator is derived.
Communicating the difference of correlated Gaussian sources over a MAC
 in Data Compression Conference (DCC 2009), (Snowbird, UT
, 2009
"... ..."
(Show Context)
Structured random codes and sensor network coding theorems
 in International Zurich Seminar on Communications (IZS 2008
, 2008
"... Abstract—In the Shannontheoretic analysis of joint sourcechannel coding problems, achievability is usually established via a twostage approach: The sources are compressed into bits, and these bits are reliably communicated across the noisy channels. Random coding arguments are the backbone of both ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
(Show Context)
Abstract—In the Shannontheoretic analysis of joint sourcechannel coding problems, achievability is usually established via a twostage approach: The sources are compressed into bits, and these bits are reliably communicated across the noisy channels. Random coding arguments are the backbone of both stages of the proof. This “separation ” strategy not only establishes the optimal performance for stationary ergodic pointtopoint problems, but also for a number of simple network situations, such as independent sources that are communicated with respect to separate fidelity criteria across a multipleaccess channel. Beyond such simple cases, for general networks, separationbased coding is suboptimal. For instance, for a simple Gaussian sensor network, uncoded transmission is exactly optimal and performs exponentially better than a separationbased solution. In this note, we generalize this sensor network strategy by employing a lattice code. The underlying linear structure of our code is crucial to its success. I.