Results 1  10
of
71
The Rate Loss in the WynerZiv Problem
 IEEE Trans. Inform. Theory
, 1996
"... The ratedistortion function for source coding with side information at the decoder (the "WynerZiv problem") is given in terms of an auxiliary random variable, which forms a Markov chain with the source and the side information. This Markov chain structure, typical to the solution of mult ..."
Abstract

Cited by 96 (16 self)
 Add to MetaCart
(Show Context)
The ratedistortion function for source coding with side information at the decoder (the "WynerZiv problem") is given in terms of an auxiliary random variable, which forms a Markov chain with the source and the side information. This Markov chain structure, typical to the solution of multiterminal source coding problems, corresponds to a loss in coding rate with respect to the conditional ratedistortion function, i.e., to the case where the encoder is fully informed. We show that for difference (or balanced) distortion measures, this loss is bounded by a universal constant, which is the minimax capacity of a suitable additive noise channel. Furthermore, in the worst case this loss is equal to the maximin redundancy over the rate distortion function of the additive noise "test" channel. For example, the loss in the WynerZiv problem is less than 0:5 bit per sample in the squarederror distortion case, and it is less than 0:22 bits for a binary source with Hammingdistance. These resul...
DataHiding Codes
 Proc. IEEE
, 2005
"... This tutorial paper reviews the theory and design of codes for hiding or embedding information in signals such as images, video, audio, graphics, and text. Such codes have also been called watermarking codes; they can be used in a variety of applications, including copyright protection for digital m ..."
Abstract

Cited by 56 (4 self)
 Add to MetaCart
(Show Context)
This tutorial paper reviews the theory and design of codes for hiding or embedding information in signals such as images, video, audio, graphics, and text. Such codes have also been called watermarking codes; they can be used in a variety of applications, including copyright protection for digital media, content authentication, media forensics, data binding, and covert communications. Some of these applications imply the presence of an adversary attempting to disrupt the transmission of information to the receiver; other applications involve a noisy, generally unknown, communication channel. Our focus is on the mathematical models, fundamental principles, and code design techniques that are applicable to data hiding. The approach draws from basic concepts in information theory, coding theory, game theory, and signal processing, and is illustrated with applications to the problem of hiding data in images. Keywords—Coding theory, data hiding, game theory, image processing, information theory, security, signal processing, watermarking. I.
Multiterminal Source Coding with High Resolution
 IEEE Trans. Inform. Theory
, 1999
"... We consider separate encoding and joint decoding of correlated continuous information sources, subject to a difference distortion measure. We first derive a multiterminal extension of the Shannon lower bound for the rate region. Then we show that this Shannon outer bound is asymptotically tight for ..."
Abstract

Cited by 56 (5 self)
 Add to MetaCart
(Show Context)
We consider separate encoding and joint decoding of correlated continuous information sources, subject to a difference distortion measure. We first derive a multiterminal extension of the Shannon lower bound for the rate region. Then we show that this Shannon outer bound is asymptotically tight for small distortions. These results imply that the loss in the sum of the coding rates due to the separation of the encoders vanishes in the limit of high resolution. Furthermore, lattice quantizers followed by SlepianWolf lossless encoding are asymptotically optimal. We also investigate the highresolution rate region in the remote coding case, where the encoders observe only noisy versions of the sources. For the quadratic Gaussian case, we establish a separation result to the effect that multiterminal coding aimed at reconstructing the noisy sources subject to the rate constraints, followed by estimation of the remote sources from these reconstructions, is optimal under certain regularity ...
On the asymptotic tightness of the Shannon lower bound
, 1997
"... New results are proved on the convergence of the Shannon lower bound (SLB) to the rate distortion function as the distortion decreases to zero. The key convergence result is proved using a fundamental property of informational divergence. As a corollary, it is shown that the SLB is asymptotically ti ..."
Abstract

Cited by 53 (16 self)
 Add to MetaCart
New results are proved on the convergence of the Shannon lower bound (SLB) to the rate distortion function as the distortion decreases to zero. The key convergence result is proved using a fundamental property of informational divergence. As a corollary, it is shown that the SLB is asymptotically tight for normbased distortions, when the source vector has a finite differential entropy and a finite ffth moment for some ff ? 0, with respect to the given norm. Moreover, we derive a theorem of Linkov on the asymptotic tightness of the SLB for general difference distortion measures with more relaxed conditions on the source density. We also show that the SLB relative to a stationary source and single letter difference distortion is asymptotically tight under very weak assumptions on the source distribution. Key words: rate distortion theory, Shannon lower bound, difference distortion measures, stationary sources T. Linder is with the Coordinated Science Laboratory, University of Illinoi...
Lattices for distributed source coding: Jointly Gaussian sources and reconstruction of a linear function
 IEEE TRANSACTIONS ON INFORMATION THEORY, SUBMITTED
, 2007
"... Consider a pair of correlated Gaussian sources (X1, X2). Two separate encoders observe the two components and communicate compressed versions of their observations to a common decoder. The decoder is interested in reconstructing a linear combination of X1 and X2 to within a meansquare distortion of ..."
Abstract

Cited by 45 (2 self)
 Add to MetaCart
(Show Context)
Consider a pair of correlated Gaussian sources (X1, X2). Two separate encoders observe the two components and communicate compressed versions of their observations to a common decoder. The decoder is interested in reconstructing a linear combination of X1 and X2 to within a meansquare distortion of D. We obtain an inner bound to the optimal ratedistortion region for this problem. A portion of this inner bound is achieved by a scheme that reconstructs the linear function directly rather than reconstructing the individual components X1 and X2 first. This results in a better rate region for certain parameter values. Our coding scheme relies on lattice coding techniques in contrast to more prevalent random coding arguments used to demonstrate achievable rate regions in information theory. We then consider the case of linear reconstruction of K sources and provide an inner bound to the optimal ratedistortion region. Some parts of the inner bound are achieved using the following coding structure: lattice vector quantization followed by “correlated” latticestructured binning.
Lattices are Everywhere
"... As bees and crystals (and people selling oranges in the market) know it for many years, lattices provide efficient structures for packing, covering, quantization and channel coding. In the recent years, interesting links were found between lattices and coding schemes for multiterminal networks. Thi ..."
Abstract

Cited by 40 (3 self)
 Add to MetaCart
(Show Context)
As bees and crystals (and people selling oranges in the market) know it for many years, lattices provide efficient structures for packing, covering, quantization and channel coding. In the recent years, interesting links were found between lattices and coding schemes for multiterminal networks. This tutorial paper covers close to 20 years of my research in the area; of enjoying the beauty of lattice codes, and discovering their power in dithered quantization, dirty paper coding, WynerZiv DPCM, modulolattice modulation, distributed interference cancelation, and more. I.
Achieving the Gaussian ratedistortion function by prediction
 IEEE Trans. Inf. Theory
, 2008
"... Abstract — The “waterfilling ” solution for the quadratic ratedistortion function of a stationary Gaussian source is given in terms of its power spectrum. This formula naturally lends itself to a frequency domain “testchannel ” realization. We provide an alternative timedomain realization for the ..."
Abstract

Cited by 36 (11 self)
 Add to MetaCart
(Show Context)
Abstract — The “waterfilling ” solution for the quadratic ratedistortion function of a stationary Gaussian source is given in terms of its power spectrum. This formula naturally lends itself to a frequency domain “testchannel ” realization. We provide an alternative timedomain realization for the ratedistortion function, based on linear prediction. The predictive testchannel has some interesting implications, including the optimality at all distortion levels of pre/post filtered vectorquantized differential pulse code modulation (DPCM), and a duality relationship with decisionfeedback equalization (DFE) for intersymbol interference (ISI) channels. I.
HighResolution Source Coding for NonDifference Distortion Measures: Multidimensional Companding
 IEEE Trans. Inform. Theory
, 1999
"... Entropycoded vector quantization is studied using highresolution multidimensional companding over a class of nondifference distortion measures. For distortion measures which are "locally quadratic" a rigorous derivation of the asymptotic distortion and entropycoded rate of multidimensio ..."
Abstract

Cited by 35 (3 self)
 Add to MetaCart
Entropycoded vector quantization is studied using highresolution multidimensional companding over a class of nondifference distortion measures. For distortion measures which are "locally quadratic" a rigorous derivation of the asymptotic distortion and entropycoded rate of multidimensional companders is given along with conditions for the optimal choice of the compressor function. This optimum compressor, when it exists, depends on the distortion measure but not on the source distribution. The ratedistortion performance of the companding scheme is studied using a recently obtained asymptotic expression for the ratedistortion function which parallels the Shannon lower bound for difference distortion measures. It is proved that the highresolution performance of the scheme is arbitrarily close to the ratedistortion limit for large quantizer dimensions if the compressor function and the lattice quantizer used in the companding scheme are optimal, extending an analogous statement for...
Multiple Description Quantization Via GramSchmidt Orthogonalization
, 2005
"... The multiple description (MD) problem has received considerable attention as a model of information transmission over unreliable channels. A general framework for designing efficient multiple description quantization schemes is proposed in this paper. We provide a systematic treatment of the El Gama ..."
Abstract

Cited by 28 (11 self)
 Add to MetaCart
(Show Context)
The multiple description (MD) problem has received considerable attention as a model of information transmission over unreliable channels. A general framework for designing efficient multiple description quantization schemes is proposed in this paper. We provide a systematic treatment of the El GamalCover (EGC) achievable MD ratedistortion region, and show that any point in the EGC region can be achieved via a successive quantization scheme along with quantization splitting. For the quadratic Gaussian case, the proposed scheme has an intrinsic connection with the GramSchmidt orthogonalization, which implies that the whole Gaussian MD ratedistortion region is achievable with a sequential dithered latticebased quantization scheme as the dimension of the (optimal) lattice quantizers becomes large. Moreover, this scheme is shown to be universal for all i.i.d. smooth sources with performance no worse than that for an i.i.d. Gaussian source with the same variance and asymptotically optimal at high resolution. A class of lowcomplexity MD scalar quantizers in the proposed general framework also is constructed and is illustrated geometrically; the performance is analyzed in the high resolution regime, which exhibits a noticeable improvement over the existing MD scalar quantization schemes.