## The Wyner-Ziv Problem with Multiple Sources (2002)

Venue: | IEEE Transactions on Information Theory |

Citations: | 27 - 1 self |

### BibTeX

@ARTICLE{Gastpar02thewyner-ziv,

author = {Michael Gastpar},

title = {The Wyner-Ziv Problem with Multiple Sources},

journal = {IEEE Transactions on Information Theory},

year = {2002},

volume = {50},

pages = {2762--2768}

}

### Years of Citing Articles

### OpenURL

### Abstract

We consider the problem of separately compressing multiple sources in a lossy fashion for a decoder that has access to side information. For the case of a single source, this problem has been completely solved by Wyner and Ziv. For the case of two sources, we establish an achievable rate region, an inner bound to the rate region, and a partial converse. The partial converse applies to the case when the sources are conditionally independent given the side information, and it di#ers significantly from prior art in that it applies also to the symmetric case where all sources are encoded with respect to fidelity criteria. Moreover, we also show that in this special case, there is no di#erence between the minimum rate needed to encode the sources jointly, and the minimum sum rate needed for separate encoding.

### Citations

8556 | Elements of Information Theory - Cover, Thomas - 1991 |

867 |
Noiseless Coding of Correlated Information Sources
- Slepian, Wolf
- 1973
(Show Context)
Citation Context ...rmation Z permit to lower the necessary rates (and/or the incurred distortions). The first gain (stemming from the fact that the sources are dependent) has been studied and solved by Slepian and Wolf =-=[3]-=- for the case of lossless compression. They considered the system of Figure 1 without the side information Z. Their surprising result is that the total rate needed for separate (lossless) compression ... |

710 | The Rate-distortion Function for Source Coding with Side Information at The Decoder
- Wyner, Ziv
- 1976
(Show Context)
Citation Context ....g. [4]), but no final results are available to date. The second gain, i.e., the one due to side information, has been determined for the case of lossy compression of a single source by Wyner and Ziv =-=[1]-=-; the result is called the Wyner-Ziv rate-distortion function. In this paper, we study the Wyner-Ziv problem with two (and more) discrete 1 memoryless dependent sources X1, X2 and a side information r... |

315 |
Rate Distortion Theory: A Mathematical Basis for Data Compression
- Berger
- 1971
(Show Context)
Citation Context ...tive source, ignoring both the side information and the other encoder’s presence. For such a scheme, the smallest rates are well known from (standard) single-source rate-distortion theory, see, e.g., =-=[1]-=- or [2, Theorem 2.2.3]. It is well known, however, that both the fact that the two sources are dependent as well as the side information Z permit to lower the necessary rates (or the incurreddistortio... |

88 |
A proof of the data compression theorem of Slepian and Wolf for ergodic sources,” no
- Cover
- 1975
(Show Context)
Citation Context ... III. AN ACHIEVABLE RATE REGION An achievable rate region can be obtained by extending the coding scheme introducedby Slepian andWolf [3], andelegantly generalizedto the concept of “binning” by Cover =-=[16]-=-. In summary, the code leading to the inner bound to the rate-distortion region by Theorem 2 below is the cascade of a suitable vector quantizer with a binning operation for the codeword indices. In p... |

86 |
The rate-distortion function for the quadratic Gaussian CEO problem
- Oohama
- 1998
(Show Context)
Citation Context ... coincide for cases where one of the two sources is either not to be reconstructed, or encoded perfectly (see [11], [12], [10]). Moreover, certain conclusive results are available for the CEO problem =-=[13]-=-. 3 The lemma is quotedwithout proof in [5, Lemma 14.8.1, p. 436]; with reference to [4], [2]. In [4], the Markov lemma (Lemma 4.1) has a prominent role as well as a proof. Lemma 1 (Markov Lemma): Con... |

74 |
Gaussian multiterminal source coding
- Oohama
- 1997
(Show Context)
Citation Context ...he proofs in this paper are limited to discrete alphabets, in line with most treatments of related issues. Notable exceptions to this include Wyner’s extension [5] of [1] to continuous alphabets, an=-=d [6]. 2sthe-=- decoder to satisfy the distortion constraints Ed1(X1, ˆ X1) ≤ D1, (2) Ed2(X2, ˆ X2) ≤ D2, (3) where ˆ Xi is the estimate that the receiver produces of Xi, using all the received codeword indic... |

65 | The distributed karhunen loève transform
- Gastpar, Dragotti, et al.
(Show Context)
Citation Context ...first such result for a distributed lossy compression problem. The results of this paper have natural applications to distributed compression and signal processing. One such extension is described in =-=[9]. -=-At the same time, they are also relevant to establish certain rate regions for multiple-relay channels. As a matter of fact, Figure 1 can be understood as a relay network: The two boxes labeled “enc... |

51 |
Rate-distortion for correlated sources with partially separated encoders
- Kaspi, Berger
- 1982
(Show Context)
Citation Context ...ion, in spite of numerous attempts, the rate region of [4] could not be shown to be optimal, i.e., there is no converse to this rate region. Certain partial answers appear in the literature (see e.g. =-=[7, 8, 6]-=-). All those answers apply to cases where one of the two sources is either not to be reconstructed, or encoded perfectly. Consequently, it is not surprising that we cannot show Ra(D1, D2) to be the op... |

43 |
Multiterminal source encoding with one distortion criterion
- Berger, Yeung
- 1989
(Show Context)
Citation Context ...ion, in spite of numerous attempts, the rate region of [4] could not be shown to be optimal, i.e., there is no converse to this rate region. Certain partial answers appear in the literature (see e.g. =-=[7, 8, 6]-=-). All those answers apply to cases where one of the two sources is either not to be reconstructed, or encoded perfectly. Consequently, it is not surprising that we cannot show Ra(D1, D2) to be the op... |

27 |
Information Theory: Coding Theory for Discrete Memoryless Systems
- Csiszár, Körner
- 1981
(Show Context)
Citation Context ...hat (x n 1 , W n 1 With high probability, (z n , x n 1 , W n 1 ) ∈ A∗(n) ɛ . ) ∈ A∗(n) ɛ , draw ) ∈ A∗(n) ɛ . Proof. The lemma is quoted without proof in [11, Lemma 14.8.1, p. 436]; wit=-=h reference to [4, 12]-=-. In [4], the Markov lemma (Lemma 4.1) has a prominent role as well as a proof. To understand the lemma, note that from the fact that a sequence z n is jointly typical with xn 1 , and xn1 is jointly t... |

16 |
Multiterminal source coding, Lectures presented at CISM Summer School on the Information Theory Approach to Communications
- Berger
- 1977
(Show Context)
Citation Context ... two sources. 6 Conclusion Distributed lossy compression as shown in Figure 1 (without the side information) is a long-standing open problem. The best known achievable rate region is the one given in =-=[4]-=-, 14sand it does not coincide with any converse bound; no final rate-distortion results can be given. In this paper, we investigated distributed lossy compression with side information, and we establi... |

5 |
Communicating via a processing broadcast satellite
- Willems, Wolf, et al.
- 1989
(Show Context)
Citation Context ...e fact that side information is available at the decoder, has been extensively studied for the case of lossless compression (see e.g., [5, p. 458], with recent extensions to network cases, see, e.g., =-=[6]-=-, [7]), andfor the case of lossy compression of a single source by Wyner andZiv [8], [9]. This correspondence investigates the Wyner–Ziv problem with two (andmore) discrete 1 memoryless dependent sour... |

3 |
Multiterminal source coding,” presented at the CISM Summer School on the Inf. Theory Approach to Commun
- Berger
- 1977
(Show Context)
Citation Context ...as the rate needed for joint compression of the two sources (i.e., their joint entropy). When the compression is lossy, the dependence between the sources still permits to lower the rates (see, e.g., =-=[4]-=-), but no conclusive results are available to date. The second gain, stemming from the fact that side information is available at the decoder, has been extensively studied for the case of lossless com... |

2 |
The secret key capacity of multiple terminals
- Csiszár, Narayan
- 2004
(Show Context)
Citation Context ...t that side information is available at the decoder, has been extensively studied for the case of lossless compression (see e.g., [5, p. 458], with recent extensions to network cases, see, e.g., [6], =-=[7]-=-), andfor the case of lossy compression of a single source by Wyner andZiv [8], [9]. This correspondence investigates the Wyner–Ziv problem with two (andmore) discrete 1 memoryless dependent sources a... |

2 |
Cooperativestrategiesandcapacitytheoremsforrelay networks
- Kramer, Gastpar, et al.
(Show Context)
Citation Context ...red to Appendix I. The rate region Ra can easily be extended to more than two sources. For brevity andin order to concentrate on the main result (Section V), we omit an explicit statement andrefer to =-=[17]-=-. IV. A GENERAL OUTER BOUND In this section, we present a region Rc(D1;D2) which contains the desired rate-distortion region R WZ S ;S jZ (D1;D2). The region Rc(D1;D2) follows from standard outer boun... |

1 |
Antenna-clustering strategies for the multiple-relay channel
- Gastpar, Kramer, et al.
- 2002
(Show Context)
Citation Context ...mpress their observations for a final destination that observes Z. Since X1, X2 and Z were all produced by the source of the relay network, they are generally correlated. This is further explained in =-=[10]. 2 -=-An achievable rate region An achievable rate region can be obtained by extending the coding scheme introduced by Slepian and Wolf in [3], which is sometimes called “binning.” In summary, the code ... |

1 |
Noiseless coding of correlatedinformation sources
- Wolf
- 1973
(Show Context)
Citation Context ...de information Z permit to lower the necessary rates (or the incurreddistortions). The first gain, stemming from the fact that the sources are dependent, has been studied and found by Slepian andWolf =-=[3]-=- for the case of lossless compression. They considered the system of Fig. 1 without the side information Z. Their surprising result is that the total rate needed for separate (lossless) compression of... |

1 |
Rate-distortion for correlatedsource with partially separatedencoders
- Berger
- 1982
(Show Context)
Citation Context ...pecial case, namely, when the sources are conditionally independent given the side information, it is shown in Section V, that the two regions do coincide. By contrast to earlier work (as reported in =-=[11]-=-, [12], [10]), this correspondence provides conclusive rate-distortion results for a symmetric case, i.e., both sources are 0018-9448/04$20.00 © 2004 IEEEIEEE TRANSACTIONS ON INFORMATION THEORY, VOL.... |

1 |
The distributed Karhunen–Loève transform
- Vetterli
- 2002
(Show Context)
Citation Context ... be reconstructedwith respect to a fidelity criterion. 2 The results of this correspondence have natural applications to distributedcompression andsignal processing. One such extension is describedin =-=[14]-=-. At the same time, they are also useful to obtain certain rate regions for multiple-relay channels. As a matter of fact, Fig. 1 can be understood as a relay network: The two boxes labeled “encoder” a... |

1 |
The multiple-relay channel: Coding and antenna-clustering capacity
- Gupta
- 2002
(Show Context)
Citation Context ...s their observations for a final destination that observes Z. Since S1, S2, and Z were all produced by the source node in the relay network, they are generally correlated. This is further explainedin =-=[15]-=-. II. NOTATION AND CONVENTIONS In this correspondence, random variables are denoted by capital letters, such as X, andtheir realizations by lower case letters, such as x. The (discrete and finite) alp... |