Results 1  10
of
52
Multiple Description Coding: Compression Meets the Network
, 2001
"... This article focuses on the compressed representations of the pictures ..."
Abstract

Cited by 288 (7 self)
 Add to MetaCart
This article focuses on the compressed representations of the pictures
A ratesplitting approach to the Gaussian multipleaccess channel
 IEEE TRANS. INFORM. THEORY
, 1996
"... It is shown that any point in the capacity region of a Gaussian multipleaccess channel is achievable by singleuser coding without requiring synchronization among users, provided that each user “splits” data and signal into two parts. Based on this result, a new multipleaccess technique called rat ..."
Abstract

Cited by 81 (2 self)
 Add to MetaCart
It is shown that any point in the capacity region of a Gaussian multipleaccess channel is achievable by singleuser coding without requiring synchronization among users, provided that each user “splits” data and signal into two parts. Based on this result, a new multipleaccess technique called ratesplitting multiple accessing (RSMA) is proposed. RSMA is a codedivision multipleaccess scheme for the Muser Gaussian multipleaccess channel for which the effort of finding the codes for the M users, of encoding, and of decoding is that of at most 2M  1 independent pointtopoint Gaussian channels. The effects of bursty sources, multipath fading, and intercell interference are discussed and directions for further research are indicated.
Algorithms for Fast Vector Quantization
 Proc. of DCC '93: Data Compression Conference
, 1993
"... Nearest neighbor searching is an important geometric subproblem in vector quantization. ..."
Abstract

Cited by 61 (13 self)
 Add to MetaCart
Nearest neighbor searching is an important geometric subproblem in vector quantization.
Vector Quantization of Image Subbands: A Survey
 IEEE Transactions on Image Processing
, 1996
"... Subband and wavelet decompositions are powerful tools in image coding, because of their decorrelating effects on image pixels, the concentration of energy in a few coefficients, their multirate/multiresolution framework, and their frequency splitting which allows for efficient coding matched to the ..."
Abstract

Cited by 53 (4 self)
 Add to MetaCart
Subband and wavelet decompositions are powerful tools in image coding, because of their decorrelating effects on image pixels, the concentration of energy in a few coefficients, their multirate/multiresolution framework, and their frequency splitting which allows for efficient coding matched to the statistics of each frequency band and to the characteristics of the human visual system. Vector quantization provides a means of converting the decomposed signal into bits in a manner that takes advantage of remaining inter and intraband correlation as well as of the more flexible partitions of higher dimensional vector spaces. Since 1988 a growing body of research has examined the use of vector quantization for subband/wavelet transform coefficients. We present a survey of these methods. 1 Introduction Image compression maps an original image into a bit stream suitable for communication over or storage in a digital medium. The number of bits required to represent the coded image should b...
New trellis codes based on lattices and cosets
 IEEE Trans. Inform. Theory
, 1987
"... A new technique is proposed for constructing trellis codes, which provides an alternative to Ungerboeck’s method of ‘‘set partitioning’’. The new codes use a signal constellation consisting of points from an ndimensional lattice Λ, with an equal number of points from each coset of a sublattice Λ ′. ..."
Abstract

Cited by 37 (7 self)
 Add to MetaCart
A new technique is proposed for constructing trellis codes, which provides an alternative to Ungerboeck’s method of ‘‘set partitioning’’. The new codes use a signal constellation consisting of points from an ndimensional lattice Λ, with an equal number of points from each coset of a sublattice Λ ′. One part of the input stream drives a generalized convolutional code whose outputs are cosets of Λ ′ , while the other part selects points from these cosets. Several of the new codes are better than those previously known.
Soft decoding techniques for codes and lattices, including the Golay code and the Leech lattice
 IEEE Trans. Inform. Theory
, 1986
"... AbstrtiTwo kinds of a&orithms are considered. 1) ff 59 is a binary code of length n, a “soft decision ” decodhg afgorithm for Q changes ao arbitrary point of R ” into a nearest codeword (nearest in Euclideao distance). 2) Similarly, a deco&g afgorithm for a lattice A in R ” changes an arbitraq poin ..."
Abstract

Cited by 33 (3 self)
 Add to MetaCart
AbstrtiTwo kinds of a&orithms are considered. 1) ff 59 is a binary code of length n, a “soft decision ” decodhg afgorithm for Q changes ao arbitrary point of R ” into a nearest codeword (nearest in Euclideao distance). 2) Similarly, a deco&g afgorithm for a lattice A in R ” changes an arbitraq point of R ” into a closest lattice point. Some general methods are given for constructing such algorithnq and arc used to obtain new and faster decoding algorithms for the C&set lattice E,, the Cofay code and the Leech lattice. L I.
Universal Lattice Decoding: Principle and Recent Advances
 WIRELESS COMMUNICATIONS AND MOBILE COMPUTING
, 2003
"... ..."
Scalable Video Coding With Multiscale Motion Compensation And Unequal Error Protection
, 1995
"... this paper, we present some of our recent results obtained for a scalable video codec based on a spatiotemporal resolution pyramid combined with E 8 lattice vector quantization. We first introduce spatiotemporal pyramids and appropriate coding schemes. We discuss the problem of optimum bitallocati ..."
Abstract

Cited by 28 (14 self)
 Add to MetaCart
this paper, we present some of our recent results obtained for a scalable video codec based on a spatiotemporal resolution pyramid combined with E 8 lattice vector quantization. We first introduce spatiotemporal pyramids and appropriate coding schemes. We discuss the problem of optimum bitallocation and multiscale motion compensation. In the second part we present simulation results concerning coding performance, softwareonly decoding, and digital video broadcasting. 2 SPATIOTEMPORAL RESOLUTION PYRAMIDS
Packet Loss Resilient Internet Video Streaming
 in Proceedings of SPIE Visual Communications and Image Processing '99
, 1999
"... This paper describes a transmission scheme for Internet video streaming that provides an acceptable video quality over a wide range of connection qualities. The proposed system consists of a scalable video coder which uses a fully standard compatible H.263 coder in its base layer. The scalable video ..."
Abstract

Cited by 27 (1 self)
 Add to MetaCart
This paper describes a transmission scheme for Internet video streaming that provides an acceptable video quality over a wide range of connection qualities. The proposed system consists of a scalable video coder which uses a fully standard compatible H.263 coder in its base layer. The scalable video coder is combined with unequal error protection using ReedSolomon codes applied across packets. We present and verify a twostate Markov model for packet losses over Internet connections. The relation between packet loss and picture quality at the decoder for an unequally protected layered video stream is derived. Experimental results show that, with our approach, the picture quality of a streamed video degrades gracefully as the packet loss probability of an Internet connection increases. Keywords: Scalable video coding, unequal error protection, erasure decoding, graceful degradation, Internet model SPIE Visual Communications and Image Processing 99, January 1999, San Jose, CA 1. INTRO...
Efficient Approximation Algorithms for the Hamming Center Problem
, 1999
"... The Hamming center problem for a set S of k binary strings, each of length n, asks for a binary string of length n that minimizes the maximum Hamming distance between and any string in S. The decision version of this problem is known to be NPcomplete [6]. We provide several approximation algorit ..."
Abstract

Cited by 27 (2 self)
 Add to MetaCart
The Hamming center problem for a set S of k binary strings, each of length n, asks for a binary string of length n that minimizes the maximum Hamming distance between and any string in S. The decision version of this problem is known to be NPcomplete [6]. We provide several approximation algorithms for the Hamming center problem. Our main result is a randomized ( 4 3 + ")approximation algorithm running in polynomial time if the Hamming radius of S is at least superlogarithmic in k. Furthermore, we show how to nd in polynomial time a set B of O(log k) strings of length n such that for each string in S there is at least one string in B within Hamming distance not exceeding the radius of S. 1 Introduction Let Z n 2 be the set of all strings of length n over the alphabet f0; 1g. For any 2 Z n 2 we use the notation [i] to refer to the symbol placed at the ith position of , where i = 1; ::; n, and we let [i::j] represent the substring of starting at position i and endin...