Results 1  10
of
20
Asymptotic Performance of Vector Quantizers with a Perceptual Distortion Measure
 in Proc. IEEE Int. Symp. on Information Theory, p. 55
, 1997
"... Gersho's bounds on the asymptotic performance of vector quantizers are valid for vector distortions which are powers of the Euclidean norm. Yamada, Tazaki and Gray generalized the results to distortion measures that are increasing functions of the norm of their argument. In both cases, the distortio ..."
Abstract

Cited by 28 (3 self)
 Add to MetaCart
Gersho's bounds on the asymptotic performance of vector quantizers are valid for vector distortions which are powers of the Euclidean norm. Yamada, Tazaki and Gray generalized the results to distortion measures that are increasing functions of the norm of their argument. In both cases, the distortion is uniquely determined by the vector quantization error, i.e., the Euclidean difference between the original vector and the codeword into which it is quantized. We generalize these asymptotic bounds to inputweighted quadratic distortion measures, a class of distortion measure often used for perceptually meaningful distortion. The generalization involves a more rigorous derivation of a fixed rate result of Gardner and Rao and a new result for variable rate codes. We also consider the problem of source mismatch, where the quantizer is designed using a probability density different from the true source density. The resulting asymptotic performance in terms of distortion increase in dB is shown...
The Optimal Lattice Quantizer in Three Dimensions
, 1983
"... The bodycentered cubic lattice is shown to have the smallest mean squared error of any lattice quantizer in three dimensions, assuming that the input to the quantizer has a uniform distribution. ..."
Abstract

Cited by 25 (6 self)
 Add to MetaCart
The bodycentered cubic lattice is shown to have the smallest mean squared error of any lattice quantizer in three dimensions, assuming that the input to the quantizer has a uniform distribution.
A Lagrangian formulation of Zador's entropyconstrained quantization theorem
 IEEE Trans. Inform. Theory
, 2002
"... Zador's classic result for the asymptotic highrate behavior of entropyconstrained vector quantization is recast in a Lagrangian form which better matches the Lloyd algorithm used to optimize such quantizers. The equivalence of the two formulations is shown and the result is proved for source distr ..."
Abstract

Cited by 19 (8 self)
 Add to MetaCart
Zador's classic result for the asymptotic highrate behavior of entropyconstrained vector quantization is recast in a Lagrangian form which better matches the Lloyd algorithm used to optimize such quantizers. The equivalence of the two formulations is shown and the result is proved for source distributions that are absolutely continuous with respect to the Lebesgue measure which satisfy an entropy condition, thereby generalizing the conditions stated by Zador under which the result holds.
Highrate quantization and transform coding with side information at the decoder
 EURASIP  Journal on Applied Signal Processing (Special Issue on Distributed Source Coding
, 2006
"... We extend highrate quantization theory to WynerZiv coding, i.e., lossy source coding with side information at the decoder. Ideal SlepianWolf coders are assumed, thus rates are conditional entropies of quantization indices given the side information. This theory is applied to the analysis of ortho ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
We extend highrate quantization theory to WynerZiv coding, i.e., lossy source coding with side information at the decoder. Ideal SlepianWolf coders are assumed, thus rates are conditional entropies of quantization indices given the side information. This theory is applied to the analysis of orthonormal block transforms for WynerZiv coding. A formula for the optimal rate allocation and an approximation to the optimal transform are derived. The case of noisy highrate quantization and transform coding is included in our study, in which a noisy observation of source data is available at the encoder, but we are interested in estimating the unseen data at the decoder, with the help of side information. We implement a transformdomain WynerZiv video coder that encodes frames independently but decodes them conditionally. Experimental results show that using the discrete cosine transform results in a ratedistortion improvement with respect to the pixeldomain coder. Transform coders of noisy images for different communication constraints are compared. Experimental results show that the noisy WynerZiv transform coder achieves a performance close to the case in which the side information is also available at the encoder. Keywords: highrate quantization, transform coding, side information, WynerZiv coding, distributed source coding, noisy source coding 1.
Mismatch in High Rate Entropy Constrained Vector Quantization
 IEEE Trans. Inform. Theory
, 2002
"... Bucklew's high rate vector quantizer mismatch result is extended from fixedrate coding to variable rate coding using a Lagrangian formulation. It is shown that if an asymptotically (high rate) optimal sequence of variable rate codes is designed for a kdimensional probability density function (pdf) ..."
Abstract

Cited by 11 (4 self)
 Add to MetaCart
Bucklew's high rate vector quantizer mismatch result is extended from fixedrate coding to variable rate coding using a Lagrangian formulation. It is shown that if an asymptotically (high rate) optimal sequence of variable rate codes is designed for a kdimensional probability density function (pdf) g and then applied to another pdf f for which f/g is bounded, then the resulting mismatch or loss of performance from the optimal possible is given by the relative entropy or KullbackLeibler divergence g).
Vector Quantization and Density Estimation
 In SEQUENCES97
, 1997
"... The connection between compression and the estimation of probability distributions has long been known for the case of discrete alphabet sources and lossless coding. A universal lossless code which does a good job of compressing must implicitly also do a good job of modeling. In particular, with a c ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
The connection between compression and the estimation of probability distributions has long been known for the case of discrete alphabet sources and lossless coding. A universal lossless code which does a good job of compressing must implicitly also do a good job of modeling. In particular, with a collection of codebooks, one for each possible class or model, if codewords are chosen from among the ensemble of codebooks so as to minimize bit rate, then the codebook selected provides an implicit estimate of the underlying class. Less is known about the corresponding connections between lossy compression and continuous sources. Here we consider aspects of estimating conditional and unconditional densities in conjunction with Bayesrisk weighted vector quantization for joint compression and classification.
Gauss Mixture Vector Quantization
 Proceedings 2001 IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP
, 2001
"... Gauss mixtures are a popular class of models in statistics and statistical signal processing because they can provide good ts to smooth densities, because they have a rich theory, and because the can be well estimated by existing algorithms such as the EM algorithm. We here extend an information the ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
Gauss mixtures are a popular class of models in statistics and statistical signal processing because they can provide good ts to smooth densities, because they have a rich theory, and because the can be well estimated by existing algorithms such as the EM algorithm. We here extend an information theortic extremal property for source coding from Gaussian sources to Gauss mixtures using high rate quantization theory and extend a method originally used for LPC speech vector quantization to provide a Lloyd clustering approach to the design of Gauss mixture models. The theory provides formulas relating minimum discrimination information (MDI) for model selection and the mean squared error resulting when the MDI criterion is used in an optimized robust classied vector quantizer. It also provides motivation for the use of Gauss mixture models for robust compression systems for general random vectors.
Optimality of KLT for HighRate Transform Coding of Gaussian VectorScale Mixtures: Application to Reconstruction, Estimation and Classification ∗
"... The KarhunenLoève transform (KLT) is known to be optimal for highrate transform coding of Gaussian vectors for both fixedrate and variablerate encoding. The KLT is also known to be suboptimal for some nonGaussian models. This paper proves highrate optimality of the KLT for variablerate encod ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
The KarhunenLoève transform (KLT) is known to be optimal for highrate transform coding of Gaussian vectors for both fixedrate and variablerate encoding. The KLT is also known to be suboptimal for some nonGaussian models. This paper proves highrate optimality of the KLT for variablerate encoding of a broad class of nonGaussian vectors: Gaussian vectorscale mixtures (GVSM), which extend the Gaussian scale mixture (GSM) model of natural signals. A key concavity property of the scalar GSM (same as the scalar GVSM) is derived to complete the proof. Optimality holds under a broad class of quadratic criteria, which include mean squared error (MSE) as well as generalized fdivergence loss in estimation and binary classification systems. Finally, the theory is illustrated using two applications: signal estimation in multiplicative noise and joint optimization of classification/reconstruction systems.
Vanishing distortion and shrinking cells
 IEEE Trans. Inform. Theory
, 1996
"... We establish an asymptotic connection between vanishing r’th power distortion and shrinking cell diameters for vector quantizers with convex cells. Appears in IEEE Transaction on Information Theory, vol. 24, no. 3, pp. 13031305, 1996. ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
We establish an asymptotic connection between vanishing r’th power distortion and shrinking cell diameters for vector quantizers with convex cells. Appears in IEEE Transaction on Information Theory, vol. 24, no. 3, pp. 13031305, 1996.