Results 1  10
of
47,752
EntropyConstrained Successively Refinable Scalar Quantization
 Proc. IEEE Data Compression Conf
, 1997
"... We study the design of entropyconstrained successively refinable scalar quantizers. We propose two algorithms to minimize the average distortion and design such a quantizer. We consider two sets of constraints on the entropy: (i) constraint on the average rate and (ii) constraint on aggregate rate ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
We study the design of entropyconstrained successively refinable scalar quantizers. We propose two algorithms to minimize the average distortion and design such a quantizer. We consider two sets of constraints on the entropy: (i) constraint on the average rate and (ii) constraint on aggregate
Quantization Index Modulation: A Class of Provably Good Methods for Digital Watermarking and Information Embedding
 IEEE TRANS. ON INFORMATION THEORY
, 1999
"... We consider the problem of embedding one signal (e.g., a digital watermark), within another "host" signal to form a third, "composite" signal. The embedding is designed to achieve efficient tradeoffs among the three conflicting goals of maximizing informationembedding rate, mini ..."
Abstract

Cited by 495 (15 self)
 Add to MetaCart
, minimizing distortion between the host signal and composite signal, and maximizing the robustness of the embedding. We introduce new classes of embedding methods, termed quantization index modulation (QIM) and distortioncompensated QIM (DCQIM), and develop convenient realizations in the form of what we
Embedded EntropyConstrained Trellis Coded Quantization
 Proc., Conf. on Information Sciences and Systems
, 1998
"... A new variablerate embedded quantization technique, called Embedded EntropyConstrained Trellis Coded Quantization (EECTCQ) is developed. Its performance is compared to that of variablerate embedded scalar quantizers. Simulations indicate that performance depends both upon the number of embedded q ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
A new variablerate embedded quantization technique, called Embedded EntropyConstrained Trellis Coded Quantization (EECTCQ) is developed. Its performance is compared to that of variablerate embedded scalar quantizers. Simulations indicate that performance depends both upon the number of embedded
Fast EntropyConstrained Vector Quantizer Design
"... Vector quantization is the process of encoding vector data as an index to a dictionary  or codebook  of representative vectors. Entropyconstrained vector quantizers (ECVQ) explicitly control the entropy of the output, and are superior to simple nearestneighbor vector quantizers in terms of rat ..."
Abstract
 Add to MetaCart
Vector quantization is the process of encoding vector data as an index to a dictionary  or codebook  of representative vectors. Entropyconstrained vector quantizers (ECVQ) explicitly control the entropy of the output, and are superior to simple nearestneighbor vector quantizers in terms
Quantum Gravity
, 2004
"... We describe the basic assumptions and key results of loop quantum gravity, which is a background independent approach to quantum gravity. The emphasis is on the basic physical principles and how one deduces predictions from them, at a level suitable for physicists in other areas such as string theor ..."
Abstract

Cited by 566 (11 self)
 Add to MetaCart
integral quantizations, coupling to matter, extensions to supergravity and higher dimensional theories, as well as applications to black holes, cosmology and Plank scale phenomenology. We describe the near term prospects for observational tests of quantum theories of gravity and the expectations that loop
Bundle Adjustment  A Modern Synthesis
 VISION ALGORITHMS: THEORY AND PRACTICE, LNCS
, 2000
"... This paper is a survey of the theory and methods of photogrammetric bundle adjustment, aimed at potential implementors in the computer vision community. Bundle adjustment is the problem of refining a visual reconstruction to produce jointly optimal structure and viewing parameter estimates. Topics c ..."
Abstract

Cited by 555 (12 self)
 Add to MetaCart
This paper is a survey of the theory and methods of photogrammetric bundle adjustment, aimed at potential implementors in the computer vision community. Bundle adjustment is the problem of refining a visual reconstruction to produce jointly optimal structure and viewing parameter estimates. Topics
Approximate Signal Processing
, 1997
"... It is increasingly important to structure signal processing algorithms and systems to allow for trading off between the accuracy of results and the utilization of resources in their implementation. In any particular context, there are typically a variety of heuristic approaches to managing these tra ..."
Abstract

Cited by 516 (2 self)
 Add to MetaCart
these tradeoffs. One of the objectives of this paper is to suggest that there is the potential for developing a more formal approach, including utilizing current research in Computer Science on Approximate Processing and one of its central concepts, Incremental Refinement. Toward this end, we first summarize a
QSplat: A Multiresolution Point Rendering System for Large Meshes
, 2000
"... Advances in 3D scanning technologies have enabled the practical creation of meshes with hundreds of millions of polygons. Traditional algorithms for display, simplification, and progressive transmission of meshes are impractical for data sets of this size. We describe a system for representing and p ..."
Abstract

Cited by 500 (8 self)
 Add to MetaCart
, and refines progressively when idle to a high final image quality. We have demonstrated the system on scanned models containing hundreds of millions of samples.
The information bottleneck method
 University of Illinois
, 1999
"... We define the relevant information in a signal x ∈ X as being the information that this signal provides about another signal y ∈ Y. Examples include the information that face images provide about the names of the people portrayed, or the information that speech sounds provide about the words spoken. ..."
Abstract

Cited by 545 (38 self)
 Add to MetaCart
about Y through a ‘bottleneck ’ formed by a limited set of codewords ˜X. This constrained optimization problem can be seen as a generalization of rate distortion theory in which the distortion measure d(x, ˜x) emerges from the joint statistics of X and Y. This approach yields an exact set of self
Coupled hidden Markov models for complex action recognition
, 1996
"... We present algorithms for coupling and training hidden Markov models (HMMs) to model interacting processes, and demonstrate their superiority to conventional HMMs in a vision task classifying twohanded actions. HMMs are perhaps the most successful framework in perceptual computing for modeling and ..."
Abstract

Cited by 497 (22 self)
 Add to MetaCart
We present algorithms for coupling and training hidden Markov models (HMMs) to model interacting processes, and demonstrate their superiority to conventional HMMs in a vision task classifying twohanded actions. HMMs are perhaps the most successful framework in perceptual computing for modeling
Results 1  10
of
47,752