Results 1  10
of
42
The LOCOI Lossless Image Compression Algorithm: Principles and Standardization into JPEGLS
 IEEE TRANSACTIONS ON IMAGE PROCESSING
, 2000
"... LOCOI (LOw COmplexity LOssless COmpression for Images) is the algorithm at the core of the new ISO/ITU standard for lossless and nearlossless compression of continuoustone images, JPEGLS. It is conceived as a "low complexity projection" of the universal context modeling paradigm, match ..."
Abstract

Cited by 155 (8 self)
 Add to MetaCart
LOCOI (LOw COmplexity LOssless COmpression for Images) is the algorithm at the core of the new ISO/ITU standard for lossless and nearlossless compression of continuoustone images, JPEGLS. It is conceived as a "low complexity projection" of the universal context modeling paradigm, matching its modeling unit to a simple coding unit. By combining simplicity with the compression potential of context models, the algorithm "enjoys the best of both worlds." It is based on a simple fixed context model, which approaches the capability of the more complex universal techniques for capturing highorder dependencies. The model is tuned for efficient performance in conjunction with an extended family of Golombtype codes, which are adaptively chosen, and an embedded alphabet extension for coding of lowentropy image regions. LOCOI attains compression ratios similar or superior to those obtained with stateoftheart schemes based on arithmetic coding. Moreover, it is within a few percentage points of the best available compression ratios, at a much lower complexity level. We discuss the principles underlying the design of LOCOI, and its standardization into JPEGLS.
Contextbased adaptive binary arithmetic coding in the h.264/avc video compression standard. Circuits and Systems for VideoTechnology, IEEETransactions on
"... (CABAC) as a normative part of the new ITUT/ISO/IEC standard H.264/AVC for video compression is presented. By combining an adaptive binary arithmetic coding technique with context modeling, a high degree of adaptation and redundancy reduction is achieved. The CABAC framework also includes a novel l ..."
Abstract

Cited by 117 (6 self)
 Add to MetaCart
(CABAC) as a normative part of the new ITUT/ISO/IEC standard H.264/AVC for video compression is presented. By combining an adaptive binary arithmetic coding technique with context modeling, a high degree of adaptation and redundancy reduction is achieved. The CABAC framework also includes a novel lowcomplexity method for binary arithmetic coding and probability estimation that is well suited for efficient hardware and software implementations. CABAC significantly outperforms the baseline entropy coding method of H.264/AVC for the typical area of envisaged target applications. For a set of test sequences representing typical material used in broadcast applications and for a range of acceptable video quality of about 30 to 38 dB, average bitrate savings of 9%â€“14 % are achieved. Index Termsâ€”Binary arithmetic coding, CABAC, context modeling, entropy coding, H.264, MPEG4 AVC. I.
An efficient indexing technique for fulltext database systems
 In Proceedings of 18th International Conference on Very Large Databases
, 1992
"... Abstract: Fulltext database systems require an index to allow fast access to documents based on their content. We propose an inverted file indexing scheme based on compression. This scheme allows users to retrieve documents using words occurring in the documents, sequences of adjacent words, and ..."
Abstract

Cited by 74 (10 self)
 Add to MetaCart
Abstract: Fulltext database systems require an index to allow fast access to documents based on their content. We propose an inverted file indexing scheme based on compression. This scheme allows users to retrieve documents using words occurring in the documents, sequences of adjacent words, and statistical ranking techniques. The compression methods chosen ensure that the storage requirements are small and that dynamic update is straightforward. The only assumption that we make is that sufficient main memory is available to support an inmemory vocabulary; given this assumption, the method we describe requires at most one disc access per query term to identify answers to queries.
A lowcomplexity modeling approach for embedded coding of wavelet coef cients
 in Proc. 1998 IEEE Data Compression Conference
, 1998
"... progressive image compression, Laplacian density, runlength coding, rate distortion We present a new lowcomplexity method for modeling and coding the bitplanes of a wavelettransformed image in a fully embedded fashion. The scheme uses a simple ordering model for embedding, based on the principle ..."
Abstract

Cited by 45 (2 self)
 Add to MetaCart
progressive image compression, Laplacian density, runlength coding, rate distortion We present a new lowcomplexity method for modeling and coding the bitplanes of a wavelettransformed image in a fully embedded fashion. The scheme uses a simple ordering model for embedding, based on the principle that coefficient bits that are likely to reduce the distortion the most should be described first in the encoded bitstream. The ordering model is tied to a conditioning model in a way that deinterleaves the conditioned subsequences of coefficient bits, making them amenable to coding with a very simple, adaptive
Test Data Compression and Test Resource Partitioning for SystemonaChip Using . . .
, 2003
"... Test data compression and test resource partitioning (TRP) are necessary to reduce the volume of test data for systemonachip designs. We present a new class of variabletovariablelength compression codes that are designed using distributions of the runs of 0s in typical test sequences. We refe ..."
Abstract

Cited by 30 (5 self)
 Add to MetaCart
Test data compression and test resource partitioning (TRP) are necessary to reduce the volume of test data for systemonachip designs. We present a new class of variabletovariablelength compression codes that are designed using distributions of the runs of 0s in typical test sequences. We refer to these as frequencydirected runlength (FDR) codes. We present experimental results for ISCAS 89 benchmark circuits and two IBM production circuits to show that FDR codes are extremely effective for test data compression and TRP. We derive upper and lower bounds on the compression expected for some generic parameters of the test sequences. These bounds are especially tight when the number of runs is small, thereby showing that FDR codes are robust, i.e., they are insensitive to variations in the input data stream. In order to highlight the inherent superiority of FDR codes, we present a probabilistic analysis of data compression for a memoryless data source. Finally, we derive entropy bounds for the benchmark test sets and show that the compression obtained using FDR codes is close to the entropy bounds.
Parameterised Compression for Sparse Bitmaps
 Proc. ACMSIGIR International Conference on Research and Development in Information Retrieval
, 1992
"... : Fulltext retrieval systems typically use either a bitmap or an inverted file to identify which documents contain which words, so that the documents containing any combination of words can be quickly located. Bitmaps of word occurrences are large, but are usually sparse, and thus are amenable to a ..."
Abstract

Cited by 29 (8 self)
 Add to MetaCart
: Fulltext retrieval systems typically use either a bitmap or an inverted file to identify which documents contain which words, so that the documents containing any combination of words can be quickly located. Bitmaps of word occurrences are large, but are usually sparse, and thus are amenable to a variety of compression techniques. Here we consider techniques in which the encoding of each bitvector within the bitmap is parameterised, so that a different code can be used for each bitvector. Our experimental results show that the new methods yield better compression than previous techniques. Categories and Subject Descriptors: E.4 [Coding and Information Theory]: Data compaction and compression; H.3.2 [Information Storage]: File organisation . Keywords: Fulltext retrieval, data compression, document database, Huffman coding, geometric distribution, inverted file. 1 Introduction Fulltext retrieval systems are used for storing and accessing document collections such as newspaper a...
A Class of Reversible Variable Length Codes for Robust Image and Video Coding
, 1997
"... We describe a class of parameterizedreversible variable length codes that have length distributions identical to GolombRicecodes and expGolomb codes. The pdfs to which these codes correspond are well matched to statistics of image and video data, thus enabling an increase in robustness to channel ..."
Abstract

Cited by 28 (3 self)
 Add to MetaCart
We describe a class of parameterizedreversible variable length codes that have length distributions identical to GolombRicecodes and expGolomb codes. The pdfs to which these codes correspond are well matched to statistics of image and video data, thus enabling an increase in robustness to channel errors with no penalty in coding efficiency. These codes are applicable to MPEG4 and other algorithms that aim to use variable length codes in errorprone environments.
Reversible Variable Length Codes for Efficient and Robust Image and Video Coding
 in Proceedings Data Compression Conference
, 1998
"... The International Telecommunications Union (ITU) recently adopted reversible variable length codes (RVLCs) for use in the emerging H.263+ video compression standard. As the name suggests, these codes can be decoded in two directions and can therefore be used by a decoder to enhance robustness in the ..."
Abstract

Cited by 28 (2 self)
 Add to MetaCart
The International Telecommunications Union (ITU) recently adopted reversible variable length codes (RVLCs) for use in the emerging H.263+ video compression standard. As the name suggests, these codes can be decoded in two directions and can therefore be used by a decoder to enhance robustness in the presence of transmission bit errors. In addition, these RVLCs involve little or no efficiency loss relative to the corresponding nonreversible variable length codes. We present here the ideas behind two general classes of RVLCs and discuss the results of applying these codes in the framework of the H.263+ and MPEG4 video coding standards.
Compression of Correlated BitVectors
 Information Systems
, 1990
"... : Bitmaps are data structures occurring often in information retrieval. They are useful; they are also large and expensive to store. For this reason, considerable effort has been devoted to finding techniques for compressing them. These techniques are most effective for sparse bitmaps. We propose a ..."
Abstract

Cited by 25 (2 self)
 Add to MetaCart
: Bitmaps are data structures occurring often in information retrieval. They are useful; they are also large and expensive to store. For this reason, considerable effort has been devoted to finding techniques for compressing them. These techniques are most effective for sparse bitmaps. We propose a preprocessing stage, in which bitmaps are first clustered and the clusters used to transform their member bitmaps into sparser ones, that can be more effectively compressed. The clustering method efficiently generates a graph structure on the bitmaps. In some situations, it is desired to impose restrictions on the graph; finding the optimal graph satisfying these restrictions is shown to be NPcomplete. The results of applying our algorithm to the Bible is presented: for some sets of bitmaps, our method almost doubled the compression savings. 1. Introduction Textual Information Retrieval Systems (IRS) are voracious consumers of computer storage resources. Most conspicuous, of course, is the...
Searching Large Lexicons for Partially Specified Terms using Compressed Inverted Files
 Proc. International Conference on Very Large Databases
, 1993
"... There are several advantages to be gained by storing the lexicon of a full text database in main memory. In this paper we describe how to use a compressed inverted file index to search such a lexicon for entries that match a pattern or partially specified term. Our experiments show that this method ..."
Abstract

Cited by 18 (5 self)
 Add to MetaCart
There are several advantages to be gained by storing the lexicon of a full text database in main memory. In this paper we describe how to use a compressed inverted file index to search such a lexicon for entries that match a pattern or partially specified term. Our experiments show that this method provides an effective compromise between speed and space, running orders of magnitude faster than brute force search, but requiring less memory than other patternmatching data structures; indeed, in some cases requiring less memory than would be consumed by a single pointer to each string. The pattern search method is based on text indexing techniques and is a successful adaptation of inverted files to main memory databases.