Results 1  10
of
69
Inverted files for text search engines
 ACM Computing Surveys
, 2006
"... The technology underlying text search engines has advanced dramatically in the past decade. The development of a family of new index representations has led to a wide range of innovations in index storage, index construction, and query evaluation. While some of these developments have been consolida ..."
Abstract

Cited by 192 (5 self)
 Add to MetaCart
The technology underlying text search engines has advanced dramatically in the past decade. The development of a family of new index representations has led to a wide range of innovations in index storage, index construction, and query evaluation. While some of these developments have been consolidated in textbooks, many specific techniques are not widely known or the textbook descriptions are out of date. In this tutorial, we introduce the key techniques in the area, describing both a core implementation and how the core can be enhanced through a range of extensions. We conclude with a comprehensive bibliography of text indexing literature.
The LOCOI Lossless Image Compression Algorithm: Principles and Standardization into JPEGLS
 IEEE TRANSACTIONS ON IMAGE PROCESSING
, 2000
"... LOCOI (LOw COmplexity LOssless COmpression for Images) is the algorithm at the core of the new ISO/ITU standard for lossless and nearlossless compression of continuoustone images, JPEGLS. It is conceived as a "low complexity projection" of the universal context modeling paradigm, matching its mo ..."
Abstract

Cited by 152 (10 self)
 Add to MetaCart
LOCOI (LOw COmplexity LOssless COmpression for Images) is the algorithm at the core of the new ISO/ITU standard for lossless and nearlossless compression of continuoustone images, JPEGLS. It is conceived as a "low complexity projection" of the universal context modeling paradigm, matching its modeling unit to a simple coding unit. By combining simplicity with the compression potential of context models, the algorithm "enjoys the best of both worlds." It is based on a simple fixed context model, which approaches the capability of the more complex universal techniques for capturing highorder dependencies. The model is tuned for efficient performance in conjunction with an extended family of Golombtype codes, which are adaptively chosen, and an embedded alphabet extension for coding of lowentropy image regions. LOCOI attains compression ratios similar or superior to those obtained with stateoftheart schemes based on arithmetic coding. Moreover, it is within a few percentage points of the best available compression ratios, at a much lower complexity level. We discuss the principles underlying the design of LOCOI, and its standardization into JPEGLS.
SelfIndexing Inverted Files for Fast Text Retrieval
 ACM Transactions on Information Systems
, 1996
"... Query processing costs on large text databases are dominated by the need to retrieve and scan the inverted list of each query term. Here we show that query response time for conjunctive Boolean queries and for informal ranked queries can be dramatically reduced, at little cost in terms of storage, b ..."
Abstract

Cited by 147 (26 self)
 Add to MetaCart
Query processing costs on large text databases are dominated by the need to retrieve and scan the inverted list of each query term. Here we show that query response time for conjunctive Boolean queries and for informal ranked queries can be dramatically reduced, at little cost in terms of storage, by the inclusion of an internal index in each inverted list. This method has been applied in a retrieval system for a collection of nearly two million short documents. Our experimental results show that the selfindexing strategy adds less than 20% to the size of the inverted file, but, for Boolean queries of 510 terms, can reduce processing time to under one fifth of the previous cost. Similarly, ranked queries of 4050 terms can be evaluated in as little as 25% of the previous time, with little or no loss of retrieval effectiveness.
Arithmetic coding revisited
 ACM Transactions on Information Systems
, 1995
"... Over the last decade, arithmetic coding has emerged as an important compression tool. It is now the method of choice for adaptive coding on multisymbol alphabets because of its speed, low storage requirements, and effectiveness of compression. This article describes a new implementation of arithmeti ..."
Abstract

Cited by 139 (2 self)
 Add to MetaCart
Over the last decade, arithmetic coding has emerged as an important compression tool. It is now the method of choice for adaptive coding on multisymbol alphabets because of its speed, low storage requirements, and effectiveness of compression. This article describes a new implementation of arithmetic coding that incorporates several improvements over a widely used earlier version by Witten, Neal, and Cleary, which has become a de facto standard. These improvements include fewer multiplicative operations, greatly extended range of alphabet sizes and symbol probabilities, and the use of lowprecision arithmetic, permitting implementation by fast shift/add operations. We also describe a modular structure that separates the coding, modeling, and probability estimation components of a compression system. To motivate the improved coder, we consider the needs of a wordbased text compression program. We report a range of experimental results using this and other models. Complete source code is available.
Contextbased adaptive binary arithmetic coding in the h.264/avc video compression standard. Circuits and Systems for VideoTechnology, IEEETransactions on
"... (CABAC) as a normative part of the new ITUT/ISO/IEC standard H.264/AVC for video compression is presented. By combining an adaptive binary arithmetic coding technique with context modeling, a high degree of adaptation and redundancy reduction is achieved. The CABAC framework also includes a novel l ..."
Abstract

Cited by 110 (6 self)
 Add to MetaCart
(CABAC) as a normative part of the new ITUT/ISO/IEC standard H.264/AVC for video compression is presented. By combining an adaptive binary arithmetic coding technique with context modeling, a high degree of adaptation and redundancy reduction is achieved. The CABAC framework also includes a novel lowcomplexity method for binary arithmetic coding and probability estimation that is well suited for efficient hardware and software implementations. CABAC significantly outperforms the baseline entropy coding method of H.264/AVC for the typical area of envisaged target applications. For a set of test sequences representing typical material used in broadcast applications and for a range of acceptable video quality of about 30 to 38 dB, average bitrate savings of 9%â€“14 % are achieved. Index Termsâ€”Binary arithmetic coding, CABAC, context modeling, entropy coding, H.264, MPEG4 AVC. I.
A lowcomplexity modeling approach for embedded coding of wavelet coef cients
 in Proc. 1998 IEEE Data Compression Conference
, 1998
"... progressive image compression, Laplacian density, runlength coding, rate distortion We present a new lowcomplexity method for modeling and coding the bitplanes of a wavelettransformed image in a fully embedded fashion. The scheme uses a simple ordering model for embedding, based on the principle ..."
Abstract

Cited by 45 (2 self)
 Add to MetaCart
progressive image compression, Laplacian density, runlength coding, rate distortion We present a new lowcomplexity method for modeling and coding the bitplanes of a wavelettransformed image in a fully embedded fashion. The scheme uses a simple ordering model for embedding, based on the principle that coefficient bits that are likely to reduce the distortion the most should be described first in the encoded bitstream. The ordering model is tied to a conditioning model in a way that deinterleaves the conditioned subsequences of coefficient bits, making them amenable to coding with a very simple, adaptive
Fast and efficient lossless image compression
 in Proc. 1993 Data Compression Conference, (Snowbird)
, 1993
"... We present a new method for lossless image compression that gives compression comparable to JPEG lossless mode with about five times the speed. Our method, called FELICS, is based on a novel use of two neighboring pixels for both prediction and error modeling. For coding we use single bits, adjusted ..."
Abstract

Cited by 34 (0 self)
 Add to MetaCart
We present a new method for lossless image compression that gives compression comparable to JPEG lossless mode with about five times the speed. Our method, called FELICS, is based on a novel use of two neighboring pixels for both prediction and error modeling. For coding we use single bits, adjusted binary codes, and Golomb or Rice codes. For the latter we present and analyze a provably good method for estimating the single coding parameter.
Parameterised Compression for Sparse Bitmaps
 Proc. ACMSIGIR International Conference on Research and Development in Information Retrieval
, 1992
"... : Fulltext retrieval systems typically use either a bitmap or an inverted file to identify which documents contain which words, so that the documents containing any combination of words can be quickly located. Bitmaps of word occurrences are large, but are usually sparse, and thus are amenable to a ..."
Abstract

Cited by 29 (8 self)
 Add to MetaCart
: Fulltext retrieval systems typically use either a bitmap or an inverted file to identify which documents contain which words, so that the documents containing any combination of words can be quickly located. Bitmaps of word occurrences are large, but are usually sparse, and thus are amenable to a variety of compression techniques. Here we consider techniques in which the encoding of each bitvector within the bitmap is parameterised, so that a different code can be used for each bitvector. Our experimental results show that the new methods yield better compression than previous techniques. Categories and Subject Descriptors: E.4 [Coding and Information Theory]: Data compaction and compression; H.3.2 [Information Storage]: File organisation . Keywords: Fulltext retrieval, data compression, document database, Huffman coding, geometric distribution, inverted file. 1 Introduction Fulltext retrieval systems are used for storing and accessing document collections such as newspaper a...
Existence of Optimal Prefix Codes for Infinite Source Alphabets
"... It is proven that for every random variable with a countably infinite set of outcomes and finite entropy there exists an optimal prefix code which can be constructed from Huffman codes for truncated versions of the random variable, and that the average lengths of any sequence of Huffman codes for th ..."
Abstract

Cited by 23 (2 self)
 Add to MetaCart
It is proven that for every random variable with a countably infinite set of outcomes and finite entropy there exists an optimal prefix code which can be constructed from Huffman codes for truncated versions of the random variable, and that the average lengths of any sequence of Huffman codes for the truncated versions converge to that of the optimal code. Also, it is shown that every optimal infinite code achieves Kraft's inequality with equality. Index TermsHuffman, lossless coding, prefix codes. I. INTRODUCTION An alphabet A is a finite set and A 3 is the set of all finitelength words formed from the elements of A: For each word w 2A 3 , let l(w) denote the word length of w: A Dary prefix code C over an alphabet A (with jAj = D) is a subset of A 3 with the property that no word in C is the prefix of another word in C: Let Z + denote the positive integers. A sequence of Dary prefix codes C1;C 2;C 3;111; converges to an infinite prefix code C if for every i 1, the it...
Group Testing for Image Compression
 In Proceedings DCC 2000, Data Compression Conference
, 2000
"... This paper presents Group Testing for Wavelets (GTW), a novel embedded waveletbased image compression algorithm based on the concept of group testing. We explain how group testing is a generalization of the zerotree coding technique for wavelettransformed images. We also show that Golomb coding is ..."
Abstract

Cited by 18 (5 self)
 Add to MetaCart
This paper presents Group Testing for Wavelets (GTW), a novel embedded waveletbased image compression algorithm based on the concept of group testing. We explain how group testing is a generalization of the zerotree coding technique for wavelettransformed images. We also show that Golomb coding is equivalent to Hwang's group testing algorithm (Hwang, 1972). GTW is similar to SPIHT (Said & Pearlman, 1996) but replaces SPIHT's significance pass with a new group testing based method. Although no arithmetic coding is implemented, GTW performs competitively with SPIHT's arithmetic coding variant in terms of ratedistortion performance.