Results 1 
8 of
8
The Design and Analysis of Efficient Lossless Data Compression Systems
, 1993
"... Our thesis is that high compression efficiency for text and images can be obtained by using sophisticated statistical compression techniques, and that greatly increased speed can be achieved at only a small cost in compression efficiency. Our emphasis is on elegant design and mathematical as well as ..."
Abstract

Cited by 50 (0 self)
 Add to MetaCart
Our thesis is that high compression efficiency for text and images can be obtained by using sophisticated statistical compression techniques, and that greatly increased speed can be achieved at only a small cost in compression efficiency. Our emphasis is on elegant design and mathematical as well as empirical analysis. We analyze arithmetic coding as it is commonly implemented and show rigorously that almost no compression is lost in the implementation. We show that highefficiency lossless compression of both text and grayscale images can be obtained by using appropriate models in conjunction with arithmetic coding. We introduce a fourcomponent paradigm for lossless image compression and present two methods that give state of the art compression efficiency. In the text compression area, we give a small improvement on the preferred method in the literature. We show that we can often obtain significantly improved throughput at the cost of slightly reduced compression. The extra speed c...
Analysis of Arithmetic Coding for Data Compression
 INFORMATION PROCESSING AND MANAGEMENT
, 1992
"... Arithmetic coding, in conjunction with a suitable probabilistic model, can provide nearly optimal data compression. In this article we analyze the effect that the model and the particular implementation of arithmetic coding have on the code length obtained. Periodic scaling is often used in arithmet ..."
Abstract

Cited by 36 (6 self)
 Add to MetaCart
Arithmetic coding, in conjunction with a suitable probabilistic model, can provide nearly optimal data compression. In this article we analyze the effect that the model and the particular implementation of arithmetic coding have on the code length obtained. Periodic scaling is often used in arithmetic coding implementations to reduce time and storage requirements; it also introduces a recency effect which can further affect compression. Our main contribution is introducing the concept of weighted entropy and using it to characterize in an elegant way the effect that periodic scaling has on the code length. We explain why and by how much scaling increases the code length for files with a homogeneous distribution of symbols, and we characterize the reduction in code length due to scaling for files exhibiting locality of reference. We also give a rigorous proof that the coding effects of rounding scaled weights, using integer arithmetic, and encoding endoffile are negligible.
Practical Implementations of Arithmetic Coding
 IN IMAGE AND TEXT
, 1992
"... We provide a tutorial on arithmetic coding, showing how it provides nearly optimal data compression and how it can be matched with almost any probabilistic model. We indicate the main disadvantage of arithmetic coding, its slowness, and give the basis of a fast, spaceefficient, approximate arithmet ..."
Abstract

Cited by 34 (6 self)
 Add to MetaCart
We provide a tutorial on arithmetic coding, showing how it provides nearly optimal data compression and how it can be matched with almost any probabilistic model. We indicate the main disadvantage of arithmetic coding, its slowness, and give the basis of a fast, spaceefficient, approximate arithmetic coder with only minimal loss of compression efficiency. Our coder is based on the replacement of arithmetic by table lookups coupled with a new deterministic probability estimation scheme.
The BurrowsWheeler Transform: Theory and Practice
 Lecture Notes in Computer Science
, 1999
"... In this paper we describe the BurrowsWheeler Transform (BWT) a completely new approach to data compression which is the basis of some of the best compressors available today. Although it is easy to intuitively understand why the BWT helps compression, the analysis of BWTbased algorithms requir ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
In this paper we describe the BurrowsWheeler Transform (BWT) a completely new approach to data compression which is the basis of some of the best compressors available today. Although it is easy to intuitively understand why the BWT helps compression, the analysis of BWTbased algorithms requires a careful study of every single algorithmic component. We describe two algorithms which use the BWT and we show that their compression ratio can be bounded in terms of the kth order empirical entropy of the input string for any k 0. Intuitively, this means that these algorithms are able to make use of all the regularity which is in the input string.
MovetoFront, Distance Coding, and Inversion Frequencies Revisited
, 2007
"... MovetoFront, Distance Coding and Inversion Frequencies are three somewhat related techniques used to process the output of the BurrowsWheeler Transform. In this paper we analyze these techniques from the point of view of how effective they are in the task of compressing lowentropy strings, that ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
MovetoFront, Distance Coding and Inversion Frequencies are three somewhat related techniques used to process the output of the BurrowsWheeler Transform. In this paper we analyze these techniques from the point of view of how effective they are in the task of compressing lowentropy strings, that is, strings which have many regularities and are therefore highly compressible. This is a nontrivial task since many compressors have nonconstant overheads that become nonnegligible when the input string is highly compressible. Because of the properties of the BurrowsWheeler transform, being locally optimal ensures an algorithm compresses lowentropy strings effectively. Informally, local optimality implies that an algorithm is able to effectively compress an arbitrary partition of the input string. We show that in their original formulation neither MovetoFront, nor Distance Coding, nor Inversion Frequencies is locally optimal. Then, we describe simple variants of the above algorithms which are locally optimal. To achieve local optimality with MovetoFront it suffices to combine it with Run Length Encoding. To achieve local optimality with Distance Coding and Inversion Frequencies we use a novel “escape and reenter” strategy.
1 1 A Comparison of Methods for Redundancy Reduction in Recurrence Time Coding
"... Abstract — Recurrence time of a symbol in a string is defined as the number of symbols that have appeared since the last previous occurrence of the same symbol. It is one of the most fundamental quantities that can be used in universal source coding. If we count only the minimum required number of s ..."
Abstract
 Add to MetaCart
Abstract — Recurrence time of a symbol in a string is defined as the number of symbols that have appeared since the last previous occurrence of the same symbol. It is one of the most fundamental quantities that can be used in universal source coding. If we count only the minimum required number of symbols occurring in the recurrence period, we can reduce some redundancy contained in recurrence time coding. The MTF (movetofront) scheme is a typical example that shares the idea. In this correspondence, we establish three such schemes, and make a basic comparison with one another from the viewpoint that they can be thought of as different attempts to realize the above idea. Index Terms — MTF, data compression, recency rank, recurrence time, source coding, universal codes I.
An InformationTheoretic Study on VariableLength Source Coding with Unequal Cost SUPERVISORY COMMITTEE:
, 2000
"... Copyright by Osamu UCHIDA ..."
An Analysis of the BurrowsWheeler Transform
, 1999
"... The BurrowsWheeler Transform (also known as BlockSorting) is at the base of compression algorithms which are the state of the art in lossless data compression. In this paper we analyze two algorithms which use this technique. The first one is the original algorithm described by Burrows and Wheeler ..."
Abstract
 Add to MetaCart
The BurrowsWheeler Transform (also known as BlockSorting) is at the base of compression algorithms which are the state of the art in lossless data compression. In this paper we analyze two algorithms which use this technique. The first one is the original algorithm described by Burrows and Wheeler, which, despite its simplicity, outperforms the Gzip compressor. The second one uses an additional runlength encoding step to improve compression. We prove that the compression ratio of both algorithms can be bounded in terms of the kth order empirical entropy of the input string for any k 0. We make no assumptions on the input and we obtain bounds which hold in the worst case, that is, for every possible input string. All previous results for BlockSorting algorithms were concerned with the average compression ratio and have been established assuming that the input comes from a finiteorder Markov source. Dipartimento di Scienze e Tecnologie Avanzate, Universit`a del Piemonte Orienta...