Results 1  10
of
11
A Fast Blocksorting Algorithm for lossless Data Compression
, 1996
"... I describe a fast blocksorting algorithm and its implementation to be used as frontend to simple lossless data compression algorithms like movetofront coding. I also compare it with widely available data compression algorithms running on the same Hardware. My algorithm achieves speed above c ..."
Abstract

Cited by 41 (0 self)
 Add to MetaCart
(Show Context)
I describe a fast blocksorting algorithm and its implementation to be used as frontend to simple lossless data compression algorithms like movetofront coding. I also compare it with widely available data compression algorithms running on the same Hardware. My algorithm achieves speed above comparable algorithms while maintaining the same good compression. Since it is a derivative from the algorithm published by M. Burrows and D.J. Wheeler the size of the input blocks must be large to achieve good compression. Unlike their method execution speed here does not depend on the blocksize used. I will also present improvements to the backend of blocksorting compression methods. Michael Schindler A fast blocksorting algorithm for lossless data compression 3 1 Introduction Today's popular lossless data compression algorithms are mainly based on the sequential datacompression published by Lempel and Ziv in 1977 [1] and 1978 [2]. There were improvements like in [3] or the developme...
Block Sorting Text Compression  Final Report
, 1996
"... A recent development in text compression is a "block sorting" algorithm which permutes the input text according to a special sort procedure and then processes the permuted text with MovetoFront and a final statistical compressor. The technique combines good speed with excellent compressi ..."
Abstract

Cited by 18 (3 self)
 Add to MetaCart
A recent development in text compression is a "block sorting" algorithm which permutes the input text according to a special sort procedure and then processes the permuted text with MovetoFront and a final statistical compressor. The technique combines good speed with excellent compression performance. This report investigates the block sorting compression algorithm, in particular trying to understand its operation and limitations. Various approaches are investigated in an attempt to improve the compression with block sorting, most of which involve a hierarchy of coding models to allow fast adaptation to local contexts. The best technique involves a new "structured" coding model, especially designed for compressing data with skew symbol distributions. Block sorting compression is found to be related to work by Shannon in 1951 on the prediction of English text. The work confirms blocksorting as a good text compression technique, with a compression approaching that of the currently be...
Symbol ranking text compression with Shannon recoding
 J. UCS
, 1997
"... Abstract In his work on the information content of English text in 1951, Shannon described a method of recoding the input text, a technique which has apparently lain dormant for the ensuing 45 years. Whereas traditional compressors exploit symbol frequencies and symbol contexts, Shannon’s method add ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Abstract In his work on the information content of English text in 1951, Shannon described a method of recoding the input text, a technique which has apparently lain dormant for the ensuing 45 years. Whereas traditional compressors exploit symbol frequencies and symbol contexts, Shannon’s method adds the concept of “symbol ranking”, as in ‘the next symbol is the one third most likely in the present context’. While some other recent compressors can be explained in terms of symbol ranking, few make explicit reference to the concept. This report describes an implementation of Shannon’s method and shows that it forms the basis of a good text compressor.
Lossless Text Compression using Dictionaries
"... Compression is used just about everywhere. Reduction of both compression ratio and retrieval of data from large collection is important in today‟s era. We propose a precompression technique that can be applied to text files. The output of our technique can be further applied to standard compression ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Compression is used just about everywhere. Reduction of both compression ratio and retrieval of data from large collection is important in today‟s era. We propose a precompression technique that can be applied to text files. The output of our technique can be further applied to standard compression techniques available, such as arithmetic coding and BZIP2, which yields in better compression ratio. The algorithm suggested here uses the dynamic dictionary created at runtime and is also suitable for searching the phrases from the compressed file.
Symbol Ranking Text Compression
, 1996
"... In his work on the information content of English text in 1951, Shannon described a method of recoding the input text, a technique which has apparently lain dormant for the ensuing 45 years. Whereas traditional compressors exploit symbol frequencies and symbol contexts, Shannon's method adds th ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In his work on the information content of English text in 1951, Shannon described a method of recoding the input text, a technique which has apparently lain dormant for the ensuing 45 years. Whereas traditional compressors exploit symbol frequencies and symbol contexts, Shannon's method adds the concept of "symbol ranking", as in `the next symbol is the one 3rd most likely in the present context'. This report describes an implementation of his method and shows that it forms the basis of a good text compressor. 1 The recent "acb" compressor of Buynovsky is shown to belong to the general class of symbol ranking compressors. Keywords text compression, Shannon, symbol ranking 1 This report has been submitted as a paper to the Journal of Universal Computer Science. It is available by anonymous ftp from ftp.cs.auckland.ac.nz /out/peterf/TechRep132 1. Introduction In 1951 C.E. Shannon published his classic paper on the information content of English text, establishing the wellknown bo...
Burrows Wheeler Compression: Principles and Reflections
"... After a general description of the Burrows Wheeler Transform and a brief survey of recent work on processing its output, the paper examines the coding of the zeroruns from the MTF recoding stage, an aspect with little prior treatment. It is concluded that the original scheme proposed by Wheeler is ..."
Abstract
 Add to MetaCart
(Show Context)
After a general description of the Burrows Wheeler Transform and a brief survey of recent work on processing its output, the paper examines the coding of the zeroruns from the MTF recoding stage, an aspect with little prior treatment. It is concluded that the original scheme proposed by Wheeler is extremely efficient and unlikely to be much improved. The paper then proposes some new interpretations and uses of the Burrows Wheeler transform, with new insights and approaches to lossless compression, perhaps including techniques from error correction. 1 Introduction to Lossless Data Compression Lossless data compression involves the compression of files such that they can be later recovered bitwise identical to the original. Comprehensive descriptions are in the book by Bell et al[3] or a more recent one edited by Khalid Sayood[18]. Several introductory matters must be introduced though, before any detailed discussion of any specific algorithms –
LOSS LESS IMAGE COMPRESSION USING SEQUENTIAL APPROACH
"... A sequential image compression technique has been proposed in this paper for obtaining loss less image at the receiving side. Apart from the benefit of sequential processing the time complexity of the algorithm is significantly small compare to the q uadratic approaches. The compressed code is store ..."
Abstract
 Add to MetaCart
(Show Context)
A sequential image compression technique has been proposed in this paper for obtaining loss less image at the receiving side. Apart from the benefit of sequential processing the time complexity of the algorithm is significantly small compare to the q uadratic approaches. The compressed code is stored judiciously to facilitate the decompression process for retrieving the original image. The compression algorithm has been applied successfully on bitmap images, results good compression ratio. 1.
Text Compression Methods Based on Dictionaries
"... Compression is used just about everywhere. Reduction of both compression ratio and retrieval of data from large collection is important in today‟s era. We propose a precompression technique that can be applied to text files. The output of our technique can be further applied to standard compression ..."
Abstract
 Add to MetaCart
(Show Context)
Compression is used just about everywhere. Reduction of both compression ratio and retrieval of data from large collection is important in today‟s era. We propose a precompression technique that can be applied to text files. The output of our technique can be further applied to standard compression techniques available, such as arithmetic coding and BZIP2, which yields in better compression ratio. The algorithm suggested here uses the dynamic dictionary created at runtime and is also suitable for searching the phrases from the compressed file.
Burrows Wheeler Compression
, 2002
"... Author’s Note. The material of this chapter, while quoting extensively from other work, is in part a summary of my own experience and thoughts in working with the BurrowsWheeler compression algorithm. Some of it is accordingly rather less formal in style than might otherwise be the case, as I give ..."
Abstract
 Add to MetaCart
(Show Context)
Author’s Note. The material of this chapter, while quoting extensively from other work, is in part a summary of my own experience and thoughts in working with the BurrowsWheeler compression algorithm. Some of it is accordingly rather less formal in style than might otherwise be the case, as I give more personal opinions on various aspects. Where my own work already appears in the public domain it is cited in the normal way, but unpublished material simply refers to “the author”. 1 Introduction. Block Sorting compression, or “BurrowsWheeler compression ” is a relatively new algorithm of good compression and speed, first presented by Burrows and Wheeler in 1994[1] although Wheeler had discovered the basic algorithm some 10 years earlier. In contrast to most other compression algorithms it treats the incoming text as a block, or sequence of blocks, with transformations on each block.