• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 17,042
Next 10 →

A universal algorithm for sequential data compression

by Jacob Ziv, Abraham Lempel - IEEE TRANSACTIONS ON INFORMATION THEORY , 1977
"... A universal algorithm for sequential data compression is presented. Its performance is investigated with respect to a nonprobabilistic model of constrained sources. The compression ratio achieved by the proposed universal code uniformly approaches the lower bounds on the compression ratios attainabl ..."
Abstract - Cited by 1522 (7 self) - Add to MetaCart
A universal algorithm for sequential data compression is presented. Its performance is investigated with respect to a nonprobabilistic model of constrained sources. The compression ratio achieved by the proposed universal code uniformly approaches the lower bounds on the compression ratios

A block-sorting lossless data compression algorithm

by M Burrows , D J Wheeler , 1994
"... We describe a block-sorting, lossless data compression algorithm, and our implementation of that algorithm. We compare the performance of our implementation with widely available data compressors running on the same hardware. The algorithm works by applying a reversible transformation to a block o ..."
Abstract - Cited by 809 (5 self) - Add to MetaCart
We describe a block-sorting, lossless data compression algorithm, and our implementation of that algorithm. We compare the performance of our implementation with widely available data compressors running on the same hardware. The algorithm works by applying a reversible transformation to a block

Data Compression

by Debra A. Lelewer, Daniel S. Hirschberg - ACM Computing Surveys , 1987
"... This paper surveys a variety of data compression methods spanning almost forty years of research, from the work of Shannon, Fano and Huffman in the late 40's to a technique developed in 1986. The aim of data compression is to reduce redundancy in stored or communicated data, thus increasing eff ..."
Abstract - Cited by 101 (5 self) - Add to MetaCart
This paper surveys a variety of data compression methods spanning almost forty years of research, from the work of Shannon, Fano and Huffman in the late 40's to a technique developed in 1986. The aim of data compression is to reduce redundancy in stored or communicated data, thus increasing

Data Compression Using Adaptive Coding and Partial String Matching

by John G. Cleary, Ian H. Witten - IEEE TRANSACTIONS ON COMMUNICATIONS , 1984
"... The recently developed technique of arithmetic coding, in conjunction with a Markov model of the source, is a powerful method of data compression in situations where a linear treatment is inappropriate. Adaptive coding allows the model to be constructed dynamically by both encoder and decoder during ..."
Abstract - Cited by 442 (20 self) - Add to MetaCart
The recently developed technique of arithmetic coding, in conjunction with a Markov model of the source, is a powerful method of data compression in situations where a linear treatment is inappropriate. Adaptive coding allows the model to be constructed dynamically by both encoder and decoder

Data compression and harmonic analysis

by David L. Donoho, Martin Vetterli, R. A. Devore, Ingrid Daubechies - IEEE Trans. Inform. Theory , 1998
"... In this paper we review some recent interactions between harmonic analysis and data compression. The story goes back of course to Shannon’s R(D) theory... ..."
Abstract - Cited by 172 (22 self) - Add to MetaCart
In this paper we review some recent interactions between harmonic analysis and data compression. The story goes back of course to Shannon’s R(D) theory...

Optimal Prefetching via Data Compression

by Jeffrey Scott Vitter, P. Krishnan , 1995
"... Caching and prefetching are important mechanisms for speeding up access time to data on secondary storage. Recent work in competitive online algorithms has uncovered several promising new algorithms for caching. In this paper we apply a form of the competitive philosophy for the first time to the pr ..."
Abstract - Cited by 258 (7 self) - Add to MetaCart
to the problem of prefetching to develop an optimal universal prefetcher in terms of fault ratio, with particular applications to large-scale databases and hypertext systems. Our prediction algorithms for prefetching are novel in that they are based on data compression techniques that are both theoretically

Data Compression

by Dr. Yair Wiseman
"... The course details various data compression techniques used on various environments. It explores different data compression methods, explaining the theory behind each and showing how compression algorithms significantly increase the storage capacity of their system. Each technique is fully illustrat ..."
Abstract - Add to MetaCart
The course details various data compression techniques used on various environments. It explores different data compression methods, explaining the theory behind each and showing how compression algorithms significantly increase the storage capacity of their system. Each technique is fully

Data Compression

by Glen G. Langdon, Jr. , 1999
"... Information Source The notion of information source was formalized in [Sha 48]. Shannon dened an abstraction called a discrete information source. In Figure 4.1, we illustrate a six-symbol alphabet A, with the uniform distribution, from which several samples have been drawn at random. In the exampl ..."
Abstract - Cited by 1 (0 self) - Add to MetaCart
Information Source 1 2 3 4 5 6 Figure 4.1: Example of an Information Source domain. Shannon's major contributions to information theory were in the late 1940s and the 1950s, when the analog world reigned. Today, the world is mercifully digital, and digital systems handle most of the needs of data

Compressive sampling

by Emmanuel J. Candès , 2006
"... Conventional wisdom and common practice in acquisition and reconstruction of images from frequency data follow the basic principle of the Nyquist density sampling theory. This principle states that to reconstruct an image, the number of Fourier samples we need to acquire must match the desired res ..."
Abstract - Cited by 1441 (15 self) - Add to MetaCart
of new data acquisition protocols that translate analog information into digital form with fewer sensors than what was considered necessary. This new sampling theory may come to underlie procedures for sampling and compressing data simultaneously. In this short survey, we provide some of the key

Compressed sensing

by Yaakov Tsaig, David L. Donoho , 2004
"... We study the notion of Compressed Sensing (CS) as put forward in [14] and related work [20, 3, 4]. The basic idea behind CS is that a signal or image, unknown but supposed to be compressible by a known transform, (eg. wavelet or Fourier), can be subjected to fewer measurements than the nominal numbe ..."
Abstract - Cited by 3625 (22 self) - Add to MetaCart
We study the notion of Compressed Sensing (CS) as put forward in [14] and related work [20, 3, 4]. The basic idea behind CS is that a signal or image, unknown but supposed to be compressible by a known transform, (eg. wavelet or Fourier), can be subjected to fewer measurements than the nominal
Next 10 →
Results 1 - 10 of 17,042
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University